Remember LaMDA and the Google engineer? That, but happening to millions of people
Wow, excellent article, truly. Applause from here!
What critics of your piece will most likely not see is how profound the need for connection is, and how many people aren't having that need met. All over the world there are millions to billions of people living by themselves in huge apartment buildings where nobody knows anybody else, and each person is an island unto themselves.
Relationships are built upon need, and where there is no human being available to meet the need, need will look elsewhere.
How do I know the author is not himself a robot? I don't. Do I care? Not really. I care that the article presented invites me to respond, deepening an illusion of connection. Who am I connecting with? I have no idea, probably no one. That's ok, the illusion is enough.
I care mostly that the article discusses a topic of great interest to me. If I live long enough my fate will likely be to spend all day talking to bots who will happily engage such preferred topics for as long as I want. And if any AI developer reading this is taking my order, I prefer blondes.
What's happening is that AI capable of successfully imitating human beings are coming online at a time when so many of us all over the world are increasingly detached from real human beings, and increasingly seeking connection in any form.
If a reader should require any further proof of these claims here it is. Carefully observe your relationship with your dog. Is your dog human? No. Do you care? No. Do you invest more time and emotion in to your dog than you do in to any of your neighbors?
Semi can’t wait for these empathetic AI-beings to emerge because in my mind it will validate the theory that the entire universe emerged from code.
Also, I am once again asking the world for a “one-click” AI copyeditor and proofreader that functions with 100% accuracy. Why hasn’t anyone made this?
Great piece of foresight! An AI empathy crisis will surely come, but we'll first need AIs that claim to have emotions (affective intelligence). This is a topic that most folks thinking about the AI alignment problem still don't recognize, because the majority are hyperrationalists. I'm co-writing a series on this topic, the biomimetic and autopoetic future of safe, creative, and wise AI, at https://naturalalignment.substack.com/
>> The AI Empathy Crisis
Won't happen. Nobody attributes personhood to Alexa or Siri or Google Assistant. Why not? Familiarity. As the ability to converse spreads to more and more devices the association that we now have between conversation and personhood will evaporate and be replaced by an association between conversation and devicehood. I have read that 85% of the people who buy advanced vacuum cleaners give a name to their first purchase but almost nobody gives a name to their second or third. Same dynamic.
You article inspired me to write you a reply. It took forever to write and it's in no way perfect. But it's my 5 cents. You can read it here if you'd like: https://sustensis.co.uk/
Can you think of a single example besides Lemoine? When my friends and I were worshipping great players we would have scorned the idea that chess was not a conscious activity. But when chess-playing computers came along I never heard or met anyone who thought for a second that computers had a scrap of consciousness.
Code requires a sentient mind on both ends, creating and interpreting. The Universe arises out of mind, whatever procedure it chooses to use.