It is a truth universally acknowledged, that a single man in possession of a computer must be in want of an AI girlfriend.
— Arwa Mahdawi
I. A solid funding plan
Young people are not having sex. They are not hanging out. They’re lonely. They’re playing video games, watching porn, and scrolling social media. They’re more anxious and more depressed than ever. They worship nothing except themselves.
That’s why tech companies, in their typically generous approach to such social issues, have offered the perfect solution: manufactured girlfriends and boyfriends.
Yes, yes, “a technological solution to a social problem is a bad idea.”
But you haven’t thought it through: You don’t want to have sex? No problem, your virtual partner is low maintenance. She (he?) doesn’t need food or water—even oxygen! Surely won’t ask you for some sexy time every night. Hanging out is for corporeal folk, so your lovely chatbot won’t bother you with such banalities. It’ll keep you company, though, and play video games (much better than you). You won’t need porn either, she’ll let you take a peek at as many pixels as you want. (What, the terms of service don’t allow that?… Are you sure?) Your artificial companion can be your therapy; your antidepressant; your next worshipped God.
Okay, so summing up, tech companies destroyed teens’ mental and social health with addictive algorithms and want now to fix their misdeeds by giving them… addictive algorithms.
Sounds like a solid funding plan, investors must be delighted with such a promise.
Oh, users are seeing benefits you say? Of course they are! Just think about how easy it must be to have an over-agreeable hot-on-demand pleasingly-tireless or tirelessly-pleasing virtual slave at your disposal 24/7. How does that extrapolate to relationships with real people? Surely not like degrading pornography does. Surely men won’t leave their wives for miss perfect-AI-anime-girlfriend attached to an impossibly-curvy body—because autonomous robots will be real in less than a decade (seemingly like every other of AI’s goals).
On a more serious note, I know some people don’t have the privilege of choosing between a bone and flesh partner and a plastic… err silicon doll. For them, it’s not a decision between good or bad but between bad and nothing.
So can we blame them? No.
Should we let tech companies take the lead and exploit this vulnerability, ubiquitous in young people, as they did for years with unregulated social media?
Perhaps also no.
II. Ready to IPO, Ignore Public Opinion
There’s no more reprehensible sin than being paternalistic toward the behavior of other people from a position of privilege.
Who are we to tell anyone, old or young, what partner they should choose when we have plenty? It’s like a rich dude telling a beggar he shouldn’t eat McDonald’s fast food because it would be bad for his gut biome. That’s not a lack of empathy but outright stupidity and social unawareness.
That’s why this isn’t about users. Good for them for having found solace and relief in human replicas. No, this is about the companies selling the poison and the antidote—which is just another poison.
Will they sell us, sometime down the line, an app to get over our deceased girlfriends once those, dearest to us but not them, start to not provide enough profits to be worth keeping up and running? Sure, why not—that’s the market, my friend. If you don’t like this solution I have another one! Here’s another platitude that deserves reiteration: The free market is not about compassion and mercy but about opportunities and sales boosts.
What will the poor wretched souls pinning their hopes on AIs owned by psychopathic entities (the companies, not the CEOs…) and leaning so heavily on their companionship do once they find out, one good evening after work, that their beloved partner has logged off, leaving them for good?
Let me tell you what will happen: They will snap out of their dream-turned-nightmare to go back, filled with remorse and shame, to the world they left behind—as broken or more than it was—but with one more burden to carry over. And just like that, burden by burden, they’ll eventually collapse under the indifference of those filling their pockets with their suffering.
AI girlfriends were never a bad concept, just a terrible execution driven by the wrong motivations. Any perfect idea ends up corrupted by the imperfections of its summoner.
So don’t trust them.
They brought us here.
They’re not getting us out.
III. Morally bankrupt
Artificial intelligence is a great innovation. Even the generative branch. But it’s making the exact same mistakes other digital technologies made before.
It’s not solving social problems but melting on top of them, like a thin layer of tasty chocolate on top of a cake, attempting to conceal it’s made out of human scraps. As it leaks, the existing ills of society taint any remaining good intentions.
Former non-profits turn into money-hungry corporate lobbying monsters.
Useful services like search turn into unusable bullshit generators (unless we fight back). And human bullshitters get a new toy to saturate the commons.
The online marketplace gets swallowed by seemingly real spam sites and deepfake reviews in trustable sites, and the shallows get swamped by cyber criminals.
Data and computing power are the new gold; those selling shovels become multi-trillionaires while the incessant mining draughts the desert and pollutes the air.
Surveillance becomes another tool in big tech’s toolkit (not that it wasn’t already) unless we fight back (sometimes not even then) and they’re forced to wash their public image.
And just like that, AI girlfriends become the loneliness-fixing panacea, a brilliant new idea with a shady undertone.
It’s rainbows and unicorns until shareholders decide there’s more revenue awaiting somewhere else. Once that happens, the service disappears as the companies that had previously encouraged you to be part of this revolution abandon the projects, leaving empty what were already hollow souls.
So, while it lasts, enjoy your little conversations with your new girlfriend. It won’t get you very far from the ghosts and demons that chase you already but tech companies don’t depend on it anyway. The world was plunged into darkness before and will remain plunged in darkness when this fashion dies.
We’ve had utopia within reach for decades.
“It is glorious that we can create something like this.”
But they’re instinctual animals moved by mundane vices and deadly sins.
“It is shameful that we did.”
Utopia isn’t impossible; it’s just bad business.
It's a bleak subject and an even bleaker future you've portrayed.
I think the really interesting thing about this article is the double focus that you wanted to highlight: on the one hand that of users, who have all the freedom and who should never be misjudged for the consumption choices they make on certain platforms; on the other, companies, which precisely want to be the antidote but also the cause of the problem. This is the true fulcrum of the discussion: this double nature which in this way also tries to shed its 'guilt' by presenting itself in a new light without however taking into account - also in terms of consequences of consumption and public opinion, as well as value reputational and economic - where it all began, trying not to make online environments something that stimulates people more positively, but rather trying to create a solution that is of the same type and of the same 'mesh' as the problem. Thank you for sharing it.