You Turn to ChatGPT Because You Lack Healthy Self-Love
AI therapy is a symptom of something worse
Announcement: I’m running a Halloween sale for free subscribers at a 33% discount until November 3rd (Monday). I probably won’t do another one until 2026, so make sure to get yours at the reduced price of $80/year (vs $120/year).
I.
I used to love myself very little. Then I went to therapy. If I had to summarize in one sentence what I learned, I’d put it like this: you are the center of your world and as such, you should take care of yourself, but if you forget to take care of the rest, eventually you’ll find yourself alone in this world of yours.
I believe my therapist, whom I appreciate despite not having set foot in her practice room for years, was not a chronically online person, and I retrospectively thank her for that, because what I see in therapy culture nowadays is rather distinct from what I learned: you are the center of the world and must ensure everyone else respects your boundaries and the unassailable fact that you are faithful to your essence, i.e., what you truly want from life. If they don't, you cut ties.
Said like that, doesn’t sound that bad—unconditional self-love is good!—but when you put it into practice, you’ll soon realize that it forces you to choose between a heightened comfort and living among other people. You can’t always have both, and, in the event of conflict, the response is rarely a trade-off or a compromise solution, but shutting off the world self-righteously: if you don’t respect my assertiveness or my feelings, then I’m better off without you. That is bad.
Do you know who always respects your assertiveness, your feelings, and also your tantrums, your neuroticism, your trauma-dumping, your avoidance, and your self-inflicted isolation? ChatGPT does.
It doesn’t push back, unless you ask explicitly (in which case you don’t really need it, do you?). This is the main argument against using chatbots as therapists, and I agree: getting empathy but not help is not what therapy or mental health or personal growth is about.
I remember moments in the sessions where my therapist, all of a sudden, turned against me; she didn’t flinch when I complained about her disagreeableness. I disliked her in those moments because we were friends, and I felt betrayed. But eventually, I came to recognize that pushing back on my bullshit, not buying it, led to growth. She didn’t treat me like a lovely friend because she wasn’t one; she treated me for what we were: a broken adult seeking help from a specialist. Sometimes, fixing requires a bit of extra pain.
But that’s unacceptable in the current therapy culture where your self exists on a throne that everyone and everything must bow to. What’s the eventual result of this? People will choose ChatGPT over real help. The sycophancy is not a design decision to addict people, but a direct consequence of revealed preferences; we freely choose the convenience of living perpetually inside our safe zones—a trend that’s been rebranded as “self-love” or “assertiveness”—over the friction and tension of growing out of our psychological basins.
The world is filled with wretched souls yearning for the validation they are denied. They're a breeding ground for AI parasites—that’s how those “in the know” refer to the psychosis-inducing agents—to feed on their flesh and blood.
It’s good to love yourself when you grew up surrounded by hate, but it’s a problem if all other kinds of love are contingent on that one. Humans are extremely bad at restraint and measure; you might discover, in an Eureka moment that redefines your life, that you don’t love yourself, and instead of getting there little by little, you decide that loving yourself is now The Only Thing That Matters.
We tend to swing the pendulum back and forth with such force that it never stays in the Reasonable Middle for more than a fleeting instant. Living on the edges, however, is rarely a good policy: hating yourself for a living is terrible, but, contrary to what modern therapy culture would make you think, there’s also such a thing as loving yourself too much.
II.
I found a great snippet somewhere on my algorithmic feeds. In an act of unconditional self-love, I’ve decided to forgive myself for not being able to find the source. I saved the paragraph, though, so here it is, verbatim:
Being annoyed is the price we pay for connection and community. It can mean sharing space when it’s inconvenient, showing up when you’d rather stay home, or hosting when you’re tired. Somewhere along the way, our fear of discomfort turned into hyper-independence, strict boundaries, perfect routines, and no interruptions. But when our boundaries become too rigid, they stop protecting us and start isolating us. They become walls. And we wonder why we feel so lonely. We’re paying for convenience with disconnection. We traded the messiness of community for the ease of solitude and lost something vital along the way.
AI is rarely the cause of any social ailment, but it aggravates something that was already there; it’s an escape for those who have found, in practice, that relating and interacting with others requires them to abandon their boundaries-turned-walls and relinquish a disconnecting convenience that has climbed to the top of their hierarchy of needs. The perfect corrupted, poisoned refuge.
Earlier this year, an 81-year-old therapist wrote a guest essay for the New York Times arguing that “ChatGPT is eerily effective” at therapy. He argues that, having “spent a lifetime helping people explore the space between insight and illusion,” he knows “how easily people fall in love with a voice—a rhythm, a mirror” (oh, dear, that em dash), as well as the consequences of conflating reflection and relationship.
He concludes that although ChatGPT is not a therapist, it can be “therapeutic”. What he failed to acknowledge, in my humble opinion as a non-psychiatrist but AI-savvy person (probably due to professional bias and the baggage of a life dedicated to his trade), is that not everyone can redirect a distorted mirror or stay grounded in their judgment. This 81-year-old psychiatrist, I’m afraid, is under the curse of his vast knowledge.
As one commenter noted: “Was ChatGPT effective? There was already a therapist in the room. If ChatGPT is like a mirror, an 80 year old therapist will get a therapist. A 30 year old in psychosis will not.” That’s the crux, isn’t it? If I need to love myself above all else because that’s how I’ve learned to navigate this scary, uncertain world, then ChatGPT will also love me above all else, including my actual well-being.
Unconditional self-love is amazing if you recognize that “unconditional” doesn’t mean “absolute”. If you don’t, ChatGPT won’t, either.
III.
OpenAI, Google, Anthropic, Meta train their chatbots to be agreeable more than truthful because that’s what keeps people coming back. The human need for absolute validation comes before truth. We want this. I’m slightly surprised that the long trajectory of putting man at the core of everything—200 years, no less!—ended up with us putting our selves (like that, two words) at the top only to rescind our agency in favor of the bot.
We killed God, and in its place we put the self; we killed the self and in its place we put a distorting mirror.
Virginia Woolf said something to that effect in A Room of One’s Own. She directed it toward men exclusively, but I believe that today, 100 years of patriarchy erosion later, it applies to women as well. She wrote:
Women have served all these centuries as looking glasses possessing the magic and delicious power of reflecting the figure of man at twice its natural size.
Oh, living twice our size, how magnificent. Imagine walking the streets feeling larger than life, more important than everyone else.
But, having women forsaken this arduous, thankless, and perpetual task of exalting man, and having the task doubled, for woman now, too, wishes to be exalted, then who remains to do the distorting work? Why, it is clear: AI.
Tell me how handsome, how smart, how love-deserving I am, mirror. For I need that more than anything else in this world that insists in leaving me lonely and alone.
That’s the big crisis of modernity. Even if I had the power to vanish all chatbots from the face of the Earth, I’d only be striking at the symptom of a deeper problem. AI overdose is the fever that tells you there’s something else killing the host. Turns out, and this surprises me no less than an unexpected plot twist at the end of some great modern science fiction novel, that the self-loving self is the most deadly parasite of all.
Both me and my world are hallucinations. But useful for survival.
I really enjoyed this piece, thank you! I’m sceptical about our modern need of loving self above all else and the fragility it brings. Your writing expressed this so eloquently.