I Wish I Believed What Edward Norton Says About AI
No sentence that starts with "AI will never" ends well
On January 6th 2025—Three Kings’ Day—after the last gift is unwrapped, I find myself discussing AI with my family, particularly a topic I revisit with ever-shorter intervals of apathy in between: how AI is conquering new ground, dodging, jumping over, and even smashing any obstacle in its way.
The last decade felt like a barbarian conquest: first, the outer bulwarks of games and perception fell; then, they came for the palace of creativity. Now, they’re besieging the chambers of thought and reason. In ten years, AI has razed humanity's self-appointed hegemony. It’s gone from handwritten digit detector to PhD assistant (from narrow and clumsy to broad and clever) in the time it takes a child to join high school.
My dad says—half hope, half plea—that some obstacles are insurmountable; AI will never dominate human emotion or true art because only a human being can feel, and feeling is the wellspring of any kind of art.
There’s wisdom in that: No creation is without the cultural, historical, and emotional context that births it. ChatGPT may be able to generate these exact words but in doing so it won’t be inspired by my conjunctural circumstances. Who I am, what I do, how I think, and what I want define my writing in a way ChatGPT can’t replicate.
However, I’m not sure it matters that much whether AI lacks intent or if there's a sentient mind behind its utterances (you know, p-zombies and all that). I mean, you can do all the mental gymnastics you want to feel special, but at the end of the day, you don't feel that way anymore. Saying “AI art is not art” is you running away from a fate you’re not prepared to face. Saying “ChatGPT doesn't write that well,” same thing. (ChatGPT doesn’t write that well, though.)
But I don’t want to start an endless debate with my dad—one that even ontology philosophers haven’t settled—so I just nod, pensive.
A couple of days later, January 8th, I watch a clip of Edward Norton, fine actor, saying this about Bob Dylan, fine musician: “You can run AI for a thousand years. It’s not going to write Bob Dylan songs.”
I'm surprised by Norton’s emphatic assertion, so I nod again, pensive. But Norton can’t see me—I realize I’m not avoiding this conversation because it’s hard but because I don’t know what to think; how to convey what I foresee or how to swallow the truth. At times I catch myself seeking those kinds of clips to rejoice in vicarious hope I no longer hold.
Yet I listen to Norton with serious apprehension, as I’ve listened to and read others before him: Ben Affleck, Nick Cave, Margaret Atwood, Ted Chiang, painters, musicians, essayists, and poets. I’ve listened to them all with an open mind, as they display a characteristic defensive posture, the natural reaction to an incessant bombardment of AI-shaped threats. They wield, with the determination of a warrior but without its martial prowess, arguments about the flaws in the machine.
Some swing sharp metaphors (isn’t ChatGPT a “blurry JPEG of the web,” after all?) or funny anecdotes (As Nick Cave said: “This song sucks.”). Some speak with surprising sensitivity to the technical merit fueling AI’s ability (Ben Affleck said “vectors of meaning,” whatever that is), but that’s rare (we all know how to act as stochastic parrots, don’t we?).
What I always encounter, hidden underneath contemptuous laughter, is that irreproachable mix of hope and plea.
I don’t sense confidence or resolve in their skepticism. I sense fear. I sense a secular “ought to” prayer rather than a description of the world that unfolds before them. “Even Bob?” Their inner voice whispers in outdated disbelief. “Is there anything safe from this monster that’s burying us in despair?” I witness the applause that Norton’s words incite in Stephen Colbert’s audience. I read the comments on the YouTube clip: “I love him even more now!”
And I think to myself: I wish I believed like you guys do. Knowing what's coming doesn't make accepting it any easier.
In a way, AI is like death. We spend our lives marching toward it while doing everything to deny its inevitability. We either don't think about it or resist the idea that its power over us is absolute. We do the same with AI. I get it, though: It’s not so easy to detach our identity—as craftsmen, as the smartest species—from everything else. We are the ones who create timeless art. Who write songs that make you cry. Who solve the movement of planets and the ubiquity of electrons. Who erect skyscrapers and pyramids. (Not me, though, just passing by.)
We are the ones who birth God, then kill it, then build it anew.
All this stuff—meaningless even to the smartest non-human animals—defines us just as much, if not more, than hairless skin, four limbs, opposing thumbs, and an uncanny predilection for gossiping. No wonder the prospect of taking a secondary role feels unbearable. No wonder we deny it.
Yet here I am, writing these words in apparent calm—as if I'm ready to welcome the Reaper at my side—and still, whenever I hear an argument like Edward’s or Nick’s or Ben’s, I want it to be true. I rewatch the video and wonder if maybe he has discovered something I can cling to. A sturdy totem forged with the mastery only gods possess, made of the unrepeatable human essence. If only Bob Dylan could withstand this unwelcome tsunami, humanity would be vindicated. Is he our totem?
And, as always, I answer: No. Norton—like my dad and so many other thinkers who have poured immense energy into this question—is hanging onto an illusion. They fail to see past the anguish; they can’t tolerate that the world will keep on spinning, as it has since time immemorial, but without us at the helm. History, rewritten; art, reinvented; culture, re-customized. The relevant junctures will no longer be about fleshy, four-limbic humans and their misdeeds and gossips, but about ChatGPT's progeny.
Contrary to what it may seem, I’m not here today to explain for the umpteenth time with technical jargon or philosophical annoyance why Norton is likely wrong—but to open up about the part of me that sees in people like him not delusional contrarianism but unyielding faith. So I want to ask aloud: What’s this sensation inside me that demands that I urgently find that sturdy totem? Why can’t I embrace the obvious imperfection of our species in light of an improved version that will soon outperform every single individual born on this ball of dust, salt, and sand? Others evidently can.
Even if there was a physical law that forbids creating a superior silicon-based intelligence (carbon chauvinism, anyone?), I writhe at the implications of what we’ve already stumbled upon. We won’t be architects. Nor sense-makers. We’ll be little more than pets. I’d rather not ask my poor cat without a name what that feels like.
Maybe the truth is that I do sense within me, like a void in my chest, that God-shaped hole Richard Dawkins and Steven Pinker so adamantly reject. That famous aphorism wrongly attributed to G. K. Chesterton—“When men choose not to believe in God . . . they then become capable of believing in anything”—captures what’s happening to me. I can’t find a reason (though maybe one lies beyond my comprehension) to hide behind the idea that God set aside this special place for us, therefore I become capable of believing in blasphemy: there’s no such place. Or rather, it’s reserved for someone—something—else.
Perhaps this has nothing to do with religions or gods. Maybe, even though I deny my patriotism and tribalism, I still feel I belong to what we call humanity. I want us to win.
I’ll close with an exercise for the reader:
Even if my rambling is irrational and AI ends up as nothing more than a footnote in our relentless quest to remain the main characters of this story, I wonder: Why don’t more pro-AI people feel the same shiver I do when staring down a future that renders us obsolete? Why—especially among those who know what’s coming—is there so little hesitation, so little dread? A destiny so unsettling it invites denial as much as death itself—yet they embrace it with pleasure and willingly push us all toward it.
Hello friend—
I write this newsletter in an attempt to understand AI and offer that understanding to others who may find themselves similarly disoriented (who isn’t these days…) As an ad-free, reader-supported project, it survives thanks to a small group of generous readers who support it with $10/month or $100/year.
If you find value here—or simply wish for this effort to persist—you are most welcome to join them. If you already have, my sincere thanks. This place exists because of you.
Take comfort in live moments, in theater, music, community - those spaces where humanity doesn’t necessarily need a machine.
Author Sidney Sheldon once wrote, “A blank piece of paper is God's way of telling us how hard it is to be God.”
If blank pieces of paper go the way of the dodo in a world of continuous, artificial intelligence, industry, progress, how will human beings make their mark?
My thoughts: https://www.whitenoise.email/p/remembrance-of-tasks-past