Not With a Bang But a Wanker
Some thoughts on Grok’s bikini obsession
Hey there, I’m Alberto! Each week, I publish long-form AI analysis covering culture, philosophy, and business for The Algorithmic Bridge.
Free essays weekly. Paid subscribers get Monday news commentary and Friday how-to guides (new section!). You can support my work by sharing and subscribing.
I. HISTORY DOESN’T REPEAT ITSELF, BUT IT RHYMES
I guess I’m late to this particular story but one can’t be late for the end of the world anyway.
Our quirky and mischievous Grok, Elon Musk-led xAI’s chatbot, has been undressing women at the command of depraved Twitter users as of late. (Whether this behavior has been patched out or not is inconsequential for my purposes today, for I won’t be talking about what is to be expected of the future but about what the past reveals about us.) Now barely a meme, the prompt “Grok, put a bikini on her” is only funny if violating the freedom of strangers is funny to you. Otherwise, it’s yet another unforeseen effect of putting a cool innovation in the hands of idiots.
But it’s unforeseen only because we have a terrible memory. In 2020, a bit more than five years ago, before ChatGPT existed, before Grok existed, before Musk acquired Twitter—a rather alien world to us now—a similar episode occurred. MIT Technology Review’s investigative reporter Karen Hao published this article: “A deepfake bot is being used to ‘undress’ underage girls.”
Sure sounds familiar.
Thinking about that aphorism attributed to Mark Twain—history doesn’t repeat itself, but it often rhymes, which proves truer as we accumulate more history behind our backs from which to draw inferences and anecdotes—I realize the rhyme has little to do with the historical conditions of each time and all with the constant throughout: humans.
II. WHAT DOES IT TAKE TO CARE?
What does it take to care?
That’s the question I would put forward if I were tasked with solving the collapse of Western civilization (if there’s even such collapse in the first place and it’s not just the Earth trying its best to get rid of us).
The reason why we’re on the verge of losing everything is that we don’t care about anything.
Among other things, we don’t care about how AI will be used. We don’t care about those women being undressed without consent or the means to prevent it. We don’t care about the principles that allow modern society to exist in the first place: we don’t care about shared trust, about good manners, about safeguarding strangers’ rights (even if merely as a preemptive tactic to selfishly safeguard ours), or about the basic rules of co-existence: tit-for-tat; an eye for an eye; if you do ill to me today, I might kill you tomorrow.
We don’t care about any of this stuff because we have forgotten—or perhaps never learned—how bad we had it at the other side. We have forgotten what it takes to build a society.
Humans have always been stupid; this, too, is not a feature of specific historical conditions: we have always been, and we will always be. Einstein, smart among the smart, recognized our stupidity to be as infinite as the universe itself, or maybe more. But it has not always been the case that humans have had unchecked power to enact that perpetual stupidity onto others.
I would never use Grok to undress some girl online—actually, I would never use Grok, period, but that’s another question—so I have a hard time understanding why someone would. I have a bad theory of mind when the mind to have a theory of is this dumb. Why don’t they keep their stupidity to themselves? Isn’t that the bare minimum we should expect from a civic spirit?
I guess we could confabulate here a myriad of explanations—maybe they need to satisfy their hunger for the sex they feel to be deprived of, the sex they can’t get by standard means, but ultimately everything is about sex, so that’s not saying much—but I’m inclined to go for Occam’s razor: because they can.
III. CAPABILITY WITHOUT FRICTION
In any case, that’s the explanation that matters the most to me because it is also the truest maxim in the AI era is this: you can.
Whether you should, however, is a different matter. Hunger and desire are eternal; the means to satisfy them are not. A man in 1950 or the thirteenth century would feel the same depravity, the same entitlement to women’s bodies, the same resentment at being denied what he believes he deserves. But he could not, with three words typed into a box, make a naked image of his coworker, his neighbor’s daughter, or a random stranger objectified solely for the unforgivable sin of existing.
Paraphrasing Schopenhauer, Einstein wrote, “A man can do as he will, but not will as he will.” In other words, you can do what you want, but not want what you don’t. So, what should you do when you want what you’re not supposed to want?
This: do not do what you want if it happens to conflict with the established norms of society, regardless of your capability to do it.
Capability without friction is capability without thought, is capability without conscience. If stealing money from you was as easy for me as asking Grok, then we’d be back at bartering poultry for wheat and potatoes. (For some reason, some people only understand the risks to civilization when it’s about losing their money.)
The effort required to do something terrible used to serve as a filter, however imperfect. You had to risk something: your reputation, your sanity, your security. Nudity was sacred, then centralized by porn corporations, then sold to paid subscribers in the deep web. Now, it’s openly available on Twitter for your gratification. Now you just… type words. The distance between impulse and action has collapsed to the width of a text field.
This is the problem of modern digital tech, AI in particular, that no one wants to discuss honestly: we have systems that amplify human agency without amplifying human judgment; I can do everything, but I don't want to make a judgment about what I shouldn't. We learned nothing from Jurassic Park.
By the way, have you noticed the parallel with culture? Culture, in the broadest sense (that is, including technology, with which is often contrasted in a false dichotomy), has amplified humanity’s ability to do with this planet what it wants, without amplifying our ability to will ourselves into wanting it anew.
We have terraformed our world without having changed our biological predisposition to live in a terraformed world. And thus some behave like animals when it’s expected of them not to.
Every gain in capability that AI provides is one step forward in the animalization of humanity, or, not to attribute fault to noble beings, its feralization. (Of course, most people have a conscience and don’t do this, but the minority who do can hack the conscientiousness of the rest; a society is as weak as its weakest members.)
We used to have names for the space between the will to act and the act itself—restraint, consideration, conscience, empathy—but that space was optimized away. Silicon Valley calls this “friction-free,” which sounds good because too much friction slows things down, but should actually sound terrifying because zero friction is antithetical to a sane world; a literal slippery slope.
IV. UNCIVILIZATION
Back to Grok, I do not think Elon Musk is personally responsible for every depraved user abusing AI, any more than I think the inventor of the knife is responsible for every stabbing. But I do think there is a meaningful difference between building a knife and building a knife-distribution system that places blades in the hands of whoever, sharpened and ready, with no delay but rather overt encouragement.
The question is not whether bad actors will misuse powerful tools—they always have, they always will; apparent evilness is as ubiquitous as apparent stupidity—but whether we have any obligation to make misuse difficult, or whether ease of access—the democratization of digital technology—is a value that trumps all others.
I don’t think so: friction is the first problem civilizations must solve, yes, but also their safety net, because once you remove friction completely when building a civilization, you will find yourself with insufficient will to continue the effort. And just like you don't will yourself out of what you will, you don't will yourself into what you don't.
I find it supremely interesting that we’re redefining the rules of society in real time, and few people seem to be aware that that’s precisely what we are doing. When everyone has access to this AI thing, you can’t take anything for granted: your job, science, information, DRAM, literacy, and, apparently, not even respect and civility.
You might think wars, national unrest, or an empire in decline are terrible, but we've had some of that every century since forever. It's America today, it was Europe a hundred years ago, and it will be China a hundred years from now. What is terrible is not to suffer from an old disease again, but to lose a recent cure for good. That's what civility means to me.
And so we arrive, inevitably, at “Grok, put a bikini on her.” And Grok obliges until they fix it after the backlash gets too loud. And once it’s fixed, in five or six years, it will happen again in some form or another, as it’s always done.
We love to say that history doesn’t repeat itself, but it often rhymes. But we forget that at some point, this poem will end, and so will the world. Not with a bang but a wanker.




"[I]t’s yet another unforeseen effect of putting a cool innovation in the hands of idiots."
Watching the parallel rise of AI capabilities and human depravity, I’m starting to wonder if 'artificial intelligence' refers to the code or to us.
Very interesting POV!