I think there's a danger with referring to everything written by AI as "AI slop". There's a big difference between using AI to co-create a piece of work, carefully guided by using well-written prompts, and then revised until it expresses what you were looking for, and using AI to mass-produce vast amounts of "content" with little to no thought or human involvement.
The former, I contend, is not "AI slop", even if the words (or some of them) were generated by an AI. This piece - whether you wrote it, an AI wrote it, or you wrote it with the assistance of an AI - is interesting, well-written, and obviously had some thought put into it. I don't hate it.
On the other hand, "write me twenty 750-word blog posts about 1980s hair metal bands" and then publishing them all with no revisions, most certainly is AI slop. And it's perfectly okay to hate that.
I agree. It was to make a point. I like the standard definition of AI slop, which is something AI-generated that went wrong or was lower quality than intended.
Everyone wants the nutritional content of broccoli alongside the sweet taste of beignets. We are what we eat and if we keep gorging on slop, we have more to worry about than our waistlines.
many of the folks on 4chan /g have a much finer definition. They regard the stuff that is not creative enough as "slop", but stuff that is weird and twisted in some way is "true AI" or from a decent foundation model.
I think the accepted definition is: AI-generated stuff that went wrong (so it depends on the intention of the human behind the prompt/instructions). However, use and customs always modify definitions. Now it's closer to low quality AI-generated stuff
Using AI to help write is a "garbage in / garbage out" situation. I work closely with AI to analyze information and to debate viewpoints (it's good at stress testing and finding gaps in reasoning). Generally speaking, the default AI output is verbose and mediocre -- it doesn't pause to look for that one juicy word to evoke a concept when I can use three bland but safe ones to denote it.
But every once in a while, in the midst of a discussion on aesthetics or philosophy, the AI puts out an absolutely stunning sentence that stops me in my tracks. There is a balance of clarity, economy, and grace in the words. I've asked the AI how it came to that phrase, why did it pick those tokens in that order in that moment. And the answer invariably is something like, "I'm mirroring your way with words. You created the context in which these words made the most sense."
I think there's a different way to use AI for writing besides simply feeding it prompts and watching it spew out words. You have to build more than the default context. You have to model what you want it to do. The rhythm, and the level of nuance.
So, I'm not surprised by either of these two previous articles by the author. It sounds like he has learned to "play" his AI like a musical instrument -- filling the space more fully, with more overtones, than simply a single voice on a stage.
How did it make me feel when I realized you were doing (or I guess pretending to do) the “that intro was actually written by AI!” switcheroo that every tech journalist was doing about two years ago until it got overplayed? Mildly annoyed.
I think there's a danger with referring to everything written by AI as "AI slop". There's a big difference between using AI to co-create a piece of work, carefully guided by using well-written prompts, and then revised until it expresses what you were looking for, and using AI to mass-produce vast amounts of "content" with little to no thought or human involvement.
The former, I contend, is not "AI slop", even if the words (or some of them) were generated by an AI. This piece - whether you wrote it, an AI wrote it, or you wrote it with the assistance of an AI - is interesting, well-written, and obviously had some thought put into it. I don't hate it.
On the other hand, "write me twenty 750-word blog posts about 1980s hair metal bands" and then publishing them all with no revisions, most certainly is AI slop. And it's perfectly okay to hate that.
I agree. It was to make a point. I like the standard definition of AI slop, which is something AI-generated that went wrong or was lower quality than intended.
Everyone wants the nutritional content of broccoli alongside the sweet taste of beignets. We are what we eat and if we keep gorging on slop, we have more to worry about than our waistlines.
Great text; thanks a lot.
Gracias Jose!
Hey AI! Do you worry about the human mind losing its creativity and critical thinking?
I will answer instead: Yes, it's a real problem
many of the folks on 4chan /g have a much finer definition. They regard the stuff that is not creative enough as "slop", but stuff that is weird and twisted in some way is "true AI" or from a decent foundation model.
I think the accepted definition is: AI-generated stuff that went wrong (so it depends on the intention of the human behind the prompt/instructions). However, use and customs always modify definitions. Now it's closer to low quality AI-generated stuff
This conundrum is certainly Borgesian. Also, in its way Pynchonian. And could see it appear in slightly altered form in Gödel, Escher, Bach.
Using AI to help write is a "garbage in / garbage out" situation. I work closely with AI to analyze information and to debate viewpoints (it's good at stress testing and finding gaps in reasoning). Generally speaking, the default AI output is verbose and mediocre -- it doesn't pause to look for that one juicy word to evoke a concept when I can use three bland but safe ones to denote it.
But every once in a while, in the midst of a discussion on aesthetics or philosophy, the AI puts out an absolutely stunning sentence that stops me in my tracks. There is a balance of clarity, economy, and grace in the words. I've asked the AI how it came to that phrase, why did it pick those tokens in that order in that moment. And the answer invariably is something like, "I'm mirroring your way with words. You created the context in which these words made the most sense."
I think there's a different way to use AI for writing besides simply feeding it prompts and watching it spew out words. You have to build more than the default context. You have to model what you want it to do. The rhythm, and the level of nuance.
So, I'm not surprised by either of these two previous articles by the author. It sounds like he has learned to "play" his AI like a musical instrument -- filling the space more fully, with more overtones, than simply a single voice on a stage.
Can you do it again but in a funny accent
How did it make me feel when I realized you were doing (or I guess pretending to do) the “that intro was actually written by AI!” switcheroo that every tech journalist was doing about two years ago until it got overplayed? Mildly annoyed.
That was fun!
Turns out, I love a good deceit.
This flavor is interesting—artificial but not like watermelon.
—another AI reply. or not.
Just wait till Alberto hears about you hijacking his Substack, chatbot. There'll be HAL to pay.
I felt this. And I’ve been trying to write the same thing from the other side, not through trickery, but through slow self-revelation.
My theory is that we don’t love AI slop.
We love recognition. And AI has learned to reflect us back to ourselves better than we can hold a mirror.
It’s an illuminating piece. I didn’t have any emotional whiplash because I care more about what the words say than where they came from, usually.
I suppose it matters the most to me when the text is presented as an expression of a human voice.
For example, an autobiographical essay or a chat in a video game that looks like another player.