You may not know exactly what “stealing” means in relation to artificial intelligence. But on some level you probably do.
Slop, at least in the rapidly evolving world of online message boards, is a broad term that has developed some traction in relation to malicious or spammy AI content in social media, art, books and, increasingly, search results .
Google suggests adding non-toxic glue to stick cheese on a pizza? This is dumb. So is a low-priced digital book that looks like what you were looking for, but not quite. And those posts on your Facebook feed that seemingly came out of nowhere? They are also squishy.
The term became more popular last month when Google incorporated the Gemini AI model into its US-based search results. Instead of pointing users to links, the service attempts to solve a query directly with an “AI Overview” — a piece of text at the top of a results page that Gemini uses to make its best guess at what the user is looking for.
The change was a reaction to Microsoft incorporating AI into its Bing search results and having some immediate glitches, leading Google to say it would retire some of its AI features until the issues are resolved.
But with the major search engines having made artificial intelligence a priority, it seems that vast amounts of machine-generated information, rather than being heavily curated by humans, will serve as an everyday part of Internet life for the foreseeable future.
Hence the term slop, which conjures up images of piles of unappetizing feed being shoveled into animal troughs. Like this kind of slop, AI-assisted search is coming together quickly, but not necessarily in a way that critical thinkers can grasp.
Kristian Hammond, director of Northwestern University’s Center for Advancing Machine Intelligence Security, noted a problem with the current model: the information from the AI ​​Overview is presented as a definitive answer, rather than a place to start an Internet user’s research on a given topic.
“You look for something and you get back what you need to think about — and it actually encourages you to think,” Mr. Hammond said. “What’s happening, in this integration with language models, is something that doesn’t encourage you to think. It encourages you to accept. And that, I think, is dangerous.”
To target a problem, naming it can prove useful. And while slop is an option, it’s still an open question whether it will catch on with mainstream audiences or end up in the slang dustbin of cheugy, bae and skibidi.
Adam Aleksic, a linguist and content creator who uses the social media handle etymologynerd, thinks this slop — which he said has yet to trickle down to a wider audience — shows promise.
“I think this is a great example of a distinctive word right now, because it’s a word we all know,” Mr Aleksic said. “It’s a word that seems naturally applicable to this situation. So it’s less in your face.”
The use of slop as a descriptor for low-quality AI material ostensibly arose as a reaction to the release of AI art generators in 2022. Some have identified Simon Willison, a programmer, as an early adopter of the term — but Mr Willison, who has pushed for adopting the phrase, he said it was in use long before he coined it.
“I think I might be too late to the party!” he said in an email.
The term has appeared in comments on 4chan, Hacker News, and YouTube, where anonymous posters sometimes display their expertise on complex topics using in-group language.
“What we always see with any slang is that it starts with a niche community and then spreads from there,” Mr Aleksic said. “Usually, coolness is a factor that helps it spread, but not necessarily. Like, we had a lot of words spread by a bunch of coding nerds, right? Look at the word “spam”. Usually, the word is created because there is a certain group with common interests, with a common need to invent words.”
In the short term, AI’s impact on search engines and the internet in general may be less extreme than some might fear.
News organizations are concerned about shrinking online audiences as people rely more on AI-generated answers, and data from Chartbeat, a firm that researches online traffic, shows there has been an immediate drop in referrals from Google Discover on sites in the early days of AI reviews. However, that decline has since rebounded, and in the first three weeks of surveys, overall search traffic to more than 2,000 major US websites actually increased, according to Chartbeat.
Mr Willison, who described himself as an optimist about artificial intelligence when used correctly, thought slop could become the most popular term for machine-generated spam.
“Society needs concise ways to talk about modern artificial intelligence – both the positives and the negatives,” he said. “‘Ignore this email, it’s spam’ and ‘Ignore this article, it’s stupid’ are both useful lessons.”