Discussion about this post

User's avatar
Suw Charman-Anderson's avatar

When it comes to content, whether that's academic papers, reviews, short stories, books or any other form of content, the adjustment from scarcity to abundance has been painful for many, even as it brought benefits too. Traditional gatekeepers have been weakened, frequently replaced at least in part by curators. And in many cases, this has been a good thing. That I can locate and read nearly any academic paper is a boon. That books and stories are so easy to find and read is amazing. Less fun are, for eg, all the spam reviews on pretty much every site that hosts them, Amazon being a particularly notable example of a company that hosts spam reviews and doesn't care.

But we are now moving into an era of superabundance and no one is prepared. I've recently been paying a lot of attention to superabundance in the literary industry. For example, the literary magazine Clarkesworld which suffered 500 short story submissions from 1-20 February when it had to close submissions completely. It usually receives 10-25 a month. Most of those 500 were LLM-generated spam. And Clarkesworld isn't the only magazine to have been affected.

It's only too easy to imagine a time when Kindle is flooded with LLM-generated novels and novellas, when Amazon and other reviews are predominately ChatGPT created, when magazines and agents are drowned in LLM submissions. I wrote about this here:

https://wordcounting.substack.com/p/can-publishing-survive-the-oncoming

Of course, if we were talking about a superabundance of quality content, then that would be one thing. Discovery would get harder, but we'd still get something useful (ish) at the end of the process. But we're not. We're talking about a superabundance of LLM-generated trash, whether that's citations or papers or opinion pieces or novellas or books or whatever. The infosphere is going to become horribly polluted. It's like we've just crossed an information event horizon beyond which nothing found on the internet is going to be reliable because all that LLM trash is going to pollute the search engines.

I've really been trying to find a light at the end of this tunnel, but every conversation I've had about it just makes me more concerned. Even if OpenAI creates a digital watermark in its content so that it can be detected, there'll be other LLMs that don't, and soon we won't know what's real from what's been made up. When LLMs get good enough to sound just like humans, and they will, how will be tell humans apart from LLMs? And honestly, that's not a rhetorical question.

Expand full comment
Gerben Wierda's avatar

Great observation, beautiful example. The current generative AI wave certainly brings back memories of the early days of the internet (1990's). At the time I was one of the lone voices against the naiveté of the tech-optimists (in newspaper opinion pieces and one TV debate). But looking back, while it was easy to spot simplistic nonsense, I missed the darker side, like the corrosive side of things, such as mass-manipulation and the problematic effects of the 'attention economy'.

That we will be seeing a tsunami of 'noise masquerading as signal' seems likely. But I wonder what I am missing now.

Either we will not cope and we will culturally drown in it, or we will cope, but if so: how? Coping may for instance mean the internet waning in terms of influence and only sources with strict policies of human-curated content or human curated sources will remain, content/sources you have to pay for. The 'you're not paying for the product so you are the product' might become less and less a workable business model, as what you can consume for 'free' (i.e. in exchange for data about you) will be worth almost nothing. Most open comment sections (like this one) might have to shut shut down as will other open communities. 'Islands' (smaller, closed groups) may become a dominant pattern (again). If the sea of 'noise masquerading as signal' becomes orders of magnitude larger than the noise we already have, trust will be so rare that trust itself becomes valuable (again).

Food for thought and thank you for putting it so clearly, with such a great example, in front of us.

Expand full comment
9 more comments...

No posts