“I don’t like this whole talk of ‘we’re living in a post-truth world,’ as if we ever lived in a truth world”
Ever since the first humans developed language, we’ve been navigating an information landscape pitted with lies, tall tales, myths, pseudoscience, half-truths, and plain old inaccuracies.
*Personally, I don’t agree with some statements, but I must acknowledge that panic and being judgemental also won’t help us out. In this day and age proper contact appears to have become more difficult, and that’s without the pitfalls of mis- and or disinformation. *
My time was wasted by LLM-generated nonsense just yesterday. I wanted to know when whistling tea kettles similar to the classic design we know today first became popular. The first search result I got was a 3000-word essay all about the history of kettles, so I started reading. You’ll know you’ve found the same one I did if at various points it claims that the kettle was invented “ca. 8000 BC”, “4000 years ago”, “around 3000 BC”, “15,000 years ago”, and “approximately 906-1127 AD”.
There are various other inconsistencies and things that make no sense at all by human standards, but it’s written in an authoritative tone, looks pretty nice, and was the first result on my searx instance, appearing in the results from several well-known search engines. It wasn’t immediately obvious to me that it’s all bullshit, and there’s probably at least some truth mixed in there somewhere.
It’s not exactly something to panic over I’d say, but it sure is annoying.
Search results being polluted by llm content is so annoying. As if all the SEO didn’t do enough already to bring down overall web quality.
Recently I was searching for some technical guidance on how to do a particular thingy with the IPython coding framework and found just the right page. Except when I tried to run the examples, nothing worked because it was all made up! The entire site/domain was a collection of machine generated answers made to look like blog posts to common programming questions (which they probably scraped from some site with real human collaboration).
I can’t even.
It wasn’t immediately obvious to me that it’s all bullshit
And here we have the biggest problem. Many people won’t fully read a long text, also, the discrepancies in the generated bullshit might be more subtle and/or require a fair bit of understanding of the topic at hand to spot.
Bullshit generators (I refuse to call text generators, regardless of their working principle, by any other name) don’t only annoy and waste time, they have the potential to misinform people on virtually anything, and if those people are making decisions based on a bunch of generated bullshit, it has the potential to cause real world damage.
Bullshit generators
The problem with this term is that we have a lot of human professions that could be described with it too, mainly in the spheres of populist politics, marketing, sales and similar fields.
I don’t see a problem with including those, too. Maybe for a more nuanced view, we could introduce a distinction between biological and technological bullshit generators.
As if correct and incorrect doesn’t exist. Obey. Idiocracy. Right and wrong is an illusion. How dumb but at least they’re trying to justify their null information brain fart article