this post was submitted on 18 Feb 2026
492 points (96.9% liked)
Fuck AI
5920 readers
2535 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you search on Google, the results are an output. There's nothing AI about the term output.
You get the same output here and, as you can see, the sources are just as easily accessible as a Google search and are handled by non-LLM systems so they cannot be hallucinations.
The topic here is about hallucinating sources, my entire position is that this doesn't happen unless you're intentionally using LLMs for things that they are not good at. You can see that systems like this do not use the LLM to handle source retrieval or citation.
This is true of Google too, if you're operating on the premise that you can trust Google's search results then you should know about Search Engine Optimization (https://en.wikipedia.org/wiki/Search_engine_optimization), an entire industry that exists specifically to manipulate Google's search results. If you trust Google more than AI systems built on search then you're just committing the same error.
Yes, you shouldn't trust things you read on the Internet until you've confirmed them from primary sources. This is true of Google searches or AI summarized results of Google searches.
I'm not saying that you should cite LLM output as facts, I'm saying that the argument that 'AIs hallucinate sources' isn't true of these systems which are designed to not allow LLMs to be in the workflow that retrieves and cites data.
It's like complaining that live ducks make poor pool toys... if you're using them for that, the problem isn't the ducks it's the person who has no idea what they're doing.