this post was submitted on 18 Feb 2026
485 points (96.9% liked)

Fuck AI

5920 readers
1862 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

Unavailable at source.

you are viewing a single comment's thread
view the rest of the comments
[–] FauxLiving@lemmy.world 1 points 5 hours ago

I'm not sure what standards you're saying unreliable.

You can see in the example that I provided it correctly answered the question and also correctly cited the place where the answer came from in the exact same amount of time as it would take to type the query into Google.

Yes, LLMs by themselves can hallucinate and do so at a high rate so that they're unreliable sources of information. That is 100% true. It will never be fixed, because LLMs are trained to be an autocorrect and produce syntactically correct language. You should never depend on raw LLM generated text from an empty context, like from a chatbot.

The study of this in academia (example: https://arxiv.org/html/2312.10997v5) has found that LLMs hallucination rate can be dropped to almost nothing (less than a human) if given text containing the information that it is being asked about. So, if you paste a document into the chat and ask it a question about the document the hallucination rate drops significantly.

This finding created a technique called Retrieval Augmented Generation where you use some non-AI means of finding data, like a search engine, and then put the documents into the context window along with the question. This makes it so that you can create systems that use LLMs for the tasks that they're accurate and fast at (like summarizing text that is in the context window) and non-AI tools to do things that require accuracy (like searching databases for facts and tracking citation).

You can see in the images I posted that it both answered the question and also correctly cited the source which was the entire point of contention.