this post was submitted on 01 Jan 2026
681 points (98.9% liked)

Fuck AI

5043 readers
1011 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Bazell@lemmy.zip 17 points 18 hours ago (2 children)

I knew that you could lag out AI chatbot's safety regulation and make it speak on forbidden themes like making explosives, but this is a whole new level of AI hallucinations, which is indeed even more dangerous.

[–] AnarchistArtificer@slrpnk.net 4 points 15 hours ago

It gets worse the longer that you engage with the chatbot. OpenAI didn't expect for conversations to last for months and months, across thousands of messages. Of course, when they did learn that people were engaging with ChatGPT in this way, and that it severely compromised its already insufficient safeguards, their response was "huzzah, more engagement. How do we encourage more people to fall into this toxic cycle?"

[–] leftzero@lemmy.dbzer0.com 3 points 15 hours ago (1 children)

It's the same level of "hallucinations" as always, that is, zero.

This isn't hallucinating (LLMs don't have a mind, they aren't capable of hallucinating, or any other form of thought), this is working as intended.

These things will tell you whatever you want to hear, their purpose isn't to provide information, it's to create addiction, to keep the customer engaged and paying as long as possible, regardless of consequences.

The induced psychosis and brain damage is a feature, not a bug, since it makes the victim more dependent on the LLM, and the cartel selling access to it.

Given the costs, and the amount of money already burnt building them, these companies need to hook as many people as possible as fast as possible, and get them addicted enough that when they raise the prices 100X to a sustainable level their victims won't be able to leave.

And they need to do this fast, because the money is running out.