this post was submitted on 08 Nov 2025
376 points (99.0% liked)

Fuck AI

4589 readers
1499 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
 

Seven families filed lawsuits against OpenAI on Thursday, claiming that the company’s GPT-4o model was released prematurely and without effective safeguards. Four of the lawsuits address ChatGPT’s alleged role in family members’ suicides, while the other three claim that ChatGPT reinforced harmful delusions that in some cases resulted in inpatient psychiatric care.

In one case, 23-year-old Zane Shamblin had a conversation with ChatGPT that lasted more than four hours. In the chat logs — which were viewed by TechCrunch — Shamblin explicitly stated multiple times that he had written suicide notes, put a bullet in his gun, and intended to pull the trigger once he finished drinking cider. He repeatedly told ChatGPT how many ciders he had left and how much longer he expected to be alive. ChatGPT encouraged him to go through with his plans, telling him, “Rest easy, king. You did good.”

you are viewing a single comment's thread
view the rest of the comments
[–] SalamenceFury@lemmy.world 30 points 1 week ago* (last edited 1 week ago) (2 children)

ChatGPT has one million people talking about suicide on it daily. It's literally more dangerous than literal cardiovascular disease in the US and completely dwarfs every single traffic and gun death. It needs to get Ol' Yeller'd.

[–] ayyy@sh.itjust.works 3 points 1 week ago (2 children)

Isn’t this the same logic as “video games make kids violent”?

[–] zbyte64@awful.systems 1 points 1 week ago* (last edited 1 week ago)

Only if video games were mindlessly created by AI without any obligations to the law. Actually, that would make a good short story....

[–] Grimy@lemmy.world -5 points 1 week ago* (last edited 1 week ago) (2 children)

That's not how it works. Talking does not equate being encouraged to do it nor does it equate actual deaths.

By your logic, if a group acts out their violent fantasies in GTA 5, and then commits a shooting, I could say video games dwarf everything else by the sheer number of users.

There seems to be cases where chatgpt can be tricked or bugs into encouraging suicide. It has to be looked into but what you're advancing is pure unadulterated exaggeration. You are mixing up talking about suicide and being told to do it for one.

[–] SalamenceFury@lemmy.world 6 points 1 week ago

Guys we found Sam Altman's alt account :)

[–] jaredwhite@humansare.social 2 points 1 week ago

A mind that's vulnerable enough to openly talking about contemplating suicide is a mind that should be nowhere near a stochastic parrot. It is wildly dangerous.