this post was submitted on 05 Jun 2025
945 points (98.9% liked)

Not The Onion

16576 readers
1148 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] gwildors_gill_slits@lemmy.ca 2 points 2 days ago

You're not wrong but also there's a ton of misinformation out there, both due to bad journalism and also pro-LLM advocates, that is selling the idea that LLMs are actually real AI that is able to think and reason and is operating within ethical boundaries of some kind.

Neither of those things are true but that's what a lot of available information about LLMs would have you believe so it's not difficult to imagine someone engaging with a chatbot ending up with a similar result without trying to force it explicitly via prompt engineering.