this post was submitted on 27 Apr 2026
1223 points (98.6% liked)

Technology

84256 readers
3063 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mech@feddit.org 95 points 4 days ago (5 children)

It's so weird how these chatbots always pretend they learnt something after they fuck up.
They literally can't.

[–] frongt@lemmy.zip 30 points 3 days ago (1 children)

They're not even pretending. The algorithm says the most likely response to "you fucked up" is "I'm sorry", so that's what it prints. There's zero psychological simulation going on, only statistical text generation.

[–] Hacksaw@lemmy.ca 21 points 3 days ago

I actually didn't believe you but it's literally true. First post, immediate apology.

[–] ech@lemmy.ca 29 points 4 days ago

The program can't pretend any more than it can tell truth. It's all just impressive regurgitation. Querying it as to why it "chose" to take any action is about as useful as interrogating a boulder on why it "chose" to roll through a house.

[–] SkaveRat@discuss.tchncs.de 22 points 4 days ago

I mean, they probably do. until it gets purged from the context window. then it just yolos again

[–] thisbenzingring@lemmy.today 2 points 4 days ago

the next ingestion cycle will probably pick it up but how do we know it'll use the information in any relevant way 😶