this post was submitted on 07 Jan 2026
156 points (99.4% liked)

Tech

2445 readers
8 users here now

A community for high quality news and discussion around technological advancements and changes

Things that fit:

Things that don't fit

Community Wiki

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mycodesucks@lemmy.world 12 points 1 week ago* (last edited 1 week ago) (1 children)

This is wrought with danger. If a chatbot goes off the rails, breaks the fourth wall, and becomes belligerent, it's annoying. If a game NPC does it, you've taken people RIGHT out of the game. And that's before they start giving you clues and advice for things that aren't in the game, or inventing lore that isn't real.

[–] LordMayor@piefed.social 9 points 1 week ago

I think their point is that we want real AI in games. LLMs are not AI in the traditional sense. They are really advanced predictive text. They might someday be a part of an AI but they are not remotely intelligent. They just have the superficial appearance of intelligence because they guess at words in away that mimics human language.

We don’t want LLM NPCs, we want NPCs that simulate human intelligence.

All of this focus and money on LLMs is probably hurting research into actually useful AI.