this post was submitted on 25 Jul 2025
453 points (98.3% liked)
A Boring Dystopia
14718 readers
592 users here now
Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.
Rules (Subject to Change)
--Be a Decent Human Being
--Posting news articles: include the source name and exact title from article in your post title
--If a picture is just a screenshot of an article, link the article
--If a video's content isn't clear from title, write a short summary so people know what it's about.
--Posts must have something to do with the topic
--Zero tolerance for Racism/Sexism/Ableism/etc.
--No NSFW content
--Abide by the rules of lemmy.world
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They have no idea what LLMs are if they think LLMs can be forced to be "truthful". An LLM has no idea what is "truth" it simply uses its inputs to predict what it thinks you want to hear base upon its the data given to it. It doesn't know what "truth" is.
They are clearly incompetent.
That said, generally speaking, pursuing a truth-seeking LLM is actually sensible, and it can actually be done. What is surprising is that no one is currently doing that.
A truth-seeking LLM needs ironclad data. It cannot scrape social media at all. It needs training incentive to validate truth above satisfying a user, which makes it incompatible with profit seeking organizations. It needs to tell a user "I do not know" and also "You are wrong," among other user-displeasing phrases.
To get that data, you need a completely restructured society. Information must be open source. All information needs cryptographically signed origins ultimately being traceable to a credentialed source. If possible, the information needs physical observational evidence ("reality anchoring").
That's the short of it. In other words, with the way everything is going, we will likely not see a "real" LLM in our lifetime. Society is degrading too rapidly and all the money is flowing to making LLMs compliant. Truth seeking is a very low priority to people, so it is a low priority to the machine these people make.
But the concept itself? Actually a good one, if the people saying it actually knew what "truth" meant.
How are you going to accomplish this when there is a disagreement on what is true. “Fake News”
"Real" truth is ultimately anchored to reality. You attach probabilities to datapoints based upon that reality anchoring, and include truthiness as another parameter.
For datapoints that are unsubstantiated or otherwise immeasurable, then it is excluded. I don't need an LLM to comment on gossip or human-created issues. I need a machine that can assist in understanding and molding the universe, and helping elevate our kind. Elevation is a matter of understanding the truths of our universe and ourselves.
With good data, good extrapolations are more likely.