this post was submitted on 07 Jul 2025
922 points (98.1% liked)

Technology

72688 readers
2949 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Melvin_Ferd@lemmy.world 1 points 1 day ago (1 children)

If it's so bad as if you say, could you give an example of a prompt where it'll tell you incorrect information.

[–] davidagain@lemmy.world 1 points 1 day ago (1 children)

It's like you didn't listen to anything I ever said, or you discounted everything I said as fiction, but everything your dear LLM said is gospel truth in your eyes. It's utterly irrational. You have to be trolling me now.

[–] Melvin_Ferd@lemmy.world 1 points 23 hours ago (1 children)

Should be easy if it's that bad though

[–] davidagain@lemmy.world 1 points 23 hours ago* (last edited 22 hours ago) (1 children)

I already told you my experience of the crapness of LLMs and even explained why I can't share the prompt etc. You clearly weren't listening or are incapable of taking in information.

There's also all the testing done by the people talked about in the article we're discussing which you're also irrationally dismissing.

You have extreme confirmation bias.

Everything you hear that disagrees with your absurd faith in the accuracy of the extreme blagging of LLMs gets dismissed for any excuse you can come up with.

[–] Melvin_Ferd@lemmy.world 1 points 19 hours ago (1 children)

You're projecting here. I'm asking you to give an example of any prompt. You're saying it's so bad that it needs to be babysat because it's errors. I'll only asking for your to give an example and you're saying that's confirmation bias and acting like I'm being religiously ignorant