this post was submitted on 20 May 2025
1712 points (98.2% liked)
Microblog Memes
7664 readers
3351 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI summaries put another layer of interpretation between the reader and the source material. When having accurate and properly-sourced information matters, it's just not trustworthy enough. At least with Wikipedia, it tells you when there is potentially biased or improperly sourced material. Search AI will confidently assert their summaries as though they are factual, regardless of how reliable or unreliable their own sources are.
So long as the citations are there I'm not usually taking the summary at it's word. I find searching "hard to Google" terms easier with AI.
When having accurate and properly sourced material matters, I hope you're not trusting the descriptions of citations laid out by wikipedia editors who are also just another layer of interpretation. It's always worth a double check.
I've been an editor on Wikipedia for decades now. I've followed sources to clarify information, fix broken links, and remove inaccurate information. I know how it works.
That's exactly my point. Wikipedia is transparent about where it gets its information. You can double-check citations, and if the citations don't exist or don't support a relevant claim, you can discard them (or edit them to flag that fact, or go above and beyond to provide a new source, if you're so willing.) With AI summaries, you can't do any of that. You're given a summation without automatic citations (or sometimes, with bogus made-up ones), and you can't do anything to correct any misinformation you encounter. Maybe you can report it, but you can't do anything in real time to prevent others from finding that same inaccurate information - not in the way that you can to immediately correct an inaccuracy on Wikipedia.
Same. But now this is a different topic.
For something like perplexity under brave where you're given inline citations, yeah, go follow them and get to an authoritative source faster.
We didn't start with "I can't submit an updated review if I find mistakes", we started at "there's another unnecessary layer of indirection". Which, sure, but it's hardly different than getting a start with a medium article of "best xxx of 2025" or, yes, a wikipedia page. It may not be to your taste, but I've had some occasions where it's convenient.
they make them up, and they dont source it properly.
Go ask perplexity.ai a question about programming or troubleshooting FAQ and then follow a cited link. I assure you they are not all made up.
AI fabricates citations.
Every citation is not fake or irrelevant. In wikipedia it's "citation needed" or "page does not exist". Same problems.
All you have to do is click it or search again.
But hey, of you prefer the old fashioned way of opening every returned search result starting with page 1 to page 6 until you just search again anyway, go ahead and do that. I'll deal with sifting through occasional bad advice in an eighth of the time.
This ^.
I think people forget the fabled "old" internet was actually a pile of trolls where one had to double check what they read.
Basic sanity checks really aren't that hard. But its a forgotten habit, I guess.
"oh my god, AI makes shit up!"