this post was submitted on 06 Apr 2026
937 points (99.4% liked)

A Boring Dystopia

16545 readers
814 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--If a picture is just a screenshot of an article, link the article

--If a video's content isn't clear from title, write a short summary so people know what it's about.

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] bitjunkie@lemmy.world 223 points 3 weeks ago (3 children)

Great, so now they're outsourcing our own fucking opinions to them

[–] Diplomjodler3@lemmy.world 140 points 3 weeks ago (1 children)

This just in: 95% of poll respondents think tech bro oligarchy is a really swell idea.

[–] teyrnon@sh.itjust.works 14 points 3 weeks ago

We are keen on ai allright. Until we can burn it to the ground.

[–] UnderpantsWeevil@lemmy.world 59 points 3 weeks ago (3 children)

Wait till these Silicon Sampling reports start popping up in election exit-polls.

We're going to get levels of "Proven Election Fraud" and "Illegal Immigrant Voter Invasion" hysteria the likes of which you've never seen.

[–] CIA_chatbot@lemmy.world 25 points 3 weeks ago (1 children)

We are living through a great filter event and we aren’t going to come through the other side

[–] UnderpantsWeevil@lemmy.world 10 points 3 weeks ago (7 children)

Pretty much by definition. Nobody lives forever.

But I'd say the Great Filter of the modern moment is more aimed at the technology than the people. What we're stress-testing is traditional modes of communication. How much noise can be generated before the signal is lost?

[–] CIA_chatbot@lemmy.world 15 points 3 weeks ago (4 children)

We have enough people without the general ability to critically think that the fallout will be that we will rush into a probable Nuclear war coupled with ignoring climate change till it’s too late. Meanwhile Billionaires are undermining democracy across the globe chasing short term profits and power. A filter event doesn’t have to be a single thing, more likely it’s a bunch of events all building to the actual “filter”.

We are well and truly fucked.

load more comments (4 replies)
load more comments (6 replies)
[–] T156@lemmy.world 14 points 3 weeks ago (2 children)

Should just have it handle voting as well. They could call it Automatic Democracy.

None of that pesky informed voting, you can just instruct an AI company on what your stance is, and it'll vote in your stead.

[–] skisnow@lemmy.ca 9 points 3 weeks ago

Funny thing, every now and then someone makes these questionnaire websites where you answer questions about policy and it tells you which party is most in alignment. Pretty much universally most people turn out to be more to the left than their voting history and intentions would suggest.

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] oxysis@lemmy.blahaj.zone 142 points 3 weeks ago (1 children)

You know what else costs a fraction of traditional polling and takes a fraction of the time?

Lying, making shit up. Which conveniently is basically what AI slop does, and having a person lie is even cheaper than licensing some random AI to do it.

[–] Archer@lemmy.world 47 points 3 weeks ago (2 children)

Ah, but licensing the AI lets you blame the AI when it inevitably blows up

[–] teyrnon@sh.itjust.works 13 points 3 weeks ago

That is exactly it, the AI gives them an excuse to blame someone else even as they had every reason to know, and they did know, we all know they know but the courts pretend like they didn't know because the Federalist Society.

load more comments (1 replies)
[–] queerlilhayseed@piefed.blahaj.zone 80 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Is there a community for cataloging tech bullshit like "silicon sampling"? If not I'll make one. First thought c/techbrobabble

EDIT: 'tis done !techbrobabble@piefed.blahaj.zone

[–] ComicalMayhem@lemmy.world 19 points 3 weeks ago (1 children)

I hope you're ready to be the only poster until the community catches on, and then that you're ready to play moderator. best of luck

Thanks! This was a spur of the moment decision, but I have thought about moderating a comm for a while. This seems like a good candidate to start with as I have a decades-long back catalogue of tech nonsense to draw from, I'll probably sit down an write out a list sometime this week and plan some posts out. At one post a day I bet I could keep it going myself for a few months, before I take into account the eternal firehose of techbrobabble I drink from every day lol.

[–] molten@lemmy.world 79 points 3 weeks ago (1 children)
[–] prime_number_314159@lemmy.world 12 points 3 weeks ago

Technically, these are damned lies because they've been summarized.

[–] wonderingwanderer@sopuli.xyz 52 points 3 weeks ago (1 children)
[–] buddascrayon@lemmy.world 16 points 3 weeks ago

You beat me to it. I was going to comment that this is literally the stupidest shit I have ever heard in my goddamn life.

[–] null@lemmy.org 49 points 3 weeks ago
[–] miggy@lemmy.blahaj.zone 49 points 3 weeks ago (2 children)

I could go even cheaper by just thinking about it really hard and guessing

[–] mojofrododojo@lemmy.world 8 points 3 weeks ago

honestly it'd probably be better unless you're actively hallucinating bullshit

load more comments (1 replies)
[–] Gullible@sh.itjust.works 43 points 3 weeks ago (2 children)

“opinions” formed from a mix of stolen books and movie scripts, terminally online shutins and fanfic writers, and politics comment sections cannot be considered a holistic look at humanity.

We’re absolutely going extinct. I’m out of hope at this point.

[–] TachyonTele@piefed.social 11 points 3 weeks ago (1 children)

I figure ill be around to see a shitshow, and then I'm out. I'm not leaving any kids behind. Making a clean break this time.

load more comments (1 replies)
[–] Phoenix3875@lemmy.world 42 points 3 weeks ago

I have an environment friendly alternative to this method. It involves tea leaves…

[–] BackgrndNoize@lemmy.world 36 points 3 weeks ago

It costs less to make shit up that "mimics" the real information. Who would have thought

[–] 4am@lemmy.zip 26 points 3 weeks ago (1 children)

Misleading title, Axios did not do this, but rather the referenced a study that they later discovered did this.

It’s on them for not learning this sooner, but let’s not act like they’re the ones who sent it up to try and manipulate political reporting.

load more comments (1 replies)
[–] KeenFlame@feddit.nu 26 points 3 weeks ago

Any scientific publications accidentally posting an article not based on the actual scientific method should be immediately punished by law or we are lost. It's time they used the hordes of money they accumulated during the easy part to now prove themselves and perform their actual function in society.

Cool, so everything is just fucking made up now. Why even bother with the AI at that point? Just make up stats that say what you want right there on the spot. Its the same fucking different. Bullshit from humans or bullshit from AI, its all still bullshit.

[–] Jankatarch@lemmy.world 22 points 3 weeks ago

What the actual fuck??

[–] Eh_I@lemmy.world 21 points 3 weeks ago
[–] AcidiclyBasicGlitch@sh.itjust.works 20 points 3 weeks ago (1 children)

Holy fuck. It can simulate large samplings or it can just hallucinate some nonsensical BS that completely misinterprets the data it gathers in order to agree with the phrasing of the person who created the prompt.

Do the majority of people trust their doctors and nurses? Maybe. Or, maybe it depends on the context of the question.

Do I trust my doctors and nurses are a better source of information than random internet advice and AI generated slop? I would hope so.

Do I trust that the American healthcare system is set up to prioritize the health and well-being of the patient over maximizing profits and forcing healthcare workers to adhere to standardized time allotments of 10 to 15 minutes for every patient interaction regardless of the individual case? Absolutely not.

load more comments (1 replies)
[–] Jayjader@jlai.lu 17 points 3 weeks ago (3 children)

Sadly, I'm not too surprised. Check this shit out, published back in November 2025: https://arxiv.org/pdf/2510.25137.

"We simulated 151 million American workers [using LLMs] to see what proportion of tasks they do that can also be done by AI".

Much more recently, Esquire couldn't get ahold of an actor for an interview and so decided to generate the actors responses using Claude: https://esquiresg.com/mackenyu-one-piece-roronoa-zoro-interview/.

We had the photospread, but nothing directly uttered by the 29-year-old. With a driving need for a feature, we had to be inventive. Harnessing our creative license, we pulled his verbatim from previous interviews and fed them through an AI programme to formulate new responses.

Are these the words we expect from Mackenyu? Or are they just replies from an echo chamber of celebrity-hood that we want to believe is from him?

With the absence of information, can new insights be gained?

Nature abhors a vacuum, and in its place, a story fills the hollow.

Somehow it is currently accepted by a certain portion of people that LLM-based systems can be used to replace actually existing human beings.

[–] Cherries@lemmy.world 12 points 3 weeks ago

Doing an interview with an LLM trained on a real person feels like libel.

load more comments (2 replies)
[–] warbosstodd@piefed.social 16 points 3 weeks ago

Unfucking believable that a legitimate media outlet would do such a thing. That’s some Breitbart shit.

[–] greyscale@lemmy.grey.ooo 16 points 3 weeks ago* (last edited 3 weeks ago)

Ever notice that they're just doing what Clavicular or whatever his name is doing? They're inventing lingo to sound like its not bullshit.

Its not hitting yourself with a hammer its looksmaxxing. Its not standing around being a dork, its mogging. Its not a context window, its the chat scrollback. Its not asking chatgpt its "silicon sampling".

They're making it seem legit by making its own terminology and in-group lingo.

Bullshit artists.

[–] Naich@piefed.world 16 points 3 weeks ago

Making shit up, but with extra steps.

[–] Jaysyn@lemmy.world 14 points 3 weeks ago

This might be the dumbest fucking thing I've ever heard.

[–] ceenote@lemmy.world 13 points 3 weeks ago (1 children)

The ideal would be that clients who actually want useful information will stop paying the pollsters for their useless crap.

The reality will be that slack will be more than picked up by people who want sham poll results to back up their agenda.

load more comments (1 replies)
[–] NOT_RICK@lemmy.world 12 points 3 weeks ago

You wouldn’t know my survey results, she goes to another school

[–] muffedtrims@lemmy.world 12 points 3 weeks ago (2 children)

63% of all statistics are made up on the spot

load more comments (2 replies)
[–] Zacryon@feddit.org 12 points 3 weeks ago
[–] Jankatarch@lemmy.world 11 points 3 weeks ago* (last edited 3 weeks ago)

Crossposting to fuck_ai if you don't mind.

[–] TropicalDingdong@lemmy.world 10 points 3 weeks ago (3 children)
load more comments (3 replies)
[–] 1984@lemmy.today 9 points 3 weeks ago

Why bother asking people when you can just ask AI to make answers.

[–] Aceticon@lemmy.dbzer0.com 8 points 3 weeks ago (1 children)

The thing is, logically the distribution of opinions or individual situations/beliefs which lead to those opinions, has been baked into the model when the data used to train it was captured, which means that at best and if the entire principle of the thing works (which itself isn't mathematically proven in any way form or shape) they're still getting only poll results for the past and which will not actually change beyond some random noise until the next time data is captured and the model is retrained.

It's like repeatedly using an old picture of a street to make realtime claims about the traffic there.

load more comments (1 replies)
[–] billwashere@lemmy.world 8 points 3 weeks ago

Kinda sounds like numbers pulled out of your ass…

load more comments
view more: next ›