Great, so now they're outsourcing our own fucking opinions to them
A Boring Dystopia
Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.
Rules (Subject to Change)
--Be a Decent Human Being
--Posting news articles: include the source name and exact title from article in your post title
--If a picture is just a screenshot of an article, link the article
--If a video's content isn't clear from title, write a short summary so people know what it's about.
--Posts must have something to do with the topic
--Zero tolerance for Racism/Sexism/Ableism/etc.
--No NSFW content
--Abide by the rules of lemmy.world
This just in: 95% of poll respondents think tech bro oligarchy is a really swell idea.
We are keen on ai allright. Until we can burn it to the ground.
Wait till these Silicon Sampling reports start popping up in election exit-polls.
We're going to get levels of "Proven Election Fraud" and "Illegal Immigrant Voter Invasion" hysteria the likes of which you've never seen.
We are living through a great filter event and we aren’t going to come through the other side
Should just have it handle voting as well. They could call it Automatic Democracy.
None of that pesky informed voting, you can just instruct an AI company on what your stance is, and it'll vote in your stead.
You know what else costs a fraction of traditional polling and takes a fraction of the time?
Lying, making shit up. Which conveniently is basically what AI slop does, and having a person lie is even cheaper than licensing some random AI to do it.
Ah, but licensing the AI lets you blame the AI when it inevitably blows up
That is exactly it, the AI gives them an excuse to blame someone else even as they had every reason to know, and they did know, we all know they know but the courts pretend like they didn't know because the Federalist Society.
I could go even cheaper by just thinking about it really hard and guessing
honestly it'd probably be better unless you're actively hallucinating bullshit
Hey at least this is one poll result rather than zero
Ah, yes, lies.
Technically, these are damned lies because they've been summarized.
Sadly, I'm not too surprised. Check this shit out, published back in November 2025: https://arxiv.org/pdf/2510.25137.
"We simulated 151 million American workers [using LLMs] to see what proportion of tasks they do that can also be done by AI".
Much more recently, Esquire couldn't get ahold of an actor for an interview and so decided to generate the actors responses using Claude: https://esquiresg.com/mackenyu-one-piece-roronoa-zoro-interview/.
We had the photospread, but nothing directly uttered by the 29-year-old. With a driving need for a feature, we had to be inventive. Harnessing our creative license, we pulled his verbatim from previous interviews and fed them through an AI programme to formulate new responses.
Are these the words we expect from Mackenyu? Or are they just replies from an echo chamber of celebrity-hood that we want to believe is from him?
With the absence of information, can new insights be gained?
Nature abhors a vacuum, and in its place, a story fills the hollow.
Somehow it is currently accepted by a certain portion of people that LLM-based systems can be used to replace actually existing human beings.
Doing an interview with an LLM trained on a real person feels like libel.
What the absolute fuck. Even if I was pro-AI, I'd find this to be incredibly unethical.
Lately, I've had some coworkers empowered by AI in really cool ways, building mockups using code they can't personally write to present to me, an actual engineer. Not with the expectations of a final product, but to express their thoughts and ideas and how they would envision a project moving forward. I think that's a really cool and exciting use of AI, allowing non-technical people to better communicate with technical people.
Then I see crap like this and think "we need to burn this shit down immediately."
Is there a community for cataloging tech bullshit like "silicon sampling"? If not I'll make one. First thought c/techbrobabble
EDIT: 'tis done !techbrobabble@piefed.blahaj.zone
I hope you're ready to be the only poster until the community catches on, and then that you're ready to play moderator. best of luck
Thanks! This was a spur of the moment decision, but I have thought about moderating a comm for a while. This seems like a good candidate to start with as I have a decades-long back catalogue of tech nonsense to draw from, I'll probably sit down an write out a list sometime this week and plan some posts out. At one post a day I bet I could keep it going myself for a few months, before I take into account the eternal firehose of techbrobabble I drink from every day lol.
That's fucking stupid.
You beat me to it. I was going to comment that this is literally the stupidest shit I have ever heard in my goddamn life.

This might be the dumbest fucking thing I've ever heard.
Any scientific publications accidentally posting an article not based on the actual scientific method should be immediately punished by law or we are lost. It's time they used the hordes of money they accumulated during the easy part to now prove themselves and perform their actual function in society.
I have an environment friendly alternative to this method. It involves tea leaves…
“opinions” formed from a mix of stolen books and movie scripts, terminally online shutins and fanfic writers, and politics comment sections cannot be considered a holistic look at humanity.
We’re absolutely going extinct. I’m out of hope at this point.
It costs less to make shit up that "mimics" the real information. Who would have thought
What the actual fuck??
Why bother asking people when you can just ask AI to make answers.
Misleading title, Axios did not do this, but rather the referenced a study that they later discovered did this.
It’s on them for not learning this sooner, but let’s not act like they’re the ones who sent it up to try and manipulate political reporting.

Holy fuck. It can simulate large samplings or it can just hallucinate some nonsensical BS that completely misinterprets the data it gathers in order to agree with the phrasing of the person who created the prompt.
Do the majority of people trust their doctors and nurses? Maybe. Or, maybe it depends on the context of the question.
Do I trust my doctors and nurses are a better source of information than random internet advice and AI generated slop? I would hope so.
Do I trust that the American healthcare system is set up to prioritize the health and well-being of the patient over maximizing profits and forcing healthcare workers to adhere to standardized time allotments of 10 to 15 minutes for every patient interaction regardless of the individual case? Absolutely not.
Cool, so everything is just fucking made up now. Why even bother with the AI at that point? Just make up stats that say what you want right there on the spot. Its the same fucking different. Bullshit from humans or bullshit from AI, its all still bullshit.
Unfucking believable that a legitimate media outlet would do such a thing. That’s some Breitbart shit.
You wouldn’t know my survey results, she goes to another school

Ever notice that they're just doing what Clavicular or whatever his name is doing? They're inventing lingo to sound like its not bullshit.
Its not hitting yourself with a hammer its looksmaxxing. Its not standing around being a dork, its mogging. Its not a context window, its the chat scrollback. Its not asking chatgpt its "silicon sampling".
They're making it seem legit by making its own terminology and in-group lingo.
Making shit up, but with extra steps.
The ideal would be that clients who actually want useful information will stop paying the pollsters for their useless crap.
The reality will be that slack will be more than picked up by people who want sham poll results to back up their agenda.