this post was submitted on 31 May 2025
418 points (98.4% liked)

Fuck AI

3026 readers
606 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] the_q@lemm.ee 12 points 6 days ago

AI ingesting AI slop and falling apart is not dissimilar to boomers ingesting rightwing slop and falling apart.

[–] AceFuzzLord@lemm.ee 7 points 6 days ago (1 children)

If I had the money and a computer able to handle the amount of stuff I'd be throwing at it with a local model, I would have a giant website full of AI generated nonsense purely for the purpose of letting AI gobble it up to help the AI incest problem.

Imagine if a whole metric ton of "websites" did this. The thieving AI companies would either have to start blocking all of these sites or deal with an issue they don't wanna because they're too stingy and will probably just have their AI try ( and fail ) to fix the problem.

[–] ThefuzzyFurryComrade@pawb.social 5 points 6 days ago (1 children)
[–] AceFuzzLord@lemm.ee 1 points 6 days ago

That was certainly cool. Now I wish I could use tools like that nepenthes on my neocities page.

[–] Plesiohedron@lemmy.cafe 4 points 6 days ago* (last edited 6 days ago)

Human society does the same thing.

We're ok when we talk about what we saw.

Less so when we talk about what somebody else saw.

Crazier and crazier when we talk about what somebody said about what somebody said about what somebody saw. Which is arguably the internet.

[–] Phen@lemmy.eco.br 92 points 1 week ago (1 children)

Another problem I've realized today, is the proliferation of data that was originally hallucinated by AI.

I was discussing an issue on a software with a coworker and he asked an AI for help configure around it. He then sent me "apparently we can try changing this setting to this value". I told him to first validate if that setting really existed because AI tends to make up things like that when it's what you would want to hear and running a test would take us 20~30 minutes.

He found some discussions about that setting not working as people expected. "ok at least it exists then" and we tried it. It didn't work. I later cloned the source of that software and checked, the setting didn't exist - ever.

[–] alaphic@lemmy.world 85 points 1 week ago (1 children)

I love that you even specifically said, "Yea, let's check to make sure that setting exists to begin with." To which instead of actually fucking checking, they proceed to google more about the setting and use someone else's 'discussion' online of it not working as proof that it does exist, even though they were likely having that discussion because the setting didn't exist.

This is also how I can tell this story is 100% true.

I don't miss working support at all and am reminded of it like this daily

load more comments (1 replies)
[–] NaibofTabr@infosec.pub 63 points 1 week ago (2 children)

Garbage in, garbage out.

Who could have possibly predicted that?

[–] Kornblumenratte@feddit.org 10 points 1 week ago (1 children)

The recycling industry begs to differ. Well, exceptions prove the rule.

[–] jagged_circle@feddit.nl 8 points 1 week ago (5 children)

Depends what materials you're recycling. Glass and plastic both require virgin material, else you'd get garbage out.

[–] dutchkimble@lemy.lol 15 points 1 week ago* (last edited 1 week ago)

Paper works up to 6 times at best, if someone is able to track the same batch. But recycling paper uses more energy than using virgin, and if virgin paper sourced from a sustainable place like Canada, then recycled paper is actually worse for the environment because of the energy thing and de-inking water waste. Also the timber is cut for housing actually and only the edges of the logs are used to make chips for making paper. So trees aren’t being cut solely for paper (from sustainable countries). Until we meet again (insert Skeletor running away)

load more comments (4 replies)
load more comments (1 replies)
[–] Churbleyimyam@lemm.ee 43 points 1 week ago (3 children)

It's very tempting to have schadenfreude about this failure but also disgusting that so much has been invested in it that should have been put to better use.

It's just another example of a system whose narrow definition of success is taking human and environmental value and using it to extract more. It's not aimed at solving worthwhile problems or making things better, which is why people are becoming more miserable and the planet is getting wrecked.

You could say that it's the system we live in which is the AI, feeding on itself and becoming more sick.

[–] chonglibloodsport@lemmy.world 9 points 1 week ago* (last edited 1 week ago) (1 children)

The schadenfreude is what we’re here for! We can’t do anything about the waste of investors’ money. They could’ve spent it all on fireworks instead. That probably would’ve been more fun!

As for the system? I prefer not to think about it. Too much systemic thinking is bad for mental health. Much better to enjoy some schadenfreude and save your serious thinking energy for things you have the power to change, especially where they can make life better for you and those around you.

[–] Churbleyimyam@lemm.ee 8 points 1 week ago

I agree with all your points! What I will add though is that what we think of as 'investors money' is actually value that has been extracted from the environment and from workers.

load more comments (2 replies)
[–] Blaster_M@lemmy.world 35 points 1 week ago* (last edited 1 week ago) (5 children)

So, reading this article, it's not about model collapse, but about RAG - letting the AI model google the question essentially. The problem is, the first 10 pages of google search results are all low effort adfarming slop sites, because of course it is, which is making the answers from the AI worse, as these slop sites often have incorrect or otherwise unproofed articles, which biases the AI to fork out the wrong answer.

I'm sure the major AI services will try and fix this with some slop site detection routines.

[–] frunch@lemmy.world 16 points 1 week ago (1 children)

I'm sure the major AI services will try and fix this with some slop site detection routines.

Which will be run by AI 🙃

[–] melechric@lemmy.world 11 points 1 week ago (1 children)

Don't forget! A lot of the slop on those first few pages of results is AI-generated.

Ouroboros is a very apt moniker for this phenomena.

load more comments (1 replies)
load more comments (4 replies)
[–] aesthelete@lemmy.world 33 points 1 week ago (1 children)

Good. Eat yourself you technological prion disease.

[–] masta_chief@sh.itjust.works 7 points 1 week ago

This is a rare insult and I like it

[–] Showroom7561@lemmy.ca 31 points 1 week ago

Good. Poison the AI well. Rot this shit to the ground.

[–] _druid@sh.itjust.works 30 points 1 week ago (2 children)

Aww boo hoo, did someone generate a degenerative feedback loop? Yeah? Did someone make a big ol' oppsy whoopsy that's gunna accelerate in hallucinations and slop as it collapses in on itself? How's the coded version of a microphone whine going to go, you silly buttholes?

load more comments (2 replies)
[–] Rooty@lemmy.world 25 points 1 week ago

Ffs, neural networks and LLMs have their place and can be useful, but setting up datacentres that snort up the entire internet indiscriminately to create a glorified chatbot that spews data that may or may not be correct is insane.

[–] BigMacHole@lemm.ee 23 points 1 week ago (3 children)

Oh no! I HOPE us Taxpayers can Bail Out these AI Companies when they go Under! AFTER ALL we CUT my Child's LIFESAVING MEDICATION so I KNOW we have the Funds to Help these Poor Billionaire CEOS!

load more comments (3 replies)
[–] SocialMediaRefugee@lemmy.world 23 points 1 week ago

I predicted this. It is similar to a photocopy of a photocopy that eventually ends up a mess of garbage noise.

[–] Deflated0ne@lemmy.world 18 points 1 week ago
[–] PanArab@lemm.ee 17 points 1 week ago

The silver lining of AI slop filling the WWW

[–] AngryCommieKender@lemmy.world 17 points 1 week ago (1 children)

Cue Price is Right failure trombone.

load more comments (1 replies)
[–] avattar@lemmy.sdf.org 15 points 1 week ago (2 children)

There is a solution to this. Make a **perfect ** AI detecting tool. The only way I can think of is through adding a tag to every bit of AI-generated data,

Though it could easily be removed from text, I guess.And no, training AI to recognize AI will never work. Also every model would have to join this, or it won't work.

Related XKCD

[–] bold_atlas@lemmy.world 1 points 6 days ago* (last edited 6 days ago)

Also people won't be able to pass AI work off as their own if it is labeled as such. Cheating and selling slop is the chief use for AI so any tag or watermark will be removed on the vast majority of stuff.

There's also liability. If your AI generates code that's used to program something important and a lot of people are injured or die, do you really want a tag that can be traceable to back the company to be on the evidence? Or slapped all over the child sex abuse images that their wonderful invention is churning out?

[–] Etterra@discuss.online 10 points 1 week ago (1 children)

LOL you're suggesting people already doing something unbelievably stupid should do something smart to compensate.

load more comments (1 replies)
[–] goldenquetzal@lemmy.world 13 points 1 week ago (2 children)

I've been predicting this for a while now and people kept telling me I was wrong. Prepare for dot com burst two, electric boogaloo.

[–] bold_atlas@lemmy.world 3 points 6 days ago* (last edited 6 days ago)

I hope it crashes but what if the market completely embraces feels-based economics and just says that incomprehensible AI slop noise is what customers crave? Maybe CEOs will interpret AI gibberish output in much the same way as ancient high priests made calls by sifting through the entrails of sacrificed animals. Tesla meme stock is evidence that you can defy all known laws of economic theory and still just coast by.

[–] late_night@sopuli.xyz 12 points 1 week ago

Him: Ugh I don't feel so good after all this data.

Her: Data is a nutritious source of information. However, ingesting too much data can trigger some unpleasant side effects. Here's what you can do to alleviate some of the symptoms:

  • Drink water
  • Lie down and rest
  • Listen to the sounds of nature

Is there anything esle I can do for you?

[–] Kyrgizion@lemmy.world 10 points 1 week ago
[–] Pnut@lemm.ee 9 points 1 week ago

How much money was invested in reminding us that if the snake starts eating its tail it's eating itself?

[–] tostos@lemmy.world 8 points 1 week ago

fill up your free cloud services with ai generated info. i mean thousand text file. like "how to make homemade butterfly". all of them will scrap by ai.

[–] HootinNHollerin@lemmy.dbzer0.com 8 points 1 week ago (2 children)
load more comments (2 replies)
[–] omxxi@feddit.org 8 points 1 week ago

makes me think about the human centipede

load more comments
view more: next ›