Ffs, neural networks and LLMs have their place and can be useful, but setting up datacentres that snort up the entire internet indiscriminately to create a glorified chatbot that spews data that may or may not be correct is insane.
Fuck AI
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI ingesting AI slop and falling apart is not dissimilar to boomers ingesting rightwing slop and falling apart.
I predicted this. It is similar to a photocopy of a photocopy that eventually ends up a mess of garbage noise.
The silver lining of AI slop filling the WWW
If I had the money and a computer able to handle the amount of stuff I'd be throwing at it with a local model, I would have a giant website full of AI generated nonsense purely for the purpose of letting AI gobble it up to help the AI incest problem.
Imagine if a whole metric ton of "websites" did this. The thieving AI companies would either have to start blocking all of these sites or deal with an issue they don't wanna because they're too stingy and will probably just have their AI try ( and fail ) to fix the problem.
That was certainly cool. Now I wish I could use tools like that nepenthes on my neocities page.
There is a solution to this. Make a **perfect ** AI detecting tool. The only way I can think of is through adding a tag to every bit of AI-generated data,
Though it could easily be removed from text, I guess.And no, training AI to recognize AI will never work. Also every model would have to join this, or it won't work.
LOL you're suggesting people already doing something unbelievably stupid should do something smart to compensate.
Not stupid, greedy.
Also people won't be able to pass AI work off as their own if it is labeled as such. Cheating and selling slop is the chief use for AI so any tag or watermark will be removed on the vast majority of stuff.
There's also liability. If your AI generates code that's used to program something important and a lot of people are injured or die, do you really want a tag that can be traceable to back the company to be on the evidence? Or slapped all over the child sex abuse images that their wonderful invention is churning out?
I've been predicting this for a while now and people kept telling me I was wrong. Prepare for dot com burst two, electric boogaloo.
I hope it crashes but what if the market completely embraces feels-based economics and just says that incomprehensible AI slop noise is what customers crave? Maybe CEOs will interpret AI gibberish output in much the same way as ancient high priests made calls by sifting through the entrails of sacrificed animals. Tesla meme stock is evidence that you can defy all known laws of economic theory and still just coast by.
fill up your free cloud services with ai generated info. i mean thousand text file. like "how to make homemade butterfly". all of them will scrap by ai.
Human society does the same thing.
We're ok when we talk about what we saw.
Less so when we talk about what somebody else saw.
Crazier and crazier when we talk about what somebody said about what somebody said about what somebody saw. Which is arguably the internet.
The great news is that these ponze schemes will either collapse or spend the next decade trying to fix it by creating algorithms which detect AI content so as to filter it out.