this post was submitted on 01 Dec 2025
416 points (98.6% liked)

Fuck AI

5206 readers
828 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Cevilia@lemmy.blahaj.zone 156 points 1 month ago (2 children)

The copium in the reddit thread is hilarious.

"The issue is that you had a space in your path name"

No, the issue is that the AI wiped an entire drive! 🀣

[–] Voroxpete@sh.itjust.works 75 points 1 month ago (1 children)

I mean, a lot of the people pointing that out are actually doing so to indicate the dangers of relying on AI in the first place.

If you read some OPs replies it becomes clear that what happened here is they asked the bot how to fix something, didn't understand the instructions it replied with, and then just went and said "Hey, I don't get it, so you do it for me."

Anyone who knew what they were doing would have noticed the bad delete command the bot presented (improperly formatted, and with no safety checks), but because OP figured "Hey, knowing stuff is for suckers", they ended up losing all their stuff.

[–] naevaTheRat@lemmy.dbzer0.com 20 points 1 month ago (3 children)

How would you feel if you bought pharmaceuticals that promised to heal you and they made you sick? What about a ride at a theme park with no warning signs that failed and hurt you?

The whole marketing thing of these garbage devices is based around abusing trust. There are no warnings that you need to be an expert, in fact they claim the opposite.

The person is a rube, but only evil people abuse the trust of others and only evil people blame people for having their trust abused. Being able to trust people is good actually, and we should viciously beat to death everyone that violates social trust.

load more comments (3 replies)

β€œI’m blaming you because my 401k can’t accept this information or it will collapse.”

[–] Kyrgizion@lemmy.world 96 points 1 month ago (1 children)

The irony of having run out of tokens on that last message... "Your problem now, peace out. Or pony up".

Great, we invented data protection rackets.

[–] usernameusername@sh.itjust.works 95 points 1 month ago (5 children)
[–] snooggums@piefed.world 21 points 1 month ago
[–] frunch@lemmy.world 15 points 1 month ago

I don't usually laugh aloud at comments, but this one got me. Thank you, and Happy Monday 🫠

[–] sup@lemmy.ca 3 points 1 month ago

Outstanding

load more comments (1 replies)
[–] falseWhite@lemmy.world 80 points 1 month ago* (last edited 1 month ago) (4 children)

Why would you give AI access to the whole drive? Why would you allow AI run destructive commands on its own without reviewing them?

The guy was asking for it. I really enjoy seeing these vibe coders imagine they are software engineers and fail miserably with their drives and databases wiped.

[–] tburkhol@lemmy.world 69 points 1 month ago (2 children)

If he knew what he was doing, would he need to be vibe coding? The target audience are exactly the people most susceptible to collateral damage.

[–] moody@lemmings.world 16 points 1 month ago (2 children)

I have a couple dev friends who were told by management that they need to be using AI, and they hate it.

[–] MrMcGasion@lemmy.world 6 points 1 month ago

I know not everyone is in a position where they can just ignore management, and maybe I've just been in more blue collar jobs where ignoring management is normalized. But unless they're literally looking over your shoulder, what they don't know won't hurt them. It's not like AI is more efficient once you count the extra debug time.

load more comments (1 replies)
[–] INeedMana@piefed.zip 6 points 1 month ago

I'll probably get eaten here but here goes: I do use LLMs when coding. But those should NEVER be used when on unknown waters. To quickly get the 50 lines boilerplate and fill out the important 12 - sure. See how a nested something can be written in a syntax I've forgotten - yes. Get some example to know where to start searching the documentation from - ok. But "I asked it to do X, don't understand what it spewed out, let's roll"? Hell no, it's a ticking bomb with a very short fuse. Unfortunately the marketing has pushed LLMs as things one can trust. I feel I'm already being treated like a zealot dev, afraid for his job, when I'm warning people around me to not trust the LLMs' output below search engine query

[–] ladicius@lemmy.world 12 points 1 month ago

That here is the core of the problem.

[–] aesthelete@lemmy.world 5 points 1 month ago* (last edited 1 month ago)

I don't know how this one works but many of them can get access through the IDE because the IDE has full disk access, due to being an IDE.

LLMs sometimes use a MCP server to access tools which are usually coded to require consent before each step, but that should probably be an always type of thing.

I hate these stupid things, but I am forced to use them. I think there should be a suggested patch type workflow instead of just allowing them to run roughshod all over your computer, but Google and Microsoft are pursuing "YOLO mode" for everything anyway even if it's alarmingly obvious how terrible an idea that is.

We have containers and VMs, these fucking things should be isolated and it should be impossible for them to alter files without consent.

[–] khepri@lemmy.world 3 points 1 month ago* (last edited 1 month ago)

Why would you ask AI to do any operation on even a single file, let alone an entire a local drive, that wasn't backed up? I've been using and misusing computers for long enough that I have blown up my own shit many times in many stupid ways though, so I can't honestly say that 20 years ago this wouldn't have been me lol.

[–] SaharaMaleikuhm@feddit.org 61 points 1 month ago

Live by the slop, die by the slop. Also: no backup, no pity

[–] BassTurd@lemmy.world 38 points 1 month ago (1 children)

Watching vibe coders get blown up by their own ignorance and stupidity is such a great past time. Fuck AI.

[–] ivanafterall@lemmy.world 4 points 1 month ago

It's still happening, too!

[–] TriangleSpecialist@lemmy.world 33 points 1 month ago* (last edited 1 month ago) (1 children)

Damn, Microsoft really just implemented their own version of the rm -rf / Russian roulette on their sad excuse for an OS.

It only took boiling an ocean for training the damn thing.

EDIT: Google did. We'll blame a Pavlovian reflex and lack of sleep (or anything other than my stupidity)...

[–] CompactFlax@discuss.tchncs.de 18 points 1 month ago (1 children)

Yes, Microsoft is moving this direction.

No, Microsoft is not in this post. Microsoft and Google have not yet merged.

[–] TriangleSpecialist@lemmy.world 9 points 1 month ago (1 children)

Damn, I need some sleep...

I feel that. β˜•οΈ

[–] Gaja0@lemmy.zip 26 points 1 month ago (1 children)

It's funny until you realize Google dumped $93M into convincing the general public that AI is the future before thrusting a half baked technology into our daily lives.

[–] Burninator05@lemmy.world 5 points 1 month ago (2 children)

Well, there's the problem. In AI development $93m is nothing. Its like they threw a pocket change at a child and demanded industry leading AI.

load more comments (2 replies)
[–] psx_crab@lemmy.zip 24 points 1 month ago* (last edited 1 month ago)

2017: we lock your data behind paywall without your authorisation, pay us 2 bitcoin to unlock it.

2025: whoops i deleted your D: drive, do check it for the extend of the damage.

Would people learn

Edit: omfg quota limit hit right after the drive is empty. Seriously why would people even allow AI to hold their egg basket. This is all on OOP.

[–] wewbull@feddit.uk 24 points 1 month ago

Kinda a pity it wasn't the C drive. It would have uninstalled itself.

[–] samus12345@sh.itjust.works 18 points 1 month ago
[–] horn_e4_beaver@discuss.tchncs.de 18 points 1 month ago* (last edited 1 month ago) (1 children)

It's nice that Google gives LLMs an excuse to get out of conversations that they're done with.

load more comments (1 replies)
[–] november@piefed.blahaj.zone 17 points 1 month ago (1 children)
[–] DonPiano@feddit.org 4 points 1 month ago
[–] khepri@lemmy.world 16 points 1 month ago* (last edited 1 month ago)

IF (and this is a big if) you are going to allow an AI direct access to your files and your command line, for the love of Gabe sandbox that shit and run a backup for the folders you give it access to. We know AI makes mistakes like this. Just act as if you were giving your little brother access to your drives and your command line and it's his first day. I get we're all still learning about this stuff, but allowing an AI agent command-line access and full drive access, to a local drive you have no backup of, is just leaving Little Timmy at home alone with a loaded shotgun and a open bottle of pills level of irresponsibility.

[–] watson@lemmy.world 12 points 1 month ago (1 children)

While also eliminating 12 jobs

[–] IcyToes@sh.itjust.works 15 points 1 month ago (1 children)

Looks to be creating rather than reducing the work needed.

I don't feel threatened.

[–] watson@lemmy.world 20 points 1 month ago* (last edited 1 month ago) (4 children)

If they fire you and then make your coworker work twice as hard for the same pay, then you definitely should.

If they fire your coworker and make you work twice as hard for the same pay, then you definitely should.

If they fire both of you, and then give everyone in your town cancer from the toxic water runoff from a massive AI data center they just built, then you definitely should.

[–] AlmightyDoorman@kbin.earth 7 points 1 month ago (1 children)

Is the water runoff toxic? Do you have any article? (Genuinely asking)

[–] watson@lemmy.world 11 points 1 month ago (1 children)

Legit question. Apparently, these data centers need a massive amount of water to cool them. This pollutes the water, and without proper regulations on AI data centers, it’s totally legal.

https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water-data/

https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117

https://cleanwater.org/publications/data-centers-threat-minnesotas-water

https://www.bbc.com/news/articles/ckgqxl9gp6go.amp

https://www.wateronline.com/doc/ai-s-water-problem-is-worse-than-you-think-but-it-doesn-t-have-to-be-0001

Just Google β€œAI toxic wastewater” and you’ll get a bunch of articles about it.

load more comments (1 replies)
load more comments (3 replies)
[–] brucethemoose@lemmy.world 10 points 1 month ago (2 children)

No sane IDE implementation lets LLMs run commands without a sandbox.

WTF is this?

[–] Cevilia@lemmy.blahaj.zone 8 points 1 month ago (1 children)

An insane IDE implementation that lets LLMs run commands without a sandbox.

https://antigravity.google/

[–] brucethemoose@lemmy.world 4 points 1 month ago* (last edited 1 month ago)

Unreal.

It’s like they’re trying to get the public to despise LLMs. It’s certainly working.

[–] Buddahriffic@lemmy.world 4 points 1 month ago

Which is also why I don't want tight AI integration on an OS. It's fine as a chatbot, not an administrator that I suspect MS will one day hand more control over their PC than they allow users.

[–] lmmarsano@lemmynsfw.com 9 points 1 month ago (1 children)

files are overrated anyway when you got antigravity

[–] unipadfox@pawb.social 11 points 1 month ago

Just ask the AI to generate you new, better files

[–] RememberTheApollo_@lemmy.world 7 points 1 month ago (1 children)

I hate to say it but this is the future. As OS devs try to cram AI into everything so that users don’t have to understand technology (because they’d rather take selfies and scroll tiktok) surrendering your control over your own devices and the information contained therein will become more complete.

[–] khepri@lemmy.world 4 points 1 month ago (1 children)

I mean, Apple doesn't want their average user even knowing that their main drive exists, it's been hidden by default for years. So that trend started well before AI.

load more comments (1 replies)

LLMs are like a power saw whose blade is attached by a chain. You don't know what it's going to cut.

load more comments
view more: next β€Ί