this post was submitted on 14 Mar 2026
544 points (98.9% liked)

Fuck AI

6367 readers
1695 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] m3t00@lemmy.world 1 points 5 hours ago

the dumb ones stay dumb. at least read the $100 book

[–] BudgetBandit@sh.itjust.works 4 points 20 hours ago* (last edited 20 hours ago)

Me, cooking, fucked up my Tofu and my wife always uses ChatGPT for everything: hey ChatGPT, how can I fix the tofu that curdled into the smallest flakes possible? ChatGPT: vinegar and heat.

The one time I used ChatGPT for more than "I have this tax law and want my grandma to understand it, say the same using easy words" content I ruined a whole pot of tofu and nutmilk.

If you don’t know what you want it to say, it is wrong.

[–] Bazell@lemmy.zip 5 points 1 day ago

Of course. If you stop thinking about complicated things, what do you except to happen after some time?

[–] Tollana1234567@lemmy.today 2 points 1 day ago

maybe you shouldnt have discredited students, by using AI accusing them of cheating with AI on essays.

[–] m3t00@lemmy.world 16 points 1 day ago

in India, it's common to pay someone to take exams. congratulations Doctor

[–] deadymouse@lemmy.world 7 points 1 day ago* (last edited 1 day ago)

This is the peak of civilization, get ready for regress.

[–] lechekaflan@lemmy.world 29 points 2 days ago (3 children)

what they're doing to push back against the tech.

Classify it as a cheating tool.

[–] bridgeburner@lemmy.world 18 points 1 day ago (3 children)

Works in theory, but not in practice, as there are no tools that can tell 100% reliably if something was written by AI. Best way IMO to test students is via oral exams. Let them explain certain things and topics they allegedly wrote about in their thesis; that way you can quickly see who actually bothered to learn and understand and not only let their thesis write by AI.

[–] dustyData@lemmy.world 17 points 1 day ago (1 children)

There was this cool initiative by a professor who is a friend of mine. He would give a pretty standard homework, but then the additional instructions were to complete said homework using an LLM. Then, the students would have to write, by hand, an analysis of all that the LLM got wrong, or could've done better. They then proceeded to discuss their analysis in class. Participating in the discussion with actual meaningful arguments was half of the points, the other half being the quality of the handwritten analysis.

It was more work, but at least the fuckers quickly appreciated that the machine was actually shit at doing their homework, and even if it could pass, it would be with the bare minimum. It also pruned the students who actually wanted to learn from the slackers who were just wasting their parent's money.

[–] BradleyUffner@lemmy.world 2 points 1 day ago

I was with you up to the writing by hand part.

[–] Echolynx@lemmy.zip 10 points 1 day ago (1 children)

They should bring back oral exams and blue books.

[–] laranis@lemmy.zip 9 points 1 day ago (1 children)

This is it. Stop it with these take-home test. Homework was always bullshit.

But then professors and teachers would have to think themselves instead of regurgitating the same lesson plan and worksheets from two decades ago

Hot take incoming: good public school teachers are criminally underpaid. Most teachers are paid exactly what they're worth.

[–] Tollana1234567@lemmy.today 2 points 1 day ago* (last edited 1 day ago)

oh but some professors and even TA have thier own things to do, like research, research papers. when i was in university last decade one of professors was doing STEM research outside of class on the side, which took most of his attention away from the class the whole semester, the TA wasnt any better she had her own MS thesis/research to do at the same time. basically dint learn anything with animal/bio physiology. he even went to another country for that research during the semester. the only DRs/phds that do have times are the ones in city colleges that tends to have classes as tough as ivy league universities, since they dont do any research at all.

and then biochem 1 teacher had so many bad reviews on rmf, she evens said herself that her "research lab takes priority" paraphrased so she does minimal court lectures, and convoluted tests.

[–] Atomic@sh.itjust.works 6 points 1 day ago (1 children)

It's also important for parents to genuinly take an interest in their children's education. Help them understand why we don't use AI for school-work. And be there for them when they need help so they don't have to resort to AI when it feels hopeless.

I remember a study we all read when I was working as a sub teacher. Ages 7 - 12. How much time does an average parent spend talking to their child on an average day. Giving commands is not talking for the purpose of the study.

5 minutes. It was 5-6 minutes. It explains a lot doesn't it.

[–] forkDestroyer@infosec.pub 2 points 1 day ago

There are plenty of negligent parents out there, but also plenty who don't have the time because they gotta pay those bills that crept up on them, especially in this economy.

load more comments (2 replies)
[–] OctopusNemeses@lemmy.world 57 points 2 days ago (2 children)

No shit. LLMs are the anti-thesis of education. The basis of which is hands on practice. Using a glorified autocomplete-my-assignments should be barred from education without exception.

This is the tip of the iceberg. The world is facing a critical collapse of profession. We're nearing a point that was foretold ins sci-fi where humans no longer understand how the machine works.Just that they can use it and it works.

Except sci-fi was too optimistic. Science fiction has actual AGI. We have glorified autocomplete. A non-intelligence.

[–] TubularTittyFrog@lemmy.world 9 points 1 day ago

it's the equivalent of trying become a marathon running while you sit on your ass on the couch and and flying FPV drone for your 'training' and saying it's the same thing.

[–] RushJet1@lemmy.world 3 points 1 day ago

This is worse than that too, we're already at a point where most people don't understand how the machine works... But overuse of AI is going to make it so that most people don't know how to do simple tasks that everybody used to know. They'll be non-functional without an internet connection.

[–] HugeNerd@lemmy.ca 7 points 1 day ago

But their ability to make Terminator-Robocop porn in under an hour is unmatched

[–] theBATCLAM@piefed.social 122 points 2 days ago* (last edited 2 days ago) (45 children)

It's absolutely terrifying. I am a returning student to uni in my thirties and the only person not using any AI. They literally depend on it.

I just had a classmate the other day turn to me, frustrated, saying "You ever ask chat(gpt) a question and it gives you a whole, like, paragraph you then have to read? like, why can't it simplify it?"

Did I mention I am an electrical/computer engineering double major? So yeah, even reading is too much for these kids. Future workforce is fucking cooked.

[–] Tollana1234567@lemmy.today 2 points 1 day ago* (last edited 1 day ago)

my older bro who is in tech, depends on it for almost every question he has or when he is ask a very simple question. asked him something about laundry last year, he said "use CHATGPT", theres really no hope for these people. and then later on ask what was causing "flies " to appear in the house, he kept using AI eventhough it was pretty giving inaccurate info,

[–] Tollana1234567@lemmy.today 1 points 1 day ago

my bro who is in tech refers to chatgpt as a go-to anytime he has a question or is asked a very simple question, yea pretty much screwed, and he think AI slop generated pictures are epitome of some kind of art piece.

[–] fuck_u_spez_in_particular@lemmy.world 8 points 1 day ago (2 children)

Future workforce is fucking cooked.

Yep, and I predict that programmers actually understanding code (and especially being able to quickly and thoroughly review code), are becoming increasingly valuable (again?) in the future, when someone really has to guarantee what the AI actually generated (and let me tell you there are still so many stupid things the AI does...).

[–] theBATCLAM@piefed.social 6 points 1 day ago

We're already seeing it a little, I know IBM just had to start massively hiring just in Feb

https://fortune.com/2026/02/13/tech-giant-ibm-tripling-gen-z-entry-level-hiring-according-to-chro-rewriting-jobs-ai-era/

[–] No1@aussie.zone 2 points 1 day ago

The new age COBOL programmers are anyone who can actually program.

[–] trackball_fetish@lemmy.wtf 18 points 2 days ago (3 children)

I have a friend who was frustrated that his programming exam was too hard (Python) and stated " why do I need to learn this? I can just use ai and get the job done ". We're absolutely fugged.

[–] theBATCLAM@piefed.social 11 points 1 day ago* (last edited 1 day ago) (1 children)

It's honestly disheartening. I understand our education system is not well in that tests aren't actually very condusive to learning, but they treat the idea of learning a skill like it's some obnoxious chore they just want over with so they never have to do it again when its like... bruh civilization/tech grows exponentially, y'all gotta learn your whole lives and it should be something you ENJOY it should give you pride to be good at something or understand a subject thoroughly.

i don't even know what to begin doing about this problem but even if you pretend the environmental impacts are fine/manageable, I can't help but think this shits gotta be destroyed for the future of humanity.

load more comments (1 replies)
load more comments (2 replies)
[–] pipi1234@lemmy.world 38 points 2 days ago (6 children)
load more comments (5 replies)
load more comments (40 replies)
[–] Zink@programming.dev 35 points 2 days ago

It seems like a huge pervasive part of modern culture is that success and fortune equal never having to get your hands dirty, never worry about the details, and never learning to figure shit out because you can just pay somebody else to do it (or ask the "AI" to).

Obviously some amount of specialization and delegation is good. That's how you get a society.

But to just exist passively is something else. It's bad for us, and I don't mean that as a moral judgment. Nobody needs certain skills to justify their existence. I mean it in the clinical sense, like that you can be sedentary with more than just your physical body.

[–] m3t00@lemmy.world 6 points 1 day ago* (last edited 1 day ago)

trig test day, surprise!. 'no calculators'. gaah, after all that time spent keying identities into my ti-83. still passed with my broken memory

load more comments
view more: next ›