this post was submitted on 25 Feb 2026
98 points (97.1% liked)
Fuck AI
6061 readers
2453 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Really hate the framing of the title, though the article is a bit more nuanced. AI may be able to "do your homework" in the same way a robot may be able to "eat your dinner", but it cannot do these things for you, only independent of you. No one can "do your homework for you" because the point of homework is to help you understand the information through exercise. Saying it's done because all questions were answered is like saying you had dinner because a robot removed the food from your plate. You are not getting the actual point of the exercise, the way you're not getting the nutrition from the food. You can argue about how helpful/nutritious the work/food is, but to argue that the work/food is not necessary is a blatant lie told to you by people trying to starve you.
This idea that "memorizing things is bad" and that memorizing can be offloaded comes from a fundamental misunderstanding of education. In order to understand things we have to know certain things. Even first had experiences are "committed to memory" so that you can draw from that information later. These people are attempting to rob others of an education and the ability to understand the world around them. I don't know how they can live with themselves.
This seems like the same argument people had with calculators. We’ve had calculators, spreadsheets, cash registers for half a century now so why do the still teach math?
There's a difference between knowing things and memorizing them, though.
I agree on the broad notion that using reference is perfectly fine at all levels of academia. You memorize information by putting it to use. Repetitive reading with no application intended just for memorization is a massive waste of time.
That is fundamentally different to attending lectures, reading books or paper and definitely not the same as putting in the work of writing your own or doing your own research.
My concern with this idea is the same as my concern with every other attempt at a "disruptive" AI product: you can already do all the valuable parts of this with existing tools and the novel things this can do aren't particularly useful or something that chatbots do well.
By all means use AI tools to do schoolworks if and when they're useful. It's just that this doesn't sound like it is.
Though I think your first point is mostly semantics, I do it's ok if some things are expected to be memorized. What do you mean by:
Is the class and test not the intended application? I bet most people who learn about DNA or Golgi bodies never apply that information outside of school. Most people who took an art history class had to learn about cubism and likely haven't uttered the word since. What about the difference between igneous and sedimentary rock? I think these classes are important, but you cannot expect people to have the time to build up an understanding of all of these subjects from first principles. At a certain point you have to memorize something. Even if you went to a volcano and watched the magma cool yourself, you'd still have to remember what the result is called. If a student can define a term and identify it in action when they see it, I don't think they need to have done any original research on it, and most coursework (lectures/videos/homework) gives them the tools to be able to define and identify it. It's about exposure and exploration, and for that kind of surface level understanding I think the coursework for most classes counts as sufficient "putting in the work".
What does useful mean in this context:
My point is that they are not useful because they don't help you learn the material. What is the "valuable part of this"? It literally just does the work for them. AI repeatedly makes factual errors, so I wouldn't even trust it to rephrase something, much less teach it to me, especially when there are a lot of trustworthy educational tools and sites out there.
If the class and test is the entire intended application then what's the point? I mean, at least throw personal growth in there or something. If going to the gym made you fat and unhealthy we wouldn't go around telling people to exercise.
Look, my point is that you learn about things when you use that knowledge repeatedly. It's a chicken and egg situation and you do have to start from memorization (you wouldn't expect a medical doctor to look up the names of body parts until they just naturally stick, and you WILL have to learn some vocabulary from scratch to learn a language), but by and large if something is written down and you have access to it that's probably enough to learn it over time.
There's a bit of a sense that study has to be pain and work because... well, old people like to see young people suffer like they used to suffer, whatever. But man, I can tell you I learned far more from the teachers and professors that gave us something to do and the tools to do it than from the ones that showed up with a power point deck and asked us to memorize bullet points.
As for what AI is useful for... I mean, yeah, it's not a lot. That was my point. AI is decent at reminding you of things you sorta vaguely know but can't recall, does ok at summarization and at some coding tasks. Some of that is useful in school (I certainly would have spun up a OCR system instead of giving myself carpal tunnel cleaning up notes), but it's not much use for you if your job is to go to a lecture and... you know, learn from it.
I will say that they are not terrible teaching aids, though. Stuff like explaining language stuff, or answering specific, precise questions that you can otherwise verify are not terrible uses. And, as a very much amateur coder, AI haters may have to accept that I've actually gotten better at coding by myself via using a chatbot to fix my problems (if only because the chatbot sucks at doing the thing from scratch, so I still do the parts I can do). You can use reference and technology to learn stuff on your own, it doesn't matter if it's a chatbot or Wikipedia. It won't do you much good to try to have it replace you at doing the work if the point of the work is to teach you how to do it, though.
My first semester of college was in the fall of 2005. One of the courses I took that semester was titled Western Civilization. Basically a repeat of high school World History, "white people evolved and did everything, and some other people were around I guess" kind of stuff. We get a couple weeks into the semester when the teacher just...stops showing up. We walk in to find "Read Chapters 1-3 in your textbook" written on the board. He's gone for over a month, occasionally there's a substitute who has no fucking idea what's going on, I think they got someone from Financial Aid or the Registrar's office to walk in and tell us to read chapters of the book in person. Practically no attempt to teach this class was made. Turns out, the professor was on some kind of emergency response team that got deployed to the gulf coast in aid of Katrina/Rita.
He gets back for the second half of the semester, and the way he reviewed for tests was "One: A. Two: C. Three: C. Four: B."
This is what you're afraid of automating away.
You know the really sick thing? I had to earn a flight instructor certificate before I realized what a piss poor state our schools are in.
To work with the "robots can't eat your dinner for you" analogy, what you're doing here is you're saying you went to a bad restaurant once therefore people shouldn't bother cooking.
Look, I gave a single example out of brevity. I did ten semesters of college, every essay I've turned in was graded on formatting, punctuation, grammar, spelling and not factual accuracy, validity of research or strength of conclusions. Because doing that stuff is hard.
Multiple choice or short answer tests are easy to cram for and easy to grade. Basing curricula around them encourages cram-and-dump study methods that don't encourage actual long-term learning. You end up with students who can do high level calculus or discuss the lasting ramifications of the Treaty of Guadalupe-Hidalgo for a week or two. If AI makes it impossible to pretend we're teaching students this way anymore, so much the better.