this post was submitted on 06 Jul 2025
278 points (98.9% liked)

Fuck AI

3379 readers
1370 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

So, before you get the wrong impression, I'm 40. Last year I enrolled in a master program in IT to further my career. It is a special online master offered by a university near me and geared towards people who are in fulltime employement. Almost everybody is in their 30s or 40s. You actually need to show your employement contract as proof when you apply at the university.

Last semester I took a project management course. We had to find a partner and simulate a project: Basically write a project plan for an IT project, think about what problems could arise and plan how to solve them, describe what roles we'd need for the team etc. Basically do all the paperwork of a project without actually doing the project itself. My partner wrote EVERYTHING with ChatGPT. I kept having the same discussion with him over and over: Write the damn thing yourself. Don't trust ChatGPT. In the end, we'll need citations anyway, so it's faster to write it yourself and insert the citation than to retroactively figure them out for a chapter ChatGPT wrote. He didn't listen to me, had barely any citation in his part. I wrote my part myself. I got a good grade, he said he got one, too.

This semester turned out to be even more frustrating. I'm taking a database course. SQL and such. There is again a group project. We get access to a database of a fictional company and have to do certain operations on it. We decided in the group that each member will prepare the code by themselves before we get together, compare our homework and decide, what code to use on the actual database. So far whenever I checked the other group members' code it was way better than mine. A lot of things were incorporated that the script hadn't taught us at that point. I felt pretty stupid becauss they were obviously way ahead of me - until we had a videocall. One of the other girls shared her screen and was working in our database. Something didn't work. What did she do? Open a chatgpt tab and let the "AI" fix the code. She had also written a short python script to help fix some errors in the data and yes, of course that turned out to be written by chatgpt.

It's so frustrating. For me it's cheating, but a lot of professors see using ChatGPT as using the latest tools at our disposal. I would love to honestly learn how to do these things myself, but the majority of my classmates seem to see that differently.

you are viewing a single comment's thread
view the rest of the comments
[–] tungsten5@lemmy.zip 14 points 8 hours ago (1 children)

I’m a grad student (aerospace engineering) and I had to pick a class outside of my department from a given list. Just one of the reauirements we have in order to graduate. I picked a course on NDE. The class was tons of fun! But it involved a lot of code. We had 8 labs and all of the labs were code based. I already knew how to write the code for this class (it was basically just do math in python, matlab, C etc.) so most of the class I spent my time just figuring out the math. We each had to pick a partner in the class to do these lab assignments with. I got stuck with a foreign student from china. She was awful. She refused to do any work herself. Every assignment, due to her incompetence, I would take charge and just assign a part of the lab to her and I asked her if she knew how to do what I was asking of her and also asked if she wanted/needed any help with any of it. She always kindly declined and claimed she could do it. Turns out she couldn’t. She would just use chatGPT to do EVERYTHING. And her answers were always wrong. So it turned into me doing my part of the lab then taking her shit AI code and fixing it to complete her part of the lab. The grading for this class was unique. We would write a short lab report, turn it into the professor, and then during lab time had an interactive grading session with the professor. This meant he would read our report and ask us questions on it to gauge our understanding of our work. If he was satisfied with our answers, and our answers to the actual lab assignment were correct, he would give us a good grade. If not, he would give us back the report, tell us to fix it and go over it to prepare for the next time we did the interactive grading (if this sounds like a terrible system to you I can assure you it wasnt. It was actually really nice and very much geared towards learning which I very much appreciated). During these sessions it became clear my lab partner knew and learned nothing. But she was brave enough to have her laptop in front of her and pretend to reference the code that she didn’t write while actually asking chatgpt the question that the professor asked her and then giving that answer to the professor. It was honestly pathetic. The only reason I didn’t report her is because then I would lose access to her husband. Her husband was also in this class and he was the total opposite of her. He did the work himself and, like me, was motivated to learn the material. So when I would get stuck on a lab I would go over it with him and vice versa. Basically, I worked on the labs with her husband while she played the middle man between us. Your story OP reminds me of this. I think AI has some good use cases but too many students abuse it and just want it to do everything for them.

[–] lmmarsano@lemmynsfw.com 8 points 8 hours ago (1 children)
[–] tungsten5@lemmy.zip 3 points 8 hours ago (1 children)
[–] lmmarsano@lemmynsfw.com 3 points 8 hours ago* (last edited 4 minutes ago)

text: what is it?
let's break the web & accessibility with images of text for no good reason

probably