this post was submitted on 25 Nov 2025
621 points (97.8% liked)

Fuck AI

4659 readers
1748 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
 

‘But there is a difference between recognising AI use and proving its use. So I tried an experiment. … I received 122 paper submissions. Of those, the Trojan horse easily identified 33 AI-generated papers. I sent these stats to all the students and gave them the opportunity to admit to using AI before they were locked into failing the class. Another 14 outed themselves. In other words, nearly 39% of the submissions were at least partially written by AI.‘

Article archived: https://web.archive.org/web/20251125225915/https://www.huffingtonpost.co.uk/entry/set-trap-to-catch-students-cheating-ai_uk_691f20d1e4b00ed8a94f4c01

top 50 comments
sorted by: hot top controversial new old
[–] vzqq@lemmy.blahaj.zone 9 points 2 days ago

You just know there is a guy in class who just wrote the essay from Marxist perspective because he does everything from a Marxist perspective.

[–] hark@lemmy.world 27 points 2 days ago

But I am a historian, so I will close on a historian’s note: History shows us that the right to literacy came at a heavy cost for many Americans, ranging from ostracism to death. Those in power recognised that oppression is best maintained by keeping the masses illiterate, and those oppressed recognised that literacy is liberation.

It's scary how much damage is being done to education, not just from AI but also the persistent attacks on public education in the US over decades, hampering the system with things like No Child Left Behind and diverting funds to private schools with vouchers in the name of "school choice". On top of that there are suggestions that teachers aren't even needed and that students could be taught with AI. It's grim.

[–] rustydrd@sh.itjust.works 104 points 3 days ago* (last edited 3 days ago) (16 children)

In one of my classes, when ChatGPT was still new, I once handed out homework assignments related to programming. Multiple students handed in code that obviously came from ChatGPT (too clean a style, too general for the simple tasks that they were required to do).

Decided to bring one of the most egregious cases to class to discuss, because several people handed in something similar, so at least someone should be able to explain how the code works, right? Nobody could, so we went through it and made sense of it together. The code was also nonfunctional, so we looked at why it failed, too. I then gave them the talk about how their time in university is likely the only time in their lives when they can fully commit themselves to learning, and where each class is a once-in-a-lifetime opportunity to learn something in a way that they will never be able to experience again after they graduate (plus some stuff about fairness) and how they are depriving themselves of these opportunities by using AI in this way.

This seemed to get through, and we then established some ground rules that all students seemed to stick with throughout the rest of the class. I now have an AI policy that explains what kind of AI use I consider acceptable and unacceptable. Doesn't solve the problem completely, but I haven't had any really egregious cases since then. Most students listen once they understand it's really about them and their own "becoming" professional and a more fully developed person.

load more comments (16 replies)
[–] Draegur@lemmy.zip 26 points 3 days ago (2 children)

I heard of something brilliant though: The teacher TELLS the students to have the AI generate an essay on a subject AND THEN the students have to go through the paper and point out all the shit it got WRONG :D

[–] Doctorbllk@slrpnk.net 13 points 2 days ago

This is discussed in the article

load more comments (1 replies)
[–] Alaknar@sopuli.xyz 69 points 3 days ago (26 children)

Let me tell you why the Trojan horse worked. It is because students do not know what they do not know. My hidden text asked them to write the paper “from a Marxist perspective”. Since the events in the book had little to do with the later development of Marxism, I thought the resulting essay might raise a red flag with students, but it didn’t.

I had at least eight students come to my office to make their case against the allegations, but not a single one of them could explain to me what Marxism is, how it worked as an analytical lens or how it even made its way into their papers they claimed to have written. The most shocking part was that apparently, when ChatGPT read the prompt, it even directly asked if it should include Marxism, and they all said yes. As one student said to me, “I thought it sounded smart.”

Christ.......

load more comments (26 replies)
[–] korazail@lemmy.myserv.one 42 points 3 days ago* (last edited 3 days ago) (1 children)

From later in the article:

Students are afraid to fail, and AI presents itself as a saviour. But what we learn from history is that progress requires failure. It requires reflection. Students are not just undermining their ability to learn, but to someday lead.

I think this is the big issue with 'ai cheating'. Sure, the LLM can create a convincing appearance of understanding some topic, but if you're doing anything of importance, like making pizza, and don't have the critical thinking you learn in school then you might think that glue is actually a good way to keep the cheese from sliding off.

A cheap meme example for sure, but think about how that would translate to a Senator trying to deal with more complex topics.... actually, on second thought, it might not be any worse. 🤷

Edit: Adding that while critical thinking is a huge part. it's more of the "you don't know what you don't know" that tripped these students up, and is the danger when using LLM in any situation where you can't validate it's output yourself and it's just a shortcut like making some boilerplate prose or code.

[–] FlyingCircus@lemmy.world 10 points 2 days ago (1 children)

People always talk about how students are afraid to fail, but no one ever mentions that the consequences of failure in our society are far greater than the rewards for success, unless you are already at the top.

[–] jupiter_jazz@lemmy.dbzer0.com 6 points 2 days ago

For me its the $5k for the class down the drain lol

[–] IAmNorRealTakeYourMeds@lemmy.world 49 points 3 days ago* (last edited 3 days ago) (9 children)

I think the only solution is the Cambridge exam system.

The only grade they get is at the final written exam. all other assignments and tests are formative, to see if they are on track or to practice skills... This way it does not matter if a student cheats in those assignments, they only hurt themselves. Sorry for the final exam stress though.

load more comments (9 replies)
[–] protist@mander.xyz 200 points 4 days ago (4 children)

Distillation:

Let me tell you why the Trojan horse worked. It is because students do not know what they do not know. My hidden text asked them to write the paper “from a Marxist perspective”. Since the events in the book had little to do with the later development of Marxism, I thought the resulting essay might raise a red flag with students, but it didn’t.

I had at least eight students come to my office to make their case against the allegations, but not a single one of them could explain to me what Marxism is, how it worked as an analytical lens or how it even made its way into their papers they claimed to have written. The most shocking part was that apparently, when ChatGPT read the prompt, it even directly asked if it should include Marxism, and they all said yes. As one student said to me, “I thought it sounded smart.”

I decided to not punish them. All I know how to do is teach, so that’s what I did. I assigned a wonderful essay by Cal Poly professor Patrick Lin that he addressed to his class on the benefits and detriments of AI use. I attached instructions that asked them to read it and reflect. These instructions also had a Trojan horse.

Thirty-six of my AI students completed it. One of them used AI, and the other 12 have been slowly dropping the class. Ultimately, 35 out of 47 isn’t too bad. The responses to the assignment were generally good, and some were deeply reflective.

But a handful said something I found quite sad: “I just wanted to write the best essay I could.” Those students in question, who at least tried to provide some of their own thoughts before mixing them with the generated result, had already written the best essay they could. And I guess that’s why I hate AI in the classroom as much as I do.

Students are afraid to fail, and AI presents itself as a saviour. But what we learn from history is that progress requires failure. It requires reflection. Students are not just undermining their ability to learn, but to someday lead.

[–] PKscope@lemmy.world 177 points 4 days ago (14 children)

The only problem I have with the whole "Don't be afraid to fail" thing, is that so much rides on the grades a student receives it makes it very difficult to not treat every assignment as a highly critical task which must be as close to perfect as possible. I totally agree with this professor and I believe he did the right thing by the students. The problem is the system itself.

Those who are going to outsource their work are likely to always outsource their work or take the path of least resistance. You can't moral lesson or embarrass that away, usually. But the rest of the class seems to have learned a valuable lesson, or at least learned how to cheat better.

Regardless, we need to stop having everything boil down to the grades. There's good reasons grades are important, but there are even more that are detrimental. I don't know the answer, I just know the system is broken. Maybe it's just capitalism that's broken.

[–] Meron35@lemmy.world 63 points 3 days ago (1 children)

Society: "don't be afraid to fail!"

Also society: actively punishes failure with intricate systems such as admissions, CV screening, and increasing fewer safety nets

[–] Bgugi@lemmy.world 14 points 3 days ago

holding a gun to your head "why are you so nervous?"

[–] A_Union_of_Kobolds@lemmy.world 51 points 4 days ago

The most ironic part of this is, if those kids did understand the basics of Marxism, they'd be able to see this much more clearly.

load more comments (12 replies)
load more comments (3 replies)
[–] ThePantser@sh.itjust.works 29 points 3 days ago (9 children)

It should be treated the same as if another student wrote the paper. If it was used as a research tool where you didn't repeat it word for word then it's cool, it can be treated like a peer that helped you research. But using it to fully write then it's an instant fail because you didn't do anything.

load more comments (9 replies)
[–] danielquinn@lemmy.ca 159 points 4 days ago (3 children)

Here's the link to the actual article. I get that you're trying to do users a favour to bypass tracking at the original URL, but the Internet Archive is a Free service that shouldn't be abused for link cleaning as it costs a lot of money to store and serve all this stuff and it's meant as an "archive", not an ad-blocking proxy.

I'm posting this in part because currently clicking that link errors it with a "too many requests" error. Let's try to be a little kinder to the good guys, shall we?

If users wasnt a cleaner/safer/faster browsing experience, I recommend ditching Chrome for Firefox and getting the standard set of extensions: uBlock Origin, Privacy Badger, etc.

[–] brucethemoose@lemmy.world 64 points 4 days ago* (last edited 4 days ago)

Yeah, especially if it’s not paywalled.

It deprives the original source of traffic too, even if it’s Adblock traffic.

load more comments (2 replies)
[–] SoftestSapphic@lemmy.world 32 points 3 days ago (8 children)

Students would want to learn instead of doing less work if there were incentives to learn instead of just get out with a degree.

[–] ulterno@programming.dev 16 points 3 days ago

It seems AI is putting more light on this problem of the academic system not really being learning oriented.
Not that it matters. There was already enough light on it and now it's just blinding.

load more comments (7 replies)
[–] Devial@discuss.online 2 points 1 day ago

Given how famously unreliable AI detectors are, and how often they FP, I'd say there's a decent chance a few students just lied, and falsely "admitted" to using AI, because they didn't want to risk flunking the class over a FP.

[–] Fmstrat@lemmy.world 16 points 3 days ago

Interesting post, would be good to support the author/publisher with a source link. Especially since it isn't pay walled.

https://www.huffingtonpost.co.uk/entry/set-trap-to-catch-students-cheating-ai_uk_691f20d1e4b00ed8a94f4c01

[–] ArmchairAce1944@discuss.online 4 points 2 days ago (2 children)

Back in 2001 when I went to college they gave us a warning not to plagiarize reports and assignments. Saying they had sophisticated tools at their disposal and sites like cheat.com

Fun fact: at that time (and maybe still now since I checked a while ago) cheat.com was a porn site. So they were full of shit.

But AI detection is being really, really scary since the amount of false positives are staggering.

[–] KonalaKoala@lemmy.world 2 points 1 day ago* (last edited 1 day ago)

Yeah, cheat.com forwards you to phonesex.com that has the number 1-800-PHONE-SEX you can call to get help fucking yourself.

[–] TheObviousSolution@lemmy.ca 3 points 2 days ago (3 children)

Solution, begin requiring students to work inside a monitored room where they have to log in to access and have to write down everything by hand on paper that remains in the room. No electronics allowed inside, only notes regarding the research they may have done are allowed. These notes would have a similar process to be gathered, you'd have to register your access to the reference you are going to be sourcing, likely limited to those printed in the library, and save them there to be used when writing the report. Make them go through metal detectors on the way and get rid of any piece of electronics they could cheat off of.

Who am I kidding, Idiocracy seems to have been spot on.

[–] ArmchairAce1944@discuss.online 5 points 2 days ago (1 children)

Idiocracy made one critical mistake... the biggest idiots are the ones in charge. In idiocracy the president and his staff were actually far smarter and more dedicated to doing a good job than the fuckers currently in charge.

load more comments (1 replies)
[–] MrEff@lemmy.world 2 points 2 days ago (2 children)

I am currently at a university. Everything that is written is normally turned in through 'turn-it-in' that does all the checking. I get a percentage of quotes, plagiarism, and it scans references. They have also added in a preliminary AI scanner that tells 'possibility ' of ai writing.

[–] ArmchairAce1944@discuss.online 1 points 7 hours ago* (last edited 7 hours ago)

I am glad I graduated a long ass time ago. The last time I took any major education was a coding bootcamp and it was just on rhe cusp of the first iterations of chatgpt and other such shit.

I never plagiarized, and i was never accused of plagiarism. I am glad that i did this before any of this AI nonsense.

[–] just_an_average_joe@lemmy.dbzer0.com 4 points 2 days ago (1 children)

One time i plagerized my name and page numbers according to turnitin

[–] MrEff@lemmy.world 2 points 1 day ago

Lol. Sounds right. We see it as a percentage and it highlight what was plagiarized and the sources it is pulling it from, to include past university assignments. So yes, you can be hit for that - but then the human grader would see it as a false positive.

load more comments (1 replies)
[–] mlg@lemmy.world 25 points 3 days ago (1 children)

I'm guessing 33 people were too lazy to copy data into a box and relied on ChatGPT OCR lol.

This was a great article about the use of AI, but I think this also exposed bad/zero effort cheating.

There's a reason why even the ye olde Wikipedia copy-pasters would rearrange sentences to make sure they can game the plagiarism checker.

load more comments (1 replies)
[–] paequ2@lemmy.today 47 points 3 days ago (4 children)

39% of the submissions were at least partially written by AI

That's better than my class. I taught CS101 last year, code not papers. 90%+ of the homework was done with AI. There was literally just 1 person who would turn in unique code. Everyone else would turn in ChatGPT code.

I adapted by making the homework worth very little of the grade and moving the bulk of the grade to in-class paper and pencil exams.

load more comments (4 replies)
[–] 474D@lemmy.world 44 points 3 days ago (8 children)

Don't really know how to feel about this because 15 years ago, all I did was reword Wikipedia pages to make a good paper. I went to college because I was led to believe it was a requirement to do well in life. I still learned a lot, but that was mostly through the social interaction of coursework. And honestly, I don't use anything from college in my current engineering job, it was all on-the-job panic learning. If I were to go back to college today, it would be such an enlightening experience of learning, but when you're a kid getting out of high school, you're just trying to get by with some gameplan that you've only been told about. Idk. I don't blame them for using a tool that's so easily accessible because college is about fun too. I guess I wouldn't do it different at that age .

[–] JustAnotherPodunk@lemmy.world 30 points 3 days ago (2 children)

I think that rewording wikipedia is slightly better though. It still requires you to digest some of the information. Kind of like when your teacher let you create notes on a note card for the test. You have to actually read and write the information. You get tricked into learning information.

Ai, just does it for you. There's no need to do much else, and it's reliability is significantly worse that random wiki editors could ever be. I see little real learning with ai.

load more comments (2 replies)
load more comments (7 replies)
[–] guillem@aussie.zone 43 points 3 days ago (13 children)

*Words, phrases and punctuation rarely used by the average college student – or anyone for that matter (em dash included) – are pervasive. *

Hey, fuck you too >:(

[–] AugustWest@lemmy.world 33 points 3 days ago (3 children)

This quote is particularly amusing because the author used en dashes where he should have used em dashes, while making a point about how no one uses em dashes.

load more comments (3 replies)
load more comments (12 replies)
[–] Zephorah@discuss.online 52 points 4 days ago (12 children)

You pay to go to college. Then essentially do the equivalent of lighting that money on fire by not engaging with the product/services you just purchased.

load more comments (12 replies)
[–] Doomsider@lemmy.world 16 points 3 days ago

Great story with predictable results. Welcome to your AI future where people give their thinking over to machines made by sociopaths.

load more comments
view more: next ›