this post was submitted on 18 Apr 2026
123 points (96.9% liked)

Fuck AI

6773 readers
573 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

Tldr: this doctor started using an AI tool to transcribe meetings with patients instead of writing up his own notes about the patient's symptoms, progression, concerns, etc. And it worked very well for that specific purpose.

Unfortunately, this doctor realized, the mental work of writing up his own notes was vital to actually remembering his patients and understanding their needs.

I sat down to review a patient I had seen six weeks previously. I read the note. It was accurate. It was comprehensive. It contained no factual errors that I could identify. And I did not recognise it.

The voice in the note was not my voice. The emphasis was not my emphasis. The clinical narrative — the selective, interpretive story that a GP constructs to capture not just what was said but what mattered, what worried them, what they decided to watch and why — was absent. In its place was a faithful transcription of everything spoken, organised by structure rather than by clinical significance.

I had not written that note. An algorithm had written it, in response to sounds it had heard, and I had approved it, hurriedly, at the end of a session. Now, six weeks later, I was reading someone else’s account of a consultation I had conducted — and I couldn’t recall the patient clearly enough to reconstruct what had been left out.

This is not a small thing. The clinical note in general practice is not merely a medicolegal record. It is, as research in the Journal of General Internal Medicine has articulated, a form of narrative medicine — a clinician-authored story that reflects how the physician understood the patient’s situation at that moment in time. The act of writing it is itself a cognitive process: it forces synthesis, prioritisation, and reflection. It is, in a real sense, how we think.

When we outsource that act to an AI, we are not merely saving time. We are externalising a cognitive function that was doing clinical work we didn’t realise it was doing.

As SpaceNoodle in comments reminded me, people in the tech industry have been warning for years that outsourcing basic coding tasks to LLM tools deprives new coders of the experience they need to take on more complex tasks.

I wonder how many other professions will find the "grunt work" - like interview transcription, the stuff simple enough to be outsourced to an AI - plays a role they hadn't recognized until they outsourced it?

top 14 comments
sorted by: hot top controversial new old
[–] leadore@lemmy.world 8 points 8 hours ago

I read the note. It was accurate. It was comprehensive. It contained no factual errors that I could identify. And I did not recognise it. Now, six weeks later, I was reading someone else’s account of a consultation I had conducted — and I couldn’t recall the patient clearly enough to reconstruct what had been left out.

If you couldn't recall enough to reconstruct what had been left out, then how do you know it was accurate and comprehensive? hmm? HMMMM?

What a nightmare.

[–] DevDave@piefed.social 5 points 10 hours ago

When I was younger, I worked exclusively on a v220 (best of the worst) Unix terminal using vi to write programs in c/c++. Yeah I got a little fancy near the end of that with three separate terminals, but the point was I also had memorized an absolutely ridiculous amount of information starting at the syntax rules for four somewhat related languages (bash, c, c++, and whatever the fuck makefile's are) plus hundreds if not thousands of functions in c and c++ without needing to consult documentation or rely on autocomplete to fill in the blanks.

30+ years later I have a command palette tool I absolutely depend on along with degraded typing skills that show whenever I have to type something out completely like a peasant.

[–] merc@sh.itjust.works 14 points 13 hours ago (2 children)

This is an interesting story because:

  • The AI transcription was perfectly accurate, even with medical jargon
  • Not having to take his own notes allowed him to spend more time with his patients, and to listen to them more closely
  • He felt less stressed and less burned out as a result
  • It wasn't de-skilling him, at least not in the way we traditionally think of it

It's basically a best case scenario for LLMs and it still made things worse. Taking notes felt like a tedious thing that kept him from doing his job. But, he discovered that taking notes was part of his job, and if he didn't do it he couldn't properly care for the patients.

Maybe once he realizes why it is that it was failing him, he'll be able to adjust his process so that he can take advantage of the machine learning system. It might be as simple as looking over the results immediately after the consultation and scribbling things in the margins so he doesn't forget the key takeaways. Or, maybe the old note-taking process is simply the best one and the LLM can't offer anything to actually help.

[–] Taleya@aussie.zone 13 points 12 hours ago

Transcribing your own notes helps data retention. You're revisiting the data, clarifying and rewording, which reinforces it in your memory

[–] youcantreadthis@quokk.au 3 points 13 hours ago (1 children)

Maybe we dhoukdnt be outtong this bullshit where it doesjt bel9ng, which is anywhere?

[–] ___@lemmy.blahaj.zone 4 points 9 hours ago (1 children)
[–] youcantreadthis@quokk.au 7 points 9 hours ago (1 children)

When i typed it, it was about literacy, but thr real incomprehrnsibility was the drugs we did along the way.

[–] victorz@lemmy.world 1 points 3 hours ago

Looks it's slowly wearing off. 👍

[–] SpaceNoodle@lemmy.world 33 points 16 hours ago (2 children)

The amount I learn every time I code up some mundane tool may at times be small, but the cumulative effect is significant. It's surely how I've reached my current point in my career, and that's how it shall remain when I leave it.

Surrendering the last of our cognition to a tool has dire consequences.

[–] Saprophyte@lemmy.world 7 points 10 hours ago (1 children)

My sister is a teacher. She explains it as her students don't need to know the information in their papers, they need to know how to research, cite, and write their papers. She says she feels sorry for the students who just spit out a paper with an LLM and don't care about the skills they are letting atrophy and will eventually lose. They are losing the ability to think critically and research then take that information and summarize it to convey thoughts and ideas to others.

What happens when a generation of people lose that capability?

[–] SpaceNoodle@lemmy.world 7 points 10 hours ago

More sheep for the wolves.

[–] homes@piefed.world 10 points 16 hours ago

Practice makes perfect. But if you don’t keep practicing, how are you supposed to get any better?

[–] tiredofsametab@fedia.io 7 points 12 hours ago

Medical transcription has been a thing for years, so I find it interesting I don't recall hearing about it until now with AI. I guess it's not as popular a topic.

What he says, however, is a reason I don't like using AI for writing code or design work; I want to learn, know, and remember every part of the systems I have ownership in. Having AI do everything removes that.

[–] FederatedFreedom1981@lemmy.ca 15 points 15 hours ago

It's almost like you need to get dirty in order to learn how to dig a hole.