this post was submitted on 18 Apr 2026
125 points (97.0% liked)

Fuck AI

6773 readers
518 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

Tldr: this doctor started using an AI tool to transcribe meetings with patients instead of writing up his own notes about the patient's symptoms, progression, concerns, etc. And it worked very well for that specific purpose.

Unfortunately, this doctor realized, the mental work of writing up his own notes was vital to actually remembering his patients and understanding their needs.

I sat down to review a patient I had seen six weeks previously. I read the note. It was accurate. It was comprehensive. It contained no factual errors that I could identify. And I did not recognise it.

The voice in the note was not my voice. The emphasis was not my emphasis. The clinical narrative — the selective, interpretive story that a GP constructs to capture not just what was said but what mattered, what worried them, what they decided to watch and why — was absent. In its place was a faithful transcription of everything spoken, organised by structure rather than by clinical significance.

I had not written that note. An algorithm had written it, in response to sounds it had heard, and I had approved it, hurriedly, at the end of a session. Now, six weeks later, I was reading someone else’s account of a consultation I had conducted — and I couldn’t recall the patient clearly enough to reconstruct what had been left out.

This is not a small thing. The clinical note in general practice is not merely a medicolegal record. It is, as research in the Journal of General Internal Medicine has articulated, a form of narrative medicine — a clinician-authored story that reflects how the physician understood the patient’s situation at that moment in time. The act of writing it is itself a cognitive process: it forces synthesis, prioritisation, and reflection. It is, in a real sense, how we think.

When we outsource that act to an AI, we are not merely saving time. We are externalising a cognitive function that was doing clinical work we didn’t realise it was doing.

As SpaceNoodle in comments reminded me, people in the tech industry have been warning for years that outsourcing basic coding tasks to LLM tools deprives new coders of the experience they need to take on more complex tasks.

I wonder how many other professions will find the "grunt work" - like interview transcription, the stuff simple enough to be outsourced to an AI - plays a role they hadn't recognized until they outsourced it?

you are viewing a single comment's thread
view the rest of the comments
[–] Taleya@aussie.zone 13 points 14 hours ago

Transcribing your own notes helps data retention. You're revisiting the data, clarifying and rewording, which reinforces it in your memory