this post was submitted on 13 May 2026
818 points (99.4% liked)
Linux
13648 readers
1150 users here now
A community for everything relating to the GNU/Linux operating system (except the memes!)
Also, check out:
Original icon base courtesy of lewing@isc.tamu.edu and The GIMP
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Would be lovely if my Government would even consider that.
I hate using Win11, but it seems we're so entrenched with Microslop that they're even giving "officially endorsed" courses on how to use Co-Pilot.
I understand that AI and Neural Networks have their uses, but why are people so willing to give up their ability to think and write for themselves??
Time is the main reason. In jobs where you write dozens of client-facing emails every day, small time savings compound fast.
Most people working in Outlook all day are doing exactly this kind of work: responding to clients, coordinating projects, clarifying requests, following up, documenting decisions, and managing constant communication.
Instead of writing every email from scratch, I can give AI instructions like:
“Read the email chain. The client needs X, Y, and Z. Write a draft reply in my voice.”
That takes seconds instead of several minutes per email. Across an entire workday, that can save hours.
Fair enough, but if you're a manager isn't that kind of the whole job? Communicating with people. If you're not doing that, what are you getting paid to do?
I can't imagine out-sourcing the skill-set you're being paid for to an AI tool is a great way to build up that skill. Sounds like humanity's typical great short-term idea with horrible long-term consequences.
I can write an amazing email word by word, or I can have my digital secretary draft it while I review, edit, and approve every part of it.
I don't send anything I haven't personally read and approved. The judgment, accountability, and intent are still mine.
You're absolutely right that outsourcing learning and critical thinking to AI would have serious long-term consequences. But using AI to accelerate execution after you've already developed those skills is different.
I'm paid for the experience and judgment to know what needs to be said, what matters, and what outcome the communication is supposed to achieve.
Again, fair enough - treating it akin to a draft making machine isn't a terrible idea...
But I would argue that reviewing an existing draft, while a perfectly valid skill to have, is not the same skill as actually writing that draft.
I can say from plenty of experience making and reviewing documentation, that making the first draft is always a much more demanding task than reviewing and making corrections.
And while there's nothing wrong with making life a bit easier, maintenance of skills is just as important as making them in the first place. If you want to maintain skills for the latter, you need to let yourself write some drafts too.
I mean I have a microchasm example of this myself. I used to be good at remembering phone numbers prior to being able to store them all on a smartphone. Now, even if you put a gun to my head, I can really only remember my own. And that is because I outsourced that part of my memory to my phone, just as most people have - without any attempt to reinforce it.
I’ve genuinely enjoyed this exchange. It’s rare to find someone willing to refine an argument instead of just defending a position. I appreciate that you’re actually thinking through the implications instead of reducing this to “AI good” or “AI bad.”
And honestly, I think we agree on more than we disagree.
I don’t think replacing human thought with AI is healthy. Your concern about skill atrophy is legitimate, and your point about drafting versus reviewing is stronger than many people realize. Creating a first draft exercises very different cognitive muscles than critiquing an existing one.
Where I think we differ slightly is that I see an important distinction between:
To me, that distinction matters enormously.
Someone blindly accepting AI output without understanding it puts themselves in a dangerous intellectual position. But someone who already has strong writing, reasoning, and communication skills can use AI more like a junior assistant or drafting tool while still retaining judgment, accountability, and intent.
What concerns me more is exactly what you’re pointing at: competence itself is becoming rarer.
If people start outsourcing the very processes that develop critical thinking, writing ability, synthesis, and communication before those skills fully mature, then we could absolutely weaken society’s long-term cognitive resilience.
That should concern everyone, regardless of whether they’re optimistic or pessimistic about AI.