this post was submitted on 25 Jan 2026
37 points (89.4% liked)

Futurology

4112 readers
75 users here now

founded 2 years ago
MODERATORS
all 12 comments
sorted by: hot top controversial new old
[–] eleijeep@piefed.social 24 points 1 month ago (2 children)

Every day, someone in a position of power tries ChatGPT for the first time and goes "Holy shit! The computer is actually talking to me! This is the biggest thing since the invention of the telegraph!"

Then they start writing memos and press releases without actually spending the other 60 minutes using it that it takes the rest of us to realize "oh it's actually just full of shit."

[–] WoodScientist@lemmy.world 5 points 1 month ago (1 children)

The surest sign that someone's job needs to be deleted is if they feel their job can be done by AI. If your work can be done by an LLM, you're simply not doing work that's worth doing.

[–] wolframhydroxide@sh.itjust.works 4 points 1 month ago* (last edited 1 month ago)

I disagree. If anyone genuinely thinks that their own job can be done by an LLM, either:

A) that job should be streamlined out of existence

B) they have a fundamental misunderstanding of what an LLM is capable of achieving, such as if their job involves meaningless drudgery, and they don't know enough about LLMs to realise that LLMs don't even have the capacity for thought necessary to consistently follow simple heuristics without threatening nuclear annihilation.

[–] MotoAsh@piefed.social 4 points 1 month ago

ELIZA effect in full swing.

[–] yakko@feddit.uk 24 points 1 month ago

Imagine being retirement age and being in a position of power to issue big press releases like this. I bet it feels really great.

[–] SW42@lemmy.world 4 points 1 month ago (2 children)

Yeah nah. The tech is pretty impressive but it can’t replace more entry level jobs than can be achieved with pre-ai technology. Treat it like a tool and find use cases that make sense. I’d like to see small, efficient, specialized local models to help with doing basic or repetitive stuff.

[–] Windex007@lemmy.world 8 points 1 month ago* (last edited 1 month ago) (1 children)

It's so bizarre working in software development for a non-tech company.

Management is like "can you use it to automate X?" And my answer is almost always "No. It will do an unreliable job of that. But if you want X automated just TELL me that's what you want and I can seriously automate it for you in a day or two by just writing a tool"

Nope.

It blows my mind how much Management doesn't give two fucking shits about the RESULT. They ONLY want to be able to tell shareholders that something was accomplished USING AI.

Oh, and for what it's worth... since I've been at this company, I've had the same question asked of:

  • The Blockchain

  • The Metaverse

Getting questions which are solutions in search of a problem has been a harbinger of a hype train heading for a derailment

[–] SW42@lemmy.world 2 points 1 month ago

I feel you. Can’t wait for AI to go the way of the blockchain :)

[–] 42Firehawk@lemmy.zip 1 points 1 month ago (1 children)

My work deals with parts that get damaged in shipping pretty often, and every shipper has different ways they want things formatted and different asinine ways that descriptions make sense that it's a pain to describe damage so that it doesn't turn into an email chain... So our IS team trained a model to take in 3-4 images of part damage and what shipper it's from and it generates everything for us to review and send. Saves a bunch of busywork for exactly that.

[–] SW42@lemmy.world 2 points 1 month ago

That’s a good use case! Provided it doesn’t hallucinate something, but you can always have automated validation steps.

[–] Mika@piefed.ca 1 points 1 month ago

There'll be no shortages of the job offers in the trenches with how the things are going