this post was submitted on 30 Jan 2026
5 points (100.0% liked)

Stuff and Such

113 readers
5 users here now

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] artwork@lemmy.world 5 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

What a nonsense, sorry...

Human is human already, regardless how they speak and what they do...
There's always true sympathy and empathy to find in a human, regardless how busy, dark, erroneous etc. they are...
Yet, it's the true alive soul inside in every person...

Medics are likely tired, doing their job every single day, yet they have empathy, and will always try to listen if you actually try...
They know what PAIN, AGONY, DEATH, SORROW... fear means...

LLM/"AI" will always pretend to be a human, since it's "trained"/designed, to be so, and will always be limited and incomplete...
Not to mention the initial dataset of numerous emphatic actual human has its limited memory inside, no one is responsible for.

What a hopeless sorrow is that awful trendy, advertised, mind-atrophying mess...
Source

[–] zecg@lemmy.world 2 points 3 weeks ago* (last edited 3 weeks ago)

Don't be sorry, it's not my text and not my mother. This is just a place I dump stuff that's interesting and I wish to remember, not an endorsement of stuff posted. That said, I think the author's point is, from the perspective of an old lady who spends her entire day travelling to meet a specialist physician only to get ten seconds and a prescription without being heard or practically even seen, her experience is that an LLM is more humane.

[–] TheWonderfool@lemmy.world 0 points 3 weeks ago

Human !== Humane

And given how the models are trained, they will always sound happy to talk, patient, understanding. A machine will not have a rough morning, or feel emotional (currently they are incapable of feeling emotions at all), so they will always reproduce similar behaviour.

On the other side you have humans that are actively suppressing their empathy when it goes against the mandate of their job (I do not know how it is in China, but where I live doctors are expected to sound professional, authoritative, and as a result detached).

It's not hard to imagine that a person in a weakened state will prefer speaking with a condescending and appeasing robot, than a cold and detached person.

And on a tangent, it's why LLMs are so effective and so used, because they are made specifically to cater to humans needs for acceptance and approval... Given the massive rise of cases like the one in the article, it's due time to finally start analysing and improving how we treat human interactions in our society, especially in delicate situations like patient care (but everywhere really). You cannot delegate human interaction to a machine, because you need humans to behave like machines to keep your society going....