this post was submitted on 30 Jan 2026
5 points (100.0% liked)
Stuff and Such
113 readers
5 users here now
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Human !== Humane
And given how the models are trained, they will always sound happy to talk, patient, understanding. A machine will not have a rough morning, or feel emotional (currently they are incapable of feeling emotions at all), so they will always reproduce similar behaviour.
On the other side you have humans that are actively suppressing their empathy when it goes against the mandate of their job (I do not know how it is in China, but where I live doctors are expected to sound professional, authoritative, and as a result detached).
It's not hard to imagine that a person in a weakened state will prefer speaking with a condescending and appeasing robot, than a cold and detached person.
And on a tangent, it's why LLMs are so effective and so used, because they are made specifically to cater to humans needs for acceptance and approval... Given the massive rise of cases like the one in the article, it's due time to finally start analysing and improving how we treat human interactions in our society, especially in delicate situations like patient care (but everywhere really). You cannot delegate human interaction to a machine, because you need humans to behave like machines to keep your society going....