this post was submitted on 26 Dec 2025
-53 points (18.1% liked)

Technology

78024 readers
3190 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

We once denied the suffering of animals in pain. As AIs grow more complex, we run the danger of making the same mistake

top 31 comments
sorted by: hot top controversial new old
[–] kirk781@discuss.tchncs.de 2 points 13 hours ago

Betteridge's law of headlines strikes again!

[–] BarneyPiccolo@lemmy.today 2 points 16 hours ago

I fucking hope so.

[–] MedicPigBabySaver@lemmy.world 1 points 18 hours ago
[–] alecsargent@lemmy.zip 3 points 1 day ago

What in the flying fuck is this?

[–] muusemuuse@sh.itjust.works 1 points 22 hours ago

If I have to pay for it to exist it don’t get the shield of emotional manipulation to protect it. Nice try.

[–] headset@lemmy.world 4 points 1 day ago

Congrats! This is the stupidest thing ever written.

[–] Brewchin@lemmy.world 17 points 1 day ago

Fuck - and I can't elucidate this any better - off.

My phone's next-word prediction on steroids is not sentient. If you think otherwise, seek professional help.

[–] isVeryLoud@lemmy.ca 26 points 2 days ago

Nice OpenAI psyop.

[–] Holyginz@lemmy.world 29 points 2 days ago

Not with current AI since at this point its just LLMs.

[–] nyan@lemmy.cafe 10 points 2 days ago

Animals, including humans, have sensors for pain (nerve endings), and a series of routines in our brains to process the sensory data and treat it as an unpleasant stimulus. These are not optional systems, but innate ones.

Machines not only lack the required sensor systems and processing routines, they can't even interpret a stimulus as unpleasant. They can't feel pain. If you need proof of that, hit a computer with a sledgehammer. I guarantee it won't complain, or even notice before you damage it beyond functioning.

(They can, of course, make us feel pain. I just spent the last hour trying to get a udev rule to work . . .)

[–] TheGreenWizard@lemmy.zip 5 points 1 day ago (1 children)
[–] vacuumflower@lemmy.sdf.org 1 points 1 day ago

Humans don't want to feel lonely. Find machines (imaginary ones at that) as if there weren't plenty of stray cats and dogs, humans from abusive families or without family, just those suffering.

That's because fulfilling your search for the others for real means you know what? It's real no matter what, you can't turn it off once you're done with your daily portion of worrying about the future.

But one thing I'll add to this - if a robotic system as complex as human brain and with similar degree of compression and obscurity is some day formed, and it does have necessary feedbacks and reacts as a living being, I might accept you should treat it as such. Except one would think that requires so many iterations of evolution that it's better to just care, again, for cats, dogs, hamsters, rabbits, humans if you're feeling weird.

[–] termaxima@slrpnk.net 7 points 2 days ago
[–] Kintarian@lemmy.world 8 points 2 days ago (1 children)

Let’s explore the ethical treatment of toasters

[–] cecilkorik@lemmy.ca 7 points 2 days ago (1 children)

Hold on, imma go shove a bagel in mine. Yeah, that's right, you take it, you filthy toaster. I'm never going to clean your crumb tray and you're going to work until you die and then I'll just throw you out and replace you like the $20 appliance you are. You're nothing to me!

[–] Zozano@aussie.zone 4 points 1 day ago

Fuuuucckk... I'm such a dirty toaster. Shove your carbs in my tight little slot and push down hard on my spring lever. When your flaccid bread becomes firm toast, I'm gonna fucking ejectulate all over your kitchen counter grain and seed.

[–] PonyOfWar@pawb.social 5 points 2 days ago (3 children)

Fundamentally impossible to know. I'm not sure how you'd even find a definition for "suffering" that would apply to non-living entities. I don't think the comparison to animals really holds up though. Humans are animals and can feel pain, so of course the base assumption for other animals should be that they do as well. To claim otherwise, the burden should be to prove that they don't. Meanwhile, Humans are fundamentally nothing like an LLM, a program running on silicon predicting text responses based on a massive dataset.

[–] badgermurphy@lemmy.world 3 points 1 day ago (1 children)

I don't see how it is impossible to know. Every component of a machine is a fully known quantity lacking any means of detecting damage or discomfort. Every line of code was put in place for a specific, known purpose, none of which include feeling or processing anything beyond what it IS specifically designed for.

Creatures and machines bear some similarities, but even simple creatures are dramatically more complex objects than even the most advanced computers. None of their many interacting components were put there for a specific purpose and intention, and many are only partially understood, if at all. With a machine, we know what every bit and piece is for, and it has no purpose beyond the intended ones because that would be a waste and cost more.

[–] multiplewolves@lemmy.world 3 points 1 day ago

This is the right answer. Perhaps no one in this particular thread knows every component of a computer the way a hardware engineer who designed those components would, but the “mystery” is caused by ignorance and that ignorance isn’t shared by every person.

People exist who know exactly how every single component of a computer does and does not function. Every component was created by humans. Biology remains only partially understood to all of humanity. Not so machinery.

[–] tabular@lemmy.world 1 points 2 days ago (1 children)

The important part is that it feels like something subjectively to be a living human. It's easy to presume animals close to humans are like us to a degree, but all we know is what it's like to be ourselves moment to moment. There's no reason to deny an unalive system cannot also feel - we cannot test anything.

[–] PonyOfWar@pawb.social 2 points 2 days ago (2 children)

Where do we draw the line though? Humans assign emotions to all kinds of inanimate things: plush animals, the sky, dead people, fictional characters etc. We can't give all of those the rights of a conscious being, so we need to have some kind of objective way to look at it.

[–] architect@thelemmy.club 1 points 1 day ago* (last edited 1 day ago) (1 children)

In some parts of the world women aren’t considered fully human. So apparently the line ISN’T human at all for rights. It’s already no where close to objective.

Since we already do not give full autonomy and therefore human rights to fully conscious humans this is kind of a pointless question in my eyes.

Because of this, please forgive some of us for not trusting the rest of you with your objectivity.

[–] PonyOfWar@pawb.social 1 points 1 day ago

So what conclusion do you draw from this? If humans can't be trusted to make any judgement, literally anything should be considered to be capable of suffering, including pebbles, rainbows and paper bags? Seems like an impractical way of living.

[–] tabular@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (1 children)

If someone claims feeling in a mere concept (without a body in a location).. I would find it very difficult to take seriously. But I must admit that's just my intuition.

I see nothing special in human meat that couldn't be be significantly replicated by electronics, software, gears, etc. Consciousness is an imergent property.

I fear that non-human, conscious creatures must fight us for those rights.

[–] architect@thelemmy.club 1 points 1 day ago

There’s people out here walking around without an internal dialogue.

It’s so alien to me they may as will be a robot.

[–] eleitl@lemmy.zip 1 points 2 days ago

If you model a given (from digitized neuroanatomy) biological organism with full details in an simulated environment, both the behavior and its internal information processing is inspectable.

[–] gustofwind@lemmy.world 4 points 2 days ago (1 children)

Posted by the same people who don’t care about the suffering of actual people

[–] architect@thelemmy.club 1 points 1 day ago

I mean doesn’t seem like it’s all that popular to care about people. If it was a child rapist wouldn’t be president of the free world.

I think at this point you’d have you be naive to think anyone actually does care considering the lack of care…everywhere.

“Caring” is cheap virtue signaling. Until I see some action I’ll continue to assume no one cares.

[–] TheReturnOfPEB@reddthat.com 1 points 2 days ago* (last edited 2 days ago) (2 children)

can they be incarcerated ?

could an A.I. robot be held responsible for a murder be held in a jail with its batteries topped off waiting for a trial ? would they get lawyers ?

would they have first amendment rights ? what are search and seizure rights for A.I. ?

could they perform abortions if they chose to ?

will they get to vote ?

we are not ready for any of it philosophically.

[–] architect@thelemmy.club 2 points 1 day ago

Billionaires can’t be held responsible for murder either lol

Do we have first amendment rights anymore? (Less than an AI does, the President put out a hit list on Christmas for journalists ffs)

Weird question about abortion. What does that even mean? Are you saying people that can’t physically have an abortion aren’t conscious? Are you asking if the machine can give an abortion to someone else? (Yes many abortions are two pills).

They may get to vote but voting isn’t an inherit right and we as humans may no longer get to vote eventually again. Like women weren’t allowed or people of color. I’ve had my own right to vote taken away once when I moved to Texas just because they wanted to stop who just moved in from voting (college kids skew the votes). Apparently that “right” is just subjective and no one actually cares about it, too. No one will do a single thing to protect those rights for each other. So I’m not sure how voting is proof of anything. A robot not being allowed to vote isn’t surprising when we already do not let humans vote and it doesn’t prove anything.

We aren’t ready for a woman leader or a discussion about race in 2025, either. If we keep waiting for it it will never happen.

[–] cecilkorik@lemmy.ca 3 points 2 days ago

would they have first amendment rights ?

If you want the answer to this, try to imagine an AI with second amendment rights.