this post was submitted on 21 Oct 2025
43 points (95.7% liked)

Philosophy

1705 readers
4 users here now

Discussion of philosophy

founded 2 years ago
MODERATORS
 

For more than a decade, researchers have wondered whether artificial intelligence could help predict what incapacitated patients might want when doctors must make life-or-death decisions on their behalf.

It remains one of the most high-stakes questions in health care AI today. But as AI improves, some experts increasingly see it as inevitable that digital “clones” of patients could one day aid family members, doctors, and ethics boards in making end-of-life decisions that are aligned with a patient’s values and goals.

Ars spoke with experts conducting or closely monitoring this research who confirmed that no hospital has yet deployed so-called “AI surrogates.” But AI researcher Muhammad Aurangzeb Ahmad is aiming to change that, taking the first steps toward piloting AI surrogates at a US medical facility.

top 21 comments
sorted by: hot top controversial new old
[–] L0rdMathias@sh.itjust.works 14 points 2 months ago (1 children)

No, for the same reason identical twins can't make that decision. They are not the same entity.

[–] foggy@lemmy.world 3 points 2 months ago (1 children)

Ah, but what if you give it your power of attorney?

If you're thinking ahead enough to set up a poa then you're thinking ahead enough to write an advanced directive. Putting your own wishes in writing is a better solution to this issue in every way.

[–] Feyd@programming.dev 12 points 2 months ago

"Digital clones" are not even close. Current "ai" is just fancy auto complete. This is simply yet another sensationalist article that exists to make people think "ai" is more than it is.

[–] ethaver@kbin.earth 12 points 2 months ago (1 children)

JUST TALK TO YOUR LOVED ONES ABOUT HOW YOU WANT TO DIE. It shouldn't be this taboo, we literally all do it!

[–] shalafi@lemmy.world 4 points 2 months ago* (last edited 2 months ago) (1 children)

Rounded up some EOL docs, filled in the blanks, tried to show my wife. She wouldn't hear it. Emailed her a plain text doc with everything she needs to know to get into my tech, insurance, money and EOL docs. Bet money she can't find it.

Guy I knew had just divorced his wife, told me he didn't have life insurance, "That bitch isn't getting any money!" "Larry, you have 3 small children and you're middle-aged."

Almost every doctor has signed a DNR. That kinda says something, don't it?!

Speaking of, I need to get this house and everything else in a trust so my wife and kids don't lose half of it in probate court. Mom went through that when dad died, what a fucking nightmare. She came out with plenty, but still, must have lost shitloads.

I used something like this for basic planning:

https://www.nia.nih.gov/health/advance-care-planning/getting-your-affairs-order-checklist-documents-prepare-future

In any case, every source agrees on what's important.

[–] ethaver@kbin.earth 4 points 2 months ago

I'm am RN and I'm a DNR. I'm psych now but when I was in school I sat 1:1 with patients who for whatever reason couldn't cooperate with care (suicide watch, dementia, delirium, etc). I think if you're going to use your POA powers to force your loved one to receive care they don't want, you should have to stand at the foot of the bed and watch and listen while we do it. If you can't stomach them screaming then you shouldn't have the right to sign off on it.

[–] baines@lemmy.cafe 8 points 2 months ago

wtf is this stupid article

[–] moonluna@lemmy.world 7 points 2 months ago (1 children)

Easy answer. No. A.i. clone.... How dystopian are people trying to take this?

[–] Tollana1234567@lemmy.today 1 points 2 months ago (1 children)

people are willing to have relationships with AIs.

[–] moonluna@lemmy.world -1 points 2 months ago

Yeah I know about those people. They should be reduced to A.I. because get with a human

[–] Mk23simp@lemmy.blahaj.zone 6 points 2 months ago

Absolutely not.

[–] teft@piefed.social 5 points 2 months ago

I don’t trust AI to give me a cookie recipe without adding rat poison or glue to the recipe so why would i trust one to make a decision like that?

[–] henfredemars@infosec.pub 4 points 2 months ago

If I prompt inject my AI clone, is it closer or further away from expressing my opinions on the matter?

[–] cerebralhawks@lemmy.dbzer0.com 4 points 2 months ago (1 children)

Sure, if it has power of attorney, which is a dumb thing to grant AI.

Before even considering that, I’d need to know it has my best interests at heart. I don’t trust AI to serve me over the corporations that made it, but I’m a bit older. Younger generations trust AI more. I don’t think that’s great but it’s their decision. It probably couldn’t be me though.

My other question is, if there’s a chance it would go against my wishes, is it really a copy of me? Shouldn’t a copy of me do what I want? That’s the real test, I think.

I’d kind of want to meet this AI. But even given all the advancements in AI, how can it be a perfect copy if it doesn’t have all my memories and know all my secrets?

[–] voracitude@lemmy.world 8 points 2 months ago

It's not possible to grant Power of Attorney to anyone or anything who doesn't meet the standard to sign a contract, e.g. they must be of sound mind. An LLM doesn't have a mind to be sound of in the first place, so it can't meet the standard.

[–] HubertManne@piefed.social 3 points 2 months ago

Maybe if my wife and all brothers and sisters and niece and nephews are gone. I don't trust its judgement over theirs.

[–] reactionality@lemmy.sdf.org 3 points 2 months ago

If it's a perfect clone it and I should both agree no matter if the answer is yes or no. So introducing AI really doesn't add anything here.

[–] 4am@lemmy.zip 3 points 2 months ago

No, for the same reason you should never teleport.

[–] Grimy@lemmy.world 3 points 2 months ago

I trust the AI to make the same decision I would (let me die).

[–] Kolanaki@pawb.social 2 points 2 months ago* (last edited 2 months ago)

Hell no. It's literally the same as tossing a coin or asking a relative.