this post was submitted on 04 Apr 2026
191 points (95.7% liked)

Ask Lemmy

38943 readers
1373 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

Lemmy, I really would like to hear your opinions on this. I am bipolar. after almost a decade of being misdiagnosed and on medication that made my manic symptoms worse, I found stable employment with good insurance and have been able to find a good psychiatrist. I've been consistently medicated for the past 3 years, and this is the most stable I have been in my entire life.

The office has rolled out the use of an app called MYIO app. My knee jerk reaction was to not be happy about the app, but I managed my emotions, took a breath and vowed to give it a chance. After being sent the link to validate my account, the app would force restart my phone at the last step of activation. (I have my phone locked down pretty tight, and lots of google shit, and data sharing is disabled, so I'm thinking that might be the cause. My phone is also like 4-5 years old, so that could also be the cause.)

Luckily I was able to complete the steps on PC and activate that way. Once I was in the account there were standard forms to sign, like the HIPAA release. There was also a form there requesting I consent to the use of AI. Hell to the NO. That's a no for me dawg.jpg.

I'm really emotional and not thinking rationally. I am hoping for the opinions of cooler heads.

If my doctor refuses to let me be a patient if I don't consent to AI, what should I do? What would you do? Agree even though this is a major line in the sand for me, or consent to keep a provider I have a rapport with, who knows me well enough to know when my meds need adjusting?

EDIT: This is the text of the AI agreement. As part of their ongoing commitment to provide the best possible service, your provider has opted to use an artificial intelligence note-taking tool that assists in generating clinical documentation based on your sessions. This allows for more time and focus to be spent on our interactions instead of taking time to jot down notes or trying to remember all the important details. A temporary recording and transcript or summary of the conversation may be created and used to generate the clinical note for that session. Your provider then reviews the content of that note to ensure its accuracy and completeness. After the note has been created, the recording and transcript are automatically deleted.

This artificial intelligence tool prioritizes the privacy and confidentiality of your personal health information. Your session information is strictly used for the purpose of your ongoing medical care. Your information is subject to strict data privacy regulations and is always secured and encrypted. Stringent business associate agreements ensure data privacy and HIPAA compliance.

you are viewing a single comment's thread
view the rest of the comments
[–] slazer2au@lemmy.world 82 points 2 days ago (4 children)

I would nope the fuck out and change doctors. A regurgitation machine prone to hallucinations has no place in medical care.

[–] Gathorall@lemmy.world 3 points 1 day ago

Yeah, though that's about 4/5 of the actual people I've met working in psychology.

[–] Tollana1234567@lemmy.today 5 points 1 day ago

i would probably report him, and leave him a bad yelp review, warning others.

[–] oneser@lemmy.zip -4 points 2 days ago (3 children)

If this was for a GP, I would agree with this stance. But a good, fitting and competent mental health professional can be harder to find.

[–] applebusch@lemmy.blahaj.zone 8 points 1 day ago (1 children)

That's the last fucking profession who should be using LLMs... People can gaslight themselves with chatbots without paying for a trusted professional to reinforce that bullshit.

[–] oneser@lemmy.zip 0 points 1 day ago* (last edited 1 day ago)

OP didn't state this clearly, but I went and looked. The app is not for replacing consults, only billing etc. so I'd put it in the "annoying, but not world ending" category.

[–] Zos_Kia@jlai.lu 2 points 1 day ago (2 children)

By god they're going to make OP change doctors just because they hate "le stochastic parrot". And op is probably in the US which makes the whole thing even crueller.

Literally a horde of teenagers playing with a bipolar's head because they have big feelings about stuff.

And all this for a fucking note taking app Jesus Christ. Yeah sure OP is probably risking their mental health in the process but who gives a shit about that when you have an occasion to proclaim that le AI bad.

[–] Washedupcynic@lemmy.ca 1 points 22 hours ago

I am concerned about what is done with the data generated via the saved recording and transcription. Yes I live in the USA. Our government is currently kidnapping people off the street and disappearing them for being brown. They are attempting to build databases identifying trans people. So yeah, I'm concerned that the third party my doctor is using, MYIO, could sell the data/transcripts, and before I know it I end up on a government list and disappeared because I am gay. Could the theft of this data being generated by the app lead to identity theft? MYIO says the videos aren't stored long term, and everything is encrypted; but companies like and the monetary penalties are just rolled into the cost of doing business. This isn't a note taking app, there are already plenty of transcribers on the market. This is something entirely different.

I've already had my identity stolen and credit cards opened in my name.

And no one is going to "MAKE" me change doctors. That's something I decide for myself.

[–] WhyJiffie@sh.itjust.works 3 points 1 day ago (1 children)

you seem to have no clue about the problem at hand. It's the lesser of issues that the AI transcriber could hallucinate. the worse problem, which is irreversible, that the treatment session and every private detail that gets discussed is funneled to at best questionable companies who will do whatever they want with your private information. once that happened, you can't just make them delete what they stored in the process, it is completely unveriable what they do besides offering the original service. everything that was told in the session will not stay between the two of you.
accepting this unknowingly is very dangerous. accepting it knowingly will alter what you say and the results with it, like going to a therapist who you know personally, which is not allowed for very good reasons.

[–] Zos_Kia@jlai.lu 1 points 1 day ago (1 children)

You think therapists and doctors in general don't use Docs or Notes services that are hosted or backed up in the cloud ? You think having your medical data leaked to tech companies is new ? Just because the notes transcription app is AI doesn't make it magically worse. In fact it makes the data harder to access as you need to re-infer the whole enchilada if you want to mine it (as opposed to, say, Google Drive who can just make a SQL query on your data and get it structured and ready to use).

It's nice that mental health is so inconsequential to you that you can balance it against privacy purity politics. It's really cool for you that you're in this position of privilege. It's not cool to be pushing on someone with a clinical condition in a way that will probably get them worse off, in a country with absolutely no mental health safety net. Just like antivax it's coated in fake concern, but you're playing a dangerous game with someone else's life and you're cool with it because you're insulated from the consequences.

You guys really are a pure product of those amoral hyper-individualistic times.

[–] WhyJiffie@sh.itjust.works 1 points 6 hours ago* (last edited 6 hours ago)

It's nice that mental health is so inconsequential to you that you can balance it against privacy purity politics.

oh now I'm a privacy purist! oh god what have I become! I want totally unreasonable things!!

or, it seems you by default don't care about privacy at all because surely who needs it, and also already forgot the case of woman in USA using online period tracker apps that outed them for having an illegal abortion.

Just like antivax it's coated in fake concern,

fake concern, sure... my concerns are very real, and OP has come for advice, asking among others what could be the consequences. well, this is one of the consequences there will be.

You guys really are a pure product of those amoral hyper-individualistic times.

yes, blame me, not the system that made this situation. don't you want to call the cops on me?

[–] phoenixarise@lemmy.world -2 points 1 day ago* (last edited 1 day ago)

I don’t believe that. They just don’t want to pay them what they’re worth. Machines don’t ask for days off or health insurance, that’s their rationale. I hope they go out of business.