this post was submitted on 04 Apr 2026
191 points (95.7% liked)

Ask Lemmy

38943 readers
1451 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

Lemmy, I really would like to hear your opinions on this. I am bipolar. after almost a decade of being misdiagnosed and on medication that made my manic symptoms worse, I found stable employment with good insurance and have been able to find a good psychiatrist. I've been consistently medicated for the past 3 years, and this is the most stable I have been in my entire life.

The office has rolled out the use of an app called MYIO app. My knee jerk reaction was to not be happy about the app, but I managed my emotions, took a breath and vowed to give it a chance. After being sent the link to validate my account, the app would force restart my phone at the last step of activation. (I have my phone locked down pretty tight, and lots of google shit, and data sharing is disabled, so I'm thinking that might be the cause. My phone is also like 4-5 years old, so that could also be the cause.)

Luckily I was able to complete the steps on PC and activate that way. Once I was in the account there were standard forms to sign, like the HIPAA release. There was also a form there requesting I consent to the use of AI. Hell to the NO. That's a no for me dawg.jpg.

I'm really emotional and not thinking rationally. I am hoping for the opinions of cooler heads.

If my doctor refuses to let me be a patient if I don't consent to AI, what should I do? What would you do? Agree even though this is a major line in the sand for me, or consent to keep a provider I have a rapport with, who knows me well enough to know when my meds need adjusting?

EDIT: This is the text of the AI agreement. As part of their ongoing commitment to provide the best possible service, your provider has opted to use an artificial intelligence note-taking tool that assists in generating clinical documentation based on your sessions. This allows for more time and focus to be spent on our interactions instead of taking time to jot down notes or trying to remember all the important details. A temporary recording and transcript or summary of the conversation may be created and used to generate the clinical note for that session. Your provider then reviews the content of that note to ensure its accuracy and completeness. After the note has been created, the recording and transcript are automatically deleted.

This artificial intelligence tool prioritizes the privacy and confidentiality of your personal health information. Your session information is strictly used for the purpose of your ongoing medical care. Your information is subject to strict data privacy regulations and is always secured and encrypted. Stringent business associate agreements ensure data privacy and HIPAA compliance.

Edit 2: I just wanted to say that I appreciate everyone here that commented. For the most part everyone brought up valid points, and helped me see things I had not considered. I emailed my doctor and let them know I did not want to agree to the use of AI. I let them know that I was cool with transcription software being used as long as it was installed locally on their machines, but I did not want a third party online app having access to recorded sessions for the purposes of transcription. They didn't take issue with it.

Thank you everyone!

you are viewing a single comment's thread
view the rest of the comments
[–] Washedupcynic@lemmy.ca 20 points 2 days ago (3 children)

I 100% agree with you. I trust my doctor. I don't trust the app. Prior to this we were using zoom.

[–] WhyJiffie@sh.itjust.works 2 points 1 day ago* (last edited 1 day ago) (1 children)

honest question. was it no problem that zoom was being used for the sessions? I am asking because by the post, you seem to care about your privacy

[–] Washedupcynic@lemmy.ca 1 points 1 day ago (1 children)

The zoom sessions weren't being recorded, or being analyzed by AI to create a transcript. I met with my DR via zoom, and the DR took notes.

[–] WhyJiffie@sh.itjust.works 2 points 13 hours ago* (last edited 13 hours ago) (1 children)

I understand that. my point is that zoom has access to the video and audio feed in transit. Despite them being very popular, they have lied about their systems without pause when they became big during covid, including that their system is end to end encrypted, which it is not.

there are better alternatives for it but unfortunately only a little fraction of the people know about them.

to be clear I support you if you are looking for preserving your privacy with this AI transcription, just wanted to let you know that information was already leaking, even if laws have baselessly believed that they did not.

[–] ace_garp@lemmy.world 8 points 2 days ago

A video-conferencing call is generally one-to-one with the clinician you know and have a relationship with.

An AI app on your phone opens your data to being viewed and scrutinised by a 3rd party within the medical practice or outside. (Which may be a positive, adding other insights that a single person may miss) Unless this is agreed, it would be a breach of patient trust. It seems the agreement you click gives your permission to share your data anywhere that 'furthers treatment'.

It seems like massive over reach to install it on your phone, instead of on the doctor's computer(where it could still summarise all interaction).

I would say you are right to want to move away from this kind of imposition. If do you change doctor, make sure to indicate that you will not install any apps as part of your treatment.

At the very least I would install the app under a seperate user than my main account.

[–] CultLeader4Hire@lemmy.world 6 points 2 days ago (1 children)

As a person who has strong bipolar tendencies but not over the threshold for a diagnosis even I struggle with these sorts of things and often find myself asking “is this thought not just self sabotage at the end of the day?” I’m also physically disabled and go to a lot of doctors appointments, who now use AI for notes and I don’t like it either but to allow that to stand in the way of my care would absolutely be self sabotage. If my doctors started outsourcing other aspects of their jobs to AI I would seriously have a problem and would reconsider my position but note taking is incredibly time consuming for doctors and if using software that transcribes our conversations allows them to be better at their actual job of being a doctor that’s a compromise I can make, especially when I remind myself bipolar symptoms often get in the way of a persons willingness to compromise

[–] TwilitSky@lemmy.world 6 points 2 days ago

That's the biggest concern.

People who need life saving mental Healthcare are already engaging on a brave journey admitting they need help and it'd be a shame if AI crap got in the way of that.

People just don't think.