this post was submitted on 19 Nov 2025
73 points (98.7% liked)

privacy

6962 readers
161 users here now

Big tech and governments are monitoring and recording your eating activities. c/Privacy provides tips and tricks to protect your privacy against global surveillance.

Partners:

founded 3 years ago
MODERATORS
 

cross-posted from: https://piefed.zip/c/privacy/p/717792/massive-leak-shows-erotic-chatbot-users-turned-womens-yearbook-pictures-into-ai-porn

Chatbot roleplay and image generator platform SecretDesires.ai left cloud storage containers of nearly two million of images and videos exposed, including photos and full names of women from social media, at their workplaces, graduating from universities, taking selfies on vacation, and more.

top 7 comments
sorted by: hot top controversial new old
[–] Flickerby@lemmy.zip 3 points 1 hour ago

So this is really gross obviously but the question of should it be illegal, and if so, where is the line, is going to be an interesting one moving forward. Getting an AI to draw a naked picture of someone for you is illegal? Comissioning an artist to draw a naked picture? What if it's just an original character who "happens to look like x person"? Learning to draw and making one yourself? Does it involve disseminatation vs personal use? If you make a nude picture of someone else but no one ever knows does it even matter? What if you have legal rights to their image? Would not want to be a lawyer involved in that field in the future, oof.

[–] itkovian@lemmy.world 6 points 4 hours ago

Social media and Big Tech, in general, has been really awful for privacy. This is just disgusting.

[–] florge@feddit.uk 30 points 10 hours ago

'Ai girls never say no' is pretty awful

[–] panda_abyss@lemmy.ca 34 points 11 hours ago (1 children)

This is pretty gross and not at all surprising.

[–] lepinkainen@lemmy.world 5 points 4 hours ago

Half of the requests on /r/grok are “yes but how do it do this to pictures I uploaded??”

NaziGPT will do pretty NSFW images (no genitalia) for stuff it created but even Elmo isn’t stupid enough to allow it for any upload.

[–] Binzy_Boi@piefed.ca 15 points 10 hours ago* (last edited 10 hours ago)

This just reminds me of the thing Google pulled not long ago asking users to submit pictures of their younger selves to generate their adult self at the baseball game with them.

Immediately thought it'd be stupid to do that since people are gonna work their ass off to bypass AI guardrails in any case including generating child porn, and lo and behold, few days later I find a tweet of someone calling out another person for creating AI-generated porn of their own daughter.

[–] swordgeek@lemmy.ca 9 points 11 hours ago

Should be fined $1000 for each image or identifiable piece information of a human being.