this post was submitted on 07 Jan 2026
268 points (98.9% liked)

Technology

79136 readers
2580 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it's no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.

top 29 comments
sorted by: hot top controversial new old
[–] riskable@programming.dev 94 points 2 weeks ago (7 children)

The real problem here is that Xitter isn't supposed to be a porn site (even though it's hosted loads of porn since before Musk bought it). They basically deeply integrated a porn generator into their very publicly-accessible "short text posts" website. Anyone can ask it to generate porn inside of any post and it'll happily do so.

It's like showing up at Walmart and seeing everyone naked (and many fucking), all over the store. That's not why you're there (though: Why TF are you still using that shithole of a site‽).

The solution is simple: Everyone everywhere needs to classify Xitter as a porn site. It'll get blocked by businesses and schools and the world will be a better place.

[–] givesomefucks@lemmy.world 17 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

The solution is simple: Everyone everywhere needs to classify Xitter as a porn site

I think a large part of it's popularity has become the porn, because it passes all those filters. Especially since Musk backed conservatives are blocking porn in red states, but as far as I know, never twitter.

Treat it like a porn site and lots of Republicans need to give up their ID to show they're old enough. They can't VPN around it because social media hates VPN

[–] other_cat@piefed.zip 1 points 2 weeks ago (1 children)

I know this isn't the intent but oh god I never considered that porn sites being blocked would open an avenue for a bloated billionaire to try to capitalize on it for themselves.

[–] foggenbooty@lemmy.world 3 points 2 weeks ago (1 children)

Every new law changes the rules of the game. Every new rule presents an opportunity to those who can circumvent it.

The wealthy have always succeded by gaming the system.

[–] Quexotic@infosec.pub 1 points 2 weeks ago

They had already succeeded because they had the workaround figured out when their friends made the rule.

[–] judgyweevil@feddit.it 13 points 2 weeks ago

I bet that many are simply ignorant of this new problem

[–] wltr@discuss.tchncs.de 5 points 2 weeks ago

I wonder, just another rename, X → XXX, would do well, wouldn’t it?

[–] db2@lemmy.world 5 points 2 weeks ago

It's like showing up at Walmart and seeing everyone naked (and many fucking), all over the store.

🤢🤮

[–] Lemming6969@lemmy.world 3 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

The real problem is that we ever gave a shit about human bodies, especially fake ones.

[–] riskable@programming.dev 2 points 2 weeks ago

I don't know how to tell you this but... Every body gives a shit. We're born shitters.

[–] a_non_monotonic_function@lemmy.world 2 points 2 weeks ago (1 children)

I thought the real problem is that it is generating *illegal porn.

[–] riskable@programming.dev 2 points 2 weeks ago (1 children)

Well, the CSAM stuff is unforgivable but I seriously doubt even the soulless demon that is Elon Musk wants his AI tool generating that. I'm sure they're working on it (it's actually a hard computer science sort of problem because the tool is supposed to generate what the user asks for and there's always going to be an infinite number of ways to trick it since LLMs aren't actually intelligent).

Porn itself is not illegal.

He has 100% control over the ability to alter or pull this product. If he's leaving it up while he's generating illegal pornography that is on him.

And no s*** I'm concerned about the illegal stuff.

[–] mjr@infosec.pub 1 points 2 weeks ago (1 children)

(though: Why TF are you still using that shithole of a site‽).

Maybe some places don't have alternative suppliers than Walmart? Similarly, some places have governments that still only use the porno social network for some services.

[–] silence7@slrpnk.net 17 points 2 weeks ago (2 children)

Why the &#**### is California putting Amber Alerts on a porn site?

[–] athatet@lemmy.zip 7 points 2 weeks ago

Bastard? Idk what other swear has that many letters.

[–] riskable@programming.dev 3 points 2 weeks ago

I don't know, man... Have you even seen Amber? It might be worth an alert 🤷

[–] givesomefucks@lemmy.world 45 points 2 weeks ago (3 children)

But that doesn’t leave many options for the victims. Maddie, who said she’s a 23-year-old pre-med student, woke up on New Year’s Day to an image that horrified her. On X, she had previously published a picture of herself with her boyfriend at a local bar, which two strangers altered using Grok.

I've thought a lot of things would kill twitter...

But if every time a woman posts a picture of herself, and neckbeards reply asking twitter AI to sexualize her, and the AI responds right there with it where everyone following the original account can see...

I truly don't understand how or why any women are still using it.

[–] U7826391786239@lemmy.zip 23 points 2 weeks ago

I truly don’t understand how or why any women are still using it.

FOMO, along with addiction to fake likes from fake friends for fake validation of their fake lives is a powerful mind control technique

how many years have people been saying "DELETE TWITTER ALREADY"

and now that they're being turned into porn, they still won't

[–] Gsus4@mander.xyz 5 points 2 weeks ago

Nobody with a face should use it anymore...but that will reduce traffic like...5%...

[–] CosmoNova@lemmy.world 0 points 2 weeks ago (1 children)

How is this gigantic website even legal and still online? In the civilized world, I mean.

[–] givesomefucks@lemmy.world 1 points 2 weeks ago (1 children)

It all starts with a little bill call the Telecommunications act of 1996...

And yeah, loads of people said it would lead to this shit.

But Silicon valley gave a shit ton of money to the Clinton's for the Internet part, and telecoms for the part doing away with monopoly regulations.

[–] CosmoNova@lemmy.world 1 points 2 weeks ago

You see I specified civilized countries to exclude the USA.

[–] fuzzywombat@lemmy.world 20 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

I'm pretty sure if anyone created a website that generated CSAM they'd be in jail by now. Just because it's Elon Musk doing it, authorities are fine with it? At one time law enforcement would make an effort put up a facade of justice system that's equal to everyone. This is how the masses would not rise up and dethrone the status quo. Anyone remember Martha Stewart going to jail for insider trading?

Isn't there Apple app store and Google play store policies that says this is not allowed? How come the app is still available on those mobile platforms? Where are EU regulators doing? No fines? Nothing?

We've basically reached a point where billionaires are publicly mocking and daring the rest of us to react. Do these accelerationist billionaires really think they'll come out ahead when the masses burn everything to the ground?

[–] architect@thelemmy.club 2 points 2 weeks ago* (last edited 2 weeks ago)

Yes they do. They have been threatening us with their murder robots for a decade. Who do you think they are going to use those on?

They view most everyone as takers.

[–] REDACTED@infosec.pub -5 points 2 weeks ago

I actually refuse to believe you can simply undress people with grok, but rather the fact that it's easily jailbroken, which suddenly makes this not a company's problem as the service was essentially "cracked" and used outside it's Terms of Service. Still, this IS a problem.

[–] Gsus4@mander.xyz 18 points 2 weeks ago

Ban twitter 🇧🇷🤝 🇪🇺

[–] No1@aussie.zone 4 points 2 weeks ago

I misread the title and thought it meant thousands of Musk undressed images per hour.

The horror!

[–] DreamMachine@lemmy.world -5 points 2 weeks ago

Scrolled @grok undress and bikini for a bit, most of it is girls jumping on the trend asking to change their own photos and humor.