this post was submitted on 02 Jul 2025
355 points (97.6% liked)

Technology

72362 readers
2662 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

you are viewing a single comment's thread
view the rest of the comments
[–] danciestlobster@lemmy.zip 7 points 2 days ago (4 children)

I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

[–] wewbull@feddit.uk 1 points 8 hours ago

You know how when you look at a picture of someone and you cover up the clothed bits, they look naked. Your brain fills in the gaps with what it knows of general human anatomy.

It's like that.

[–] kayzeekayzee@lemmy.blahaj.zone 9 points 1 day ago (1 children)

I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that's really good at turning blurry faces into that particular person's face.

Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

[–] gkpy@feddit.org 2 points 1 day ago* (last edited 1 day ago) (3 children)

Cheers for the explanation, had no idea that's how it works.

So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

[–] some_guy@lemmy.sdf.org 6 points 1 day ago

There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don't necessarily need to use illegal material to train to get illegal output.

[–] swelter_spark@reddthat.com 3 points 1 day ago

AI can generate images of things that don't even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.

[–] Vinstaal0@feddit.nl 2 points 1 day ago

You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.

People have also put dots on people's clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.

[–] lime@feddit.nu 9 points 1 day ago* (last edited 1 day ago)

not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like "photograph"+"person"+"small"+"pose" and generate plausible material due to the fact that all of those concepts have features in common.

you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to "steer" the output of a model towards a particular style.

you can make even a fully legal model output illegal data.

all that being said, the base dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it's like 12 billion images so it's hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.

[–] General_Effort@lemmy.world 3 points 1 day ago

This is mostly about swapping faces. You take a video and a photo of someone's face. Software can replace the face of someone in the video with that face. That's been around for a decade or so. There are other ways of doing it.

When the face belongs to an underage individual, and the video is pornographic...

LLMs only do text.