this post was submitted on 01 Mar 2025
45 points (97.9% liked)
World News
992 readers
576 users here now
Rules:
- Be a decent person
- No spam
- Add the byline, or write a line or two in the body about the article.
Other communities of interest:
founded 6 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can't... generate... abuse.
No more than you can generate murder.
The entire point of saying "child abuse images" is to distinguish evidence of rape from, just, drawings.
If you want drawings of this to also be illegal, fine, great, say that. But stop letting people use the language of actual real-world molestation of living human children, when describing some shit a guy made up alone.
How did they train the model? I'd say it's just as problematic if the generator was trained using CSAM.
Theoretically you should be able to generate it by cobbling together legal images.
But given the massive volume of scraped data, they've also ended up with actual CSAM in their training data. I recall seeing articles about them having to identify and remove it, but I'm definitely not adding that search to my history.
I'm not even talking about the accidentally scraped images. People retrain models to make porn that more accurately depicts their fetish all the time.