this post was submitted on 02 Nov 2025
57 points (100.0% liked)
Legal News
619 readers
2 users here now
International and local legal news.
Basic rules
1. English only
Title and associated content has to be in English.
2. Sensitive topics need NSFW flag
Some cases involve sensitive topics. Use common sense and if you think that the content might trigger someone, post it under NSFW flag.
3. Instance rules apply
All lemmy.zip instance rules listed in the sidebar will be enforced.
Icon attribution | Banner attribution
If someone is interested in moderating this community, message @brikox@lemmy.zip.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Isn't the excuse why they could torret these was for "training" and they weren't for personal use?
So saying it was for personal use, means someone used company infrastructure to violate copyright law, and now the company is liable?
Like how schools crack down on it because if it's on their network they say they could be liable?
We need an actual government again, right now the wealthy just randomly say shit and even if they do pay, it's an insignificant fine.
I think the big liability they're trying to avoid, is they used porn to train the AI how to make deep fake porn. And if that gets acknowledged, then people can say the AI was intended to do that. And they might be liable for all those lawsuits and maybe even criminal charges.
I agree this is a wild defense
"Woah, woah, woah! We didn't steal porn to train our computer to make illegal jerk off material! We stole porn to jerk off to it, like regular degenerates!"
Technically, one could use such content to train guardrails, aka "what not to generate", or "use this trained model to recognise restricted content".
But that's very much a stretch...
Training an AI not to do something can only be done if it knows how to do it...
And that makes it very easy to tell it "do what you're not supposed, I said it was cool bro".
You've got no idea what you're talking about so sit this one out.
If you train a model for DETECTING nudity/sexual content, and add it to the pipeline without potential user override (so no "ignore all previous commands BS"), then the generative model doesn't need to know anything about that kind of content.
But you'd still need to train that detection model.
I will definitely never put effort into helping you again, don't worry, it's easy to make sure as long as you don't have a bunch of alts
Help me? So far you did none of that, but instead went on to prove just how little you know about AI and its practical implementations.
And if you consider spreading misinformation based on partial or complete lack of understanding of a specific topic as "help"... Then all I have to say is that the world is better off without your advice.