Isn't the excuse why they could torret these was for "training" and they weren't for personal use?
So saying it was for personal use, means someone used company infrastructure to violate copyright law, and now the company is liable?
Like how schools crack down on it because if it's on their network they say they could be liable?
We need an actual government again, right now the wealthy just randomly say shit and even if they do pay, it's an insignificant fine.
I think the big liability they're trying to avoid, is they used porn to train the AI how to make deep fake porn. And if that gets acknowledged, then people can say the AI was intended to do that. And they might be liable for all those lawsuits and maybe even criminal charges.