anyone using any kind of AI either doesn't know how consent works-- or they don't care about it.
a horrifying development in the intersection of technofascism and rape culture
This is a most excellent place for technology news and articles.
anyone using any kind of AI either doesn't know how consent works-- or they don't care about it.
a horrifying development in the intersection of technofascism and rape culture
Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they're here encouraging the next generation of misogynist scum by defending this shit, too.
And men (pretend to) wonder why we distrust them.
Ngl, I'm only leaving reply notifs on for this one to work on my blocklist.
Yeah there’s some nasty shit here. Big yikes, Lemmy.
Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.
I would categorise it as sexual harassment, not abuse. Still serious, but a different level
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.
Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?
Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.
This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.
I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.
The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.
So is this a way to take away rights by making it about kids?
I mean what the fuck. We did much less and got punished right? It didn't matter if we were on the property. Schools can hold students accountable for conduct with other students.
The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn't need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.
The problem is surely with the interaction between parents and schools. Or maybe it's just the old way of thinking. Maybe it's better to have police and courts start taking over discipline in schools.
I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here
not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like "photograph"+"person"+"small"+"pose" and generate plausible material due to the fact that all of those concepts have features in common.
you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to "steer" the output of a model towards a particular style.
you can make even a fully legal model output illegal data.
all that being said, the base dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it's like 12 billion images so it's hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.
This is mostly about swapping faces. You take a video and a photo of someone's face. Software can replace the face of someone in the video with that face. That's been around for a decade or so. There are other ways of doing it.
When the face belongs to an underage individual, and the video is pornographic...
LLMs only do text.
Aren't there already laws against making child porn?
I'd rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.
Alas, whether there's a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.