this post was submitted on 11 Dec 2025
561 points (96.4% liked)

Technology

78060 readers
4402 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Devial@discuss.online 183 points 2 weeks ago* (last edited 2 weeks ago) (39 children)

The article headline is wildly misleading, bordering on being just a straight up lie.

Google didn't ban the developer for reporting the material, they didn't even know he reported it, because he did so anonymously, and to a child protection org, not Google.

Google's automatic tools, correctly, flagged the CSAM when he unzipped the data and subsequently nuked his account.

Google's only failure here was to not unban on his first or second appeal. And whilst that is absolutely a big failure on Google's part, I find it very understandable that the appeals team generally speaking won't accept "I didn't know the folder I uploaded contained CSAM" as a valid ban appeal reason.

It's also kind of insane how this article somehow makes a bigger deal out of this devolper being temporarily banned by Google, than it does of the fact that hundreds of CSAM images were freely available online and openly sharable by anyone, and to anyone, for god knows how long.

[–] forkDestroyer@infosec.pub 22 points 2 weeks ago (2 children)

I'm being a bit extra but...

Your statement:

The article headline is wildly misleading, bordering on being just a straight up lie.

The article headline:

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

The general story in reference to the headline:

  • He found csam in a known AI dataset, a dataset which he stored in his account.
  • Google banned him for having this data in his account.
  • The article mentions that he tripped the automated monitoring tools.

The article headline is accurate if you interpret it as

"A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It" ("it" being "csam").

The article headline is inaccurate if you interpret it as

"A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It" ("it" being "reporting csam").

I read it as the former, because the action of reporting isn't listed in the headline at all.

^___^

[–] Blubber28@lemmy.world 6 points 2 weeks ago (3 children)

This is correct. However, many websites/newspapers/magazines/etc. love to get more clicks with sensational headlines that are technically true, but can be easily interpreted as something much more sinister/exciting. This headline is a great example of it. While you interpreted it correctly, or claim to at least, there will be many people that initially interpret it the second way you described. Me among them, admittedly. And the people deciding on the headlines are very much aware of that. Therefore, the headline can absolutely be deemed misleading, for while it is absolutely a correct statement, there are less ambiguous ways to phrase it.

load more comments (3 replies)
load more comments (1 replies)
[–] MangoCats@feddit.it 17 points 2 weeks ago

Google’s only failure here was to not unban on his first or second appeal.

My experience of Google and the unban process is: it doesn't exist, never works, doesn't even escalate to a human evaluator in a 3rd world sweatshop - the algorithm simply ignores appeals inscrutably.

load more comments (37 replies)
[–] TheJesusaurus@sh.itjust.works 102 points 2 weeks ago (1 children)

Why confront the glaring issues with your "revolutionary" new toy when you could just suppress information instead

[–] Kyrgizion@lemmy.world 23 points 2 weeks ago (1 children)

This was about sending a message: "stfu or suffer the consequences". Hence, subsequent people who encounter similar will think twice about reporting anything.

[–] Devial@discuss.online 30 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

Did you even read the article ? The dude reported it anonymously, to a child protection org, not google, and his account was nuked as soon as he unzipped the data, because the content was automatically flagged.

Google didn't even know he reported this, and Google has nothing whatsoever to do with this dataset. They didn't create it, and they don't own or host it.

load more comments (4 replies)
[–] AngryishHumanoid@lemmynsfw.com 53 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

"Sign up for free access."

[–] floquant@lemmy.dbzer0.com 9 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Nooo I was liking 404 :/ ~~Sucks to see them enshittified too..~~

edit: that was too harsh, I take it back.

[–] StitchInTime@piefed.social 22 points 2 weeks ago

I think they’ve always been like this for some of their posts, and honestly I’m considering getting a paid subscription to support them. Sucks, but they’ve been putting out quality content in exchange for your email address and some metrics - I’d call it a fair trade.

[–] tja@sh.itjust.works 14 points 2 weeks ago (2 children)

They are doing it because of AI scraper. But that is for some time now already

load more comments (2 replies)
load more comments (1 replies)
[–] killea@lemmy.world 45 points 2 weeks ago (2 children)

So in a just world, google would be heavily penalized for not only allowing csam on their servers, but also for violating their own tos with a customer?

[–] shalafi@lemmy.world 19 points 2 weeks ago (3 children)

We really don't want that first part to be law.

Section 230 was enacted as part of the Communications Decency Act of 1996 and is a crucial piece of legislation that protects online service providers and users from being held liable for content created by third parties. It is often cited as a foundational law that has allowed the internet to flourish by enabling platforms to host user-generated content without the fear of legal repercussions for that content.

Though I'm not sure if that applies to scraping other server's content. But I wouldn't say it's fair for the scraper to review everything. If we don't like that take, then we should illegalize scraping altogether, but I'm betting there are unwanted side effects to that.

While I agree with Section 230 in theory, it is often only used in practice to protect megacorps. For example, many Lemmy instances started getting spammed by CSAM after the Reddit API migration. It was very clearly some angry redditors who were trying to shut down instances, to try and keep people on Reddit.

But individual server owners were legitimately concerned that they could be held liable for the CSAM existing on their servers, even if they were not the ones who uploaded it. The concern was that Section 230 would be thrown out the window if the instance owners were just lone devs and not massive megacorps.

Especially since federation caused content to be cached whenever a user scrolled past another instance’s posts. So even if they moderated their own server’s content heavily (which wasn’t even possible with the mod tools that existed at the time), then there was still the risk that they’d end up cacheing CSAM from other instances. It led to a lot of instances moving from federation blacklists to whitelists instead. Basically, default to not federating with an instance, unless that instance owner takes the time to jump through some hoops and promises to moderate their own shit.

load more comments (2 replies)
[–] abbiistabbii@lemmy.blahaj.zone 5 points 2 weeks ago (8 children)

This, literally the only reason I could guess is that it is to teach AI to recognise childporn, but if that is the case, why is google going it instead of like, the FBI?

[–] gustofwind@lemmy.world 6 points 2 weeks ago (1 children)

Who do you think the FBI would contract to do the work anyway 😬

Maybe not Google but it would sure be some private company. Our government doesn’t do stuff itself almost ever. It hires the private sector

load more comments (1 replies)
load more comments (7 replies)
[–] finitebanjo@lemmy.world 29 points 2 weeks ago (3 children)

My dumb ass sitting here confused for a solid minute thinking CSAM was in reference to a type of artillery.

[–] pigup@lemmy.world 15 points 2 weeks ago

Combined surface air munitions

[–] llama@lemmy.zip 4 points 2 weeks ago (1 children)

Right I thought it was cyber security something or other like API keys now duck duck go probably thinks I'm a creep

load more comments (1 replies)
load more comments (1 replies)
[–] cyberpunk007@lemmy.ca 25 points 2 weeks ago (2 children)

Child sexual abuse material.

Is it just me or did anyone else know what "CSAM" was already?

[–] pipe01@programming.dev 13 points 2 weeks ago (4 children)

Yeah it's pretty common, unfortunately

load more comments (4 replies)
[–] chronicledmonocle@lemmy.world 6 points 2 weeks ago

I had no idea what the acronym was. Guess I'm just sheltered or something.

[–] hummingbird@lemmy.world 19 points 2 weeks ago

It goes to show: developers should make sure they don't make their livelihood dependent on access to Google services.

[–] arararagi@ani.social 9 points 2 weeks ago

"stop noticing things" -Google

[–] rizzothesmall@sh.itjust.works 8 points 2 weeks ago (3 children)

Never heard that acronym before...

[–] TheJesusaurus@sh.itjust.works 20 points 2 weeks ago (1 children)

Not sure where it originates but it's the preferred term in UK policing and therefore most media reporting to refer to what might have been called "CP" on the interweb in the past as CSAM. Probably because porn implies it's art rather than crime, and also just a wider umbrella term

[–] Zikeji@programming.dev 18 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

It's also more distinct. CP has many potential definitions. CSAM only has the one I'm aware of.

[–] yesman@lemmy.world 10 points 2 weeks ago (2 children)

LOL, You mean the letters C and P can stand for lots of stuff. At first I thought you meant the term "child porn" was ambiguous.

[–] drdiddlybadger@pawb.social 6 points 2 weeks ago (1 children)

Weirdly people have also been intentionally diluting the term to expand it to other things which causes a number of legal issues.

load more comments (1 replies)
load more comments (1 replies)
load more comments (2 replies)
[–] rizzothesmall@sh.itjust.works 11 points 2 weeks ago

Lol why tf people downvoting that? Sorry I learned a new fucking thing jfc.

load more comments (1 replies)
[–] devolution@lemmy.world 6 points 2 weeks ago

Gemini likes twins...

...I'll see myself out.

load more comments
view more: next ›