this post was submitted on 31 Mar 2025
103 points (100.0% liked)

Technology

38484 readers
521 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 27 points 1 week ago* (last edited 1 week ago) (5 children)

Though this content could flourish in pockets of the fediverse, the scary scenario of prevalent child sexual abuse material is not the case. There are many moderation tools, including shared blocklists, that prevent it. However, the idea that the fediverse is full of harmful content was used by Elon Musk to justify his anti-competitive decision to block links from X to Mastodon.

Didn't he unban someone who posted one of the worst CSAM videos known?

[–] [email protected] 14 points 1 week ago (4 children)

CSAM is a risk on any platform. I guarantee there's private subreddits with it.

[–] [email protected] 11 points 1 week ago (3 children)

There are some public numbers on how many occurrences are found each year on the major platforms.

IIRC, Facebook deals with around 75 million reports per year. Twitter, Reddit, and others were around 20 million reports per year.

I don't know how many are dealt with on Mastodon or Lemmy (or how you'd even get reliable numbers for that), but something tells me it's a lot less than the bigger platforms these days.

[–] [email protected] 5 points 1 week ago (1 children)

In the four years that I've been an admin here I've only seen one CSAM case. I don't want to see another one. It was very difficult dealing with it on a personal level.

[–] [email protected] 2 points 1 week ago

I'm very sorry for you. People might not realize how traumatizing having to deal with it can be. It definitely shouldn't be the responsibility of people without proper support or training.

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)