this post was submitted on 28 Aug 2023
4 points (100.0% liked)

Lemmy.World Announcements

30810 readers
2 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages πŸ”₯

https://status.lemmy.world/

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations πŸ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS
 

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

top 18 comments
sorted by: hot top controversial new old
[–] Feathercrown@lemmy.world 1 points 2 years ago (2 children)
[–] Whitehat93875@lemmy.world 2 points 2 years ago (1 children)

That's not a troll, CSAM goes well beyond trolling, pedophile would be a more accurate term for them.

[–] CoderKat@lemm.ee 1 points 2 years ago* (last edited 2 years ago)

Yeah. A troll might post something like a ton of oversized images of pig buttholes. Who the fuck even has access to CSAM to post? That's something you only have on hand if you're a predator already. Nor is it something you can shrug off like "lol I was only trolling". It's a crime that will send you to jail for years. It's a major crime that gets entire police units dedicated to it. It's a huuuuge deal and I cannot even fathom what kind of person would risk years in prison to sabotage an internet forum.

[–] expatriado@lemmy.world 1 points 2 years ago (2 children)

troll is too mild of an adjective for these people

[–] PM_Your_Nudes_Please@lemmy.world 2 points 2 years ago (2 children)

Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go β€œoh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

[–] gammasfor@sh.itjust.works 1 points 2 years ago

And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.

[–] CharlesDarwin@lemmy.world 0 points 2 years ago

Sounds like a digital form of SWATing.

[–] Feathercrown@lemmy.world 2 points 2 years ago (1 children)

How about "pedophile"? I mean, they had to have the images to post them.

[–] jarfil@lemmy.world 1 points 2 years ago (1 children)

"Terrorist". Having the images doesn't mean they liked them, they used them to terrorize a whole community though.

[–] HelloHotel@lemmy.world 1 points 2 years ago* (last edited 2 years ago)

"Petophilile enabled Terrorist" or "petophilic terrorist" depending on the person

It still means they can tolerate CSAM or are normalized to it enough that they can feel anything other than discust during "shipping and handling".

I've actually never seen anything of this stuff so good job!

[–] STRIKINGdebate2@lemmy.world 1 points 2 years ago (2 children)

I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don't worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won't. Dm me If you wish to apply for mod.

Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

[–] gabe@literature.cafe 1 points 2 years ago

Please, please, please do not blame yourself for this. This is not your fault. You did what you were supposed to do as a mod and stepped up and asked for help when you needed to, lemmy just needs better tools. Please take care of yourself.

[–] lwadmin@lemmy.world 0 points 2 years ago (1 children)

@Striker@lemmy.world this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

[–] rob_t_firefly@lemmy.world 0 points 2 years ago (1 children)

Hopefully the devs will take the lesson from this incident and put some better tools together.

[–] WhiskyTangoFoxtrot@lemmy.world -1 points 2 years ago (2 children)

Or we'll finally accept that the core Lemmy devs aren't capable of producing a functioning piece of software and fork it.

[–] Bread@sh.itjust.works 1 points 2 years ago

Its not easy to build a social media app, forking it won't make it any easier to solve this particular problem. Joining forces to tackle an inevitable problem is the only solution. The Lemmy devs are more than willing to accept pull requests for software improvements.

[–] x1gma@lemmy.world 1 points 2 years ago

And who's gonna maintain the fork? Even less developers from a split community? You have absolutely no idea what you're talking about.