this post was submitted on 28 Feb 2025
20 points (100.0% liked)
Technologie - 🤖
752 readers
10 users here now
Ici concerne le champs de domaine large de la technologie : actualités sur l'informatique, partage de programme informatique et de code, montrer vos projets Arduino, ect.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So it does not scan your photos, it's a library that can be used by apps. I think the point is to allow apps to reject or tag a picture without ever sending it to a server for scanning, thus taking off the load of these servers and putting it on the client-side.
Like, you're trying to post a story on Instagram, the app asks Safetycore if the picture contains porn/violence/something they don't want, Safetycore says yes/no, and Instagram accepts and tag or refuses the picture accordingly.
The danger here IMO is less about privacy and more about censoring : we know every time something is pushed to fight child porn, it ends up used to control activists and political opponents. People may be restricted to share proofs of police violence for example.
No problem if you're using Lemmy, you can use any front-end, so you can use an app that won't use SafetyCore.