17
submitted 3 days ago by [email protected] to c/[email protected]

How can one configure their Lemmy instance to reject illegal content? And I mean the bad stuff, not just the NSFW stuff. There are some online services that will check images for you, but I'm unsure how they can integrate into Lemmy.

As Lemmy gets more popular, I'm worried nefarious users will post illegal content that I am liable for.

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 5 points 3 days ago* (last edited 3 days ago)

https://github.com/db0/fedi-safety can scan images pre and post upload for you for CSAM, including novel GenAI ones. If you need pre-scanning, you will also need to run this service https://github.com/db0/pictrs-safety along with your instance. Both of these need a budget GPU to do the scans, but you can use your home PC.

this post was submitted on 12 Jul 2025
17 points (94.7% liked)

Lemmy Support

4931 readers
1 users here now

Support / questions about Lemmy.

Matrix Space: #lemmy-space

founded 6 years ago
MODERATORS