I noticed a bit of panic around here lately and as I have had to continuously fight against pedos for the past year, I have developed tools to help me detect and prevent this content.

As luck would have it, we recently published one of our anti-csam checker tool as a python library that anyone can use. So I thought I could use this to help lemmy admins feel a bit more safe.

The tool can either go through all your images via your object storage and delete all CSAM, or it canrun continuously and scan and delete all new images as well. Suggested option is to run it using --all once, and then run it as a daemon and leave it running.

Better options would be to be able to retrieve exact images uploaded via lemmy/pict-rs api but we’re not there quite yet.

Let me know if you have any issue or improvements.

EDIT: Just to clarify, you should run this on your desktop PC with a GPU, not on your lemmy server!

  • Morgikan@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    My understanding was it’s bad practice to host images on Lemmy instances anyway as it contributes to storage bloat. Instead of coming up with a one-off script solution (albeit a good effort), wouldn’t it make sense to offload the scanning to a third party like imgur or catbox who would already be doing that and just link images into Lemmy? If nothing else wouldn’t that limit liability on the instance admins?

    • hoodlem@hoodlem.me
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I was thinking the same thing. Stop storing the images and offload to Imgur or whatever. They likely already have a solution for this issue. Show images inline instead of a link. Looks the same, no liability.

      Saying that, this is tremendously cool. I was given pause though by another poster on the thread mentioning the legality of using this in the U.S.