cross-posted from: https://beehaw.org/post/6795142

Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • drdiddlybadger@pawb.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Isn’t this bound to happen without built in automated tools for flagging and moderation. Not quite sure how the federation handles this sort of thing besides community modding, saying something if you see something.

  • candle_lighter@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Fortunately it’s all on Japanese instances that many instances like Mastodon.social defederate from

  • poVoq@slrpnk.net
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Very sensationalist head line.

    If you read the paper, it is mostly that one well known Japanese instance that according to Japanese laws is mostly legal.

  • macniel@feddit.de
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    Going by the blurb posted, not the link. How are the demanding more robust moderation and reporting tools when obviously reporting something even took down the instance in question?

    Who was the sponsor of this research, Zuck and Musk?

  • density@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    In just two days, researchers found 112 instances of known CSAM across 325,000 posts

    “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,”

    In the whole history of this group they have found less than 112 pieces of CSAM? It’s Stanford University. Why not drop in on a few of Jeffery Epstein’s friends and fans. They can tell you were to look.

    • emeralddawn45@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      Yeah literally. What a propaganda piece. Now do twitter, or Facebook, or Instagram. Except due to the walled garden effect of those platforms, the dangerous material probably isn’t viewable by just anyone. That doesn’t mean it’s not there though.

      • Or Reddit. You know, the website where a community dedicated to sharing CSAM was one of the biggest on the site and its lead moderator was a sitewide celebrity (oh, and Reddit’s current top admin was also a moderator on that community).

      • Quik@infosec.pub
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I don’t think it’s a propaganda piece as it’s even bringing up ideas on how to do moderation better in the Fediverse, it seems to me to be a bit too constructive to just call it propaganda and move on.

  • sciawp@lemm.ee
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    This is something I have worried about for a while. The core concept of the fediverse makes stuff like this really easy to do and there’s not really a solution. I guess government agencies just need to be on the lookout for it?