Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • crystal@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    You seem to be missing my point.

    This tool would not increase censorship.

    Admins are already able to implement all censorship they want.

    Admins are already able to block left-wing opinions, right-wing opinions, child porn, normal porn.

    And that already happens.

    Lots of instances (like feddit.de) block pornographic content.

    Lots of instances (like lemmy.blahaj.zone) block right-wing content.

    It is already possible, and it is already happening.

    An AI which can detect CSAM (and potentially other content) won’t change that. It will simply make the admins’ job easier.