Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).

  • remotelove@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    14 days ago

    Yes, CSAM is bad.

    However, false positives from scans also have the possibility to destroy lives. While I wouldn’t cry about Apple losing millions in false-postive related lawsuits, it’s simply not a good thing in this case.