• pavnilschanda@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    A problem that I see getting brought up is that generated AI images makes it harder to notice photos of actual victims, making it harder to locate and save them

    • 30p87@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Depends on what kind of images. 99% of CP is just kids naked on the beach or teens posting pictures of themselves. And especially with the latter, there’s no one to save nor does it really harm anyone nor should it be as illegal as the actual 1% rape footage. And even said rape footage is just the same stuff again and again, often decades old. And off of that, I don’t think any AI could produce “usable” material.

      And of course, the group violating the law in this regard the most are the kids/teens themselves, sending nudes or uploading them to forums.