• macniel@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Mhm I have mixed feelings about this. I know that this entire thing is fucked up but isn’t it better to have generated stuff than having actual stuff that involved actual children?

    • retrospectology@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.

      As a society we should never allow the normalization of sexualizing children.

      • lily33@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Actually, that’s not quite as clear.

        The conventional wisdom used to be, (normal) porn makes people more likely to commit sexual abuse (in general). Then scientists decided to look into that. Slowly, over time, they’ve become more and more convinced that (normal) porn availability in fact reduces sexual assault.

        I don’t see an obvious reason why it should be different in case of CP, now that it can be generated.

        • Lowlee Kun@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          It should be different because people can not have it. It is disgusting, makes them feel icky and thats just why it has to be bad. Conventional wisdom sometimes really is just convential idiocracy.

    • pavnilschanda@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      A problem that I see getting brought up is that generated AI images makes it harder to notice photos of actual victims, making it harder to locate and save them

      • 30p87@feddit.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Depends on what kind of images. 99% of CP is just kids naked on the beach or teens posting pictures of themselves. And especially with the latter, there’s no one to save nor does it really harm anyone nor should it be as illegal as the actual 1% rape footage. And even said rape footage is just the same stuff again and again, often decades old. And off of that, I don’t think any AI could produce “usable” material.

        And of course, the group violating the law in this regard the most are the kids/teens themselves, sending nudes or uploading them to forums.

    • Catoblepas@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Did we memory hole the whole ‘known CSAM in training data’ thing that happened a while back? When you’re vacuuming up the internet you’re going to wind up with the nasty stuff, too. Even if it’s not a pixel by pixel match of the photo it was trained on, there’s a non-zero chance that what it’s generating is based off actual CSAM. Which is really just laundering CSAM.