• TheObviousSolution@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    He then allegedly communicated with a 15-year-old boy, describing his process for creating the images, and sent him several of the AI generated images of minors through Instagram direct messages. In some of the messages, Anderegg told Instagram users that he uses Telegram to distribute AI-generated CSAM. “He actively cultivated an online community of like-minded offenders—through Instagram and Telegram—in which he could show off his obscene depictions of minors and discuss with these other offenders their shared sexual interest in children,” the court records allege. “Put differently, he used these GenAI images to attract other offenders who could normalize and validate his sexual interest in children while simultaneously fueling these offenders’ interest—and his own—in seeing minors being sexually abused.”

    I think the fact that he was promoting child sexual abuse and was communicating with children and creating communities with them to distribute the content is the most damning thing, regardless of people’s take on the matter.

    Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.

    • Saledovil@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.

      The image depicts mature women, not children.

      • BangCrash@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        Correct. And OP’s not saying it is.

        But to place that sort of image on an article about CSAM is very poorly thought out

    • Maggoty@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Wait do you think all Hentai is CSAM?

      And sending the images to a 15 year old crosses the line no matter how he got the images.

  • Darkard@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    And the Stable diffusion team get no backlash from this for allowing it in the first place?

    Why are they not flagging these users immediately when they put in text prompts to generate this kind of thing?

    • macniel@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      You can run the SD model offline, so on what service would that User be flagged?

  • Kedly@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Ah yes, more bait articles rising to the top of Lemmy. The guy was arrested for grooming, he was sending these images to a minor. Outside of Digg, anyone have any suggestions for an alternative to Lemmy and Reddit? Lemmy’s moderation quality is shit, I think I’m starting to figure out where I lean on the success of my experimental stay with Lemmy

    Edit: Oh god, I actually checked digg out after posting this and the site design makes it look like you’re actually scrolling through all of the ads at the bottom of a bulshit clickbait article

  • macniel@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Mhm I have mixed feelings about this. I know that this entire thing is fucked up but isn’t it better to have generated stuff than having actual stuff that involved actual children?

    • retrospectology@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.

      As a society we should never allow the normalization of sexualizing children.

      • lily33@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        7 months ago

        Actually, that’s not quite as clear.

        The conventional wisdom used to be, (normal) porn makes people more likely to commit sexual abuse (in general). Then scientists decided to look into that. Slowly, over time, they’ve become more and more convinced that (normal) porn availability in fact reduces sexual assault.

        I don’t see an obvious reason why it should be different in case of CP, now that it can be generated.

        • Lowlee Kun@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          It should be different because people can not have it. It is disgusting, makes them feel icky and thats just why it has to be bad. Conventional wisdom sometimes really is just convential idiocracy.

    • Catoblepas@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Did we memory hole the whole ‘known CSAM in training data’ thing that happened a while back? When you’re vacuuming up the internet you’re going to wind up with the nasty stuff, too. Even if it’s not a pixel by pixel match of the photo it was trained on, there’s a non-zero chance that what it’s generating is based off actual CSAM. Which is really just laundering CSAM.

    • pavnilschanda@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      A problem that I see getting brought up is that generated AI images makes it harder to notice photos of actual victims, making it harder to locate and save them

      • 30p87@feddit.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Depends on what kind of images. 99% of CP is just kids naked on the beach or teens posting pictures of themselves. And especially with the latter, there’s no one to save nor does it really harm anyone nor should it be as illegal as the actual 1% rape footage. And even said rape footage is just the same stuff again and again, often decades old. And off of that, I don’t think any AI could produce “usable” material.

        And of course, the group violating the law in this regard the most are the kids/teens themselves, sending nudes or uploading them to forums.

  • PirateJesus@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    OMG. Every other post is saying their disgusted about the images part but it’s a grey area, but he’s definitely in trouble for contacting a minor.

    Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.

    https://www.thefederalcriminalattorneys.com/possession-of-lolicon

    https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

      • zbyte64@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Big brain PDF tells the judge it is okay because the person in the picture is now an adult.

        • arefx@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 months ago

          You can say pedophile… that “pdf file” stuff is so corny and childish. Hey guys lets talk about a serious topic by calling it things like “pdf files” and “Graping”. Jfc

          • RGB3x3@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            7 months ago

            Why do people say “graping?” I’ve never heard that.

            Please tell me it doesn’t have to do with “The Grapist” video that came out on early YouTube.