A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • timestatic
    link
    fedilink
    arrow-up
    33
    ·
    18 days ago

    Then also every artist creating loli porn would have to be jailed for child pornography.

      • timestatic
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        1
        ·
        18 days ago

        But this is the US… and its kind of a double standard if you’re not arrested for drawing but for generating it.

        • oberstoffensichtlich
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          17 days ago

          There is a difference between something immediately identifiable as a drawing and something almost photorealistic. If a generated image is indistinguishable from a real photo, it should be treated the same.

          • ContrarianTrail@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            16 days ago

            The core reason CSAM is illegal is not because we don’t want people to watch it but because we don’t want them to create it which is synonymous with child abuse. Jailing someone for drawing a picture like that is absurd. While it might be of bad taste, there is no victim there. No one was harmed. Using generative AI is the same thing. No matter how much simulated CSAM you create with it, not a single child is harmed in doing so. Jailing people for that is the very definition of a moral panic.

            Now, if actual CSAM was used in the training of that AI, then it’s a more complex question. However it is a fact that such content doesn’t need to be in the training data in order for it to create simulated CSAM and as long as that is the case it is immoral to punish people for creating something that only looks like it but isn’t.

          • puppycat@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            17 days ago

            I don’t advocate for either but it should NOT be treated the same. one doesn’t involve a child being involved and traumatized, id rather a necrophiliac make ai generated pics instead of… you know.