A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • Modern_medicine_isnt@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    In the end you can’t stop it anymore than you can stop teen boys from wanking. Eventually there will just be fake nudes of everyone so it will have no meaning. It sucks, but it is how it is. Maybe people should get out in front of it by generating there own deep fakes of themsleves, but embellish them some so they have an obvious fakeness and age them up to legal age or something.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Does it suck? A future where people have gotten over feeling ashamed of having bodies sounds pretty cool.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          If nudity wasn’t a big deal, it wouldn’t even occur to them to harass girls with fake nudes, and nobody would care if they tried.

          • ParsnipWitch@feddit.de
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            They could still do it for self-gratification. And the problem in that is objectifying other people.

            Regardless of whether or not they would still do it when nudity was something humans didn’t have emotions over, it would still be wrongdoing against another person. That’s the problem that has to be tackled.

            I don’t think it’s less realistic than removing emotions about nudity in people.

  • calypsopub@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    So as a grown woman, I’m not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That’s more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

    I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won’t dare to share it outside their sick incels club.

    • foo@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      What if the deep fake was so real it was hard to tell? Now if the deep fake was highly invasive and humiliating? Can you see the problem?

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Not really. The more extreme it is, the more easily people will believe you when you say it’s a deep fake. Everyone who matters (friends and family) will know it’s not you. The more this sort of thing becomes commonplace, the more people will simply shake their heads and move on.

        • ParsnipWitch@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          10 months ago

          That depends on a how a specific person is seen and treated by their surroundings.

          A teenage girl who is already a victim of harassment or bullying for example will be treated very differently when humiliating images of her surface in her peer group, compared between someone who is well liked in school.

          People who do this have to be judged much more harshly. This can’t become the next item on a list of common sexual harassment experiences every girl and women “has to” experience.

    • WoahWoah@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      That’s fine and well. Except they are videos, and it is very difficult to prove they aren’t you. And the internet is forever.

      This isn’t like high school when you went to high school.

      Agreed on your last paragraph.

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

        That’s the silver lining of this entire ordeal.

        Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

        • finestnothing@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

          Plus, something I’ve already seen happen is someone says a nude is fake and are then told they have to prove that it’s fake to get people to believe them… which is very hard without sharing an actual nude that has something unique about their body

          • derpgon@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

            Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

            • WoahWoah@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              10 months ago

              Yes I’m sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

              HR probably wouldn’t even allow a conversation about it. That person just never gets called back.

              And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

              The entire thing is damaging and ugly.

  • NightAuthor@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I wonder what the prevalence of this kind of behavior is like in countries that aren’t so weird about sex.

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      What does this have to do with the other? Where I live nudity isn’t all that uncommon (when compared to the US, for example). But sexually harassing someone with fake porn is whole different issue.

      I see a lot of problems with people having trouble understanding consent and struggling to respect other people. Those boys are weird about sex. That’s the weirdness we should address.

      • NightAuthor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 months ago

        My bad, I wasn’t as clear as I could have been. I meant, I wonder if boys would be so weird as to want to make such fake porn in places that are less weird about sex.

        Did you think I was advocating for the fake images?

          • NightAuthor@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            I imagine, in a society where the appeal of such images is low, the sanctity of the image of the body probably isn’t a big deal and people wouldn’t be so hurt by them either.

            • ParsnipWitch@feddit.de
              link
              fedilink
              arrow-up
              0
              ·
              10 months ago

              It’s still a questionable way to approach it. Why should the consequence be that people are simply “too weird about sex” and that should change? Instead of that the boys are weird and should change? It’s typical victim blaming.

              This perspective (the victim should change) is very prevalent when the crime is sexual harassment of girls and women.

  • renrenPDX@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    This is treading on some dangerous waters. Kids need to realize this is way too close to basically creating underage pornography/trafficking.

  • Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

    The rights to famous people’s “images” are bought and sold all the time.

    I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be “owned” by anyone but yourself.

    The fact that making a law like this isn’t a priority means this will get worse because we already have a society and laws that don’t respect our rights to control of our own image.

    A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else’s image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

    • Zetta@mander.xyz
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      That sounds pretty dystopian to me. Wouldn’t that make filming in public basically illegal?

      • ParsnipWitch@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        10 months ago

        In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn’t feel dystopian at all, to be honest. I’d rather have it that way than ending up on someone’s stupid vlog or whatever.

  • Treczoks@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    The problem is how to actually prevent this. What could one do? Make AI systems illegal? Make graphics tools illegal? Make the Internet illegal? Make computers illegal?

        • ParsnipWitch@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          I think in this case less mild punishment would send the appropriate signal that this isn’t just a little joke or a small misdemeanor.

          There are still way too many people who believe sexual harassment etc. aren’t that huge of a deal. And I believe the fact that perpetrators so easily get away with it plays into this.

          (I am not sure how it is in the US, in my country the consequence of crimes against bodily autonomy are laughable.)

  • gandalf_der_12te@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Honest opinion:

    We should normalize nudity.

    That’s the only healthy relationship that we can have with our bodies in the long term.

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      For this to happen people would probably need to stop judging people on their bodies. I am pretty sure there is a connection there. With how extremely superficial media and many relationships are, and with how we value women in particular, this needs a lot of change in people and society.

      I also think it would be a good thing, but we still have to do something about it until we reach that point.

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    reading this, I don’t really know what is supposed to be protected here to be deemed possible of protections in the first place.

    closest reasonable one is the girl’s “identity”, so it could be fraud. but it’s not used to fool people. more likely, those getting the pics already consented this is ai generated.

    so might be defamation?

    the image generation tech is already easily accessible so the girl’s picture being easily accessible might be the weakest link?