Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

    • taladar@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      In the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.

  • rufus@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head…

    I wonder why they have no address etc on their website and the app isn’t available in any of the proper app-stores.

    Obviously police should ask Instagram who blackmails all these girls… Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

  • Margot Robbie@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Banning diffusion models doesn’t work, the tech is already out there and you can’t put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.

    This can only be stopped on the distribution side, and any new laws should focus on that.

    But the silver lining of this whole thing is that nude scandals for celebs aren’t really possible any more if you can just say it’s probably a deepfake.

    • GCostanzaStepOnMe@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Other than banning those websites and apps that offer such services, I think we also need to seriously rethink our overall exposure to the internet, and especially rethink how and how much children access it.

      • MadSurgeon@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        We’ll need an AI run police state to stop this technology. I doubt anybody has even the slightest interest in that.

        • GCostanzaStepOnMe@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          We’ll need an AI run police state to stop this technology.

          No? You really just need to ban websites that run ads for these apps.

  • Aetherion@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Better don’t stop posting your life into the internet, this would push people to create more child porn! /s

  • aard@kyu.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    This was just a matter of time - and there isn’t really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that’ll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

    So in the long term we’ll see that shift to images generated at home, by kids often too young to be prosecuted - and you won’t be able to stop that unless you start outlawing most of AI image generation tools.

    At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

    There’s one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying “they’re AI generated” is becoming a plausible way out.

    • Turun@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Quelle für das angesprochene Gesetz bitte. Das will ich im Detail lesen.

  • iByteABit [he/him]@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Governments need to strike hard against all kinds of platforms like this, even if they can be used for legitimate reasons.

    AI is way too dangerous a tool to allow free innovation and market on, it’s the number one technology right now that must be heavily regulated.

    • Blapoo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      What, exactly would they regulate? The training data? The output? What kinds of user inputs are accepted?

      All of this is hackable.

      • RaivoKulli@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Making unauthorized nude images of other people, probably. The service did advertise, “undress anyone”.

        • jet@hackertalks.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          The Philosophical question becomes, if it’s AI generated is it really a photo of them?

          Let’s take it to an extreme. If you cut the face off somebody’s polaroid and then paste it into a nudie magazine over the face of an actress. Is that amalgam a nude photo of the Polaroid picture person?

          It’s a debate that could go either way, and I’m sure we will have an exciting legal land scape with countries with different rules.

          • taladar@feddit.de
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            I suppose you could make a Ship of Theseus like argument there too. At what point does it matter where the parts of the picture came from. Most would probably be okay with their hairstyle being added to someone else’s picture, what about their eyes, their mouth,… Where exactly is the line?

          • ParsnipWitch@feddit.de
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            How about we teach people some baseline of respect towards other people? Punishing behaviour like that can help showing that it’s not okay to treat other people like pieces of meat.

  • duxbellorum@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    This seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?

      • duxbellorum@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        Why? They didn’t take or share any nudes, and nobody believes they did.

        This is only a nightmare if an ignorant adult tells them that it is.

        • ParsnipWitch@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          Did your picture got taken and shared as a teenager? Did you get heavily sexualised and harassed? Believe me, it feels like a nightmare even if no one is telling you that it should feel like a nightmare.

          Take your “sexual harassment is only bad to teenage girls if you tell them” shit elsewhere.

        • 0x815@feddit.deOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          @duxbellorum

          Why? They didn’t take or share any nudes, and nobody believes they did.

          This is only a nightmare if an ignorant adult tells them that it is.

          So you don’t have children, right?