Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
Governments need to strike hard against all kinds of platforms like this, even if they can be used for legitimate reasons.
AI is way too dangerous a tool to allow free innovation and market on, it’s the number one technology right now that must be heavily regulated.
What, exactly would they regulate? The training data? The output? What kinds of user inputs are accepted?
All of this is hackable.
Making unauthorized nude images of other people, probably. The service did advertise, “undress anyone”.
The Philosophical question becomes, if it’s AI generated is it really a photo of them?
Let’s take it to an extreme. If you cut the face off somebody’s polaroid and then paste it into a nudie magazine over the face of an actress. Is that amalgam a nude photo of the Polaroid picture person?
It’s a debate that could go either way, and I’m sure we will have an exciting legal land scape with countries with different rules.
How about we teach people some baseline of respect towards other people? Punishing behaviour like that can help showing that it’s not okay to treat other people like pieces of meat.
I suppose you could make a Ship of Theseus like argument there too. At what point does it matter where the parts of the picture came from. Most would probably be okay with their hairstyle being added to someone else’s picture, what about their eyes, their mouth,… Where exactly is the line?