It would be more like outlawing ivory grand pianos because they require dead elephants to make - the AI models under question here were trained on abuse.
Sounds to me it would be more like outlawing grand pianos because of all of the dead elephants - while some people are claiming that it is possible to make a grand piano without killing elephants.
3,226 suspected images out of 5.8 billion. About 0.00006%. And probably mislabeled to boot, or it would have been caught earlier. I doubt it had any significant impact on the model’s capabilities.
Then we should be able to charge AI (the developers moreso) for the same disgusting crime, and shut AI down.
…no
That’d be like outlawing hammers because someone figured out they make a great murder weapon.
Just because you can use a tool for crime, doesn’t mean that tool was designed/intended for crime.
It would be more like outlawing ivory grand pianos because they require dead elephants to make - the AI models under question here were trained on abuse.
Sounds to me it would be more like outlawing grand pianos because of all of the dead elephants - while some people are claiming that it is possible to make a grand piano without killing elephants.
There’s CSAM in the training set[1] used for these models so some elephants have been murdered to make this piano.
3,226 suspected images out of 5.8 billion. About 0.00006%. And probably mislabeled to boot, or it would have been caught earlier. I doubt it had any significant impact on the model’s capabilities.
Camera-makers, too. And people who make pencils. Lock the whole lot up, the sickos.