• Catoblepas@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Did we memory hole the whole ‘known CSAM in training data’ thing that happened a while back? When you’re vacuuming up the internet you’re going to wind up with the nasty stuff, too. Even if it’s not a pixel by pixel match of the photo it was trained on, there’s a non-zero chance that what it’s generating is based off actual CSAM. Which is really just laundering CSAM.