i had no problem distinguishing the models on the community from children.
maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.
that’s why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.
i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn’t recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.
i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that’s kind of cool too for sure.
I thought about this some more and I can feel a lot more sympathy for your decision now.
It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.
Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.
It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.
I’m sorry for coming across as ignorant, I just did not consider your perspective that much really.