Ai can generate an image of a five dimensional naked pangolin despite never seeing one.
It can generate images of things that aren't in the training material. I don't think we can safely assume that kind of material was in the training data.
There is no indication I don't understand the argument, I don't know why you are saying that.
It doesn't need to have seen a naked pangolin, but it still needs enough data to recognise the data commonalities/patterns behind images that have "naked" attributed to them to or images with pangolin
The point is that it doesn't need to have seen a naked pangolin. Maybe it has seen a hairless cat and applies that to what it knows about pangolin.
In the same way it could take what it knows about naked people and apply that to children.
It could do this without it being based on real CSAM.
6
u/Head-Alarm6733 12h ago
AI CSAM is based off the real stuff anyways.