Someone abuses a child, film it, put it in AI. And they now have that child's model.
Throw away the child and they're currently guilty free if any charges. Of course that won't be enough so repeat the process.
It's not like someone is creating a model in blender and than running that though a AI. Not like that doesn't happen anyway.
Yes, but given that CSAM data already exists, and we can't go back in time to prevent it, there's no further cost to attain that dataset. Unlike all future real CSAM, which will be produced by abusing children IRL.
I see parallels with Unit 731 here. Horrible things have already happened, so what do we do with the information?
Because of new content. If AI is being trained on real data and new content than the datasets don't end up stale.
For example, there was a time when to get a flood effect filmmakers flooded a set. 3 extras died. Later on they were told they can't do that, but they can simulate it. Tons of movies show people getting overcome by floods, but no one dies in real life anymore.