Someone abuses a child, film it, put it in AI. And they now have that child's model.
Throw away the child and they're currently guilty free if any charges. Of course that won't be enough so repeat the process.
It's not like someone is creating a model in blender and than running that though a AI. Not like that doesn't happen anyway.
Also, the only way to find out if this has any effect at all (positive or negative) would disgust and outrage many, as that test would require having a region where it's forbidden and a control group where it's allowed and seeing which is worse.
I'm not sure how many people would try to lynch mob (let alone vote out) whoever tries to do that, but I'm sure it's enough that exact numbers don't matter.
Yes, but given that CSAM data already exists, and we can't go back in time to prevent it, there's no further cost to attain that dataset. Unlike all future real CSAM, which will be produced by abusing children IRL.
I see parallels with Unit 731 here. Horrible things have already happened, so what do we do with the information?
The government is greedy in its lust for control and order in a chaotic world. It has a tendency to overreach, then overreach again (as we see with in the overlap of privacy and counterterrorism).
However, I also think thoughtcrime is a very dangerous and slippery slope. It's not an easy question with an easy answer.
- "It's a safer outlet and prevents actual child abuse, so it's a good thing."
- "It will encourage and enforce paedophilic tendencies and (indirectly) encourages actual child abuse, so it's a bad thing."
The last time I looked, the evidence is inconclusive. It's a difficult topic to research well, so I'm not expecting anything conclusive on this any time soon.
My own view is that most likely, there are different kind of paedophiles and that different things will be true for different groups because these types of things aren't that simple. This kind of nuance is even harder to research, especially on such a controversial topic fraught with ethics issues.
There's also the issue of training material, which is unique to AI. Is it possible to have AI generated child abuse material without training material of that kind of thing? I don't know enough about AI to comment on that.
A big unanswered question in the age of AI: how does a system of law work when breaking one law is bad, but the product of breaking many laws is totally exempt?
We're starting to see the milder form of this in debates around authorship and copyright. But when your AI model requires a shockingly large quantity of clearly verboten material as input, what is one to make of the output?
Those who seek sexual gratification from the abuse of a minor. The real deal.
And those who are aroused by the body of the minor, or watching the abuse of an minor.
If the model is "good enough" than you could potentially say that those who are interested in pedophilia probably won't seek the further extremes to fulfil their pleasure.
However, in the long run they are still pedophillac and the real deal will always be the more for those.
For the moment, GenAI isn't.
If you had the opportunity to tune your AI with photography than to self generated where true photography of a pig which produced higher quality less defects on generation why would you not go for such?
That isn't how anything works.
Listen to the podcast "Hunting Warhead" before you make another comment so wildly uninformed on the topic anywhere.
Which is actually a perfectly valid defense imo, as it’s horribly dumb to incriminate real people because of fictional characters. Should everyone who has a copy of IT go to jail because of child pornography? It makes no sense.
Because of new content. If AI is being trained on real data and new content than the datasets don't end up stale.
An AI can generate an image of a wizard with a frog on their head and that doesn't imply that the training set had such an image
For example, there was a time when to get a flood effect filmmakers flooded a set. 3 extras died. Later on they were told they can't do that, but they can simulate it. Tons of movies show people getting overcome by floods, but no one dies in real life anymore.
Stupid question but why take kids then and not adult women? Why take the risk of buying CP if you do not like the kids young?
Same with CP.
But real movies still use real effects. Just a lot more of it is on a green screen as a cost saving exercise and the demand for the movie to be now now now.
If quality went in to making films as they did in the past, the movie industry wouldn't be such a shovel of shite. Those were real, with real actors and real acting. Now you got CGI however, scenes are still produced in the real.