zlacker

[parent] [thread] 45 comments
1. iammjm+(OP)[view] [source] 2024-02-15 17:20:26
Who is the victim of this “crime” here?
replies(6): >>artnin+q >>PlutoI+u1 >>Ahoy87+S1 >>delich+X1 >>DalasN+y3 >>abough+S6
2. artnin+q[view] [source] 2024-02-15 17:23:14
>>iammjm+(OP)
One argument I've heard is that it makes investigation of real cp harder.
replies(3): >>mullet+U >>anonpo+o2 >>nickth+E5
◧◩
3. mullet+U[view] [source] [discussion] 2024-02-15 17:25:58
>>artnin+q
Pretty sure that is a terrible reason to make laws for it. I heard the bill of rights has the same problem.
replies(1): >>artnin+12
4. PlutoI+u1[view] [source] 2024-02-15 17:29:09
>>iammjm+(OP)
Real children in the activity this kind of material promotes
replies(2): >>5040+J2 >>kergon+M4
5. Ahoy87+S1[view] [source] 2024-02-15 17:31:01
>>iammjm+(OP)
This is probably controversial, but if we assume that pedos will (unfortunately) exist and will generate demand for this kind of material, isn't it better if it comes from an AI instead of real children?
replies(3): >>double+N3 >>tiltow+M5 >>arp242+46
6. delich+X1[view] [source] 2024-02-15 17:31:15
>>iammjm+(OP)
This is "thoughtcrime". The argument against outlawing thoughtcrime has never been that there isn't a victim. Of course there are victims of bad thoughts. The steel man argument is that we're better off without central thought control in spite of those real victims, because there are more and worse victims of authoritarian thought control.
◧◩◪
7. artnin+12[view] [source] [discussion] 2024-02-15 17:31:33
>>mullet+U
I'm also not sure how I feel about it. Guess I'd have to see some evidence of how people generating cp images affects real children, be it by encouraging actual acts or whatever. If it doesn't or has the opposite effect, I would be against criminalising it, even though it is disgusting
replies(1): >>realfe+Ga
◧◩
8. anonpo+o2[view] [source] [discussion] 2024-02-15 17:32:54
>>artnin+q
Possibly, but I'd also speculate that it makes real cp less profitable, and therefore start to disappear. Similar to how real OF girls are going to rapidly lose their income because dudes running AI camgirls will increasingly outcompete them for attention and money.
replies(1): >>realfe+za
◧◩
9. 5040+J2[view] [source] [discussion] 2024-02-15 17:35:01
>>PlutoI+u1
This is the old 'violence in movies promotes violence in real life' argument.
10. DalasN+y3[view] [source] 2024-02-15 17:38:06
>>iammjm+(OP)
One thing that comes to mind is this: imagine someone is found with cp on a device. They could defend themselves saying it is AI generated. Unless there is a reliable way to tell AI fakes from real ones people could possibly use this defense.
replies(3): >>ng12+U3 >>ameliu+14 >>beaegl+x4
◧◩
11. double+N3[view] [source] [discussion] 2024-02-15 17:39:04
>>Ahoy87+S1
For it to come from AI it needs to come from real children. Chicken and the egg scenario.

Someone abuses a child, film it, put it in AI. And they now have that child's model.

Throw away the child and they're currently guilty free if any charges. Of course that won't be enough so repeat the process.

It's not like someone is creating a model in blender and than running that though a AI. Not like that doesn't happen anyway.

replies(4): >>Ahoy87+O4 >>nickth+j5 >>office+q5 >>kergon+B6
◧◩
12. ng12+U3[view] [source] [discussion] 2024-02-15 17:39:44
>>DalasN+y3
If AI generated porn were indistinguishable wouldn't that almost totally eliminate demand for the real stuff?
replies(3): >>ben_w+o5 >>Firmwa+A6 >>realfe+x7
◧◩
13. ameliu+14[view] [source] [discussion] 2024-02-15 17:40:01
>>DalasN+y3
If the technology gets to that point, who needs the real thing?
replies(1): >>mschus+S4
◧◩
14. beaegl+x4[view] [source] [discussion] 2024-02-15 17:41:07
>>DalasN+y3
Makes sense but real people have real ages. Couldnt they just say the AI is a rendition of an 18 year old with some hypothetical development deviation? You'd have to ban all ai porn because the age can't be measured as it's non-existent.
replies(2): >>once_i+I5 >>birrac+Aa
◧◩
15. kergon+M4[view] [source] [discussion] 2024-02-15 17:42:03
>>PlutoI+u1
You mean, in the same way as black metal promotes burning churches and video games cause mass shootings?
◧◩◪
16. Ahoy87+O4[view] [source] [discussion] 2024-02-15 17:42:23
>>double+N3
It feels really wrong to write this but what happens if someone makes a model that's "good enough" and there's no incentive to abuse children any more? Also, a lot of models are never trained on real pictures of what they generate.
replies(2): >>double+07 >>realfe+ea
◧◩◪
17. mschus+S4[view] [source] [discussion] 2024-02-15 17:42:44
>>ameliu+14
People who are into it not because they like their kids young (i.e. "classic" pedos), but because they want to (feel to) have the power of causing pain. There is a real market for custom pedo videos, it's utterly insane.
replies(2): >>ameliu+e6 >>almata+nd
◧◩◪
18. nickth+j5[view] [source] [discussion] 2024-02-15 17:44:12
>>double+N3
Gen AI does not need to be training on photos of nude children to produce photos of nude children. It can generate a flying pig soaring over Congress without ever being trained to do so.
replies(1): >>double+O9
◧◩◪
19. ben_w+o5[view] [source] [discussion] 2024-02-15 17:44:24
>>ng12+U3
Unknown. For example, I have heard most offenders abuse their relatives, and I don't expect synthetic material to have any impact in this category.

Also, the only way to find out if this has any effect at all (positive or negative) would disgust and outrage many, as that test would require having a region where it's forbidden and a control group where it's allowed and seeing which is worse.

I'm not sure how many people would try to lynch mob (let alone vote out) whoever tries to do that, but I'm sure it's enough that exact numbers don't matter.

replies(1): >>diputs+78
◧◩◪
20. office+q5[view] [source] [discussion] 2024-02-15 17:44:39
>>double+N3
> For it to come from AI it needs to come from real children.

Yes, but given that CSAM data already exists, and we can't go back in time to prevent it, there's no further cost to attain that dataset. Unlike all future real CSAM, which will be produced by abusing children IRL.

I see parallels with Unit 731 here. Horrible things have already happened, so what do we do with the information?

replies(1): >>double+Ca
◧◩
21. nickth+E5[view] [source] [discussion] 2024-02-15 17:45:09
>>artnin+q
I hear encryption does the same thing.
replies(1): >>ben_w+17
◧◩◪
22. once_i+I5[view] [source] [discussion] 2024-02-15 17:45:16
>>beaegl+x4
That would indeed be the probable next step for government or intergovernmental organizations. Criminalize AI porn. Then criminalize regular porn.

The government is greedy in its lust for control and order in a chaotic world. It has a tendency to overreach, then overreach again (as we see with in the overlap of privacy and counterterrorism).

◧◩
23. tiltow+M5[view] [source] [discussion] 2024-02-15 17:45:30
>>Ahoy87+S1
Does the legality of fake material increase the demand of real material? Does availability of fake material "awaken" or otherwise normalize desires that might have remained dormant? Studies have shown a link between violent porn and abusive behavior. A link, of course, does not mean causation, but given the potential for monstrous harm, I think we need to be wary of legalizing this kind of material. There's also the question of the training set used to generate this type of imagery.

However, I also think thoughtcrime is a very dangerous and slippery slope. It's not an easy question with an easy answer.

◧◩
24. arp242+46[view] [source] [discussion] 2024-02-15 17:46:25
>>Ahoy87+S1
This is an old discussion about "pretend paedophilia" using either art or (adult) actors, and AI essentially changed very little about it. The two views, very briefly, are:

- "It's a safer outlet and prevents actual child abuse, so it's a good thing."

- "It will encourage and enforce paedophilic tendencies and (indirectly) encourages actual child abuse, so it's a bad thing."

The last time I looked, the evidence is inconclusive. It's a difficult topic to research well, so I'm not expecting anything conclusive on this any time soon.

My own view is that most likely, there are different kind of paedophiles and that different things will be true for different groups because these types of things aren't that simple. This kind of nuance is even harder to research, especially on such a controversial topic fraught with ethics issues.

There's also the issue of training material, which is unique to AI. Is it possible to have AI generated child abuse material without training material of that kind of thing? I don't know enough about AI to comment on that.

◧◩◪◨
25. ameliu+e6[view] [source] [discussion] 2024-02-15 17:47:05
>>mschus+S4
I'm guessing the market will just serve them fake custom videos then ...
◧◩◪
26. Firmwa+A6[view] [source] [discussion] 2024-02-15 17:48:20
>>ng12+U3
But to generate faithful AI CP it must mean the AI was trained on actual CP dataset. So those who trained the AI would have some explaining to do.
replies(3): >>ithkui+zb >>aenvok+1c >>xigoi+Gf
◧◩◪
27. kergon+B6[view] [source] [discussion] 2024-02-15 17:48:22
>>double+N3
If a generative AI knows the concept of children and the concept of porn, it can generate porn with children in it (possibly with various degrees of success and realism). It’s not stuck and forced to produce only what was strictly in the training set. AIs are fundamentally extrapolation machines.
28. abough+S6[view] [source] 2024-02-15 17:49:29
>>iammjm+(OP)
"Anyone in the training dataset"?

A big unanswered question in the age of AI: how does a system of law work when breaking one law is bad, but the product of breaking many laws is totally exempt?

We're starting to see the milder form of this in debates around authorship and copyright. But when your AI model requires a shockingly large quantity of clearly verboten material as input, what is one to make of the output?

◧◩◪◨
29. double+07[view] [source] [discussion] 2024-02-15 17:50:10
>>Ahoy87+O4
If a "good enough" model ever came be it could split the pedophile category in to two. (On a basic level)

Those who seek sexual gratification from the abuse of a minor. The real deal.

And those who are aroused by the body of the minor, or watching the abuse of an minor.

If the model is "good enough" than you could potentially say that those who are interested in pedophilia probably won't seek the further extremes to fulfil their pleasure.

However, in the long run they are still pedophillac and the real deal will always be the more for those.

◧◩◪
30. ben_w+17[view] [source] [discussion] 2024-02-15 17:50:11
>>nickth+E5
It does make it harder to investigate, the counter argument is that encryption is a literally unavoidable requirement for securing almost 100% of all online activities, which are themselves now critical to the functioning of modern economies.

For the moment, GenAI isn't.

◧◩◪
31. realfe+x7[view] [source] [discussion] 2024-02-15 17:52:10
>>ng12+U3
In that scenario, how tf would you know that "real stuff" was eliminated? Think, please.
◧◩◪◨
32. diputs+78[view] [source] [discussion] 2024-02-15 17:55:21
>>ben_w+o5
My guess is that offenders abuse relatives because they are easier to access and manipulate, not because there is a true preference there. More a crime of opportunity than a pursued goal.
◧◩◪◨
33. double+O9[view] [source] [discussion] 2024-02-15 18:02:06
>>nickth+j5
It may not but why would it not be trained on child imagery if not to produce photorealistic results?

If you had the opportunity to tune your AI with photography than to self generated where true photography of a pig which produced higher quality less defects on generation why would you not go for such?

◧◩◪◨
34. realfe+ea[view] [source] [discussion] 2024-02-15 18:04:15
>>Ahoy87+O4
> what happens if someone makes a model that's "good enough" and there's no incentive to abuse children any more?

That isn't how anything works.

◧◩◪
35. realfe+za[view] [source] [discussion] 2024-02-15 18:05:59
>>anonpo+o2
You think the main reason cp exists is profit? Really?

Listen to the podcast "Hunting Warhead" before you make another comment so wildly uninformed on the topic anywhere.

◧◩◪
36. birrac+Aa[view] [source] [discussion] 2024-02-15 18:06:02
>>beaegl+x4
Ah yes, the japanese “1000 year old dragon loli” gambit.

Which is actually a perfectly valid defense imo, as it’s horribly dumb to incriminate real people because of fictional characters. Should everyone who has a copy of IT go to jail because of child pornography? It makes no sense.

◧◩◪◨
37. double+Ca[view] [source] [discussion] 2024-02-15 18:06:24
>>office+q5
It's not the cost, why do need movies get produced when existing movies already exist?

Because of new content. If AI is being trained on real data and new content than the datasets don't end up stale.

replies(1): >>office+kc
◧◩◪◨
38. realfe+Ga[view] [source] [discussion] 2024-02-15 18:06:36
>>artnin+12
Luckily most other people in the world don't need further convincing.
◧◩◪◨
39. ithkui+zb[view] [source] [discussion] 2024-02-15 18:10:32
>>Firmwa+A6
I don't think that's necessarily true.

An AI can generate an image of a wizard with a frog on their head and that doesn't imply that the training set had such an image

◧◩◪◨
40. aenvok+1c[view] [source] [discussion] 2024-02-15 18:12:24
>>Firmwa+A6
You don't need to train on pictures of canine golfers to make highly convincing pictures of dogs driving golf carts on Mars. https://imgur.com/a/EIWUJYp The AIs are extremely good at mixing concepts.
◧◩◪◨⬒
41. office+kc[view] [source] [discussion] 2024-02-15 18:13:54
>>double+Ca
New movies get produced because people want to make and sell movies. They don't have to make movies that are 100% reality. Movies actually use special effects and CGI to fake all kinds of things that would be illegal in real life.

For example, there was a time when to get a flood effect filmmakers flooded a set. 3 extras died. Later on they were told they can't do that, but they can simulate it. Tons of movies show people getting overcome by floods, but no one dies in real life anymore.

replies(1): >>double+wg
◧◩◪◨
42. almata+nd[view] [source] [discussion] 2024-02-15 18:18:34
>>mschus+S4
> People who are into it not because they like their kids young (i.e. "classic" pedos), but because they want to (feel to) have the power of causing pain.

Stupid question but why take kids then and not adult women? Why take the risk of buying CP if you do not like the kids young?

◧◩◪◨
43. xigoi+Gf[view] [source] [discussion] 2024-02-15 18:29:38
>>Firmwa+A6
Are you sure? I’d guess that AI can extrapolate from adult porn and non-sexual depictions of children.
replies(1): >>Firmwa+8g
◧◩◪◨⬒
44. Firmwa+8g[view] [source] [discussion] 2024-02-15 18:31:09
>>xigoi+Gf
So the AI will generate children with adult private parts?
replies(1): >>xigoi+Sg
◧◩◪◨⬒⬓
45. double+wg[view] [source] [discussion] 2024-02-15 18:32:23
>>office+kc
> New movies get produced because people want to make and sell movies.

Same with CP.

But real movies still use real effects. Just a lot more of it is on a green screen as a cost saving exercise and the demand for the movie to be now now now.

If quality went in to making films as they did in the past, the movie industry wouldn't be such a shovel of shite. Those were real, with real actors and real acting. Now you got CGI however, scenes are still produced in the real.

◧◩◪◨⬒⬓
46. xigoi+Sg[view] [source] [discussion] 2024-02-15 18:33:54
>>Firmwa+8g
Pretty sure there are non-sexual images of naked children too, such as in anatomy textbooks.
[go to top]