zlacker

[parent] [thread] 6 comments
1. tcfhgj+(OP)[view] [source] 2025-12-11 17:14:16
> Maybe AI detection is more ethically fraught since you'd need to keep hold of the CSAM until the next training run,

why?

the damage is already done

replies(3): >>tremon+36 >>pseuda+lK >>giantg+1y2
2. tremon+36[view] [source] 2025-12-11 17:39:23
>>tcfhgj+(OP)
Why would you think that? Every distribution, every view is adding damage, even if the original victim doesn't know (or even would rather not know) about it.
replies(2): >>jjk166+J8 >>tcfhgj+aa
◧◩
3. jjk166+J8[view] [source] [discussion] 2025-12-11 17:52:01
>>tremon+36
I don't think AI training on a dataset counts as a view in this context. The concern is predators getting off on what they've done, not developing tools to stop them.
replies(1): >>pseuda+dK
◧◩
4. tcfhgj+aa[view] [source] [discussion] 2025-12-11 18:00:07
>>tremon+36
I don't think it's how it works.
◧◩◪
5. pseuda+dK[view] [source] [discussion] 2025-12-11 20:50:29
>>jjk166+J8
Debating what counts as a view is irrelevant. Some child pornography subjects feel violated by any storage or use of their images. Government officials store and use them regardless.
6. pseuda+lK[view] [source] 2025-12-11 20:51:03
>>tcfhgj+(OP)
Some victims feel this way. Some do not.
7. giantg+1y2[view] [source] 2025-12-12 13:24:18
>>tcfhgj+(OP)
I would think there's more possibility of it leaking or being abused in a giant stockpile. Undoubtedly, those training sets would be commercialized in some way, potentially causing some to see that as adding insult to injury.
[go to top]