zlacker

[return to "A Developer Accidentally Found CSAM in AI Data. Google Banned Him for It"]
1. bsowl+d5[view] [source] 2025-12-11 16:24:28
>>markat+(OP)
More like "A developer accidentally uploaded child porn to his Google Drive account and Google banned him for it".
◧◩
2. jkaplo+q7[view] [source] 2025-12-11 16:33:45
>>bsowl+d5
The penalties for unknowingly possessing or transmitting child porn are far too harsh, both in this case and in general (far beyond just Google's corporate policies).

Again, to avoid misunderstandings, I said unknowingly - I'm not defending anything about people who knowingly possess or traffic in child porn, other than for the few appropriate purposes like reporting it to the proper authorities when discovered.

◧◩◪
3. jjk166+em[view] [source] 2025-12-11 17:37:41
>>jkaplo+q7
The issue is that when you make ignorance a valid defense, the optimal strategy is to deliberately turn a blind eye, as it reduces your risk exposure. It further gives refuge for those who can convincingly feign ignorance.

We should make tools readily available and user friendly so it is easier for people to detect CSAM that they have unintentionally interacted with. This both shields the innocent from being falsely accused, and makes it easier to stop bad actors as their activities are detected earlier.

◧◩◪◨
4. pixl97+lU[view] [source] 2025-12-11 20:16:11
>>jjk166+em
No, it should be law enforcement job to determine intent, not a blanket you're guilty. This being Actus Reus is a huge mess that makes it easy to frame people and get in trouble with no guilty act.
◧◩◪◨⬒
5. jjk166+c61[view] [source] 2025-12-11 21:17:31
>>pixl97+lU
Determining intent takes time, is often not possible, and encourages people to specifically avoid the work to check if something needs to be flagged. Not checking is at best negligent. Having everybody check and flag is the sensible option.
[go to top]