zlacker

[return to "A Developer Accidentally Found CSAM in AI Data. Google Banned Him for It"]
1. bsowl+d5[view] [source] 2025-12-11 16:24:28
>>markat+(OP)
More like "A developer accidentally uploaded child porn to his Google Drive account and Google banned him for it".
◧◩
2. jkaplo+q7[view] [source] 2025-12-11 16:33:45
>>bsowl+d5
The penalties for unknowingly possessing or transmitting child porn are far too harsh, both in this case and in general (far beyond just Google's corporate policies).

Again, to avoid misunderstandings, I said unknowingly - I'm not defending anything about people who knowingly possess or traffic in child porn, other than for the few appropriate purposes like reporting it to the proper authorities when discovered.

◧◩◪
3. burnt-+QD[view] [source] 2025-12-11 18:59:10
>>jkaplo+q7
That's the root problem with all mandated, invasive CSAM scanning. (Non-signature based) creates an unreasonable panopticon that leads to lifelong banishment by imprecise, evidence-free guessing. It also hyper-criminalizes every parent who accidentally takes a picture of their kid without being fully dressed. And what about DoS victims who are anonymously sent CSAM without their consent to get them banned for "possession"? While pedo is gross and evil no doubt, but extreme "think of the children" measures that sacrifice liberty and privacy create another evil that is different. Handing over total responsibility and ultimate decision-making for critical matters to a flawed algorithm is lazy, negligent, and immoral. There's no easy solution to any such process, except requiring human review should be the moral and ethical minimum standard before drastic measures (human in the loop (HITL)).
[go to top]