zlacker

[return to "A Developer Accidentally Found CSAM in AI Data. Google Banned Him for It"]
1. winche+Ji[view] [source] 2025-12-11 17:22:59
>>markat+(OP)
Author of NudeNet here.

I just scraped data from reddit and other sources so i could build a nsfw classifier and chose to open source the data and the model for general good.

Note that i was a 1 year experienced engineer working solely on this project in my free time, so it was basically impossible for me to review or clear out the few csam images in the 100,000+ images in the dataset.

Although, now i wonder if i should never have open sourced the data. Would have avoided lot of these issues.

◧◩
2. qubex+yz[view] [source] 2025-12-11 18:44:25
>>winche+Ji
I in no way want to underplay the seriousness of child sexual abuse, but as a naturist I find all this paranoia around nudity and “not safe for work” to be somewhere between hilarious and bewildering. Normal is what you grew up with I guess, and I come from an FKK family. What’s so shocking about a human being? All that stuff in public speaking about “imagine your audience is naked”. Yeah, fine: so what’s Plan B?
[go to top]