zlacker

[return to "A Developer Accidentally Found CSAM in AI Data. Google Banned Him for It"]
1. jsnell+X6[view] [source] 2025-12-11 16:32:00
>>markat+(OP)
As a small point of order, they did not get banned for "finding CSAM" like the outrage- and clickbait title claims. They got banned for uploading a data set containing child porn to Google Drive. They did not find it themselves, and them later reporting the data set to an appropriate organization is not why they got banned.
◧◩
2. jfindp+Ba[view] [source] 2025-12-11 16:47:51
>>jsnell+X6
>They got banned for uploading child porn to Google Drive

They uploaded the full "widely-used" training dataset, which happened to include CSAM (child sexual abuse material).

While the title of the article is not great, your wording here implies that they purposefully uploaded some independent CSAM pictures, which is not accurate.

[go to top]