zlacker

[return to "A Developer Accidentally Found CSAM in AI Data. Google Banned Him for It"]
1. jsnell+X6[view] [source] 2025-12-11 16:32:00
>>markat+(OP)
As a small point of order, they did not get banned for "finding CSAM" like the outrage- and clickbait title claims. They got banned for uploading a data set containing child porn to Google Drive. They did not find it themselves, and them later reporting the data set to an appropriate organization is not why they got banned.
◧◩
2. jeffbe+H7[view] [source] 2025-12-11 16:34:25
>>jsnell+X6
Literally every headline that 404 media has published about subjects I understand first-hand has been false.
◧◩◪
3. ameliu+6i[view] [source] 2025-12-11 17:20:21
>>jeffbe+H7
Can we use AI to fix this?

Make an LLM read the articles behind the links, and then rewrite the headlines (in a browser plugin for instance).

◧◩◪◨
4. add-su+ll[view] [source] 2025-12-11 17:34:14
>>ameliu+6i
HN already needlessly rewrites headlines with automation and it's more annoying to see automation go stupidly wrong than letting the original imperfect situation stand. Having outrage about headlines is a choice.
◧◩◪◨⬒
5. ameliu+ms[view] [source] 2025-12-11 18:08:27
>>add-su+ll
I don't think HN's rewrite algorithm uses modern LLM techniques.

Also, it could be optional. It probably should be, in fact.

[go to top]