zlacker

[parent] [thread] 2 comments
1. jmcgou+(OP)[view] [source] 2026-02-04 17:12:53
Having an issue with users uploading CSAM (a problem for every platform) is very different from giving them a tool to quickly and easily generate CSAM, with apparently little-to-no effort to prevent this from happening.
replies(1): >>timmg+b1
2. timmg+b1[view] [source] 2026-02-04 17:17:55
>>jmcgou+(OP)
If the tool generates it automatically or spuriously, then yes. But if it is the users asking it to, then I'm not sure there is a big difference.
replies(1): >>dragon+is1
◧◩
3. dragon+is1[view] [source] [discussion] 2026-02-05 00:41:24
>>timmg+b1
Well, its worth noting that with the nonconsensual porn, child and otherwise, it was generating X would often rapidly punish the user that posted the prompt, but leave the grok-generated content up. It wasn't an issue of not having control, it was an issue of how the control was used.
[go to top]