zlacker

[parent] [thread] 2 comments
1. realus+(OP)[view] [source] 2026-02-04 16:36:39
> But I am having trouble justifying in an consistent manner why Grok / X should be liable here instead of the user.

Because Grok and X aren't even doing the most basic filtering they could do to pretend to filter out CSAM.

replies(1): >>yibg+7K
2. yibg+7K[view] [source] 2026-02-04 20:01:16
>>realus+(OP)
Filtering on the platform or Grok output though? If the filtering / flagging on X is insufficient then that is a separate issue independent of Grok. If filtering output of Grok, while irresponsible in my view, I don’t see why that’s different from say photoshop not filtering its output.
replies(1): >>realus+do2
◧◩
3. realus+do2[view] [source] [discussion] 2026-02-05 08:29:42
>>yibg+7K
Photoshop doesn't have a "transform to nude" button and if they did, they would be in the exact same kind of legal trouble as Grok.

That's the difference between a tool being used to commit crimes and a tool specifically designed to commit crimes.

[go to top]