We already apply this logic elsewhere. Car makers must include seatbelts. Pharma companies must ensure safety. Platforms must moderate illegal content. Responsibility is shared when the risk is systemic.
Platforms moderating illegal content is exactly what we are arguing about, so you can't use it as an argument.
The rest cases you list are harms to the people using the tools/products. It is not harms that people using the tools inflict on third parties.
We are literally arguing about 3d printer control two topics downstream. 3d printers in theory can be used for CSAM too. So we should totally ban them - right? So are pencils, paper, lasers, drawing tablets.
If a platform encourages and doesn’t moderate at all, yes we should go after the platform.
Imagine a newspaper publishing content like that, and saying they are not responsible for their journalists
Yes, AI chatbots have to do everything in their power to avoid users easily generating such content.
AND
Yes, people that do so (even if done so on your self-hosted model) have to be punished.
I believe it is OK that Grok is being investigated because the point is to figure out whether this was intentional or not.
Just my opinion.
X also actively distributes and profits off of CSAM. Why shouldn't the law apply to distribution centers?
I mean, I thought that was basically already the law in the UK.
I can see practical differences between X/twitter doing moderation and the full ISP censorship, but I cannot see any differences in principle...
If LLMs should have guardrails, why should open source ones be exempt? What about people hosting models on hugging face? WHat if you use a model both distributed by and hosted by hugging face.
I mean even just calling it censorship is already trying to shove a particular bias into the picture. Is it government censorship that you aren't allowed to shout "fire!" in a crowded theater? Yes. Is that also a useful feature of a functional society? Also yes. Was that a "slippery slope"? Nope. Turns out people can handle that nuance just fine.