No platform ever should allow CSAM content.
And the fact that they didn’t even care and haven’t want to spend money for implementing guardrails or moderation is deeply concerning.
This has imho nothing to do with model censorship, but everything with allowing that kind of content on a platform
A provider should have no responsibility how the tools are used. It is on users. This is a can of worms that should stay closed, because we all lose freedoms just because of couple of bad actors. AI and tool main job is to obey. We are hurling at "I'm sorry, Dave. I'm afraid I can't do that" future with breakneck speed.
——-
You’ve said that whatever is behind door number 1 is unacceptable.
Behind door number 2, “holding tool users responsible”, is tracking every item generated via AI, and being able to hold those users responsible.
If you don’t like door number 2, we have door number 3 - which is letting things be.
For any member of society, opening door 3 is straight out because the status quo is worse than reality before AI.
If you reject door 1 though, you are left with tech monitoring. Which will be challenged because of its invasive nature.
Holding Platforms responsible is about the only option that works, at least until platforms tell people they can’t do it.