zlacker

[parent] [thread] 1 comments
1. hammyh+(OP)[view] [source] 2023-05-22 19:24:18
But surely, if safety is an issue, releasing them in the capacity that you describe would be a far greater problem?
replies(1): >>mindsl+27
2. mindsl+27[view] [source] 2023-05-22 20:01:55
>>hammyh+(OP)
Releasing their models for direct use would make any actual problems present themselves sooner, before more advanced models are created that intensify those problems. Right now the stance is basically going full speed ahead on creating the thing that might be a problem, while they're going to "solve" it with bespoke content-based filters and banning users. That is the setup for green lighting problematic-but-profitable uses - ie bog standard corporate behavior.
[go to top]