zlacker

[parent] [thread] 5 comments
1. Symmet+(OP)[view] [source] 2024-05-15 13:35:27
The scenario I have in my head is that they had to override the safety team's objections to ship their new models before Google IO happened.
replies(1): >>jdthed+cQ
2. jdthed+cQ[view] [source] 2024-05-15 17:36:01
>>Symmet+(OP)
The "safety" team can go eat grass.

I don't believe in AI "safety measures" any more than I do in kitchen cleaver safety measures.

That is, nothing beyond "keep out of kids' reach" and "don't use it like an idiot" but let the cleaver be a damn cleaver.

replies(2): >>Capric+t41 >>nialv7+Lu1
◧◩
3. Capric+t41[view] [source] [discussion] 2024-05-15 18:52:42
>>jdthed+cQ
> That is, nothing beyond "keep out of kids' reach" and "don't use it like an idiot" but let the cleaver be a damn cleaver.

Neither of which will be enforced with AI

replies(1): >>jdthed+fa1
◧◩◪
4. jdthed+fa1[view] [source] [discussion] 2024-05-15 19:23:55
>>Capric+t41
Exactly, just like I can't "enforce" another person to not be an idiot about anything.
◧◩
5. nialv7+Lu1[view] [source] [discussion] 2024-05-15 21:20:47
>>jdthed+cQ
A cleaver isn't going to try to kill you without someone holding it....
replies(1): >>jdthed+BB3
◧◩◪
6. jdthed+BB3[view] [source] [discussion] 2024-05-16 17:00:45
>>nialv7+Lu1
I genuinely don't get your point, you mean as opposed to an LLM ... ?
[go to top]