zlacker

[return to "Elon Musk sues Sam Altman, Greg Brockman, and OpenAI [pdf]"]
1. HarHar+vu1[view] [source] 2024-03-01 19:23:01
>>modele+(OP)
Any competent lawyer is going to get Musk on the stand reiterating his opinions about the danger of AI. If the tech really is dangerous then being more closed arguably is in the public's best interest, and this is certainly the reason OpenAI have previously given.

Not saying I agree that being closed source is in the public good, although one could certainly argue that accelerating the efforts of bad actors to catch up would not be a positive.

◧◩
2. nicce+1w1[view] [source] 2024-03-01 19:30:18
>>HarHar+vu1
> If the tech really is dangerous then being more closed arguably is in the public's best interest, and this is certainly the reason OpenAI have previously given.

Not really. It slows down like security over obscurity. It needs to be open that we know the real risks and we have the best information to combat it. Otherwise, someone who does the same in closed matter, has better chances to get advantage when misusing it.

◧◩◪
3. Feepin+9x1[view] [source] 2024-03-01 19:37:41
>>nicce+1w1
This only holds if defense outscales attack. It seems very likely that attack outscales defense to me with LLMs.
◧◩◪◨
4. nicce+hy1[view] [source] 2024-03-01 19:42:37
>>Feepin+9x1
Well then, isn’t the whole case about just denying the inevitable?

If OpenAI can do it, I would not say that that is very unlikely for someone else to do the same. Open or not. The best chance is still that we prepare with the best available information.

◧◩◪◨⬒
5. Feepin+p23[view] [source] 2024-03-02 10:51:09
>>nicce+hy1
Yep, it absolutely is about denying the inevitable, or rather, "playing for time." The longer we manage to delay, the more likely somebody comes up with some clever approach for actually controlling the things. Also humanity stays alive in the meantime, which is no small thing in itself.
◧◩◪◨⬒⬓
6. nicce+lK5[view] [source] 2024-03-03 16:57:15
>>Feepin+p23
> with some clever approach for actually controlling the things.

But if we hide the things, we have no idea what we are trying to control.

[go to top]