zlacker

[return to "Elon Musk sues Sam Altman, Greg Brockman, and OpenAI [pdf]"]
1. HarHar+vu1[view] [source] 2024-03-01 19:23:01
>>modele+(OP)
Any competent lawyer is going to get Musk on the stand reiterating his opinions about the danger of AI. If the tech really is dangerous then being more closed arguably is in the public's best interest, and this is certainly the reason OpenAI have previously given.

Not saying I agree that being closed source is in the public good, although one could certainly argue that accelerating the efforts of bad actors to catch up would not be a positive.

◧◩
2. andy_p+BM1[view] [source] 2024-03-01 21:09:23
>>HarHar+vu1
Are you a lawyer or have some sort of credentials to be able to make that statement? I’m not sure if Elon Musk being hypocrite about AI safety would be relevant to the disputed terms of a contract.
◧◩◪
3. HarHar+JO1[view] [source] 2024-03-01 21:23:36
>>andy_p+BM1
I don't think it's about him being a hypocrite - just him undermining his own argument. It's a tough sell saying AI is unsafe but it's still in the public's best interest to open source it (and hence OpenAI is reneging on it's charter).
◧◩◪◨
4. advael+Xa3[view] [source] 2024-03-02 12:45:17
>>HarHar+JO1
Not really. The fact that "we keep our technology secret for safety reasons" is the reasoning given by many for-profit corporations does not make it a good argument, just a very profitable lie to tell, and it has never stopped showing itself false at every opportunity to test it. But it's also never stopped being profitable to keep this secrecy, which is why the likes of Apple and Microsoft make these claims so frequently

This is, in many ways, the substance of the lawsuit. This logic of "we must guard this secret carefully... for safety!" doesn't actually inevitably come from most lines of enabling research in any field in academia for example, but it does reliably come up once someone can enclose the findings in order to profit from exploiting this information asymmetry somehow

Secrecy for profit isn't a super benevolent thing to do, but it's generally speaking fine. We have whole areas of law about how to balance the public benefit of wide availability of information and the private benefit to discoverers of some technique, technology, or even facts about the world. It is well understood by most people that trade secrets aren't public knowledge. We see this plea to "safety" come up only exactly in cases where companies want to justify having control over things that have become pervasive and often mandatory to use in many contexts in a way that allows said companies to in turn exert further control over that thing's users, which is to say in tech monopolies. The use of that reasoning basically one-to-one predicts a business model that relies on DMCA 1201 (or its international equivalents) to function, a legal edifice designed by Microsoft lawyers which has become pervasive worldwide essentially at their behest

That said, I don't think it's particularly hard to make the case that writing a whole-ass non-profit charter explicitly outlining the intent to do research in the open and then suddenly switching to the very familiar corporate reality distortion field stance of a convicted monopolist you happen to have formed a partnership with in order to justify effectively abandoning that charter is a good basis for a lawsuit

[go to top]