zlacker

[return to "Elon Musk sues Sam Altman, Greg Brockman, and OpenAI [pdf]"]
1. BitWis+3T[view] [source] 2024-03-01 16:30:05
>>modele+(OP)
Wouldn't you have to prove damages in a lawsuit like this? What damages does Musk personally suffer if OpenAI has in fact broken their contract?
◧◩
2. Kepler+yW[view] [source] 2024-03-01 16:46:58
>>BitWis+3T
A non-profit took his money and decided to be for profit and compete with the AI efforts of his own companies?
◧◩◪
3. a_wild+3Z[view] [source] 2024-03-01 16:58:12
>>Kepler+yW
Yeah, OpenAI basically grafted a for-profit entity onto the non-profit to bypass their entire mission. They’re now extremely closed AI, and are valued at $80+ billion.

If I donated millions to them, I’d be furious.

◧◩◪◨
4. neilv+vb1[view] [source] 2024-03-01 17:54:29
>>a_wild+3Z
> and are valued at $80+ billion. If I donated millions to them, I’d be furious.

Don't get mad; convince the courts to divide most of the nonprofit-turned-for-profit company equity amongst the donors-turned-investors, and enjoy your new billions of dollars.

◧◩◪◨⬒
5. a_wild+jm1[view] [source] 2024-03-01 18:43:15
>>neilv+vb1
Or just simply...Open the AI. Which they still can. Because everyone is evidently supposed to reap the rewards of this nonprofit -- from the taxpayers/governments affected by supporting nonprofit institutions, to the researchers/employees who helped ClopenAI due to their nonprofit mission, to the folk who donated to this cause (not invested for a return), to the businesses and laypeople across humanity who can build on open tools just as OAI built on theirs, to the authors whose work was hoovered up to make a money printing machine.

The technology was meant for everyone, and $80B to a few benefactors-turned-lotto-winners ain't sufficient recompense. The far simpler, more appropriate payout is literally just doing what they said they would.

◧◩◪◨⬒⬓
6. neilv+Sn1[view] [source] 2024-03-01 18:51:13
>>a_wild+jm1
This is what I actually support. At this point, though, given how the non-profit effectively acted against its charter, and aggressively so, with impressive maneuvers by some (and inadequate maneuvers by others)... would the organization(s) have to be dissolved, or go through some sort of court-mandated housecleaning?
◧◩◪◨⬒⬓⬔
7. a_wild+0w1[view] [source] 2024-03-01 19:30:07
>>neilv+Sn1
OpenAI should be compelled to release their models under (e.g) GPLv3. That's it. They can keep their services/profits/deals/etc to fund research, but all products of that research must be openly available.

No escape hatch excuse of "because safety!" We already have a safety mechanism -- it's called government. It's a well-established, representative body with powers, laws, policies, practices, agencies/institutions, etc. whose express purpose is to protect and serve via democratically elected officials.

We the people decide how to regulate our society's technology & safety, not OpenAI, and sure as hell not Microsoft. So OpenAI needs a reality check, I say!

◧◩◪◨⬒⬓⬔⧯
8. neilv+oy1[view] [source] 2024-03-01 19:42:47
>>a_wild+0w1
Should there also be some enforcement of sticking to non-profit charter, and avoiding self-dealing and other conflict-of-interest behavior?

If so, how do you enforce that against what might be demonstrably misaligned/colluding/rogue leadership?

◧◩◪◨⬒⬓⬔⧯▣
9. a_wild+nX1[view] [source] 2024-03-01 22:21:33
>>neilv+oy1
Yes, regulators should enforce our regulations, if that's your question. Force the nonprofit to not profit; prevent frauds from defrauding.

In this case, a nonprofit took donations to create open AI for all of humanity. Instead, they "opened" their AI exclusively to themselves wearing a mustache, and enriched themselves. Then they had the balls to rationalize their actions by telling everyone that "it's for your own good." Their behavior is so shockingly brazen that it's almost admirable. So yeah, we should throw the book at them. Hard.

[go to top]