zlacker

[parent] [thread] 8 comments
1. neilv+(OP)[view] [source] 2024-03-01 17:54:29
> and are valued at $80+ billion. If I donated millions to them, I’d be furious.

Don't get mad; convince the courts to divide most of the nonprofit-turned-for-profit company equity amongst the donors-turned-investors, and enjoy your new billions of dollars.

replies(2): >>Timber+i7 >>a_wild+Oa
2. Timber+i7[view] [source] 2024-03-01 18:27:08
>>neilv+(OP)
The for-profit arm is what's valued at $80B not the non-profit arm that Elon donated to. If any of this sounds confusing to you, that's because it is.

Hopefully the courts can untangle this mess.

replies(2): >>prepen+kh >>jeltz+1P1
3. a_wild+Oa[view] [source] 2024-03-01 18:43:15
>>neilv+(OP)
Or just simply...Open the AI. Which they still can. Because everyone is evidently supposed to reap the rewards of this nonprofit -- from the taxpayers/governments affected by supporting nonprofit institutions, to the researchers/employees who helped ClopenAI due to their nonprofit mission, to the folk who donated to this cause (not invested for a return), to the businesses and laypeople across humanity who can build on open tools just as OAI built on theirs, to the authors whose work was hoovered up to make a money printing machine.

The technology was meant for everyone, and $80B to a few benefactors-turned-lotto-winners ain't sufficient recompense. The far simpler, more appropriate payout is literally just doing what they said they would.

replies(1): >>neilv+nc
◧◩
4. neilv+nc[view] [source] [discussion] 2024-03-01 18:51:13
>>a_wild+Oa
This is what I actually support. At this point, though, given how the non-profit effectively acted against its charter, and aggressively so, with impressive maneuvers by some (and inadequate maneuvers by others)... would the organization(s) have to be dissolved, or go through some sort of court-mandated housecleaning?
replies(1): >>a_wild+vk
◧◩
5. prepen+kh[view] [source] [discussion] 2024-03-01 19:13:29
>>Timber+i7
The nonprofit owns the for profit.
◧◩◪
6. a_wild+vk[view] [source] [discussion] 2024-03-01 19:30:07
>>neilv+nc
OpenAI should be compelled to release their models under (e.g) GPLv3. That's it. They can keep their services/profits/deals/etc to fund research, but all products of that research must be openly available.

No escape hatch excuse of "because safety!" We already have a safety mechanism -- it's called government. It's a well-established, representative body with powers, laws, policies, practices, agencies/institutions, etc. whose express purpose is to protect and serve via democratically elected officials.

We the people decide how to regulate our society's technology & safety, not OpenAI, and sure as hell not Microsoft. So OpenAI needs a reality check, I say!

replies(1): >>neilv+Tm
◧◩◪◨
7. neilv+Tm[view] [source] [discussion] 2024-03-01 19:42:47
>>a_wild+vk
Should there also be some enforcement of sticking to non-profit charter, and avoiding self-dealing and other conflict-of-interest behavior?

If so, how do you enforce that against what might be demonstrably misaligned/colluding/rogue leadership?

replies(1): >>a_wild+SL
◧◩◪◨⬒
8. a_wild+SL[view] [source] [discussion] 2024-03-01 22:21:33
>>neilv+Tm
Yes, regulators should enforce our regulations, if that's your question. Force the nonprofit to not profit; prevent frauds from defrauding.

In this case, a nonprofit took donations to create open AI for all of humanity. Instead, they "opened" their AI exclusively to themselves wearing a mustache, and enriched themselves. Then they had the balls to rationalize their actions by telling everyone that "it's for your own good." Their behavior is so shockingly brazen that it's almost admirable. So yeah, we should throw the book at them. Hard.

◧◩
9. jeltz+1P1[view] [source] [discussion] 2024-03-02 10:26:45
>>Timber+i7
No, it does not. It is very simple.
[go to top]