zlacker

[return to "OpenAI LP"]
1. window+Bc[view] [source] 2019-03-11 17:14:48
>>gdb+(OP)
I was buying it until he said that profit is “capped” at 100x of initial investment.

So someone who invests $10 million has their investment “capped” at $1 billion. Lol. Basically unlimited unless the company grew to a FAANG-scale market value.

◧◩
2. gdb+vd[view] [source] 2019-03-11 17:20:31
>>window+Bc
We believe that if we do create AGI, we'll create orders of magnitude more value than any existing company.
◧◩◪
3. komali+Gh[view] [source] 2019-03-11 17:46:12
>>gdb+vd
I was going to make a comment on the line

>The fundamental idea of OpenAI LP is that investors and employees can get a capped return if we succeed at our mission

Is that the mission? Create AGI? If you create AGI, we have a myriad of sci-fi books that have explored what will happen.

1. Post-scarcity. AGI creates maximum efficiency in every single system in the world, from farming to distribution channels to bureaucracies. Money becomes worthless.

2. Immortal ruling class. Somehow a few in power manage to own total control over AGI without letting it/anyone else determine its fate. By leveraging "near-perfect efficiency," they become god-emperors of the planet. Money is meaningless to them.

3. Robot takeover. Money, and humanity, is gone.

Sure, silliness in fiction, but is there a reasonable alternative from the creation of actual, strong general artificial intelligence? I can't see a world with this entity in it that the question of "what happens to the investors' money" is a relevant question at all. Basically, if you succeed, why are we even talking about investor return?

◧◩◪◨
4. Veedra+171[view] [source] 2019-03-12 00:18:11
>>komali+Gh
As far as I can disentangle from [1], OpenAI posits only moderate superhuman performance. The profit would come from a variant where OpenAI subsumes much but not all of the economy and does not bring things to post-scarcity. The nonprofit would take ownership of almost all of the generated wealth, but the investments would still have value since the traditional ownership structure might be left intact.

I don't buy the idea myself, but I could be misinterpreting.

[1] https://blog.gregbrockman.com/the-openai-mission

[go to top]