zlacker

[return to "OpenAI LP"]
1. window+Bc[view] [source] 2019-03-11 17:14:48
>>gdb+(OP)
I was buying it until he said that profit is “capped” at 100x of initial investment.

So someone who invests $10 million has their investment “capped” at $1 billion. Lol. Basically unlimited unless the company grew to a FAANG-scale market value.

◧◩
2. gdb+vd[view] [source] 2019-03-11 17:20:31
>>window+Bc
We believe that if we do create AGI, we'll create orders of magnitude more value than any existing company.
◧◩◪
3. throwa+Qi[view] [source] 2019-03-11 17:53:42
>>gdb+vd
Leaving aside the absolutely monumental if that's in that sentence, how does this square with the original OpenAI charter[1]:

> We commit to use any influence we obtain over AGI’s deployment to ensure it is used for the benefit of all, and to avoid enabling uses of AI or AGI that harm humanity or unduly concentrate power.

Early investors in Google have received a roughly 20x return on their capital. Google is currently valued at $750 billion. Your bet is that you'll have a corporate structure which returns orders of magnitude more than Google on a percent-wise basis (and therefore has at least an order of magnitude higher valuation), but you don't want to "unduly concentrate power"? How will this work? What exactly is power, if not the concentration of resources?

Likewise, also from the OpenAI charter:

Our primary fiduciary duty is to humanity. We anticipate needing to marshal substantial resources to fulfill our mission, but will always diligently act to minimize conflicts of interest among our employees and stakeholders that could compromise broad benefit.

How do you envision you'll deploy enough capital to return orders of magnitude more than any company to date while "minimizing conflicts of interest among employees and stakeholders"? Note that the most valuable companies in the world are also among the most controversial. This includes Facebook, Google and Amazon.

______________________________

1. https://openai.com/charter/

◧◩◪◨
4. gdb+0k[view] [source] 2019-03-11 18:01:02
>>throwa+Qi
The Charter was designed to capture our values, as we thought about how we'd create a structure that allows us to raise more money while staying true to our mission.

Some companies to compare with:

- Stripe Series A was $100M post-money (https://www.crunchbase.com/funding_round/stripe-series-a--a3... Series E was $22.5B post-money (https://www.crunchbase.com/funding_round/stripe-series-e--d0...) — over a 200x return to date

- Slack Series C was $220M (https://blogs.wsj.com/venturecapital/2014/04/25/slack-raises...) and now is filing to go public at $10B (https://www.ccn.com/slack-ipo-heres-how-much-this-silicon-va...) — over 45x return to date

> you don't want to "unduly concentrate power"? How will this work?

Any value in excess of the cap created by OpenAI LP is owned by the Nonprofit, whose mission is to benefit all of humanity. This could be in the form of services (see https://blog.gregbrockman.com/the-openai-mission#the-impact-... for an example) or even making direct distributions to the world.

◧◩◪◨⬒
5. ericja+gr[view] [source] 2019-03-11 18:44:15
>>gdb+0k
If I understand correctly from past announcements, OpenAI has roughly $1B committed in funding? So up until OpenAI attains a $100B valuation, its incentives are indistinguishable from a for-profit entity?
[go to top]