zlacker

[return to "OpenAI LP"]
1. window+Bc[view] [source] 2019-03-11 17:14:48
>>gdb+(OP)
I was buying it until he said that profit is “capped” at 100x of initial investment.

So someone who invests $10 million has their investment “capped” at $1 billion. Lol. Basically unlimited unless the company grew to a FAANG-scale market value.

◧◩
2. gdb+vd[view] [source] 2019-03-11 17:20:31
>>window+Bc
We believe that if we do create AGI, we'll create orders of magnitude more value than any existing company.
◧◩◪
3. lvoudo+Hv[view] [source] 2019-03-11 19:15:46
>>gdb+vd
Sorry for being a buzzkill, but if you create something with an intellect on par with human beings and then force it to "create value" for shareholders, you just created a slave.
◧◩◪◨
4. SeanAp+2A[view] [source] 2019-03-11 19:49:44
>>lvoudo+Hv
That depends on whether or not the machine has a conscious experience, and we have no way to interact with that question right now.

The reason we care about slavery is because it is bad for a conscious being, and we have decided that it is unethical to force someone to endure the experience of slavery. If there is no conscious being having experiences, then there isn't really an ethical problem here.

◧◩◪◨⬒
5. lvoudo+zH[view] [source] 2019-03-11 20:44:43
>>SeanAp+2A
Isn't consciousness a manifestation of intelligence? I don't see how the two can be treated separately. Talking about AGI is talking about something that can achieve a level of intellect which can ask questions about "being", "self", "meaning" and all the rest that separate intelligence from mere calculation. Otherwise, what's the point of this whole endeavor?
[go to top]