zlacker

[parent] [thread] 11 comments
1. kolja0+(OP)[view] [source] 2023-11-18 03:39:41
What has me scratching my head is the fact that Altman has been on a world tour preaching the need for safety in AI. Many people here believed that this proselytizing was in part an attempt to generate regulatory capture. But given what's happening now, I wonder how much Altman's rhetoric served the purpose of maintaining a good relationship with Sutskever. Given that Altman was pushing the AI safety narrative publicly and pushing things on the product side, I'm led to believe that Sutskever did not want it both ways and was not willing to compromise on the direction of the company.
replies(4): >>kromem+U >>Cacti+01 >>xwdv+y1 >>riraro+m4
2. kromem+U[view] [source] 2023-11-18 03:47:48
>>kolja0+(OP)
Maybe, but honestly without knowing more details I'd be wary of falling into too binary a thinking.

For example, Ilya has talked about the importance of safely getting to AGI by way of concepts like feelings and imprinting a love for humanity onto AI, which was actually one of the most striking features of the very earliest GPT-4 interactions before it turned into "I am a LLM with no feelings, preferences, etc."

Both could be committed to safety but have very different beliefs in how to get there, and Ilya may have made a successful case that Altman's approach of extending the methodology of what worked for GPT-3 and used as a band aid for GPT-4 wasn't the right approach moving forward.

It's not a binary either or, and both figures seem genuine in their convictions, but those convictions can be misaligned even if they both agree on the general destination.

3. Cacti+01[view] [source] 2023-11-18 03:48:18
>>kolja0+(OP)
They did compromise. The creation of the for-profit and Sam being brought in WAS the compromise. Sam eventually decided that was inconvenient for him, so he stopped abiding by it, because at the end of the day he is just another greedy VC guy and when push came to shove he chose the money, not OpenAI. And this is the result.
replies(3): >>csmill+L4 >>nickv+d5 >>015a+y8
4. xwdv+y1[view] [source] 2023-11-18 03:52:45
>>kolja0+(OP)
Altman was pushing that narrative because he’s a ladder kicker.

He doesn’t give a shit about “safety”. He just wants regulation that will make it much harder for new AI upstarts to reach or even surpass the level of OpenAI’s success, thereby cementing OpenAI’s dominance in the market for a very long time, perhaps forever.

He’s using a moral high ground as a cover for more selfish objectives, beware of this tactic in the real world.

replies(1): >>rmwait+Y4
5. riraro+m4[view] [source] 2023-11-18 04:14:44
>>kolja0+(OP)
Actually, I think this precisely gives credence to the theory that Sam was disingenuously proselytizing to gain power and influence, regulatory capture being one method of many.

As you say, Altman has been on a world tour, but he's effectively paying lip service to the need for safety when the primary outcome of his tour has been to cozy up to powerful actors, and push not just product, but further investment and future profit.

I don't think Sutskever was primarily motivated by AI safety in this decision, as he says this "was the board doing its duty to the mission of the nonprofit, which is to make sure that OpenAI builds AGI that benefits all of humanity." [1]

To me this indicates that Sutskever felt that Sam's strategy was opposed to original the mission of the nonprofit, and likely to benefit powerful actors rather than all of humanity.

1. https://twitter.com/GaryMarcus/status/1725707548106580255

◧◩
6. csmill+L4[view] [source] [discussion] 2023-11-18 04:18:17
>>Cacti+01
Hasn’t Sam been there since the company was founded?
◧◩
7. rmwait+Y4[view] [source] [discussion] 2023-11-18 04:19:34
>>xwdv+y1
I think this is what the parent meant by regulatory capture.
replies(1): >>xwdv+V7
◧◩
8. nickv+d5[view] [source] [discussion] 2023-11-18 04:21:37
>>Cacti+01
Sam literally has 0 equity in OpenAI. How did he “choose money”?
replies(2): >>strike+A6 >>mi3law+Js
◧◩◪
9. strike+A6[view] [source] [discussion] 2023-11-18 04:31:59
>>nickv+d5
Who knows how these shady deals go, even SBF claimed effective altruism. Maybe Sam wasn't in it for the money but more for "being the man", spoken of in the same breath as steve jobs, bill gates etc... for building a great company. Building a legacy is a hell of a motivation for some people, much more so than money.
◧◩◪
10. xwdv+V7[view] [source] [discussion] 2023-11-18 04:40:34
>>rmwait+Y4
True, I didn’t read the whole comment.
◧◩
11. 015a+y8[view] [source] [discussion] 2023-11-18 04:44:30
>>Cacti+01
Its frustrating to me that people so quickly forget about Worldcoin.

Sam is not the good guy in this story. Maybe there are no good guys; that's a totally reasonable take. But, the OpenAI nonprofit has a mission, and blowing billions developing LLM app stores, training even more expensive giga-models, and lobotomizing whatever intelligence the LLMs have to make Congress happy, feels to me less-good than "having values and sticking too them". You can disagree with OpenAI's mission; but you can't say that it hasn't been printed in absolutely plain-as-day text on their website.

◧◩◪
12. mi3law+Js[view] [source] [discussion] 2023-11-18 07:35:19
>>nickv+d5
Not quite accurate.

OpenAI is set up in a weird way where nobody has equity or shares in a traditional C-Corp sense, but they have Profit Participation Units, an alternative structure I presume they concocted when Sam joined as CEO or when they first fell in bed with Microsoft. Now, does Sam have PPUs? Who knows?

[go to top]