zlacker

[return to "We have reached an agreement in principle for Sam to return to OpenAI as CEO"]
1. eclect+79[view] [source] 2023-11-22 07:00:30
>>staran+(OP)
The media and the VCs are treating Sam like some hero and savior of AI. I’m not getting it. What has he done in life and/or AI to deserve so much respect and admiration? Why don’t top researchers and scientists get equivalent (if not more) respect, admiration and support? It looks like one should strive to become product manager, not an engineer or a scientist.
◧◩
2. busyan+JK[view] [source] 2023-11-22 12:07:33
>>eclect+79
> Why don’t top researchers and scientists get equivalent (if not more) respect, admiration and support?

I can't believe I'm about to defend VCs and "senior management" but here goes.

I've worked for two start-ups in my life.

The first start-up had dog-shit technology (initially) and top-notch management. CEO told me early on that VCs invest on the quality of management because they trust good senior executives to hire good researchers and let them pivot into profitable areas (and pivoting is almost always needed).

I thought the CEO was full of shit and simply patting himself on the back. Company pivoted HARD and IPOed around 2006 and now has a MC of ~ $10 billion.

The second start-up I worked with was founded by a Nobel laureate and the tech was based on his research. This time management was dog-shit. Management fumbled the tech and went out of business.

===

Not saying Altman deserves uncritical praise. All I'm saying is that I used to diminish the importance of quality senior leadership.

◧◩◪
3. rtsil+XM[view] [source] 2023-11-22 12:23:44
>>busyan+JK
> IPOed around 2006 and now has a MC of ~ $10 billion.

The interesting thing is you used economic values to show their importance, not what innovations or changes they achieved. Which is fine for ordinary companies, but OpenAI is supposed to be a non-profit, so these metrics should not be relevant. Otherwise, what's the difference?

◧◩◪◨
4. infect+pQ[view] [source] 2023-11-22 12:51:07
>>rtsil+XM
How do you do expensive bleeding edge research with no money? Sure you might get some grants in the millions but what if it takes billions. Now lets assume the research is no small feat, its not just a handful of individuals in a lab, we need to hire larger teams to make it happen. We have to pay for those individuals and their benefits.

My take is its not cheap to do what they are doing and adding a capped for-profit side is an interesting take. Afterall, OpenAI's mission clearly states that AGI is happening and if thats true, those profit caps are probably trivial to meet.

[go to top]