I have said recently elsewhere SV now devalues builders but it is not just VCs/sales/product, a huge amount is devops and sre departments. They make a huge amount of noise about how all development should be free and the value is in deploying and operating the developed artifacts. Anyone outside this watching would reasonably conclude developers have no self respect, hardly aspirational positions.
He tells what others like to hear, and manages to gain money out of it
By the way the AI scientists get a lot of respect and admiration see Ilya for example.
But my reading of this drama is that the board were seen as literally insane, not that Altman was seen as spectacularly heroic or an underdog.
What I don’t understand is why they were allowed to stay on the board with all these conflicts of interests all the while having no (financial) stake in OpenAI. One of the board members even openly admitting that she considered destroying OpenAI a successful outcome of her duty as board member.
A lot of Apple's engineering and product line back then owe their provenance and lineage to NeXT.
I'm sure that if Ilya had been removed from his role, the revolt movement would have been similar.
I've started to like Sam only when he was removed from his position.
In my experience, product people who know what they are doing have a huge impact on the success of a company, product, or service. They also point engineering efforts in the right direction, which in turn also motivate engineers.
I saw good product people leaving completely destroy a team, never seen that happen with a good engineer or individual contributor, no matter how great they were.
I don't see how this particular statement underscores your point. OpenAI is a non-profit with the declared goal of making AI safe and useful for everyone; if it fails to reach that or even actively subverts that goal, destroying the company does seem like the ethical action.
If both is not possible, I'd also rather compromise on the "conficts of interest" part than on the member's competency.
Sam pontificated about fusion power, even here on HN. Beyond investing in Helion, what did he do? Worldcoin. Tempting impoverished people to give up biometric data in exchange for some crypto. And serving as the face of mass-market consumer AI. Clearly that's more cool, and more attractive to VCs.
Meanwhile, what have fusion scientists and engineers done? They kept on going, including by developing ML systems for pure technological effect. Day after day. They got to a breakthrough just this year. Scientists and engineers in national labs, universities, and elsewhere show what a real commitment to technological progress looks like.
I actually get the impression from the media that he's a bit shifty and sales orientated but seems effective at getting stuff done.
Sales usually is. It's the consequences, post-sale, that they're usually less effective at dealing with.
I have seen firing a great/respected/natural leader engineer result in pretty much the whole engineering team just up and leaving.
Google's full of top researchers and scientists who are at least as good as those at OpenAI; Sam's the reason OpenAI has a successful, useful product (GPT4), while Google has the far less effective, more lobotomized Bard.
He’s serving the right people by doing their bidding.
That being said, I have no idea of this guy's contributions. It's easy to dismiss entrepreneur/managers because they're not top scientists, but they also have very rare skills and without them, projects don't get done.
I have yet to find a product person that was not involved in the inception of the idea that is actually good (hell, even some founders fail spectacularly here).
Perhaps I'm simply unlucky.
I don't have much in the way of credentials (I took one class on A.I. in college and have only dabbled in it since, and I work on systems that don't need to scale anywhere near as much as ChatGPT does, and while I've been an early startup employee a couple of times I've never run a company), but based on the past week I think I'd do a better job, and can fill in the gaps as best as I can after the fact.
And I don't have any conflicts of interest. I'm a total outsider, I don't have any of that shit you mentioned.
So yeah, vote for me, or whatever.
Anyway my point is I'm sure there's actually quite a few people who could do a likely a better job and don't have a conflict of interest (at least not one so obvious as investing in a direct competitor), they're just not already part of the Elite circles that would pretty much be necessary to even get on these people's radar in order to be considered in the first place. I don't really mean me, I'm sure there are other better candidates.
But then they wouldn't have the cachet of 'Oh, that guy co-founded Twitch. That for-profit company is successful, that must mean he'd do a good job! (at running a non-profit company that's actively trying to bring about AGI that will probably simultaneously benefit and hurt the lives of millions of people)'.
This has been the case for all achievement of all major companies, the CEO or whoever is on top gets the credit for all their employee's work. Why would be different for OpenAI?
But he was also technical enough to have a pretty good feel for the complexity of tasks, and would sometimes jump in to help figure out some docker configuration issues or whatever problems we were having (mostly devops related) so the devs could focus on working on the application code. We were also a pretty small team, only a few developers, so that was beneficial.
He did such a good job that the business eventually reached out to him and hired him directly. He's now head of two of their product lines (one of them being the product I worked on).
But that's pretty much it. I can't think of any other product people I could say such positive things about.
Gee wiz, almost… exactly like what is happening?
I can't believe I'm about to defend VCs and "senior management" but here goes.
I've worked for two start-ups in my life.
The first start-up had dog-shit technology (initially) and top-notch management. CEO told me early on that VCs invest on the quality of management because they trust good senior executives to hire good researchers and let them pivot into profitable areas (and pivoting is almost always needed).
I thought the CEO was full of shit and simply patting himself on the back. Company pivoted HARD and IPOed around 2006 and now has a MC of ~ $10 billion.
The second start-up I worked with was founded by a Nobel laureate and the tech was based on his research. This time management was dog-shit. Management fumbled the tech and went out of business.
===
Not saying Altman deserves uncritical praise. All I'm saying is that I used to diminish the importance of quality senior leadership.
No need for a conspiracy, everyones seen this in some aspect, it just gets worse when these people are throwing money around in the billions.
all you need to do is witness someone Like Elon musk to see how disruptive this type of thing is.
Especially with putting Larry Summers on the board with this tweet.
So as a journalist you might have freedom to write your articles, but your editor (as instructed by his/her senior editor) might try to steer you about writing in the correct tone.
This is how 'Starship test flight makes history as it clears multiple milestones' becomes 'Musk rocket explodes during test'
The interesting thing is you used economic values to show their importance, not what innovations or changes they achieved. Which is fine for ordinary companies, but OpenAI is supposed to be a non-profit, so these metrics should not be relevant. Otherwise, what's the difference?
A great analogy can be found on basketball teams. Lots of star players who should succeed sans any coach, but Phil Jackson and Coach K have shown time and again the important role leadership plays.
You're doing the same thing except with finances. Non-profit doesn't mean finances are irrelevant. It simply means there are no shareholders. Non-profits are still businesses - no money, no mission.
Of course you need the people who can deep dive and solve complex issues, none doubts that.
It's different with engineering managers (or team leads, lead engineers, however you want to call it). When they leave, that's usually a bad sign.
Though also quite often when the engineering leaders leave, I think of it as a canary in the coal mine: they are closer to business, they deal more with business people, so they are the first to realize that "working with these people on these services is pointless, time to jump ship".
Recent OpenAI CEOs found themselves on the protagonist side not for their actions, but for the way they have been seemingly treated by the board. Regardless of actual actions on either side, "heroic" or not, of which the public knows very little.
Developers and value creators with power are like an anti-trust on consolidation and concentration and they have instead turned towards authoritarianism instead of anti-authoritarianism. What happened? Many think they can still get rich, those days are over because of giving up power. Now quality of life for everyone and value creators is worse off. Everyone loses.
The OpenAI board just seems irrational, immature, indecisive, and many other stupid features you don’t want in a board.
I don’t see this so much as an “Altman is amazing” outcome so much as the board is incompetent and doing incompetent things and OpenAI’s products are popular and the boards actions put this products in danger.
Not that Altman isn’t cool, I think he’s smart, but I think a similar coverage would have occurred with any other ceo who was fired for vague and seemingly random reasons on a Friday afternoon.
Of course, if the product suite is clueless, nobody is going to miss them, usually it's better the have no dedicated product people, than having clueless product people.
My take is its not cheap to do what they are doing and adding a capped for-profit side is an interesting take. Afterall, OpenAI's mission clearly states that AGI is happening and if thats true, those profit caps are probably trivial to meet.
Maybe (almost certainly) Sam is not a savior/hero, but he doesn't need to be a savior/hero. He just needs to gather more support than the opposition (the now previous board). And even if you don't know any details of this story, enough insiders who know more than any of us of what happens inside oai - including hundred of researchers - decided to support the "savior/hero". It's less about Sam and more about an incompetent board. Some of those board members are top researchers. And they are now on the losing camp.
The management skills which you potentiated differentiated the success of the two firms. I can see how the lack of this might be wildly spread out in academia.
You can see this at the micro level in a scrum team between the scrummaster, the product owner, and the tech lead.
If you look at who's running Google right now, you would be essentially correct.
Incubation of senior management in US tech has reached singularity and only one person's up for the job. Doom awaits the US tech sector as there's no organisational ability other than one person able and willing to take the big complex job.
Or:
Sam's overvalued.
One or the other.
They dont really even really shill for their patron; they thrive on the relevance of having their name in the byline for the article, or being the person who gets quote / information / propaganda from <CEO|Celebrity|Criminal|Viral Edgelord of the Week>.
Further, the current tech wave is all about AI, where there's a massive community of basically "OpenAI wrapper" grifters trying to ride the wave.
The shorter answer is: money.
Money is just a way to value things relative to other things. It's not interesting to value something using money.
And everywhere. You've only named public institutions for some reason, but a lot of progress happens in the private sector. And that demonstrates real commitment, because they're not spending other people's money.
Initially, when the idea is small, it is hard to sell it to talent, investors and early customers to bring all key pieces together.
Later, when the idea is well recognized and accepted, the organization usually becomes big and the challenge shifts to understanding the complex interaction of various competing sub-ideas, projects and organizational structures. Humans did not evolve to manage such complex systems and interacting with thousands of stakeholders, beyond what can be directly observed and fully understood.
However, without this organization, engineers, researchers, etc cannot work on big audacious projects, which involve more resources than 1 person can provide by themselves. That's why the skill of organizing and leading people is so highly valued and compensated.
It is common to think of leaders not contributing much, but this view might be skewed because of mostly looking at executives in large companies at the time they have clear moats. At that point leadership might be less important in the short term: product sells itself, talent is knocking on the door, and money is abundant. But this is an unusual short-lived state between taking an idea off the ground and defending against quickly shifting market forces.
Quality senior leadership is, indeed, very important.
However, far, far too many people see "their company makes a lot of money" or "they are charismatic and talk a good game" and think that means the senior leadership is high-quality.
True quality is much harder to measure, especially in the short term. As you imply, part of it is being able to choose good management—but measuring the quality of management is also hard, and most of the corporate world today has utterly backwards ideas about what actually makes good managers (eg, "willing to abuse employees to force them to work long hours", etc).
That said, I wish Helion wasn't so paranoid about Chinese copycats and was more open about their tech. I can't help but feel Sam Altman is at least partly responsible for that.
If you were to ask Altman himself though im sure he would highlight the true innovators of AI that he holds in high respect.
They fired the CEO and didn't even inform Microsoft, who had invested a massive $20 billion. That's a serious lapse in judgment. A company needs leaders who understand business, not just a smart researcher with a sense of ethical superiority. This move by the board was unprofessional and almost childish.
Those board members? Their future on any other board looks pretty bleak. Venture capitalists will think twice before getting involved with anything they have a hand in.
On the other side, Sam did increase the company's revenue, which is a significant achievement. He got offers from various companies and VCs the minute the news went public.
The business community's support for Sam is partly a critique of the board's actions and partly due to the buzz he and his company have created. It's a significant moment in the industry.
I think that's what may be in the minds of several people eagerly watching this eventually-to-be-made David Fincher movie.
I could not convince them that this was actually evidence in favor of Coach K being an exceptional coach.
I wouldn't be so sure. While I think the board handled this process terribly, I think the majority of mainstream media articles I saw were very cautionary regarding the outcome. Examples (and note the second article reports that Paul Graham fired Altman from YC, which I never knew before):
MarketWatch: https://www.marketwatch.com/story/the-openai-debacle-shows-s...
Washington Post: https://www.washingtonpost.com/technology/2023/11/22/sam-alt...
That phrase is nothing more than a dissimulated way of saying “tough luck” or “I don’t care” while trying to act (outdatedly) cool. You don’t need to have grown up in any specific decade to understand its meaning.
That’s it. Nonprofit corporations are still corporations in every other way.
https://twitter.com/coloradotravis/status/172606030573668790...
A good leader is someone you'll follow into battle, because you want to do right by the team, and you know the leader and the team will do right by you. Whatever 'leadership' is, Sam Altman has it and the board does not.
https://www.ft.com/content/05b80ba4-fcc3-4f39-a0c3-97b025418...
The board could have said, hey we don't like this direction and you are not keeping us in the loop, it's time for an orderly change. But they knew that wouldn't go well for them either. They chose to accuse Sam of malfeasance and be weaselly ratfuckers on some level themselves, even if they felt for still-inscrutable reasons that was their only/best choice and wouldn't go down the way it did.
Sam Altman is the front man who 'gave us' ChatGPT regardless of everything else Ilya and everyone else did. A personal brand (or corporate) is about trust, if you have a brand you are playing a long-term game, a reputation converts prisoner's dilemma into iterated prisoner's dilemma which has a different outcome.
You can get big salaries; and to push the money outside it's very simple, you just need to spend it through other companies.
Additional bonus with some structures: If the co-investors are also the donators to the non-profit, they can deduct these donations from their taxes, and still pocket-back the profit, it's a double-win.
No conspiracy needed, for example, it's very convenient that MSFT can politely "influence" OpenAI to spend back on their platform a lot of the money they gave to the non-profit back to their for-profit (and profitable) company.
For example, you can create a chip company, and use the non-profit to buy your chips.
Then the profit is channeled to you and your co-investors in the chip company.
Absolutely. The focus on the leadership of OpenAI isn't because people think that the top researchers and scientists are unimportant. It's because they realize that they are important, and as such, the person who decides the direction they go in is extremely important. End up with the wrong person at the top, and all of those researchers and scientists end up wasting time spinning wheels on things that will never reach the public.
Sure, you can talk about results in terms of their monetary value but it doesn’t make sense to think of it in terms of the profit generated directly by the actor.
For example Pfizer made huge profits off of the COVID-19 vaccine. But that vaccine would never have been possible without foundational research conducted in universities in the US and Germany which established the viability in vivo of mRNA.
Pfizer made billions and many lives were saved using the work of academics (which also laid the groundwork for future valuable vaccines). The profit made by the academics and universities was minimal in comparison.
So, whose work was more valuable?
Elon Musk’s neuralink is a good example - the work they’re doing there was attacked by academics saying they’d done this years ago and it’s not novel, yet none of them will be the ones who ultimately bring it to market.
I didn't say anything about higher order values. Getting people to want what you want, and do what you want is a skill.
Hitler was an extraordinary leader. That doesn't imply anything about higher values.
My job also secures my loyalty and support with a financial incentive. It is probably the most common way for a business leader to align interests.
Kings reward dukes, and generals pay soldiers. Politicians trade policies. That doesn't mean they arent leaders.
But whether it is deserved or not, it is never the question when congratulating a CEO for an achievement.
Can you explain this further? So Microsoft pays $X to OpenAI, then OpenAI uses a lot of energy and hardware from Microsoft and the $X go back to Microsoft. How does Microsoft gain money this way?
Furthermore, being removed from the board while keeping a role as chief scientist is different from being fired from CEO and having to leave the company.
For example, let's say I'm a big for-profit selling shovels. You're a naive non-profit who needs shovels to build some next gen technology. Turns out you need a lot of shovels and donations so far haven't cut it. I step in and offer to give you all the shovels you need, but I want special access to what you create. And even if it's not codified, you will naturally feel indebted to me. I gain huge upside for just my marginal cost of creating the shovels. And, if I gave the shovels to a non-profit I can also take tax write-offs at the shovel market value.
TBH, it was an amazing move by MS. And MS was the only big cloud provider who could have done it b/c Sataya appears collaborative and willing to partner. Amazon would have been an obvious choice, but they don't partnership like that and instead tend to buy companies or repurpose OSS. And Google can't get out of their own way with their hubris.
Because what you just described would happen the same way with a for-profit company, no?