What does Altman bring to the table besides raising money from foreign governments and states, apparently? I just do not understand all of this. Like, how does him leaving and getting replaced by another CEO the next week really change anything at the ground level other than distractions from the mission being gone?
And the outpouring of support for someone who was clearly not operating how he marketed himself publicly is strange and disturbing indeed.
"Dozens" sounds like about right amount for a large org.
Of course, some employees may agree with the doom/safety board ideology, and will no doubt stay. But I highly doubt everyone will, especially the researchers who were working on new, powerful models — many of them view this as their life's work. Sam offers them the ability to continue.
If you think this is about "the big bucks" or "fame," I think you don't understand the people on the other side of this argument at all.
It is Sam Altman. He will have one in a week.
> It seems like half of the people working at the non-profit were not actually concerned about the mission but rather just waiting out their turn for big bucks and fame.
I would imagine most employees at any organization are not really there because of corporate values, but their own interests.
> What does Altman bring to the table besides raising money from foreign governments and states, apparently?
And one of the world's largest tech corporations. If you are interested in the money side, that isn't something to take lightly.
So I would bet it is just following the money, or at least the expected money.
The new board also wants to slow development. That isn't very exciting either.
They didn't "bring" a hyper capitalist. Sam Co-founded this entire thing lol. He was there from the beginning.
The world is filled with Sam Altmans, but surely not enough Ilya Sutskevers.
OpenAI would not exist if FAANG had been capable of getting out of it's own way and shipping things. The moment OpenAI starts acting like the companies these people left, it's a no brainer that they'll start looking for the door.
I'm sure Ilya has 10 lifetimes more knowledge than me locked away in his mind on topics I don't even know exist... but the last 72 hours are the most brain dead actions I've ever seen out of the leadership of a company.
This isn't even cutting your own nose of to spite the face: this is like slashing your own tires to avoid going in the wrong direction.
The only possible justification would have been some jailable offense from Sam Altman, and ironically their initial release almost seemed to want to hint that before they were forced to explicitly state that wasn't the case. At the point where you're forced to admit you surprise fired your CEO for relatively benign reasons how much must have gone completely sideways to land you in that position?
This is kind of like the leadership of the executive branch switching parties. You're not going to say "why would the staff immediately quit?" Especially since this is corporate America, and sama can have another "country" next week.
CEOs should be judged by their vision for the company, their ability to execute on that vision, bringing in funding, and building the best executive team for that job. That is what Altman brings to the table.
You make it seem that wanting to make money is a zero-sum game, which is a narrow view to take - you can be heavily emotionally and intellectually invested in what you do for a living and wanting to be financially independent at the same time. You also appear to find it “disturbing” that people support someone that is doing a good job - there has always been a difference between marketing and operations, and it is rather weird you find that disturbing - and appreciate stability, or love working for a team that gets shit done.
To address your initial strawman, why would workers quit when the boss leaves? Besides all the normal reasons listed above, they also might not like the remaining folks, or they may have lost faith in those folks, given the epic clusterfuck they turned this whole thing into. All other issues aside, if I would see my leadership team fuck up this badly, on so many levels, i’d be getting right out of dodge.
These are all common sense, adult considerations for anyone that has an IQ and age above room temperature and that has held down a job that has to pay the bills, and combining that with your general tone of voice, I’m going to take a wild leap here and posit that you may not be asking these questions in good faith.
It's the AI era - VCs are going crazy funding AI startups. What makes you think Greg and Sam would have a hard time raising millions/billions and starting a new company in a week if they want to?
Better run for the lifeboat before the ship hits the iceberg.
> It is Sam Altman. He will have one in a week.
His previous companies were Loopt and Worldcoin. Won't his next venture require finding someone else to piggyback off of?
> If you are interested in the money side, that isn't something to take lightly.
I am interested in how taking billions from foreign companies and states could lead to national security and conflict of interest problems.
> The new board also wants to slow development.
It's not a new board as far as I know.
What is this instability, in your view? And how is this “desired stability” going to come back?
For example, Elon Musk was smart enough to do some things … then he crashed and burned with Twitter because it’s about people and politics. He could not have done a worse job, despite being “smart.”
If Ilya & co. want the staff to side with them, they have to give a reason first. It doesn't necessarily have to be convincing, but not giving a reason at all will never be convincing.
It's "such a big deal" because he has been leading the company, and apparently some people really like how and they really don't like how it ended.
Why would it require any other explanation? Are you asking what leaders do and why an employee would care about what they do...?
It's not a new board, but it's the time when the board decided to assert their power and make their statement/vision clear.
There is just a giant gap here where I simply do not get it, and I see no evidence that explains me not getting it is missing some key aspect of all this. This just seems like classic cargo cult, cult of personality, and following money and people who think they know best
“Difficult to understate” would mean he has little to no social capital.
It won't be hard for them to hire researchers and engineers, from OpenAI or other places.
Questions like this makes me wonder if you are a troll. I won't continue this thread.
This is what I referred as "Cargo Cult AI". You can get the money, but money is not the only ingredient needed to make things happen.
edit: Looks like they won't have a brand new company next week, but joining an existing one.
So Ilya Sutskever, one of the most distinguished ML researchers of his generation does not understand the technology ?
The same guy who's been on record saying LLMs are enough for AGI ?
This is like a bunch of people joining a basketball team where the coach starts turning it into a soccer team, and then the GM fires the coach for doing this and everyone calls the GM crazy and stupid. If you want to play soccer, go play soccer!
If you want to make a ton of money in a startup moving fast, how about don't setup a non-profit company spouting a bunch of humanitarian shit? It's even worse, because Altman very clearly did all this intentionally by playing the "I care about humanity card" just long enough while riding on the coattails of researchers where he could start up side processes to use his new AI profile to make the big bucks. But now people want to make him a martyr simply because the board called his bluff. It's bewildering.
Consider the relative charisma of the people around him, though.
In the field of AI, right now, "slowing down" is like deciding to stop the car and walk the track by foot in the middle of a Formula 1 race. It's like going backwards.
Unless things change from the current status quo, OpenAI will be irrelevant in less than 2 years. And of course many will quit such a company and go work somewhere where the CEO wants to innovate, not slow down.
He has a better chance than some other random guy who was not the CEO of OpenAI.
I would think to myself, what if management ever had a small disagreement with me?
I quit a line cook job once in a very similar circumstance scaled down to a small restaurant. The inexperienced owners were making chaotic decisions and fired the chef and I quit the same day, not out of any kind of particular loyalty or anger, I just declined the chaos of the situation. Quitting before the chaos hurt me or my reputation by getting mixed up in it… to move on to other things.
Do you? Because that part is way more irritating, and, honestly, starting to read your original comment I thought that was where you were going with this: Why was he fired, exactly?
The way the statement was framed basically painted him a liar, in a way, so vague, that people put forth the most insane theories about why. I can sense some animosity, but do you really think it's okay to fire anyone in a way, where to the outside the possible explanation ranges from a big data slip to molesting their sister?
Nothing has changed. That is the part that needs transparency and its lack is bewildering.
If a CEO of a non-profit is raising billions of dollars from foreign companies and states to create a product that he will then sell to the non-profit he is CEO of, I view that as adding instability to the non-profit given its original mission. Because that mission wasn't to create a market for the CEO to take advantage of for personal gain.
As for Altman... I don't understand what's insignificant about raising money and resources from outside groups? Even if he wasn't working directly on the product itself, that role is still valuable in that it means he knows the amounts of resources that kind of project will require while also commanding some amount of familiarity with how to allocate them effectively. And on top of that he seems understand how to monetize the existent product a lot better than the Ilya who mostly came out of this looking like a giant hazard for anyone who isn't wearing rose tinted sci-fi goggles.
OpenAI already had the best technology fully developed and in production when Microsoft invested in them.
I believe "cargo cult" means something quite different to how you're using it.
It's not "cargo cult" to consider someone's CV when you hire them for a new job. Sam Altman ran a successful AI company before and he most likely can do it again if provided enough support and resources.
Still, what do they actually want? It seems a bit overly dramatic for such an organisation.
Yes, but that doesn't mean it's enough. Not every random guy who wasn't the CEO of OpenAI is about to start an AI company (though some probably are).
It's quite possible an AI company does need a better vision than "hire some engineers and have them make AI".
For example, one scenario someone in a different thread conjectured is that Sam was secretly green-lighting the intentional (rather than incidental) collection of large amounts of copyrighted training data, exposing the firm to a great risk of a lawsuit from the media industry.
If he hid this from the board, “not being candid” would be the reason for his firing, but if the board admits that they know the details of the malfeasance, they could become entangled in the litigation.
Bear in mind that the cause of an equity market crash and its trigger are two different things.
The 2000 crash in Tech was caused by market speculation in enthusiastic dot-com companies with poor management YES, but the trigger was simply the DOJ finally making Bill throw a chair (they had enough of being humiliated by him for decades as they struggled with old mainframe tech and limited staffing).
If the dot-com crash trigger had not arrived for another 12-18 months, I’m sure the whole mess could have been swept under the rug by traders during the Black Swan event and the recovery of the healthy companies would have been 5-6 months, not 5-6 years (or 20 years in MSFT’s case).
I think it's pretty obvious after reading it why people who were really committed to that Charter weren't happy with the direction that Sam was taking the company.
About him and Greg joining to Microsoft.
> I believe "cargo cult" means something quite different to how you're using it.
I don't think so.
Tribes believed that building wooden air strips or planes would bring the goods they have seen during wartime.
People believe that bringing Altman will bring the same thing (OpenAI as is) exactly where it's left off.
Altman is just tip of the iceberg. Might have some catalyst inside him, but he's not the research itself or the researcher himself.
In fact, he is exactly the type to be on the board.
He is not the one saying 'slow down we might accidentally invent an AGI that takes over the world'. As you say, he says, LLMS are not a path to a world dominating AGI.
We have a bunch of people talking about how worried they are and how we should slow down, and among them Sam Altman, and you see he was shipping fast. And Elon Musk, who also was concurrently working on his own AI startup while telling everyone how we should stop.
There's no stopping this and any person of at least average intelligence is fully aware of this. If a "top researcher" is in favor of not researching, then they're not a researcher. If a researcher doesn't want to ship anything they research, they're also not a researcher.
OpenAI has shipped nothing so far that is in any way suggesting the end of humanity or other such apocalyptic scenario. In total, these AI models have great potency in making our media, culture, civilization a mess of autogenerated content, and they can be very disruptive in a negative way. But no SINGLE COMPANY is in control of this. If it's not OpenAI, it'll be one of the other AI companies shipping comparable models right now.
OpenAI simply had the chance to lead, and they just gave up on it. Now some other company will lead. That's all that happened. OpenAI slowing down won't slow down AI in general. It just makes OpenAI irrelevant in 1-2 years time max.
That is, if you do not subscribe to one of the various theories that him sinking Twitter was intentional. The most popular ones I've come across are "Musk wants revenge for Twitter turning his daughter trans", "Saudi-Arabia wants to get rid of Twitter as a trusted-ish network/platform to prevent another Arab Spring" and "Musk wants to cozy up to a potential next Republican presidency".
Personally, I think all three have merits - because otherwise, why didn't the Saudis and other financiers go and pull an Altman on Musk? It's not Musk's personal money he's burning on Twitter, it's to a large degrees other people's money.
> startup investing does not consist of trying to pick winners the way you might in a horse race. But there are a few people with such force of will that they're going to get whatever they want.
- The elite ML/AI researchers and engineers.
- The elite SV/tech venture capitalists.
These types come with their own followings - and I'm not saying that these two never intersect, but on one side you get a lot of brilliant researchers that truly are in it for the mission. They want to work there, because that's where ground zero is - both from the theoretical and applied point of view.
It's the ML/AI equivalent of working at CERN - you could pay the researchers nothing, or everything, and many wouldn't care - as long as they get to work on the things they are passionate about, AND they get to work with some of the most talented and innovative colleagues in the world. For these, it is likely more important to have top ML/AI heads in the organization, than a commercially-oriented CEO like Sam.
On the other side, you have the folks that are mostly chasing prestige and money. They see OpenAI as some sort of springboard into the elite world of top ML, where they'll spend a couple of years building cred, before launching startups, becoming VP/MD/etc. at big companies, etc. - all while making good money.
For the latter group, losing commercial momentum could indeed affect their will to work there. Do you sit tight in the boat, or do you go all-in on the next big player - if OpenAI crumbles the next year?
With that said, leadership conflicts and uncertainty is never good - whatever camp you're in.
Beside the argument that creshal brought up in a sibling comment that some people are more charismatic live and some are more charismatic through a camera:
In my observation, quite some programmers are much more immune to "charisma influence" (or rather: manipulation by charisma) than other people. For example, in the past someone sent me an old video of Elon Musk where in some TV show (I think) he explained how he wants to build a rocket to fly to the moon and the respective person claimed that this video makes you want Musk to succeed because of the confidence that Elon Musk shows. Well, this is not the impression that the video made on me ...
>The reason I was a founding donor to OpenAI in 2015 was not because I was interested in AI, but because I believed in Sam. So I hope the board can get its act together and bring Sam and Greg back.
I guess other people joined for similar reasons.
As regards the 'strange and disturbing' support, personally I thought OpenAI was doing cool stuff and it was a shame to break it because of internal politics.
What I say is, both lost their status quo (OpenAI as the performer, Sam as the leader), and both will have to re-adjust and re-orient.
The magic smoke has been let out. Even if you restore the "configuration" of OpenAI with Sam and all employees before Friday, it's almost impossible to get the same company from these parts.
Again, Sam was part of what made OpenAI what it is, and without it, he won't be able to perform the same. Same is equally valid for OpenAI.
Things are changing, it's better to observe rather than dig for an entity or a person. Life is bigger than both of them, even when combined.
Seems like all these "business guys" think that's all it takes.
His previous endeavor was YC partner, right? So a rich VC turning to a CEO. To make even more money. How original. If any prominent figure was to be credited here beyond Ilya S., well that would probably be Musk. Not S.A. who as a YC partner/whatever played Russian roulette with other rich folks' money all these years... As for MS hiring S.A., they are just doing the smart thing: if S.A. is indeed that awesome and everyone misses the "charisma", he'll pioneer AI and even become the next MS CEO... Or Satya Nadela will have his own "Windows Phone" moment with SamAI ;)
If your analysis is based solely off YouTube interviews, I think your perspective on Sam’s capabilities and personality is going to be pretty surface level and uninteresting.
Here's more about Justin.tv the new interim CEO. It isn't paywalled. https://www.cnbc.com/2023/11/20/who-is-emmett-shear-the-new-...
Wrong question. From the behavior of the board this weekend, it seems like the question is more "Do you understand how he was fired?".
IE: Immediately, on a Friday before Market close, before informing close partners (like Microsoft with 49% stake).
The "why" can be correct, but if the "how" is wrong that's even worse in some regards. It means that the board's thinking process is wrong and they'll likely make poor decisions in the future.
I don't know much about Sam Altman, but the behavior of the board was closer to a huge scandal. I was expecting news of some crazy misdeed of some kind, not just a simple misalignment with values.
Under these misalignment scenarios, you'd expect a stern talking to, and then a forced resignation over a few months. Not an immediate firing / removal. During this time, you'd inform Microsoft (and other partners) of the decision to get everyone on the same page, so it all elegantly resolves.
EDIT: And mind you, I don't even think the "why" has been well explained this weekend. That's part of the reason why "how" is important, to make sure the "why" gets explained clearly to everyone.
As he approached, lightning crackled around him, as if he was commanding the elements themselves. With a deft flick of his wrist, he sent a bolt of lightning to scare away a school of flying sharks that were drawn by the storm. Landing on the deck of my boat with the grace of a superhero, he surveyed the chaos.
"Need a hand with those lobsters?" he quipped, as he single-handedly wrangled the crustaceans with an efficiency that would put any seasoned fisherman to shame. But Sam wasn't done yet. With a mere glance, he reprogrammed my malfunctioning GPS using his mind, charting a course to safety.
As the boat rocked violently, a massive wave loomed over us, threatening to engulf everything. Sam, unfazed, simply turned to the wave and whispered a few unintelligible words. Incredibly, the wave halted in its tracks, parting around us like the Red Sea. He then casually conjured a gourmet meal from the lobsters, serving it with a fine wine that materialized out of thin air.
Just as quickly as he had appeared, Sam mounted his drone once more. "Time to go innovate the weather," he said with a wink, before soaring off into the storm, leaving behind a trail of rainbows.
As the skies cleared and the sea calmed, I realized that in the world of Silicon Valley CEOs, having a "Sam Altman saved my butt" story was more than just a rite of passage; it was a testament to the boundless, almost mythical capabilities of a man who defied the very laws of nature and business. And I, a humble lobster fisherman, had just become part of that legend.
Of the $46 Billion Twitter deal ($44 equity + $2 debt buyout), it was:
* $13 Billion Loans (bank funded)
* $33 Billion Equity -- of this, ~$9 Billion was estimated to be investors (including Musk, Saudis, Larry Ellison, etc. etc.)
So its about 30% other investors and 70% Elon Musk money.
Whether or not he works at the company is symbolic and indicative of who is in charge: the people who want to slow AI progress, or the people who want to speed it up.
This guy is a villain.
This definitely sounds like someone the average person - including the average tech worker, exceptionally income-engorged as they may be - would want heading the, "Manhattan Project but potentially for inconceivably sophisticated social/economic/mind control et al." project. /s
Sam will be leading a new division at Microsoft. He will do alright now that he has access to all of the required resources.
> better to observe rather than dig for an entity or a person
Yes agreed. I don't know much about Sam personally and don't care. OpenAI itself has not made any fundamental breakthroughs in AI research. AI is much bigger than these two.
It's like an appeal to authority against an authority that isn't even saying what you're appealing for.
I start to believe these workers are mostly financially motivated and that's why they follow him.
Pure f***g Greed. He is basically a front-man for a bunch of VCs/Angels/Influential Business Folks/Shady Investors/etc. who were betting on making big bucks through him.
Unfortunately, Ilya and his philosophical/ethical/moral stance has gotten in their way and hence they have let loose their dogs in the media to play up Sam Altman's "indispensability" to OpenAI.
However, pg has been working with many founders. And he has been working with Altman. I haven’t.
So while it may puzzle me, I do have to wonder what there is that I may be missing.
This is exactly it! Thanks for calling it out.
Sam Altman was just using the Researchers and their IP to enrich himself(and his select group of friends) while shafting everybody else including the researchers themselves.
The rest is "Halo Effect" - https://en.wikipedia.org/wiki/Halo_effect
tech execs are trained to be toned down when in public and the camera is on them(so they don't say something that will make -ve headlines later).
Example:
Elon's recent biography shows that he swears a lot casually while working (as do many of us). You wouldn't glean that from any of his public interviews.