Makes what the world would look like if, say, the Manhattan Project would have been managed the same way.
Well, a younger me working at OpenAI would resign latest after my collegues stage a coup againstvthe board out of, in my view, a personality cult. Propably would have resigned after the third CEO was announced. Older me would wait for a new gig to be ligned up to resign, with beginning after CEO number 2 the latest.
The cyckes get faster so. It took FTX a little bit longer from hottest start up to enter the trajectory of crash and burn, OpenAI did faster. I just hope this helps ro cool down the ML sold as AI hype a notch.
People keep talking about this. That was never going to happen. Look at Sam Altman's career: he's all about startups and building companies. Moreover, I can't imagine he would have agreed to sign any kind of contract with OpenAI that required exclusivity. Know who you're hiring; know why you're hiring them. His "side-projects" could have been hugely beneficial to them over the long term.
How can you make a claim like this when, right or wrong, Sam's independence is literally, currently, tanking the company? How could allowing Sam to do what he wants benefit OpenAI, the non-profit entity?
And maybe it’s not. The big mistake people make is hearing non-profit and think it means there’s a greater amount of morality. It’s the same mistake as assuming everyone who is religious is therefore more moral (worth pointing out that religions are nonprofits as well).
Most hospitals are nonprofits, yet they still make substantial profits and overcharge customers. People are still people, and still have motives; they don't suddenly become more moral when they join a non-prof board. In many ways, removing a motive that has the most direct connection to quantifiable results (profit) can actually make things worse. Anyone who has seen how nonprofits work know how dysfunctional they can be.
No one knows why the board did this. No one is talking about that part. Yet every one is on twitter talking shit about the situation.
I have worked with a lot of PhD's and some of them can be, "disconnected" from anything that isn't their research.
This looks a lot like that, disconnected from what average people would do, almost childlike (not ish, like).
Maybe this isn't the group of people who should be responsible for "alignment".
Not that I had any illusions about this being a fig leaf in the first place.
If it is just ML sold as AI hype, are you really worried about the threat of AI?
Are you talking about American hospitals?
Let's take personalities out of it and see if it makes more sense:
How could a new supply of highly optimized, lower-cost AI hardware benefit OpenAI?
In addition, public hospitals still charge for their services, it's just who pays the bill that changes, in some nations (the government as the insuring body vs a private insuring body or the individual).
They don't make large profits otherwise they wouldn't be nonprofits. They do have massive revenues and will find ways to spend the money they receive or hoard it internally as much as they can. There are lots of games they can play with the money, but experiencing profits is one thing they can't do.
The danger of generative AI is that it disrupts all kinds of things: arts, writers, journalism, propaganda... That threat already exists, the tech being no longer being hyped might allow us to properly adress that problem.
This is a common misunderstanding. Non-profits/501(c)(3) can and often do make profits. 7 of the 10 most profitable hospitals in the U.S. are non-profits[1]. Non-profits can't funnel profits directly back to owners, the way other corporations can (such as when dividends are distributed). But they still make profits.
But that's besides the point. Even in places that don't make profits, there are still plenty of personal interests at play.
[1] https://www.nytimes.com/2020/02/20/opinion/nonprofit-hospita...
Priceless. The modern version of Pascal's wager.
Honestly, I think they did that to themselves.
Any reason good enough to fire him is good enough to share with the interim CEO and the rest of the company, if not the entire world. If they can’t even do that much, you can’t blame employees for losing faith in their leadership. They couldn’t even tell SAM ALTMAN why, and he was the one getting fired!
It was not possible for a war-time government crash project to have been managed the same way. During WW2 the existential fear was an embodied threat currently happening. No one was even thinking about a potential for profits or even any additional products aside from an atomic bomb. And if anyone had ideas on how to pursue that bomb that seemed like a decent idea, they would have been funded to pursue them.
And this is not even mentioning the fact that security was tight.
I'm sure there were scientists who disagreed with how the Manhattan project was being managed. I'm also sure they kept working on it despite those disagreements.
I've only really been close to one (the owner of the small company i worked at started one), and in the past I did some consulting work for anther, but that describes what I saw in both situations fairly aptly. There seems to be a massive amount of power and ego wrapped up in the creation and running these things from my limited experience. If you were invited to a board, that's one thing, but it takes a lot of time and effort to start up a non-profit, and that's time and effort that could be spent towards some other existing non-profit usually, so I think it's relevant to consider why someone would opt for the much more complicated and harder route than just donating time and money to something else that helps in roughly the same way.
Profit is money that ends up in the bank to be used later. Compensation is what gets spent on yachts. Anything spent on hospital supplies is an expense. This stuff matters.
Then where do these profits go?
The fact that Altman and Brockman were hired so quickly by Microsoft gives a clue: it takes time to hire someone. For one thing, they need time to decide. These guys were hired by Microsoft between close-of-business on Friday and start-of-business on Monday.
My supposition is that this hiring was in the pipeline a few weeks ago. The board of OpenAI found out on Thursday, and went ballistic, understandably (lack of candidness). My guess is there's more shenanigans to uncover - I suspect that Altman gave Microsoft an offer they couldn't refuse, and that OpenAI was already screwed by Thursday. So realizing that OpenAI was done for, they figured "we might as well blow it all up".
https://en.wikipedia.org/wiki/501(c)_organization
"Religious, Educational, Charitable, Scientific, Literary, Testing for Public Safety, to Foster National or International Amateur Sports Competition, or Prevention of Cruelty to Children or Animals Organizations"
However, many other forms of organizations can be non-profit, with utterly no implied morality.
Your local Frat or Country Club [ 501(c)(7) ], a business league or lobbying group [ 501(c)(6), the 'NFL' used to be this ], your local union [ 501(c)(5) ], your neighborhood org (that can only spend 50% on lobbying) [ 501(c)(4) ], a shared travel society (timeshare non-profit?) [ 501(c)(8) ], or your special club's own private cemetery [ 501(c)(13) ].
Or you can do sneaky stuff and change your 501(c)(3) charter over time like this article notes. https://stratechery.com/2023/openais-misalignment-and-micros...
If this incident is representative, I'm not sure there was ever a possibility of good governance.
So yeah, Mayo Cinic makes a $2B profit. That is not money going to shareholders though, that's funds for a future building or increasing salaries or expanding research or something, it supposedly has to be used for the mission. What is the outrage of these orgs making this kind of profit?
This is not an interview process for hiring a junior dev at FAANG.
If you're Sam & Greg, and Satya gives you an offer to run your own operation with essentially unlimited funding and the ability to bring over your team, then you can decide immediately. There is no real lower bound of how fast it could happen.
Why would they have been able to decide so quickly? Probably because they prioritize the ability to bring over the entire team as fast as possible, and even though they could raise a lot of money in a new company, that still takes time, and they view it as critically important to hire over the new team as fast as possible (within days) that they accept whatever downsides there may be to being a subsidiary of Microsoft.
This is what happens when principles see opportunity and are unencumbered by bureaucratic checks. They can move very fast.
Employees might suddenly feel they deserve to be paid a lot more. Suppliers will play a lot more hardball in negotiations. A middle manager may give a sinecure to their cousin.
And upper managers can extract absolutely everything trough lucrative contracts to their friends and relatives. (Of course the IRS would clamp down on obvious self-dealings, but that wouldn't make such schemes disappear. It'll make them far more complicated and expensive instead.)
Outside of the US, private hospitals tend to be overtly for-profit. Price-gauging "non-profit" hospitals are mostly an American phenomenon.
That is, I think Greg and Sam were likely fired because, in the board's view, they were already running OpenAI Global LLC more as if it were a for-profit subsidiary of Microsoft driven by Microsoft's commercial interest, than as the organization able to earn and return profit but focussed on the mission of the nonprofit it was publicly declared to be and that the board very much intended it to be. And, apparently, in Microsoft's view, they were very good at that, so putting them in a role overtly exactly like that is a no-brainer.
And while it usually takes a while to vet and hire someone for a position like that, it doesn't if you've been working for them closely in something that is functionally (from your perspective, if not on paper for the entity they nominally reported to) a near-identical role to the one you are hiring them for, and the only reason they are no longer in that role is because they were doing exactly what you want them to do for you.
Before the boards' actions this friday, the company was on one of the most incredible success trajectories in the world. Whatever Sam's been doing as a CEO worked.
It takes time if you're a normal employee under standard operating procedure. If you really want to you can merge two of the largest financial institutions in the world in less than a week. https://en.wikipedia.org/wiki/Acquisition_of_Credit_Suisse_b...
https://en.wikipedia.org/wiki/German_nuclear_weapons_program
I don't know anything about how executives get hired. But supposedly this all happened between Friday night and Monday morning. This isn't a simple situation; surely one man working through the weekend can't decide to set up a new division, and appoint two poached executives to head it up, without consulting lawyers and other colleagues. I mean, surely they'd need to go into Altman and Brockman's contracts with OpenAI, to check that the hiring is even legal?
That's why I think this has been brewing for at least a week.
Hey, maybe this means the AGIs will fight amongst themselves and thus give us the time to outwit them. :D
I totally agree. I don't think this is universally true of non-profits, but people are going to look for value in other ways if direct cash isn't an option.
To entertain your theory, Let’s say they were planning on hiring him prior to that firing. If that was the case, why is everybody so upset that Sam got fired, and why is he working so hard to try to get reinstated to a role that he was about to leave anyway?
That just sounds like a biased and overly emotive+naive response on your part.
Again, most hospitals in the world operate the same way as the US. You can go almost anywhere in SE Asia, Latín América, África, etc and see this. There's a lot more to "outside the US" than Western+Central Europe/CANZUK/Japan. The only difference is that there are strong business incentives to keep the system in place since the entire industry (in the US) is valued at more than most nations' GDP.
But feel free to keep twisting the definition or moving goalposts to somehow make the American system extra nefarious and unique.
There may be drawbacks to the "instant hiring" model.