The board may have been incompetent and shortsighted. Perhaps they should even try and bring Altman back, and reform themselves out of existence. But why would the vast majority of the workforce back an open letter failing to signal where they stand on the crucial issue - on the purpose of OpenAI and their collective work? Given the stakes which the AI community likes to claim are at issue in the development of AGI, that strikes me as strange and concerning.
I have no inside information. I don't know anyone at Open AI. This is all purely speculation.
Now that that's out out the way, here is my guess: money.
These people never joined OpenAI to "advance sciences and arts" or to "change the world". They joined OpenAI to earn money. They think they can make more money with Sam Altman in charge.
Once again, this is completely all speculation. I have not spoken to anyone at Open AI or anyone at Microsoft or anyone at all really.
EDIT: I don't know why this is being downvoted. My speculation as to the average OpenAI employee's place in the global income distribution (of course wealth is important too) was not snatched out of thin air. See: https://www.vox.com/future-perfect/2023/9/15/23874111/charit...
I legitimately don't understand comments that dismiss the pursue of better compensation because someone is "already among the highest lifetime earners on the planet."
Superficially it might make sense: if you already have all your lifetime economic needs satisfied, you can optimize for other things. But does working in OpenAI fulfill that for most employees?
I probably fall into that "highest earners on the planet" bucket statistically speaking. I certainly don't feel like it: I still live in a one bedroom apartment and I'm having to save up to put a downpayment on a house / budget for retirement / etc. So I can completely understand someone working for OpenAI and signing such a letter if a move the board made would cut down their ability to move their family into a house / pay down student debt / plan for retirement / etc.