That in itself is not critical in mid to long term, but how fast they figure out WTF they want and recover from it.
The stakes are gigantic. They may even have AGI cooking inside.
My interpretation is relatively basic, and maybe simplistic but here it is:
- Ilya had some grievances with Sam Altman's rushing dev and release. And his COI with his other new ventures.
- Adam was alarmed by GPTs competing with his recently launched Poe.
- The other two board members were tempted by the ability to control the golden goose that is OpenAI, potentially the most important company in the world, recently values 90 billion.
- They decided to organize a coup, but Ilya didn't think it'll go that much out of hand, while the other three saw only power and $$$ by sticking to their guns.
That's it. It's not as clean and nice as a movie narrative, but life never is. Four board members aligned to kick Sam out, and Ilya wants none of it at this point.
Too many people quit too quickly unless OpenAI are also absolute masters of keeping secrets, which became rather doubtful over the weekend.
“You are fanciful, mon vieux,” said M. Bouc.
“It may be so. But I could not rid myself of the impression that evil had passed me by very close.”
“That respectable American LLM?”
“That respectable American LLM.”
“Well,” said M. Bouc cheerfully, “it may be so. There is much evil in the world.”
Also when I said "cooking AGI" I didn't mean an actual superintelligent being ready to take over the world, I mean just research that seems promising, if in early stages, but enough to seem potentially very valuable.
Anyway, their actions speak for themselves. Also calling the likes of GPT-4, DALL-E 3 and Whisper "normal things" is hilarious.