Yet where OpenAI’s attempt at signaling may have been drowned out by other, even more conspicuous actions taken by the company, Anthropic’s signal may have simply failed to cut through the noise. By burying the explanation of Claude’s delayed release in the middle of a long, detailed document posted to the company’s website, Anthropic appears to have ensured that this signal of its intentions around AI safety has gone largely unnoticed [1].
That is indeed quite the paper to write whilst on the board of OpenAI, to say the least.
[1] https://cset.georgetown.edu/publication/decoding-intentions/
During the call, Jason Kwon, OpenAI’s chief strategy officer, said the board was endangering the future of the company by pushing out Mr. Altman. This, he said, violated the members’ responsibilities.
Ms. Toner disagreed. The board’s mission is to ensure that the company creates artificial intelligence that “benefits all of humanity,” and if the company was destroyed, she said, that could be consistent with its mission. In the board’s view, OpenAI would be stronger without Mr. Altman.
It strikes me as exactly the sort of thing she should be writing given OpenAI's charter. Recognizing and rewarding work towards AI safety is good practice for an organization whose entire purpose is the promotion of AI safety.
Yeah, such a person totally blocks your startup from making billions of dollars instead of benefitting humanity.
Oh wait...
On the other hand, its quite apparent that essentially all of the OpenAI workforce (understandably, given the compensation package which creates a financial interest at odds with the nonprofit's mission) and in particular the entire executive team saw the charter as a useful PR fiction, not a mission (except maybe Ilya, though the flip-flop in the middle of this action may mean he saw it the same way, but thought that given the conflict, dumping Sam and Greg would be the only way to preserve the fiction, and whatever cost it would have would be worthwhile given that function.)
Altman is past borderline.