zlacker

[parent] [thread] 8 comments
1. foobie+(OP)[view] [source] 2023-11-18 02:16:32
I think you're completely backward. A board doesn't do that unless they absolutely have to.

Think back in history. For example, consider the absolutely massive issues at Uber that had to go public before the board did anything. There is no way this is over some disagreement, there has to be serious financial, ethical or social wrongdoing for the board to rush job it and put a company worth tens of billions of dollars at risk.

replies(2): >>apstls+hj >>tsimio+bz
2. apstls+hj[view] [source] 2023-11-18 04:44:36
>>foobie+(OP)
The board, like any, is a small group of people, and in this case a small group of people divided into two sides defined by conflicting ideological perspectives. In this case, I imagine the board members have much broader and longer-term perspectives and considerations factoring into their decision making than the significant, significant majority of other companies/boards. Generalizing doesn’t seem particularly helpful.
replies(1): >>foobie+Wx
◧◩
3. foobie+Wx[view] [source] [discussion] 2023-11-18 06:42:44
>>apstls+hj
Generalizing is how we reason, and having been on boards and worked with them closely, I can straight up tell you that's not how it works.

In general, everyone is professional unless there's something really bad. This was quite unprofessionally handled, and so we draw the obvious conclusion.

replies(1): >>apstls+Op2
4. tsimio+bz[view] [source] 2023-11-18 06:56:06
>>foobie+(OP)
Per other profiles of OpenAI, this is an organization of true believers in the benefits and dangers of AGI. It's also a non-profit, not a company.

All this to say that the board is probably unlike the boards of the vast majority of tech companies.

replies(1): >>danbmi+CI
◧◩
5. danbmi+CI[view] [source] [discussion] 2023-11-18 08:24:22
>>tsimio+bz
This. There were no investors on the board -- as Jason @ all-in said "that's just crazy".
replies(1): >>banana+tT
◧◩◪
6. banana+tT[view] [source] [discussion] 2023-11-18 10:00:59
>>danbmi+CI
> as Jason @ all-in said

lol

> "that's just crazy".

why is it crazy? the purpose of OpenAI is not to make investors rich - having investors on the board trying to make money for themselves would be crazy.

replies(1): >>Raston+hX
◧◩◪◨
7. Raston+hX[view] [source] [discussion] 2023-11-18 10:31:10
>>banana+tT
Exactly, if we assume Altman wanting to pursue commercialization at the cost of safety was the issue, the board did its job by advancing its mandate of "AI for the benefit of humanity" although not sure why they went with the nuclear option.
replies(1): >>tsimio+P21
◧◩◪◨⬒
8. tsimio+P21[view] [source] [discussion] 2023-11-18 11:17:50
>>Raston+hX
Very true.

Though I would go further than that: if that is indeed the reason, the board has proven themselves very much incompetent. It would be quite incompetent to invite this type of shadow of scandal for something that was a fundamentally reasonable disagreement.

◧◩◪
9. apstls+Op2[view] [source] [discussion] 2023-11-18 19:30:38
>>foobie+Wx
I am also not a stranger to board positions. However, I have never been on the board of a non-profit that is developing technology with genuinely deep, and as-of-now unknown, implications for the status quo of the global economy and - at least as the OpenAI board clearly believes - the literal future and safety of humanity. I haven’t been on a board where a semi-idealist engineer board member has played a (if not _the_) pivotal role in arguably the most significant technical development in recent decades, and who maintains ideals and opinions completely orthogonal to the CEO’s.

Yes, generalizing is how we reason, because it lets us strip away information that is not relevant in most scenarios and reduces complexity and depth without losing much in most cases. My point is, this is not a scenario that fits in the set of “most cases.” This is actually probably one of the most unique and corner-casey example of board dynamics in tech. Adherence to generalizations without considering applicability and corner cases doesn’t make sense.

[go to top]