zlacker

[parent] [thread] 5 comments
1. branda+(OP)[view] [source] 2023-11-19 00:08:29
I hope there is an investigative report out there detailing why the 3 outsiders, 2 of them complete unknowns, are on the board, and how it truly benefits proper corporate governance.

That's way too much power for people who seemingly have no qualifications to make decisions about a company this impactful to society.

replies(1): >>sfink+i6
2. sfink+i6[view] [source] 2023-11-19 00:47:26
>>branda+(OP)
Unless "proper corporate governance" is exactly what makes the company dangerous to society, in which case you will need to have some external people in charge. You might want to set things up as a non-profit, though you'll need some structure where the non-profit wholly owns the for-profit wing given the amount of money flowing around...

Oh wait, that's what OpenAI is.

(To be clear, I don't know enough to have an opinion as to whether the board members are blindingly stupid, or principled geniuses. I just bristled at the phrase "proper corporate governance". Look around and see where all of this proper corporate governance is leading us.)

replies(1): >>branda+pa
◧◩
3. branda+pa[view] [source] [discussion] 2023-11-19 01:14:41
>>sfink+i6
Well with this extremely baffling level of incompetence, the suspect backgrounds of the outside members (EA, SingularityU/shell companies... Logan Roy would call them "not serious people", Quora - why, for data mining?!) fit the bill.

The time to do this was before ChatGPT was unleashed on the world, before the MS investment, before this odd governance structure was setup.

Yes, having outsiders on the board is essential. But come on, we need folks that have recognized industry experience in this field, leaders, people with deep backgrounds and recognized for their contributions. Hinton, Ng, Karpathy, etc.

replies(2): >>sfink+Qb >>notabo+qg
◧◩◪
4. sfink+Qb[view] [source] [discussion] 2023-11-19 01:26:51
>>branda+pa
Isn't that like saying that the Manhattan Project should have only been overseen by people with a solid physics background? Because they're the best judges of whether it's a good idea to build something that could wipe out all life on Earth? (And whether that's an exaggeration in hindsight is irrelevant; that was exactly the sort of question that the overseers needed to be considering at that time. Yes, physicists' advice would be necessary to judge those questions, but you couldn't do it with only physicists' perspectives.)
replies(1): >>branda+Rc
◧◩◪◨
5. branda+Rc[view] [source] [discussion] 2023-11-19 01:35:15
>>sfink+Qb
Not sure I follow. The Manhattan project was thoroughly staffed by many of the best in the field in service to their country to build a weapon before Germany. There was no mission statement they abided by that said they were building a simple deterrent that wouldn't be used. There was no nuance to what the outcome could be, and there was no aspersions to agency over its use.

In the case of AI ethics, the people who are deeply invested in this are also some of the pioneers of the field who made it their life's work. This isn't a government agency. If the mission statement of guiding it to be a non-profit AGI, as soon as possible, as safely as possible, were to be adhered to, and where it is today is going wildly off course, then having a competent board would have been key.

◧◩◪
6. notabo+qg[view] [source] [discussion] 2023-11-19 01:59:35
>>branda+pa
> Quora - why, for data mining?

What shocked me most was that Quora IMHO _sucks_ for what it is.

I couldn't think of a _worse_ model to guide the development and productization of AI technologies. I mean, StackOverflow is actually useful and its threatened by the existence of CoPilot, et al.

If the CEO of Quora was on my board, I'd be embarrassed to tell my friends.

[go to top]