zlacker

[parent] [thread] 1 comments
1. lucubr+(OP)[view] [source] 2023-11-19 04:50:31
...or they don't respond to these pressure tactics, continue talking to their employees to ameliorate legitimate concerns, and accept that some of Sam's hires will go to join him. OpenAI's core (remembering that OpenAI's Charter doesn't demand it makes cool consumer/developer AI products, it demands OpenAI build AGI safely) is not the ChatGPT product team or admin, it is the research team that Ilya leads (or lead until a month ago when Sam tried to sideline him). The company isn't going to leave to follow Sam, or at least the scientists and engineers aren't. They've lost some technical leads that Sam hired and will probably lose more, but it's worth it to make sure that OpenAI is actually following its Charter.
replies(1): >>chatma+T
2. chatma+T[view] [source] 2023-11-19 04:56:14
>>lucubr+(OP)
> continue talking to their employees to ameliorate legitimate concerns, and accept that some of Sam's hires will go to join him

This is wishful thinking. If an employee is inclined to follow the innovation, it's clear where they'll go.

But otherwise, the point you raise is a good one: this is about the charter of the board. Many of us are presuming a financial incentive, but the structure of the company means they might actually be incentivized to stop the continued development of the technology if they think it poses a risk to humanity. Now, I personally find this to be hogwash, but it is a legitimate argument for why the board might actually be right in acting apparently irrationally.

[go to top]