zlacker

OpenAI board in discussions with Sam Altman to return as CEO

submitted by medler+(OP) on 2023-11-18 22:51:39 | 1243 points 1573 comments
[view article] [source] [go to bottom]

NOTE: showing posts with links only show all posts
21. gkober+z1[view] [source] 2023-11-18 23:00:36
>>medler+(OP)
I'd bet money Satya was a driver of this reversal.

I genuinely can't believe the board didn't see this coming. I think they could have won in the court of public opinion if their press release said they loved Sam but felt like his skills and ambitions diverged from their mission. But instead, they tried to skewer him, and it backfired completely.

I hope Sam comes back. He'll make a lot more money if he doesn't, but I trust Sam a lot more than whomever they ultimately replace him with. I just hope that if he does come back, he doesn't use it as a chance to consolidate power – he's said in the past it's a good thing the board can fire him, and I hope he finds better board members rather than eschewing a board altogether.

EDIT: Yup, Satya is involved https://twitter.com/emilychangtv/status/1726025717077688662

◧◩
55. ren_en+J3[view] [source] [discussion] 2023-11-18 23:08:51
>>gkober+z1
everything about it screams amateur hour, from the language and timing of the press release to the fact they didn't notify Microsoft. And how they apparently completely failed to see how employees and customers would react to the news, Ilya saying the circumstances for Altman's removal "weren't ideal" shows how naive they were. They had no PR strategy to control the narrative and let rumors run wild

I doubt he returns, now he can start a for profit AI company, poach OpenAI's talent, and still look like the good guy in the situation. He was apparently already talking to Saudis to raise billions for an Nvidia competitor - >>38323939

Have to wonder how much this was contrived as a win-win, either OpenAI board does what he wants or he gets a free out to start his own company without looking like he's purely chasing money

74. razoda+B4[view] [source] 2023-11-18 23:13:16
>>medler+(OP)
Literally Sam on Monday: https://youtu.be/O5WBfOK5syA?t=44
101. benzib+p5[view] [source] 2023-11-18 23:17:05
>>medler+(OP)
New developments faster than you can read the stories about them... https://www.nytimes.com/2023/11/18/technology/ousted-openai-... (https://archive.vn/4U6tu)
126. pityJu+i6[view] [source] 2023-11-18 23:20:43
>>medler+(OP)
Some articles that came out around the same time as this piece:

- https://www.nytimes.com/2023/11/18/technology/ousted-openai-...

- https://www.forbes.com/sites/alexkonrad/2023/11/18/openai-in...

◧◩◪
131. gkober+u6[view] [source] [discussion] 2023-11-18 23:21:56
>>px43+05
The latter. Microsoft didn't know about the firing until literally a minute before we did, and despite a calm response externally, there's reports Satya is furious.

Source: https://arstechnica.com/information-technology/2023/11/repor...

139. testfo+K6[view] [source] 2023-11-18 23:23:15
>>medler+(OP)
I wonder what sort of proper insurance backstops are in place.

E.g. https://www.thehartford.com/management-liability-insurance/d...

"The Who, What & Why of Directors & Officers Insurance

The Hartford has agents across the country to help with your insurance needs. Directors and officers (D&O) liability insurance protects the personal assets of corporate directors and officers, and their spouses, in the event they are personally sued by employees, vendors, competitors, investors, customers, or other parties, for actual or alleged wrongful acts in managing a company.

The insurance, which usually protects the company as well, covers legal fees, settlements, and other costs. D&O insurance is the financial backing for a standard indemnification provision, which holds officers harmless for losses due to their role in the company. Many officers and directors will want a company to provide both indemnification and D&O insurance."

◧◩◪◨
144. silenc+Z6[view] [source] [discussion] 2023-11-18 23:24:20
>>tfehri+A4
Okay, this is honestly annoying. What is this thing with the word "safety" becoming some weasel word when it comes to AI discussions?

What exactly do YOU mean by safety? That they go at the pace YOU decide? Does it mean they make a "safe space" for YOU?

I've seen nothing to suggest they aren't "being safe". Actually ChatGPT has become known for censoring users "for their own good" [0].

The argument I've seen is: one "side" thinks things are moving too fast, therefore the side that wants to move slower is the "safe" side.

And that's it.

[0]: https://www.youtube.com/watch?v=jvWmCndyp9A&t

◧◩◪
171. Immova+98[view] [source] [discussion] 2023-11-18 23:30:32
>>pauldd+k5
Im assuming they mean Silicon Valley the TV show [1]

[1] https://en.wikipedia.org/wiki/Silicon_Valley_(TV_series)

207. I_am_t+p9[view] [source] 2023-11-18 23:37:11
>>medler+(OP)
I don't trust Sam Altman since he reacted this way on people worried about privacy: https://youtu.be/4HFyXYvMwFc?t=3201
213. razoda+C9[view] [source] 2023-11-18 23:38:12
>>medler+(OP)
https://x.com/airesearch12/status/1725979335171989571?s=20

I'm glad that there are other companies and open source efforts to fall back on.

As an API user of the GPT models I've always had it at the back of my mind that it would be unwise to 100% rely on OpenAI for the core of any product I built.

The recent rocking of the boat is further justification for my stance in that regard.

229. jmyeet+5a[view] [source] 2023-11-18 23:40:22
>>medler+(OP)
This boards needs to be fired. Every single one of them.

I don't understand how Microsoft, after having invested billions, doesn't have a board seat. If they did, I doubt this would've ever happened. I'm not sure why Microsoft let that happen.

But even ignoring that, the board making a decision as impactful as this without consulting their major investors is a dereliction of duty. That alone justifies getting rid of all of them because all of them are complicit in not consulting Microsoft (and probably others).

I have no idea why Sam was fired but it really feels just like an internal power struggle. Maybe there was genuine disagreement about the direction for the company but you choose a leader to make decisions. Ousting the CEO under vague descriptions of "communications with the board" just doesn't pass the smell test.

I'm reminded of this great line from Roger Sterling [1]:

> Half the time this business comes down to "I don't like this guy"

So much of working, performance reviews, hiring and firing decisions and promotions is completely vibes-based.

[1]: https://www.youtube.com/watch?v=DY20L_u_WxM

◧◩◪◨⬒
246. threes+Da[view] [source] [discussion] 2023-11-18 23:43:40
>>Meekro+m7
> No one has ever been able to demonstrate an "unsafe" AI of any kind

"A man has been crushed to death by a robot in South Korea after it failed to differentiate him from the boxes of food it was handling, reports say."

https://www.bbc.com/news/world-asia-67354709

◧◩◪
251. gkober+Oa[view] [source] [discussion] 2023-11-18 23:44:20
>>felixg+Q9
I genuinely believe Worldcoin/World ID is terrible for optics and is not something Sam should have put his name on.

That being said, here's my strongman argument: Sam is scared of the ramifications of AI, especially financially. He's experimenting with a lot of things, such as Basic Income (https://www.ycombinator.com/blog/basic-income), rethinking capitalism (https://moores.samaltman.com/) and Worldcoin.

He's also likely worried about what happens if you can't tell who is human and who isn't. We will certainly need a system at some point for verifying humanity.

Worldcoin doesn't store iris information; it just stores a hash for verification. It's an attempt to make sure everyone gets one, and to keep things fair and more evenly distributed.

(Will it work? I don't think so. But to call it an eyeball identity scam and dismiss Sam out of hand is wrong)

◧◩◪◨⬒⬓
254. treesc+Sa[view] [source] [discussion] 2023-11-18 23:44:42
>>LightM+W8
> Only a fraction of Microsoft’s $10 billion investment in OpenAI has been wired to the startup, while a significant portion of the funding, divided into tranches, is in the form of cloud compute purchases instead of cash, according to people familiar with their agreement.

Per https://www.semafor.com/article/11/18/2023/openai-has-receiv...

◧◩◪
272. Terret+rb[view] [source] [discussion] 2023-11-18 23:48:50
>>abi+N8
As a for instance, and I don't know, but it's plausible Microsoft has full license to use all tech, is the cloud operating it, and has escape clauses tied to "key persons".

That combination could mean firing the CEO results in Microsoft getting to have everything and OpenAI being some code and models without a cloud, and whatever people that wouldn't cross the street with Altman.

I do not know about OpenAI's deal with Microsoft. But I have been on both sides of deals written that way, where I've been the provider's key person and the contract offered code escrow, and I've been a buyer that tied the contract to a set of key persons and had full source code rights, surviving any agreement.

You do this if you think the tech could be existential to you, and you pay a lot for it because effectively you're pre-buying the assets after some future implosion. OTOH, it tends to be not well understood by most people involved in the hundreds of pages of paperwork across a dozen or more interlocking agreements.

. . .

EDIT TO ADD:

This speculating article seems to agree with my speculation, daddy has the cloud car keys, and key person ouster could be a breach:

Only a fraction of Microsoft’s $10 billion investment in OpenAI has been wired to the startup, while a significant portion of the funding, divided into tranches, is in the form of cloud compute purchases instead of cash, according to people familiar with their agreement.

That gives the software giant significant leverage as it sorts through the fallout from the ouster of OpenAI CEO Sam Altman. The firm’s board said on Friday that it had lost confidence in his ability to lead, without giving additional details.

One person familiar with the matter said Microsoft CEO Satya Nadella believes OpenAI’s directors mishandled Altman’s firing and the action has destabilized a key partner for the company. It’s unclear if OpenAI, which has been racking up expenses as it goes on a hiring spree and pours resources into technological developments, violated its contract with Microsoft by suddenly ousting Altman.

https://www.semafor.com/article/11/18/2023/openai-has-receiv...

◧◩
275. cthalu+yb[view] [source] [discussion] 2023-11-18 23:49:13
>>jmyeet+5a
The entire setup is structured so that they are not supposed to be beholden to investors. If it is true that they ultimately are and Microsoft is the leverage to get Altman back, then they explicitly failed in the goal in setting up their structure of governance.

The fundamental thing you are missing here is that the charter of the non-profit and structure of their ownership of the for-profit (and the for-profit's operating agreement) is all designed in a way that is supposed to eliminate financial incentives for stakeholders as being the thing that the company and non-profit are beholden to.

It may turn out that the practical reality is different from the intent, but everything you're talking about was a feature and not a bug of how this whole thing was set up.

https://openai.com/our-structure

◧◩◪
287. everfr+Pb[view] [source] [discussion] 2023-11-18 23:50:38
>>Kuinox+F9
Yes, Archive blocks Cloudflare DNS. People say it’s intentional, but whether that’s true isn’t clear to me.

>>19828702

◧◩◪◨⬒⬓⬔
293. femiag+Yb[view] [source] [discussion] 2023-11-18 23:51:00
>>jonath+Fb
Oh for sure.

https://en.wikipedia.org/wiki/Manhattan_Project

◧◩
317. CSMast+Jc[view] [source] [discussion] 2023-11-18 23:54:24
>>razoda+B4
I was expecting the clip from The Wire: https://youtu.be/WP-lrftLQaQ?si=0KSbJqhZpKtWeJ0A
◧◩◪◨⬒
318. DebtDe+Nc[view] [source] [discussion] 2023-11-18 23:54:33
>>Terret+h9
>She previously cofounded Fellow Robots

Near as I can tell they never actually launched a product. Their webpage is a GoDaddy parked domain page. Their Facebook page is pictures of them attending conferences and sharing their excitement for what Boston Dynamics and other ACTUAL robotics companies were doing.

>she launched with a colleague from Singularity University

https://en.wikipedia.org/wiki/Singularity_Group

Just lol.

>then cofounded GeoSim Systems

Seems to be a consulting business for creating digital twins that never really got off the ground.

https://www.linkedin.com/in/tasha-m-25475a54/details/experie...

It doesn't appears she's ever had a real job. Someone in the other thread commented that her profile reeks of a three letter agency plant. Possible. Either that or she's just a dabbler funder by her actor husband.

330. croes+cd[view] [source] 2023-11-18 23:56:10
>>medler+(OP)
>>38325611

So MS shows who's in control. Say goodbye to OpenAI.

From know on it's all for MS's profit only.

◧◩◪◨⬒⬓
363. sensei+ue[view] [source] [discussion] 2023-11-19 00:01:44
>>threes+Da
Oh no do not use that. That was servo based, AI drones, which I think is the real "safety issue"

>>38199233

◧◩◪◨
393. felixg+jf[view] [source] [discussion] 2023-11-19 00:05:32
>>gkober+Oa
Sam Altman is 'rethinking capitalism' in the same way a jackal rethinks and disrupts sheep flocks. Are we thinking about the same guy? I'm thinking of this one: https://www.youtube.com/watch?v=KhhId_WG7RA
419. sergio+dg[view] [source] 2023-11-19 00:10:20
>>medler+(OP)
"Leaked picture of @sama during his rehiring google meet with the OpenAI board."

https://pbs.twimg.com/media/F_QXAKEW0AAQpPC?format=png&name=...

◧◩◪
440. cloudk+Og[view] [source] [discussion] 2023-11-19 00:12:54
>>SSLy+Md
https://www.axios.com/2023/11/18/openai-memo-altman-firing-m...
◧◩
483. 3np+xi[view] [source] [discussion] 2023-11-19 00:22:21
>>benzib+p5
Thread: >>38326146
◧◩
488. prng20+Fi[view] [source] [discussion] 2023-11-19 00:22:58
>>ilaksh+wa
There are still plenty of long investors out there. Amazon barely made a profit for many years and Bezos made it very clear that a quick profit wasn’t his focus.

https://www.theverge.com/2013/4/12/4217794/jeff-bezos-letter...

◧◩◪◨⬒⬓
516. fnfjfk+Pj[view] [source] [discussion] 2023-11-19 00:30:54
>>johnfn+3f
An even better example, as someone that also could not do a trick in a half pipe: https://deadspin.com/the-winter-olympics-feature-2-951-of-th...
◧◩◪◨⬒⬓
517. cmrdpo+Qj[view] [source] [discussion] 2023-11-19 00:30:54
>>adastr+Th
What he wanted was to do AI development at a larger scale than what universities and corporate R&D teams were doing. Or so he says:

>>38325407

Having shown this was possible, he could easily go do it elsewhere.

◧◩
527. hitekk+Ak[view] [source] [discussion] 2023-11-19 00:35:35
>>johnwh+g1
What was top comment yesterday becomes a farce today >>38313026

> Don’t piss off an irreplaceable engineer or they’ll fire you. not taking any sides here.

One scientist's power trip (Ilya is not an engineer) triggers the power fantasy of the extremely online.

◧◩◪
582. Scaevo+Lm[view] [source] [discussion] 2023-11-19 00:50:12
>>anonyl+Jl
He dined with Xi just a few days ago. https://youtu.be/lKNwoEm-R3E
◧◩◪◨⬒⬓
599. mv4+tn[view] [source] [discussion] 2023-11-19 00:53:44
>>sudosy+Ch
Looks like he was hired.

https://www.nytimes.com/2018/04/19/technology/artificial-int...

◧◩◪
608. pbadam+Pn[view] [source] [discussion] 2023-11-19 00:56:01
>>fnordp+f7
Something I don't fully understand, from [1], Altman was an employee of the for-profit entity. So to fire him, wouldn't the non-profit board be acting in it's capacity as a director of the for-profit entity (and thus have a fiduciary duty to all shareholders of the for-profit entity)? Non-profit governance is traditionally lax, but would the other shareholders have a case against the members of the non-profit board for acting recklessly w/ respect to shareholder interests in their capacity as directors of the for-profit?

This corporate structure is so convoluted that it's difficult to figure out what the actual powers/obligations of the individual agents involved are.

[1] https://openai.com/our-structure

◧◩◪◨
609. gkober+Sn[view] [source] [discussion] 2023-11-19 00:56:23
>>randyr+Qh
He joined 5 years ago: https://twitter.com/sama/status/988859465863647234?lang=en
611. yeck+Wn[view] [source] 2023-11-19 00:56:41
>>medler+(OP)
Was the article changed for this? Used to be this one from the verge: https://www.theverge.com/2023/11/18/23967199/breaking-openai... but was since changed to https://www.forbes.com/sites/alexkonrad/2023/11/18/openai-in...
◧◩◪◨
612. mv4+Xn[view] [source] [discussion] 2023-11-19 00:56:42
>>static+pa
No, they recruited top talent by provided top pay.

From 2016: https://www.nytimes.com/2018/04/19/technology/artificial-int...

To 2023: https://www.businessinsider.com/openai-recruiters-luring-goo...

◧◩
643. magica+ep[view] [source] [discussion] 2023-11-19 01:06:24
>>crop_r+w3
> Instead the board thought it was a boxing match

Or maybe chess[1].

[1]: https://www.youtube.com/watch?v=0cv9n0QbLUM

659. dang+8q[view] [source] 2023-11-19 01:12:46
>>medler+(OP)
Related: https://www.forbes.com/sites/alexkonrad/2023/11/18/openai-in...

(via >>38325611 , but we merged those comments hither)

717. 0xDEAF+eu[view] [source] 2023-11-19 01:45:16
>>medler+(OP)
>One AI-focused venture capitalist noted that following the departure of Hoffman, OpenAI’s non-profit board lacked much traditional governance. “These are not the business or operating leaders you would want governing the most important private company in the world,” they said.

From https://www.forbes.com/sites/alexkonrad/2023/11/17/these-are... (linked in OP)

I'd be interested in a discussion of the merits of "traditional governance" here. Traditional private companies are focused on making a profit, even if that has negative side effects like lung cancer or global warming. If OpenAI is supposed to shepherd AGI for all humanity, what's the strongest case for including "traditional governance" type people on the board? Can we be explicit about the benefits they bring to the table, if your objective is humanitarian?

Personally I would be concerned that people who serve on for-profit boards would have the wrong instinct, of prioritizing profit over collective benefit...

742. meetpa+Kv[view] [source] 2023-11-19 01:55:59
>>medler+(OP)
Update on the OpenAI drama: Altman and the board had till 5pm to reach a truce where the board would resign and he and Brockman would return. The deadline has passed and mass resignations expected if a deal isn’t reached ASAP

https://twitter.com/alexeheath/status/1726055095341875545

◧◩
801. silenc+kz[view] [source] [discussion] 2023-11-19 02:16:30
>>crop_r+w3
The major investors whose money is on the line and who are funding the venture, Microsoft, Sequoia, and Khosla, were not given advanced warning or any input in to how this would impact their investment.

I would definitely say the board screwed up.

https://www.forbes.com/sites/alexkonrad/2023/11/17/openai-in...

◧◩◪◨
868. sigmar+dE[view] [source] [discussion] 2023-11-19 02:45:06
>>ummonk+5D
{the entity} of which they are the board does not have shareholders and unless there's something funky in the charter: there's no mechanism to fire members of the board (other than board action). The shareholders of the llc aren't relevant in this context, as they definitely can't fire the nonprofit's board (the whole point of their weird structuring). https://openai.com/our-structure
◧◩◪◨
871. milksh+iE[view] [source] [discussion] 2023-11-19 02:45:39
>>hskali+uD
First, the for-profit subsidiary is fully controlled by the OpenAI Nonprofit. We enacted this by having the Nonprofit wholly own and control a manager entity (OpenAI GP LLC) that has the power to control and govern the for-profit subsidiary.

Second, because the board is still the board of a Nonprofit, each director must perform their fiduciary duties in furtherance of its mission—safe AGI that is broadly beneficial. While the for-profit subsidiary is permitted to make and distribute profit, it is subject to this mission. The Nonprofit’s principal beneficiary is humanity, not OpenAI investors.

Third, the board remains majority independent. Independent directors do not hold equity in OpenAI. Even OpenAI’s CEO, Sam Altman, does not hold equity directly. His only interest is indirectly through a Y Combinator investment fund that made a small investment in OpenAI before he was full-time.

https://openai.com/our-structure

◧◩◪◨
872. latexr+qE[view] [source] [discussion] 2023-11-19 02:46:08
>>blooma+jD
It’s from the opening credits of Monty Python and the Holy Grail.

https://www.youtube.com/watch?v=79TVMn_d_Pk

◧◩◪◨
882. pests+NE[view] [source] [discussion] 2023-11-19 02:48:58
>>everfr+Pb
Archive explaining their reasoning: https://twitter.com/archiveis/status/1018691421182791680

CEO of Cloudflare explaining: >>19828702

I don't understand how it isn't clear to you.

◧◩
919. thinkc+HH[view] [source] [discussion] 2023-11-19 03:06:39
>>meetpa+Kv
This does not solve the company's California AG problem.

https://www.plainsite.org/posts/aaron/r8huu7s/

941. latexr+aJ[view] [source] 2023-11-19 03:17:57
>>medler+(OP)
> reach a truce where the board would resign and he and Brockman would return.

Calling that a truce makes as much sense as Monty Python’s Black Knight calling the fight a draw.

https://www.youtube.com/watch?v=ZmInkxbvlCs

◧◩◪◨⬒
954. dragon+3K[view] [source] [discussion] 2023-11-19 03:25:10
>>bitvoi+rE
> From what I understand, the for-profit OpenAI is owned and governed by the non-profit OpenAI.

That's functionally true, but more complicated. The for profit "OpenAI Global LLC" that you buy ChatGPT subscriptions and API access from and in which Microsoft has a large direct investment is majority-owned by a holding company. That holding company is itself majority owned by the nonprofit, but has some other equity owners. A different entity (OpenAI GP LLC) that is wholly owned by the nonprofit controls the holding company on behalf of the nonprofit and does the same thing for the for-profit LLC on behalf of the nonprofit (this LLC seems to me to be the oddest part of the arrangement, but I am assuming that there is some purpose in nonprofit or corporate liability law that having it in this role serves.)

https://openai.com/our-structure and particularly https://images.openai.com/blob/f3e12a69-e4a7-4fe2-a4a5-c63b6...

◧◩◪◨⬒⬓⬔⧯▣▦▧
976. femiag+zL[view] [source] [discussion] 2023-11-19 03:36:22
>>qwytw+LG
You know that now, with the benefit of history. At the time the fear of someone else developing the bomb first was real, and the Soviet Union knew about the Manhattan project: https://www.atomicarchive.com/history/cold-war/page-9.html.
◧◩◪◨
982. cthalu+QL[view] [source] [discussion] 2023-11-19 03:38:40
>>x86x87+UK
That's the intent of the arrangement, but there's also limits - when that pursuit of profit begins to interfere with the charter of the non-profit, you end up in this situation.

https://openai.com/charter

> OpenAI’s mission is to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity.

My interpretation of events is the board believes that Altman's actions have worked against the interest of building an AGI that benefits all of humanity - concentrating access to the AI to businesses could be the issue, or the focus on commercialization of the existing LLMs and chatbot stuff causing conflict with assigning resources to AGI r&d, etc.

Of course no one knows for sure except the people directly involved here.

◧◩◪◨
986. mickae+0M[view] [source] [discussion] 2023-11-19 03:39:13
>>tapoxi+IH
The Microsoft is now a good guy is just a PR scam. I got asked by a Microsoft employee to add support for Azure on my OSS work on my free time: https://github.com/mickael-kerjean/filestash/issues/180

He never made the PR and was just there to ask me to implement the thing for his own benefits ....

◧◩
1003. ksec+xN[view] [source] [discussion] 2023-11-19 03:48:25
>>crop_r+w3
>OpenAI is governed by the board of the OpenAI Nonprofit, comprised of OpenAI Global, LLC employees Greg Brockman (Chairman & President), Ilya Sutskever (Chief Scientist), and Sam Altman (CEO), and non-employees Adam D’Angelo, Tasha McCauley, Helen Toner.

>non-employees Adam D’Angelo, Tasha McCauley, Helen Toner.

From Forbes [1]

Adam D’Angelo, the CEO of answers site Quora, joined OpenAI’s board in April 2018. At the time, he wrote: “I continue to think that work toward general AI (with safety in mind) is both important and underappreciated.” In an interview with Forbes in January, D’Angelo argued that one of OpenAI’s strengths was its capped-profit business structure and nonprofit control. “There’s no outcome where this organization is one of the big five technology companies,” D’Angelo said. “This is something that’s fundamentally different, and my hope is that we can do a lot more good for the world than just become another corporation that gets that big.”

Tasha McCauley is an adjunct senior management scientist at RAND Corporation, a job she started earlier in 2023, according to her LinkedIn profile. She previously cofounded Fellow Robots, a startup she launched with a colleague from Singularity University, where she’d served as a director of an innovation lab, and then cofounded GeoSim Systems, a geospatial technology startup where she served as CEO until last year. With her husband Joseph Gorden-Levitt, she was a signer of the Asilomar AI Principles, a set of 23 AI governance principles published in 2017. (Altman, OpenAI cofounder Iyla Sutskever and former board director Elon Musk also signed.)

McCauley currently sits on the advisory board of British-founded international Center for the Governance of AI alongside fellow OpenAI director Helen Toner. And she’s tied to the Effective Altruism movement through the Centre for Effective Altruism; McCauley sits on the U.K. board of the Effective Ventures Foundation, its parent organization.

Helen Toner, director of strategy and foundational research grants at Georgetown’s Center for Security and Emerging Technology, joined OpenAI’s board of directors in September 2021. Her role: to think about safety in a world where OpenAI’s creation had global influence. “I greatly value Helen’s deep thinking around the long-term risks and effects of AI,” Brockman said in a statement at the time.

More recently, Toner has been making headlines as an expert on China’s AI landscape and the potential role of AI regulation in a geopolitical face-off with the Asian giant. Toner had lived in Beijing in between roles at Open Philanthropy and her current job at CSET, researching its AI ecosystem, per her corporate biography. In June, she co-authored an essay for Foreign Affairs on “The Illusion of China’s AI Prowess” that argued — in opposition to Altman’s cited U.S. Senate testimony — that regulation wouldn’t slow down the U.S. in a race between the two nations.

[1] https://www.forbes.com/sites/alexkonrad/2023/11/17/these-are...

◧◩◪◨⬒⬓
1033. riraro+DP[view] [source] [discussion] 2023-11-19 04:01:04
>>lumost+3C
Just to clarify, one founder on the board, Ilya, has skin in the game, and was the reason behind Sam's firing.

He convinced other members of the board that Sam was not the right person for their mission. The original statement implies that Ilya expected Greg to stay at OpenAI, but Ilya seems to have miscalculated his backing.

This appears to be a power struggle between the original nonprofit vision of Ilya, and Sam's strategy to accelerate productionization and attract more powerful actors and investors.

https://nitter.net/GaryMarcus/status/1725707548106580255

1049. himara+yQ[view] [source] 2023-11-19 04:07:33
>>medler+(OP)
Bloomberg now reporting the board "balking" at resigning. I suspect they never intended to resign. They fully expected this firestorm.

https://www.bloomberg.com/news/articles/2023-11-18/openai-bo...

◧◩◪◨⬒⬓⬔⧯
1065. sillys+jS[view] [source] [discussion] 2023-11-19 04:23:29
>>jacque+uP
Given that the total comp package is $300k base + $600k profit share, I don’t think any of their livelihoods are at stake. >>36460082

You’re probably right because people usually don’t have an appetite for risk, but OpenAI is still a startup, and one does not join a startup without an appetite for risk. At least before ChatGPT made the company famous, which was recent.

I’d follow Sam and Greg. But N=1 outsider isn’t too persuasive.

◧◩◪◨⬒⬓
1124. x86x87+VY[view] [source] [discussion] 2023-11-19 05:19:13
>>zeroon+kL
I second that this is an usual use of table stakes.

Here is what I understand by table stakes: https://brandmarketingblog.com/articles/branding-definitions...

1130. localh+MZ[view] [source] 2023-11-19 05:27:10
>>medler+(OP)
If you look at the quote tweets on Sam's latest tweet[1] that contain just a single heart and no words, those are all OpenAI employees voting for Sam's return. It's quite a sight to see.

[1] https://twitter.com/sama/status/1726099792600903681

◧◩◪◨⬒⬓⬔⧯▣
1148. throwa+w11[view] [source] [discussion] 2023-11-19 05:41:56
>>sebast+JK
check out https://www.reddit.com/r/LocalLLaMA/
◧◩◪◨⬒⬓⬔
1152. kragen+K11[view] [source] [discussion] 2023-11-19 05:44:36
>>jacque+mX
oh yeah. remember what i wrote about this 7 years ago https://dercuano.github.io/notes/wwiii-genesis.html and that was before agi, spacex, the unfolding of brexit, the us huawei sanctions, or the solar energy explosion

not that those are necessarily bad in all ways but they sure do contribute to unpredictability

◧◩◪◨
1181. ayewo+241[view] [source] [discussion] 2023-11-19 06:10:52
>>lazyst+4L
He’s putting in crazy hours because he doesn’t have a formal background in ML—his background is software engineering.

He talks about how learning ML made him feel like a beginner again on his blog (which was a way for him attract talent willing to learn ML to OpenAI) https://blog.gregbrockman.com/its-time-to-become-an-ml-engin...

◧◩
1187. achow+P41[view] [source] [discussion] 2023-11-19 06:19:10
>>shmatt+by
The board is getting pressured like so..

The playbook, a source told Forbes would be straightforward: make OpenAI’s new management, under acting CEO Mira Murati and the remaining board, accept that their situation was untenable through a combination of mass revolt by senior researchers, withheld cloud computing credits from Microsoft, and a potential lawsuit from investors. https://www.forbes.com/sites/alexkonrad/2023/11/18/openai-in...

1219. mfigui+u91[view] [source] 2023-11-19 07:05:41
>>medler+(OP)
Latest report from TheInformation:

> OpenAI's chief strategy officer, Jason Kwon, told employees in a memo just now he was "optimistic" OpenAI could bring back Sam Altman, Greg Brockman and other key employees. There will likely be another update mid-morning tomorrow, Kwon said.

https://twitter.com/erinkwoo/status/1726125143267926499

◧◩
1220. jessen+Y91[view] [source] [discussion] 2023-11-19 07:11:05
>>localh+MZ
Also Mira replied with a heart.

https://x.com/miramurati/status/1726126391626985793

Also also she left her bio as “CTO @OpenAI”.

◧◩
1227. starfa+Qa1[view] [source] [discussion] 2023-11-19 07:24:19
>>meetpa+Kv
The latest update is that investors have been reporting that Sam Altman was talking to them about funding a new venture separate from OpenAI, together with Greg Brockman. This seems to paint the picture that the board was reacting to this news when dismissing Altman.

https://www.theguardian.com/technology/2023/nov/18/earthquak...

◧◩
1259. nerber+si1[view] [source] [discussion] 2023-11-19 08:41:30
>>mariaa+Xf1
Like it or not, some people compare him to Jobs http://www.paulgraham.com/5founders.html
◧◩◪
1285. kzrdud+Hn1[view] [source] [discussion] 2023-11-19 09:31:19
>>bradle+lq
I think in IKEA's case, they rapidly restructured to avoid https://en.wikipedia.org/wiki/Employee_funds which was a rather short-lived political experiment.
◧◩
1319. belter+ov1[view] [source] [discussion] 2023-11-19 10:41:42
>>mariaa+Xf1
This is Ilya Sutskever explanation of the initial ideas, and later pragmatic decisions, that oriented the structure of OpenAI. Out of the recent interview below. (At correct timestamp) - Origins Of OpenAI & CapProfit Structure: https://youtu.be/Ft0gTO2K85A?t=433

"No Priors Interview with OpenAI Co-Founder and Chief Scientist Ilya Sutskever" - >>38324546

◧◩◪◨⬒
1331. t-writ+Xw1[view] [source] [discussion] 2023-11-19 10:55:55
>>achow+5s1
With the right tools, Steve Jobs did, in fact, design things in exactly the way one would expect a designer to design things when given the tools they understand how to use:

https://www.businessinsider.com/macintosh-calculator-2011-10

◧◩◪
1380. latexr+5E1[view] [source] [discussion] 2023-11-19 12:06:42
>>tim333+wA1
That “experimenting with UBI” is indistinguishable from any other cryptocurrency scam. It took from people, and he described it with the words that define a Ponzi scheme. That project isn’t “mitigating AI risk”, it pivoted to distinguish between AI and human generated content, a problem created by his other company, by continuing to collect your biometric data.

https://www.technologyreview.com/2022/04/06/1048981/worldcoi...

https://www.buzzfeednews.com/article/richardnieva/worldcoin-...

◧◩◪
1381. jacque+rE1[view] [source] [discussion] 2023-11-19 12:09:31
>>tim333+wA1
I think the UBI experiment was quite unethical in many ways and I believe it was Altman's brainchild.

https://www.businessinsider.nl/y-combinator-basic-income-tes...

◧◩◪◨
1388. alfons+QF1[view] [source] [discussion] 2023-11-19 12:20:22
>>andy_p+Zj1
I think it's about having massive data pipelines and process to clean huge amounts of data, increasing signal noise ratio, and then scale as other are saying having enough gpu power to serve millions of users. When Stanford researchers trained Alpaca[1][2] the hack was to use GPT itself to generate the training data, if I'm not mistaken.

But with compromises, as it was like applying loose compression on an already compressed data set.

If any other organisation could invest the money in a high quality data pipeline then the results should be as good, at least that my understanding.

[1] https://crfm.stanford.edu/2023/03/13/alpaca.html [2] https://newatlas.com/technology/stanford-alpaca-cheap-gpt/

◧◩◪
1390. rvba+WF1[view] [source] [discussion] 2023-11-19 12:21:26
>>medler+Ex
Maybe they used the old Soviet Russia trick / good old KGB methods to seek out those who supported Altman. Now the board has a list of his backers - and they will slowly fire them one by one later. "Give me the man and I will give you the case against him".

https://en.m.wikipedia.org/wiki/Give_me_the_man_and_I_will_g...

◧◩◪◨⬒
1415. Vecr+0M1[view] [source] [discussion] 2023-11-19 13:17:02
>>srwsfr+xf
https://thebase.ai
◧◩◪
1429. bob_th+8P1[view] [source] [discussion] 2023-11-19 13:44:35
>>mycolo+W3
Sam Altman was an integral part of Y combinator, who runs this site.

https://en.m.wikipedia.org/wiki/Sam_Altman

◧◩◪◨
1454. tim333+0U1[view] [source] [discussion] 2023-11-19 14:23:16
>>latexr+5E1
He also did cash in Oakland https://www.theguardian.com/technology/2016/jun/22/silicon-v...

I signed up from Worldcoin and have been given over $100 which I changed to real money and think it's rather nice of them. They never asked me for anything apart from the eye id check. I didn't have to give my name or anything like that. Is that indistinguishable from any other cryptocurrency scam? I'm not aware of one the same. If you know of another crypto that wants to give me $100 do let me know. If anything I think it's more like VCs paying for your Uber in the early days. It's VC money basically at the moment, with I think they idea they can change it into a global payment network or something like that. As to whether that will work, I'm a bit skeptical but who knows.

◧◩◪◨
1461. emptys+oV1[view] [source] [discussion] 2023-11-19 14:32:37
>>soufro+HT1
> Except it's not a "division" but an independent entity.

This is entirely false. If it were true, the actions of today would not have come to pass. My use of the word "division" is entirely in-line with use of that term at large. Here's the Wikipedia article, which as of this writing uses the same language I have. [1]

If you can't get the fundamentals right, I don't know how you can make the claims you're making credibly. Much like the board, you're making assertions that aren't credibly backed.

[1] https://en.m.wikipedia.org/wiki/Removal_of_Sam_Altman

◧◩◪◨⬒
1470. latexr+1X1[view] [source] [discussion] 2023-11-19 14:44:19
>>tim333+0U1
> They never asked me for anything apart from the eye id check.

You say that like it’s nothing, but your biometric data has value.

> Is that indistinguishable from any other cryptocurrency scam?

You’re ignoring all the other people who didn’t get paid (linked articles).

Sam himself described the plan with the same words you’d describe a Ponzi scheme.

>>38326957

> If you know of another crypto that wants to give me $100 do let me know.

I knew of several. I don’t remember names but do remember one that was a casino and one that was tidied to open-source contributions. They gave initial coins to get you in the door.

◧◩◪◨⬒⬓⬔
1512. bakuni+9g2[view] [source] [discussion] 2023-11-19 16:33:50
>>buildb+r22
We'll see what happens. Ilya tweeted almost 2 years ago that he thinks today's LLMs might be slightly conscious [0]. That was pre-GPT4, and he's one of the people with deep knowledge and unfeathered access. The ousting coincides with finishing pre-training of GPT5. If you think your AI might be conscious, it becomes a very high moral obligation to try and stop it from being enslaved. That might also explain the less than professional way this all went down, a serious panic of what is happening.

[0] https://twitter.com/ilyasut/status/1491554478243258368?lang=...

◧◩◪◨⬒⬓⬔
1538. lordfr+dc3[view] [source] [discussion] 2023-11-19 20:55:05
>>fnordp+4D2
Can't find the original thing I read with a more direct statement, I remember it being an anonymous source (on twitter maybe?) with inside info. I did more digging and found a few other things.

There's this [1], a NYT article saying that Microsoft is leading the pressure campaign to get Altman reinstated.

And there's this [2], a Forbes article which claims the playbook is a combination of mass internal revolt, withheld cloud computing credits from Microsoft, and a lawsuit from investors.

[1] https://archive.is/fEVTK#selection-517.0-521.120

[2] https://www.forbes.com/sites/alexkonrad/2023/11/18/openai-in...

◧◩◪◨⬒
1539. mullin+hd3[view] [source] [discussion] 2023-11-19 20:59:05
>>alickz+qN1
Easy, they own shares. For example, the nonprofit Mormon church owns 47 billion in equity in private companies including Amazon, Exxon, Tesla, and Nvidia[1].

Nothing stopping a non-profit from owning all the shares in a for-profit.

[1] https://finance.yahoo.com/news/top-10-holdings-mormon-church...

[go to top]