zlacker

[parent] [thread] 26 comments
1. jacque+(OP)[view] [source] 2023-11-20 14:51:34
Yes, indeed and that's the real loss here: any chance of governing this properly got blown up by incompetence.
replies(5): >>hef198+v2 >>postmo+u7 >>zer00e+Xa >>slavik+tx >>bart_s+2z
2. hef198+v2[view] [source] 2023-11-20 15:04:08
>>jacque+(OP)
Of we ignore the risks and threats of AI for a second, this whole story is actually incredibly funny. So much childish stupidity on display on all sides is just hilarious.

Makes what the world would look like if, say, the Manhattan Project would have been managed the same way.

Well, a younger me working at OpenAI would resign latest after my collegues stage a coup againstvthe board out of, in my view, a personality cult. Propably would have resigned after the third CEO was announced. Older me would wait for a new gig to be ligned up to resign, with beginning after CEO number 2 the latest.

The cyckes get faster so. It took FTX a little bit longer from hottest start up to enter the trajectory of crash and burn, OpenAI did faster. I just hope this helps ro cool down the ML sold as AI hype a notch.

replies(3): >>jacque+Vd >>jibe+ne >>anonym+rs
3. postmo+u7[view] [source] 2023-11-20 15:37:37
>>jacque+(OP)
Ignoring "Don't be Ted Faro" to pursue a profit motive is indeed a form of incompetence.
4. zer00e+Xa[view] [source] 2023-11-20 15:58:57
>>jacque+(OP)
> any chance of governing this properly got blown up by incompetence

No one knows why the board did this. No one is talking about that part. Yet every one is on twitter talking shit about the situation.

I have worked with a lot of PhD's and some of them can be, "disconnected" from anything that isn't their research.

This looks a lot like that, disconnected from what average people would do, almost childlike (not ish, like).

Maybe this isn't the group of people who should be responsible for "alignment".

replies(1): >>kmlevi+ip
◧◩
5. jacque+Vd[view] [source] [discussion] 2023-11-20 16:15:41
>>hef198+v2
The scary thing is that these incompetents are supposedly the ones to look out for the interests of humanity. It would be funny if it weren't so tragic.

Not that I had any illusions about this being a fig leaf in the first place.

replies(1): >>stingr+Rh
◧◩
6. jibe+ne[view] [source] [discussion] 2023-11-20 16:18:33
>>hef198+v2
Of we ignore the risks and threats of AI for a second [..] just hope this helps ro cool down the ML sold as AI hype

If it is just ML sold as AI hype, are you really worried about the threat of AI?

replies(1): >>hef198+Pk
◧◩◪
7. stingr+Rh[view] [source] [discussion] 2023-11-20 16:36:46
>>jacque+Vd
Perhaps they were put in that position precisely because of their incompetence, not despite of it.
replies(1): >>jacque+km
◧◩◪
8. hef198+Pk[view] [source] [discussion] 2023-11-20 16:51:47
>>jibe+ne
It can be both, a hype and a danger. I don't worry much about AGI by now (I stopped insulting Alexa so, just to be sure).

The danger of generative AI is that it disrupts all kinds of things: arts, writers, journalism, propaganda... That threat already exists, the tech being no longer being hyped might allow us to properly adress that problem.

replies(1): >>jacque+Em
◧◩◪◨
9. jacque+km[view] [source] [discussion] 2023-11-20 16:57:22
>>stingr+Rh
I wouldn't rule that out. Normally you'd expect a bit more wisdom rather than only smarts on a board. And some of those really shouldn't be there at all (conflicts of interest, lack of experience).
◧◩◪◨
10. jacque+Em[view] [source] [discussion] 2023-11-20 16:58:07
>>hef198+Pk
> I stopped insulting Alexa so, just to be sure

Priceless. The modern version of Pascal's wager.

◧◩
11. kmlevi+ip[view] [source] [discussion] 2023-11-20 17:07:26
>>zer00e+Xa
The Fact still nobody knows why they did it is part of the problem now though. They have already clarified it was not for any financial reason, security reason, or privacy/safety reason, so that rules out all the important ones that spring to anyone’s minds. And they refuse to elaborate why in writing despite being asked to repeatedly.

Any reason good enough to fire him is good enough to share with the interim CEO and the rest of the company, if not the entire world. If they can’t even do that much, you can’t blame employees for losing faith in their leadership. They couldn’t even tell SAM ALTMAN why, and he was the one getting fired!

replies(1): >>denton+8w
◧◩
12. anonym+rs[view] [source] [discussion] 2023-11-20 17:17:47
>>hef198+v2
> Makes what the world would look like if, say, the Manhattan Project would have been managed the same way.

It was not possible for a war-time government crash project to have been managed the same way. During WW2 the existential fear was an embodied threat currently happening. No one was even thinking about a potential for profits or even any additional products aside from an atomic bomb. And if anyone had ideas on how to pursue that bomb that seemed like a decent idea, they would have been funded to pursue them.

And this is not even mentioning the fact that security was tight.

I'm sure there were scientists who disagreed with how the Manhattan project was being managed. I'm also sure they kept working on it despite those disagreements.

replies(2): >>Apocry+WN >>hooand+rD1
◧◩◪
13. denton+8w[view] [source] [discussion] 2023-11-20 17:28:20
>>kmlevi+ip
> The Fact still nobody knows why they did it is part of the problem now though.

The fact that Altman and Brockman were hired so quickly by Microsoft gives a clue: it takes time to hire someone. For one thing, they need time to decide. These guys were hired by Microsoft between close-of-business on Friday and start-of-business on Monday.

My supposition is that this hiring was in the pipeline a few weeks ago. The board of OpenAI found out on Thursday, and went ballistic, understandably (lack of candidness). My guess is there's more shenanigans to uncover - I suspect that Altman gave Microsoft an offer they couldn't refuse, and that OpenAI was already screwed by Thursday. So realizing that OpenAI was done for, they figured "we might as well blow it all up".

replies(6): >>mediam+WA >>jrajav+fC >>dragon+SE >>jacque+AK >>jowea+7M >>kmlevi+lp1
14. slavik+tx[view] [source] 2023-11-20 17:32:52
>>jacque+(OP)
> that's the real loss here: any chance of governing this properly got blown up by incompetence

If this incident is representative, I'm not sure there was ever a possibility of good governance.

15. bart_s+2z[view] [source] 2023-11-20 17:39:02
>>jacque+(OP)
Was it due to incompetence though? The way it has played out has made me feel it was always doomed. It is apparent that those concerned with AI safety were gravely concerned with the direction the company was taking, and were losing power rapidly. This move by the board may have simply done in one weekend what was going to happen anyways over the coming months/years anyways.
◧◩◪◨
16. mediam+WA[view] [source] [discussion] 2023-11-20 17:45:45
>>denton+8w
The problem with this analysis is the premise: that it "takes time to hire someone."

This is not an interview process for hiring a junior dev at FAANG.

If you're Sam & Greg, and Satya gives you an offer to run your own operation with essentially unlimited funding and the ability to bring over your team, then you can decide immediately. There is no real lower bound of how fast it could happen.

Why would they have been able to decide so quickly? Probably because they prioritize the ability to bring over the entire team as fast as possible, and even though they could raise a lot of money in a new company, that still takes time, and they view it as critically important to hire over the new team as fast as possible (within days) that they accept whatever downsides there may be to being a subsidiary of Microsoft.

This is what happens when principles see opportunity and are unencumbered by bureaucratic checks. They can move very fast.

replies(1): >>denton+oP
◧◩◪◨
17. jrajav+fC[view] [source] [discussion] 2023-11-20 17:50:47
>>denton+8w
I suspect it takes somewhat less time and process to hire somebody, when NOT hiring them by start-of-business on Monday will result in billions in lost stock value.
replies(1): >>denton+nJ3
◧◩◪◨
18. dragon+SE[view] [source] [discussion] 2023-11-20 17:58:43
>>denton+8w
I don't think the hiring was in the pipeline, because until the board action it wasn't necessary. But I think this is still in the area of the right answer, nonetheless.

That is, I think Greg and Sam were likely fired because, in the board's view, they were already running OpenAI Global LLC more as if it were a for-profit subsidiary of Microsoft driven by Microsoft's commercial interest, than as the organization able to earn and return profit but focussed on the mission of the nonprofit it was publicly declared to be and that the board very much intended it to be. And, apparently, in Microsoft's view, they were very good at that, so putting them in a role overtly exactly like that is a no-brainer.

And while it usually takes a while to vet and hire someone for a position like that, it doesn't if you've been working for them closely in something that is functionally (from your perspective, if not on paper for the entity they nominally reported to) a near-identical role to the one you are hiring them for, and the only reason they are no longer in that role is because they were doing exactly what you want them to do for you.

◧◩◪◨
19. jacque+AK[view] [source] [discussion] 2023-11-20 18:18:32
>>denton+8w
The hiring could have been done over coffee in 15 minutes to agree on basic terms and then it would be announced half an hour later. Handshake deal. Paperwork can catch up later. This isn't the 'we're looking for a junior dev' pipeline.
◧◩◪◨
20. jowea+7M[view] [source] [discussion] 2023-11-20 18:24:21
>>denton+8w
> My supposition is that this hiring was in the pipeline a few weeks ago. The board of OpenAI found out on Thursday, and went ballistic, understandably (lack of candidness). My guess is there's more shenanigans to uncover - I suspect that Altman gave Microsoft an offer they couldn't refuse, and that OpenAI was already screwed by Thursday. So realizing that OpenAI was done for, they figured "we might as well blow it all up".

It takes time if you're a normal employee under standard operating procedure. If you really want to you can merge two of the largest financial institutions in the world in less than a week. https://en.wikipedia.org/wiki/Acquisition_of_Credit_Suisse_b...

◧◩◪
21. Apocry+WN[view] [source] [discussion] 2023-11-20 18:30:59
>>anonym+rs
That's what happened to the German program though

https://en.wikipedia.org/wiki/German_nuclear_weapons_program

replies(1): >>anonym+kQ
◧◩◪◨⬒
22. denton+oP[view] [source] [discussion] 2023-11-20 18:36:10
>>mediam+WA
> There is no real lower bound of how fast it could happen.

I don't know anything about how executives get hired. But supposedly this all happened between Friday night and Monday morning. This isn't a simple situation; surely one man working through the weekend can't decide to set up a new division, and appoint two poached executives to head it up, without consulting lawyers and other colleagues. I mean, surely they'd need to go into Altman and Brockman's contracts with OpenAI, to check that the hiring is even legal?

That's why I think this has been brewing for at least a week.

◧◩◪◨
23. anonym+kQ[view] [source] [discussion] 2023-11-20 18:39:13
>>Apocry+WN
Well, yes, but they were the existential threat.

Hey, maybe this means the AGIs will fight amongst themselves and thus give us the time to outwit them. :D

replies(1): >>jowea+HT
◧◩◪◨⬒
24. jowea+HT[view] [source] [discussion] 2023-11-20 18:49:52
>>anonym+kQ
Actual scifi plot.
◧◩◪◨
25. kmlevi+lp1[view] [source] [discussion] 2023-11-20 20:49:51
>>denton+8w
This narrative doesn’t make any sense. Microsoft was blindsided and (like everyone else) had no idea Sam was getting fired until a couple days ago. The reason they hired him quickly is because Microsoft was desperate to show the world they had retained open AI’s talent prior to the market opening on Monday.

To entertain your theory, Let’s say they were planning on hiring him prior to that firing. If that was the case, why is everybody so upset that Sam got fired, and why is he working so hard to try to get reinstated to a role that he was about to leave anyway?

◧◩◪
26. hooand+rD1[view] [source] [discussion] 2023-11-20 21:48:54
>>anonym+rs
For real. It's like, did you see Oppenheimer? There's a reason they put the military in charge of that.
◧◩◪◨⬒
27. denton+nJ3[view] [source] [discussion] 2023-11-21 13:39:36
>>jrajav+fC
Yeah, like OpenAI hired their first interim CEO on Thursday night, hired their second on Monday, and are now talking about rehiring Sam (who probably doesn't care to be rehired).

There may be drawbacks to the "instant hiring" model.

[go to top]