zlacker

[parent] [thread] 42 comments
1. tsunam+(OP)[view] [source] 2023-11-19 03:21:13
The fact that HN engineering grunts have no idea what table stakes are vs titles and authority shows how they aren’t cut out for executive brinksmanship.

Sam has superior table stakes.

replies(4): >>gtirlo+L >>zeroon+P1 >>djur+33 >>juped+pc
2. gtirlo+L[view] [source] 2023-11-19 03:26:58
>>tsunam+(OP)
Such as?
replies(1): >>tsunam+12
3. zeroon+P1[view] [source] 2023-11-19 03:34:59
>>tsunam+(OP)
I don’t think you are using table stakes correctly
replies(2): >>tsunam+i2 >>x86x87+qf
◧◩
4. tsunam+12[view] [source] [discussion] 2023-11-19 03:36:10
>>gtirlo+L
Talent following

Financial backing to make a competitor

Internal knowledge of roadmap

Media focus

Alignment with the 2nd most valuable company on the planet.

I could go on. I strongly dislike the guy but you need to recognize table stakes even in your enemy. Or you’ll be like Ilya. A naive fool who is gonna get wrecked thinking doing the “right” thing in his own mind will automatically means you win.

replies(2): >>cthalu+y3 >>hansSj+Ko
◧◩
5. tsunam+i2[view] [source] [discussion] 2023-11-19 03:38:28
>>zeroon+P1
Really? Aka Sam has the ability to start a new business and take the contracts with him and Ilya doesn’t. Because that’s table stakes. Exactly.
replies(4): >>cthalu+i3 >>nostra+r6 >>Random+n9 >>resolu+Bj
6. djur+33[view] [source] 2023-11-19 03:42:37
>>tsunam+(OP)
What does any of that have to do with whether it's a "coup" or not? "Coup" has an implication of illegitimacy, but by all accounts the board acted within its authority. It doesn't matter if it was an ill-advised action or if Altman has more leverage here.
replies(3): >>jacque+u4 >>tsunam+S4 >>chatma+f8
◧◩◪
7. cthalu+i3[view] [source] [discussion] 2023-11-19 03:43:38
>>tsunam+i2
Are you saying that Sam has the ability to generate new contracts when you say take contracts with him, or do you think that somehow the existing contracts with Microsoft and other investors are tied to where he is?
replies(1): >>tsunam+15
◧◩◪
8. cthalu+y3[view] [source] [discussion] 2023-11-19 03:45:18
>>tsunam+12
From everything we can see Ilya appears to be a true believer.

A true believer is going to act along the axis of their beliefs even if it ultimately results in failure. That doesn't necessarily make them naive or fools - many times they will fully understand that their actions have little or no chance of success. They've just prioritized a different value of you.

replies(2): >>tsunam+p4 >>jacque+I4
◧◩◪◨
9. tsunam+p4[view] [source] [discussion] 2023-11-19 03:50:16
>>cthalu+y3
Agree but I see that as potato potahto. Failure by a different name with imaginary wins by the delusional ethicist.
◧◩
10. jacque+u4[view] [source] [discussion] 2023-11-19 03:50:45
>>djur+33
They acted within their authority but possibly without the support of those that asked them to join in the first place and possibly without sufficient grounds and definitely in a way that wasn't in the interest of OpenAI as far as the story is known today.
replies(1): >>hansSj+2o
◧◩◪◨
11. jacque+I4[view] [source] [discussion] 2023-11-19 03:52:23
>>cthalu+y3
That's fair, but by messing this up OpenAI may well end up without any oversight at all. Which isn't the optimum outcome by a long shot and that's what you get for going off half-cocked about a thing like this.
replies(1): >>Sebb76+zd3
◧◩
12. tsunam+S4[view] [source] [discussion] 2023-11-19 03:53:37
>>djur+33
Legitimacy is derived from power not from abstraction. Sorry that’s the reality. Rules are an abstraction. Power let’s you do whatever you want including making new rules.
replies(1): >>x86x87+b6
◧◩◪◨
13. tsunam+15[view] [source] [discussion] 2023-11-19 03:54:27
>>cthalu+i3
I’d say so. Or bring satya with him.
◧◩◪
14. x86x87+b6[view] [source] [discussion] 2023-11-19 04:01:13
>>tsunam+S4
Yeah no. While you may be onto something that still does not make it a coup.
replies(1): >>tsunam+27
◧◩◪
15. nostra+r6[view] [source] [discussion] 2023-11-19 04:03:33
>>tsunam+i2
Everyone on that board is financially independent and can do whatever they want. If Sam & Ilya can't get along that basically means there are 2 companies where previously there was OpenAI. (4 if you add Google and Anthropic into the mix; remember that OpenAI was founded because Ilya left Google, and then Anthropic was founded when a bunch of top OpenAI researchers left and started their own company).

Ultimately this is good for competition and the gen-AI ecosystem, even if it's catastrophic for OpenAI.

replies(1): >>tsunam+X6
◧◩◪◨
16. tsunam+X6[view] [source] [discussion] 2023-11-19 04:06:49
>>nostra+r6
Anyone can do whatever they want, it doesn’t mean it will work out the way they want it too.
replies(1): >>nostra+J7
◧◩◪◨
17. tsunam+27[view] [source] [discussion] 2023-11-19 04:07:29
>>x86x87+b6
It’s doesn’t matter what you call it.
replies(1): >>x86x87+ra
◧◩◪◨⬒
18. nostra+J7[view] [source] [discussion] 2023-11-19 04:14:03
>>tsunam+X6
I'm curious what you're inferring to be "the way they want it to"?

From my read, Ilya's goal is to not work with Sam anymore, and relatedly, to focus OpenAI on more pure AGI research without needing to answer to commercial pressures. There is every indication that he will succeed in that. It's also entirely possible that that may mean less investment from Microsoft etc, less commercial success, and a narrower reach and impact. But that's the point.

Sam's always been about having a big impact and huge commercial success, so he's probably going to form a new company that poaches some top OpenAI researchers, and aggressively go after things like commercial partnerships and AI stores. But that's also the point.

Both board members are smart enough that they will probably get what they want, they just want different things.

replies(1): >>stdgy+ma
◧◩
19. chatma+f8[view] [source] [discussion] 2023-11-19 04:17:49
>>djur+33
There's a distinction between what's technically allowed and what's politically allowed. The board has every right to vote Sam and Greg off the island with 4/6 voting in favor. That doesn't mean they won't see resistance to their decision on other fronts, and especially those where Sam and Greg have enough soft power that the rest of the board would be obviously inadvised to contradict them. If the entire media apparatus is on their side, for example (soft power), then the rest of the board needs to consider that before making a decision that they're technically empowered to make (hard power).

IMO, there are basically two justifiably rational moves here: (1) ignore the noise; accept that Sam and Greg have the soft power, but they don't have the votes so they can fuck off; (2) lean into the noise; accept that you made a mistake in firing Sam and Greg and bring them back in a show of magnanimity.

Anything in between these two options is hedging their bets and will lead to them getting eaten alive.

replies(1): >>tsunam+ob
◧◩◪
20. Random+n9[view] [source] [discussion] 2023-11-19 04:27:44
>>tsunam+i2
But it isn't a business at heart from its structure. Commercially I agree that Sam's position is superior but purely focusing on the non-profit's mission (not even the non-profit itself) - not so sure.
◧◩◪◨⬒⬓
21. stdgy+ma[view] [source] [discussion] 2023-11-19 04:36:11
>>nostra+J7
You need to remember that most people on this site subscribe to the ideology that growth is the only thing that matters. They're Michael Douglas 'greed is good' type of people wrapped up in a spiffy technological veneer.

Any decision that doesn't make the 'line go up' is considered a dumb decision. So to most people on this site, kicking Sam out of the company was a bad idea because it meant the company's future earning potential had cratered.

replies(3): >>peyton+1b >>tsunam+If >>int_19+wi
◧◩◪◨⬒
22. x86x87+ra[view] [source] [discussion] 2023-11-19 04:37:08
>>tsunam+27
it sort of does. a coup is usually regarded as a bad thing. firing a ceo? not so much.

pushing to call it a coup is an attempt to control the narrative.

◧◩◪◨⬒⬓⬔
23. peyton+1b[view] [source] [discussion] 2023-11-19 04:42:14
>>stdgy+ma
That’s unfair. The issue is poor governance. Why would anybody outside OpenAI care how much money they make? The fact is a lot of people now rely in one way or another on OpenAI’s services. Arbitrary and capricious decisions affect them.
◧◩◪
24. tsunam+ob[view] [source] [discussion] 2023-11-19 04:44:44
>>chatma+f8
Except You are discounting the major player with all the hard power who can literally call any shot with money
replies(2): >>chatma+nc >>Random+Nd
◧◩◪◨
25. chatma+nc[view] [source] [discussion] 2023-11-19 04:52:24
>>tsunam+ob
You mean Microsoft, who hasn't actually paid them the money they said they will eventually, and who can change their Azure billing arrangement at any time?

Sure, I guess I didn't consider them, but you can lump them into the same "media campaign" (while accepting that they're applying some additional, non-media related leverage) and you'll come to the same conclusion: the board is incompetent. Really the only argument I see against this is that the legal structure of OpenAI is such that it's actually in the board's best interest to sabotage the development of the underlying technology (i.e. the "contain the AGI" hypothesis, which I don't personally subscribe to - IMO the structure makes such decisions more difficult for purely egotistical reasons; a profit motive would be morally clarifying).

26. juped+pc[view] [source] 2023-11-19 04:52:30
>>tsunam+(OP)
lmao
◧◩◪◨
27. Random+Nd[view] [source] [discussion] 2023-11-19 05:02:30
>>tsunam+ob
The objective functions might be different enough and then there is nothing the hard power can do to get what it wants from OpenAI. Non-profit might consider winddown more in line with mission than something else, for example.
replies(1): >>chatma+rf
◧◩
28. x86x87+qf[view] [source] [discussion] 2023-11-19 05:19:13
>>zeroon+P1
I second that this is an usual use of table stakes.

Here is what I understand by table stakes: https://brandmarketingblog.com/articles/branding-definitions...

◧◩◪◨⬒
29. chatma+rf[view] [source] [discussion] 2023-11-19 05:19:20
>>Random+Nd
The threat to the hard power is that a new company emerges to compete with them, and it's led by the same people they just fired.

If your objective is to suppress the technology, the emergence of an equally empowered competitor is not a development that helps your cause. In fact there's this weird moral ambiguity where your best move is to pretend to advance the tech while actually sabotaging it. Whereas by attempting to simply excise it from your own organization's roadmap, you push its development outside your control (since Sam's Newco won't be beholden to any of your sanctimonious moral constraints). And the unresolvability of this problem, IMO, is evidence of why the non-profit motive can't work.

As a side-note: it's hilarious that six months ago OpenAI (and thus Sam) was the poster child for the nanny AI that knows what's best for the user, but this controversy has inverted that perception to the point that most people now see Sam as a warrior for user-aligned AGI... the only way he could fuck this up is by framing the creation of Newco as a pursuit of safety.

replies(1): >>Random+1g
◧◩◪◨⬒⬓⬔
30. tsunam+If[view] [source] [discussion] 2023-11-19 05:21:29
>>stdgy+ma
I’m sorry, how is OpenAI going to pay for itself then? On goodwill and hopes?

Please get real.

replies(1): >>stdgy+xi
◧◩◪◨⬒⬓
31. Random+1g[view] [source] [discussion] 2023-11-19 05:24:17
>>chatma+rf
If they cannot fulfill their mission one way or another (because it isn't resolvable in the structure) than dissolution isn't a bad option, I'd say.
replies(1): >>chatma+zg
◧◩◪◨⬒⬓⬔
32. chatma+zg[view] [source] [discussion] 2023-11-19 05:28:59
>>Random+1g
That's certainly a purist way of looking at it, and I don't disagree that it's the most aligned with their charter. But it also seems practically ineffective, even - no, especially - when considered within the context of that charter. Because by shutting it down (or sabotaging it), they're not just making a decision about their own technology; they're also yielding control of it to groups that are not beholden to the same constraints.
replies(1): >>Random+Vg
◧◩◪◨⬒⬓⬔⧯
33. Random+Vg[view] [source] [discussion] 2023-11-19 05:31:40
>>chatma+zg
Given that their control over the technology at large is limited anyway, they are already (somewhat?) ineffective, I would think. Not sure what a really good and attainable position for them would like be in that respect.
replies(1): >>chatma+5h
◧◩◪◨⬒⬓⬔⧯▣
34. chatma+5h[view] [source] [discussion] 2023-11-19 05:33:07
>>Random+Vg
Yeah, agreed. But that's also why I feel the whole moral sanctimony is a pointless pursuit in the first place. The tech is coming, from somewhere, whether you like it or not. Never in history has a technological revolution been stopped.
◧◩◪◨⬒⬓⬔
35. int_19+wi[view] [source] [discussion] 2023-11-19 05:46:16
>>stdgy+ma
> You need to remember that most people on this site subscribe to the ideology that growth is the only thing that matters

I'm not sure that's actually true anymore. Look at any story about "growth", and you'll see plenty of skeptical comments. I'd say the audience has skewed pretty far from all the VC stuff.

◧◩◪◨⬒⬓⬔⧯
36. stdgy+xi[view] [source] [discussion] 2023-11-19 05:46:20
>>tsunam+If
My best guess is they turn off the commercial operations that are costing them the most money (And that they didn't want Sam to push in the first place) and pump up the prices on the ones they can actually earn a profit from and then try to coast for awhile.

Or they'll do something hilarious like sell VCs on a world wide cryptocurrency that is uniquely joined to an individual by their biometrics and somehow involves AI. I'm sure they could wrangle a few hundred million out of the VC class with a braindead scheme like that.

◧◩◪
37. resolu+Bj[view] [source] [discussion] 2023-11-19 06:00:29
>>tsunam+i2
No, to continue the poker metaphors, that's taking your chips and going home, perhaps to create your own casino with blackjack and hookers (h/t to Bender).

"Table stakes" simply means having enough money to sit at the table and play, nothing more. "Having a big pile of GPUs is table stakes to contest in the AI market."

◧◩◪
38. hansSj+2o[view] [source] [discussion] 2023-11-19 06:48:23
>>jacque+u4
You're speaking as if Altman and Brockman did Sutskever a favour by "asking him to join". They were practically begging.
replies(1): >>jacque+iO
◧◩◪
39. hansSj+Ko[view] [source] [discussion] 2023-11-19 06:54:17
>>tsunam+12
Ilya IS the talent. They were desperate to hire him.
replies(1): >>tsunam+YR1
◧◩◪◨
40. jacque+iO[view] [source] [discussion] 2023-11-19 11:05:18
>>hansSj+2o
Doesn't change the fact that this probably wasn't the outcome they were going for.
◧◩◪◨
41. tsunam+YR1[view] [source] [discussion] 2023-11-19 18:02:32
>>hansSj+Ko
I’ve been in his shoes at a smaller level. Once the company believes they have a stable sellable product they have no interest in any new breakthroughs. His table stakes are gone and Microsoft probably believes gpt4 turbo will be billable for years to come.
◧◩◪◨⬒
42. Sebb76+zd3[view] [source] [discussion] 2023-11-20 01:05:06
>>jacque+I4
Assuming he sees OpenAI spinning out of control either way, it's probably better to have tried to change it and, if it fails, to at least not be part of the problem.
replies(1): >>jacque+Lz3
◧◩◪◨⬒⬓
43. jacque+Lz3[view] [source] [discussion] 2023-11-20 04:04:05
>>Sebb76+zd3
I think that could have been done more graciously. And there are other drivers still on the table other than good governance, a good old palace revolution in disguise is definitely not ruled out at this point.
[go to top]