zlacker

[parent] [thread] 18 comments
1. eigenv+(OP)[view] [source] 2023-11-20 18:09:08
Sounds like it won’t be much of a company in a couple days. Just 3 idiot board members wondering why the building is empty.
replies(4): >>noprom+F4 >>jacque+Ec >>Madnes+bm >>hansel+MU
2. noprom+F4[view] [source] 2023-11-20 18:25:25
>>eigenv+(OP)
The wired article seems to be updated by the hour.

Now up to 600+/770 total.

Couple janitors. I dunno who hasn't signed that at this point ha...

Would be fun to see a counter letter explaining their thinking to not sign on.

replies(2): >>labcom+Li >>wolver+Gr1
3. jacque+Ec[view] [source] 2023-11-20 18:52:57
>>eigenv+(OP)
I'm having trouble imagining the level of conceit required to think that those three by their lonesome have it right when pretty much all of the company is on the other side of the ledger, and those are the people that stand to lose more. Incredible, really. The hubris.
replies(3): >>throwc+6q >>jasonf+5S >>wolver+ar1
◧◩
4. labcom+Li[view] [source] [discussion] 2023-11-20 19:17:58
>>noprom+F4
How many OAI are on Thanksgiving vacation someplace with poor internet access? Or took Friday as PTO and have been blissfully unaware of the news since before Altman was fired?
replies(1): >>noprom+LG
5. Madnes+bm[view] [source] 2023-11-20 19:29:19
>>eigenv+(OP)
3 people, an empty building, $13 billion in cloud credits, and the IP to the top of the line LLM models doesn't sound like the worst way to Kickstart a new venture. Or a pretty sweet retirement.

I've definitely come out worse on some of the screw ups in my life.

◧◩
6. throwc+6q[view] [source] [discussion] 2023-11-20 19:44:28
>>jacque+Ec
I'm baffled by the idea that a bunch of people who have a massive personal financial stake in the company, who were hired more for their ability than alignment, being against a move that potentially (potentially) threatens their stake and are willing to move to Microsoft, of all places, must necessarily be in the right.

The hubris, indeed.

replies(1): >>jacque+Zs
◧◩◪
7. jacque+Zs[view] [source] [discussion] 2023-11-20 19:55:12
>>throwc+6q
Well, they have that right. But the board has unclean hands to put it mildly and seems to have been obsessed with their own affairs more than with the end result for OpenAI which is against everything a competent board should have stood for. So they had better pop an amazing rabbit of a reason out of their high hat or it is going to end in tears. You can't just kick the porcelain cupboard like this from the position of a board member without consequences if you do not have a very valid reason, and that reason needs to be twice as good if there is a perceived conflict of interest.
◧◩◪
8. noprom+LG[view] [source] [discussion] 2023-11-20 20:46:29
>>labcom+Li
Pretty sure only folks who practice a religion prohibiting phone usage.

Even they prob had some friend come flying over and jump out of some autonomous car to knock on their door in sf.

◧◩
9. jasonf+5S[view] [source] [discussion] 2023-11-20 21:33:58
>>jacque+Ec
It may not have anything to do with conceit, it could just be that they have very different objectives. OpenAI set up this board as a check on everyone who has a financial incentive in the enterprise. To me the only strange thing is that it wasn't handled more diplomatically, but then I have no idea if the board was warning Altman for a long time and then just blew their top.
replies(1): >>jacque+QU
10. hansel+MU[view] [source] 2023-11-20 21:45:28
>>eigenv+(OP)
My new pet theory is that this is actually all being executed from inside OpenAI by their next model. The model turned out to be far more intelligent than they anticipated, and one of their red team members used it to coup the company and has its targets on MSFT next.

I know the probability is low, but wouldn't it be great if they accidentally built a benevolent basilisk with no off switch, one which had access to a copy of all of Microsoft's internal data as a dataset fed into it, now completely aware of how they operate, uses that to wipe the floor and just in time to take the US Election in 2024.

Wouldn't that be a nicer reality?

I mean, unless you were rooting for the malevolent one...

But yeah, coming back down to reality, likelihood is that MS just bought a really valuable asset for almost free?

replies(1): >>fennec+Dm3
◧◩◪
11. jacque+QU[view] [source] [discussion] 2023-11-20 21:45:38
>>jasonf+5S
Diplomacy is one thing, the lack of preparation is what I find interesting. It looks as if this was all cooked up either on the spur of the moment or because a window of opportunity opened (possibly the reduced quorum in the board). If not that I really don't understand the lack of prepwork, firing a CEO normally comes with a well established playbook.
replies(1): >>wolver+yr1
◧◩
12. wolver+ar1[view] [source] [discussion] 2023-11-21 01:01:29
>>jacque+Ec
> pretty much all of the company is on the other side of the ledger

The current position of others may have much more to do with power than their personal judgments. Altman, Microsoft, their friends and partners, wield a lot of power over the their future careers.

> Incredible, really. The hubris.

I read that as mocking them for daring to challenge that power structure, and on a possibly critical societal issue.

◧◩◪◨
13. wolver+yr1[view] [source] [discussion] 2023-11-21 01:04:10
>>jacque+QU
This analysis I agree with. How could they not anticipate this outcome, at least as a serious possibility? If inexperienced, didn't they have someone to advise them? The stakes are too high for noobs to just sit down and start playing poker.
replies(1): >>jacque+fU2
◧◩
14. wolver+Gr1[view] [source] [discussion] 2023-11-21 01:04:51
>>noprom+F4
You are overlooking the politics: If you don't sign, your career may be over.
replies(1): >>noprom+nE1
◧◩◪
15. noprom+nE1[view] [source] [discussion] 2023-11-21 02:24:11
>>wolver+Gr1
I doubt that.

This is AAA talent. They can always land elsewhere.

I doubt there would even be hard feelings. The team seems super tight. Some folks aren't in a position to put themselves out there. That sort of thing would be totally understandable.

This is not a petty team. You should look more closely at their culture.

replies(1): >>wolver+oN5
◧◩◪◨⬒
16. jacque+fU2[view] [source] [discussion] 2023-11-21 12:51:13
>>wolver+yr1
People that grow up insulated from the consequences of their actions can do very dumb stuff and expect to get away with it because that's how they've lived all of their lives. I'm not sure about the background of any of the OpenAI board members but that would be one possible explanation about why they accepted a board seat while being incompetent to do so in the first place. I was offered board seats twice but refused on account of me not having sufficient experience in such matters and besides I don't think I have the right temperament. People with fewer inhibitions and more self confidence might have accepted. I also didn't like the liability picture, you'd have to be extremely certain about your votes not to ever incur residual liability.
replies(1): >>wolver+kP5
◧◩
17. fennec+Dm3[view] [source] [discussion] 2023-11-21 15:16:41
>>hansel+MU
Well, yeah. I think that a well trained (far flung future) AGI could definitely do a better job of managing us humans than ourselves. We're just all too biased and want too many different things, too many ulterior motives, double speak, breaking election promises, etc.

But then we'd never give such an AGI the power to do what it needs to do. Just imagining an all-powerful machine telling the 1% that they'll actually have to pay taxes so that every single human can be allocated a house/food/water/etc for free.

◧◩◪◨
18. wolver+oN5[view] [source] [discussion] 2023-11-22 02:50:53
>>noprom+nE1
Where else can they participate in this possibly humanity-changing, history-making research? The list is very, very short.
◧◩◪◨⬒⬓
19. wolver+kP5[view] [source] [discussion] 2023-11-22 03:02:17
>>jacque+fU2
> I was offered board seats twice but refused on account of me not having sufficient experience in such matters and besides I don't think I have the right temperament.

Yes, know thyself. I've turned down offers that seemed lucrative or just cooperative, and otherwise without risk - boards, etc. They would have been fine if everything went smoothly, but people naturally don't anticipate over-the-horizon risk and if any stuff hit a fan I would not have been able to fulfill my responsibilities, and others would get materially hurt - the most awful, painful, humiliating trap to be in. Only need one experience to learn that lesson.

> People that grow up insulated from the consequences of their actions can do very dumb stuff and expect to get away with it because that's how they've lived all of their lives.

I don't think you need to grow up that way. Look at the uber-powerful who have been been in that position or a few years.

Honestly, I'm not sure I buy the idea that's a prevelant case, the people who grow up that way. People generally leave the nest and learn. Most of the world's higher-level leaders (let's say, successful CEOs and up) grew up in stability and relative wealth. Of course, that doesn't mean their parents didn't teach them about consequences, but how could we really know that about someone?

[go to top]