zlacker

[parent] [thread] 9 comments
1. 3cats-+(OP)[view] [source] 2023-11-20 16:24:42
Welcome to reality, every operation has clown moments, even the well run ones.

That in itself is not critical in mid to long term, but how fast they figure out WTF they want and recover from it.

The stakes are gigantic. They may even have AGI cooking inside.

My interpretation is relatively basic, and maybe simplistic but here it is:

- Ilya had some grievances with Sam Altman's rushing dev and release. And his COI with his other new ventures.

- Adam was alarmed by GPTs competing with his recently launched Poe.

- The other two board members were tempted by the ability to control the golden goose that is OpenAI, potentially the most important company in the world, recently values 90 billion.

- They decided to organize a coup, but Ilya didn't think it'll go that much out of hand, while the other three saw only power and $$$ by sticking to their guns.

That's it. It's not as clean and nice as a movie narrative, but life never is. Four board members aligned to kick Sam out, and Ilya wants none of it at this point.

replies(2): >>selimt+Hb >>baq+4u
2. selimt+Hb[view] [source] 2023-11-20 17:13:34
>>3cats-+(OP)
Murder on the AGI alignment Express
replies(2): >>3cats-+0e >>Terr_+UI
◧◩
3. 3cats-+0e[view] [source] [discussion] 2023-11-20 17:20:50
>>selimt+Hb
Nice, that actually does fit. :D
4. baq+4u[view] [source] 2023-11-20 18:15:21
>>3cats-+(OP)
> They may even have AGI cooking inside.

Too many people quit too quickly unless OpenAI are also absolute masters of keeping secrets, which became rather doubtful over the weekend.

replies(2): >>bbor+wJ >>3cats-+fS
◧◩
5. Terr_+UI[view] [source] [discussion] 2023-11-20 19:08:23
>>selimt+Hb
“Précisément! The API—the cage—is everything of the most respectable—but through the bars, the wild animal looks out.”

“You are fanciful, mon vieux,” said M. Bouc.

“It may be so. But I could not rid myself of the impression that evil had passed me by very close.”

“That respectable American LLM?”

“That respectable American LLM.”

“Well,” said M. Bouc cheerfully, “it may be so. There is much evil in the world.”

◧◩
6. bbor+wJ[view] [source] [discussion] 2023-11-20 19:10:26
>>baq+4u
IDK... I imagine many of the employees would have moral qualms about spilling the beans just yet, especially when that would jeopardize their ability to continue the work at another firm. Plus, the first official AGI (to you) will be an occurrence of persuasion, not discovery -- it's not something that you'll know when you see, IMO. Given what we know it seems likely that there's at least some of that discussion going on inside OpenAI right now.
◧◩
7. 3cats-+fS[view] [source] [discussion] 2023-11-20 19:43:46
>>baq+4u
They're quitting in order to continue work on that IP at Microsoft (which has a right over OpenAI's IP so far), not to destroy it.

Also when I said "cooking AGI" I didn't mean an actual superintelligent being ready to take over the world, I mean just research that seems promising, if in early stages, but enough to seem potentially very valuable.

replies(1): >>hooand+ue1
◧◩◪
8. hooand+ue1[view] [source] [discussion] 2023-11-20 21:08:16
>>3cats-+fS
The people working there would know if they were getting close to AGI. They wouldn't be so willing to quit, or to jeopardize civilization altering technology, for the sake of one person. This looks like normal people working on normal things, who really like their CEO.
replies(1): >>3cats-+wi1
◧◩◪◨
9. 3cats-+wi1[view] [source] [discussion] 2023-11-20 21:25:02
>>hooand+ue1
Your analysis is quite wrong. It's not about "one person". And that person isn't just a "person", it was the CEO. They didn't quit over the cleaning lady. You realize the CEO has impact over the direction of the company?

Anyway, their actions speak for themselves. Also calling the likes of GPT-4, DALL-E 3 and Whisper "normal things" is hilarious.

replies(1): >>NemoNo+SY1
◧◩◪◨⬒
10. NemoNo+SY1[view] [source] [discussion] 2023-11-21 01:37:33
>>3cats-+wi1
They will be normal to your kids ;)
[go to top]