zlacker

[return to "Ilya Sutskever "at the center" of Altman firing?"]
1. Bjorkb+87[view] [source] 2023-11-18 03:32:31
>>apsec1+(OP)
I have a hard time believing this simply since it seems so ill-conceived. Sure, maybe Sam Altman was being irresponsible and taking risks, but they had an insanely good thing going for them. I'm not saying Sam Altman was responsible for the good times they were having, but you're probably going to bring them to an end by abruptly firing one of the most prominent members of the group, seeing where individual loyalties lie, and pissing off Microsoft by tanking their stock price without giving them any heads up.

I mean, none of this would be possible without insane amounts of capital and world class talent, and they probably just made it a lot harder to acquire both.

But what do I know? If you can convince yourself that you're actually building AGI by making an insanely large LLM, then you can also probably convince yourself of a lot of other dumb ideas too.

◧◩
2. hilux+88[view] [source] 2023-11-18 03:39:59
>>Bjorkb+87
Reading between lots of lines, one possibility is that Sam was directing this "insanely good thing" toward making lots of money, whereas the non-profit board prioritized other goals higher.
◧◩◪
3. Bjorkb+o9[view] [source] 2023-11-18 03:50:27
>>hilux+88
Sure, I get that, but to handle a disagreement over money in such a consequential fashion just doesn't make sense to me. They must have understood that to arrive in a position where they have to fire the CEO with little warning is going to have profound consequences, perhaps even existential ones.
◧◩◪◨
4. 015a+vg[view] [source] 2023-11-18 04:43:51
>>Bjorkb+o9
AGI is existential. That's the whole point, I think. If they can get to AGI, then building an LLM app store is such a distraction along the path that any reasonable person would look back and laugh at how cute an idea it was, despite how big or profitable it feels today.
◧◩◪◨⬒
5. js8+Iw[view] [source] 2023-11-18 06:56:57
>>015a+vg
It's a distraction only if you are not an effective altruist. To build AGI (so that all humans can benefit) you need money, so this was a way to make money so they could FINALLY be spent on the goal of AGI. /s

I think the next AGI startup should perhaps try the communist revolution route, since the capitalist-based one didn't pan out. After all, Lenin was a pioneer in effective altruism. /s

◧◩◪◨⬒⬓
6. timeon+hS[view] [source] 2023-11-18 10:11:44
>>js8+Iw
Can '/s' after straw man sneak the message across?
◧◩◪◨⬒⬓⬔
7. js8+TZ[view] [source] 2023-11-18 11:15:30
>>timeon+hS
I am strawmanning effective altruism in the same way that effective altruism strawmans just plain old altruism.
◧◩◪◨⬒⬓⬔⧯
8. OJFord+Hh1[view] [source] 2023-11-18 13:21:24
>>js8+TZ
Ha, that's brilliantly put. I think the fundamental idea of EA is perfectly sound, but then instead of just being basic advice (for when 'doing altruism') it's somehow a cult?
[go to top]