zlacker

[return to "Ilya Sutskever to leave OpenAI"]
1. zoogen+Ix[view] [source] 2024-05-15 04:50:43
>>wavela+(OP)
Interesting, both Karpathy and Sutskever are gone from OpenAI now. Looks like it is now the Sam Altman and Greg Brockman show.

I have to admit, of the four, Karpathy and Sutskever were the two I was most impressed with. I hope he goes on to do something great.

◧◩
2. nabla9+pH[view] [source] 2024-05-15 06:45:38
>>zoogen+Ix
Top 6 science guys are long gone. Open AI is run by marketing, business, software and productization people.

When the next wave of new deep learning innovations sweeps the world, Microsoft eats whats left of them. They make lots of money, but don't have future unless they replace what they lost.

◧◩◪
3. fnordp+SH[view] [source] 2024-05-15 06:52:31
>>nabla9+pH
I don’t feel that OpenAI has a huge moat against say Anthropic. And I don’t know OpenAI needs Microsoft nearly as much as Microsoft needs OpenAI
◧◩◪◨
4. cm2187+bN[view] [source] 2024-05-15 07:49:53
>>fnordp+SH
But is it even clear what is the next big leap after LLM? I have the feeling many tend to extrapolate the progress of AI from the last 2 years to the next 30 years but research doesn't always work like that (though improvements in computing power did).
◧◩◪◨⬒
5. benter+7R[view] [source] 2024-05-15 08:23:07
>>cm2187+bN
Extrapolating 2 years might give you a wrong idea, but extrapolating the last year suggests making another leap that was GPT3 or GPT4 is much, much more difficult. The only considerable breakthrough I can think of is Google's huge context window which I hope will be the norm one day, but in terms of actual results they're not mind-blowing yet. We see little improvements everyday and for sure there will be some leaps, but I wouldn't count on a revolution.
◧◩◪◨⬒⬓
6. trasht+E21[view] [source] 2024-05-15 10:36:45
>>benter+7R
Unlike AI in the past, there is now massive amounts of money going into AI. And the number things humans are still doing significantly better than AI is going down continously now.

If something like Q* is provided organically with GPT5 (which may have a different name), and allows proper planning, error correction and direct interaction with tools, that gaps is getting really close to 0.

◧◩◪◨⬒⬓⬔
7. varjag+d41[view] [source] 2024-05-15 10:56:50
>>trasht+E21
AI in the past (adjusted for 1980s) was pretty well funded. It's just that fundamental scientific discovery bears little relationship to the pallets of cash.
◧◩◪◨⬒⬓⬔⧯
8. trasht+0i1[view] [source] 2024-05-15 12:35:44
>>varjag+d41
> AI in the past (adjusted for 1980s) was pretty well funded.

A tiny fraction of the current funding. 2-4 orders of magnitude less.

> It's just that fundamental scientific discovery bears little relationship to the pallets of cash

Heavy funding may not automatically lead to breakthroughs such as Special Relativity or Quantum Mechanics (though it helps there too). But once the most basic ideas are in place, massive is what causes the breakthroughs like in the Manhatten Project and Apollo Program.

And it's not only the money itself. It's the attention and all the talent that is pulled in due to that.

And in this case, there is also the fear that the competition will reach AGI first, whether the competition is a company or a foreign government.

It's certainly possible the the ability to monetize the investments may lead to some kind of slowdown at some point (like if there is a recession).

But it seems to me that such a recession will have no more impact on the development of AGI than the dotcom bust had for the importance of the internet.

[go to top]