zlacker

[parent] [thread] 6 comments
1. mianos+(OP)[view] [source] 2023-11-20 07:52:33
This is exactly why you would want people on the board who understand the technology. Unless they have some other technology that we don't know about, that maybe brought all this on, a GPT is not a clear path to AGI. That is a technical thing that to understand seems to be beyond most people without real experience in the field. It is certainly beyond the understanding of some dude that lucked into a great training set and became an expert, much the same way the The Knack became industry leaders.
replies(1): >>famous+D1
2. famous+D1[view] [source] 2023-11-20 08:01:32
>>mianos+(OP)
>Unless they have some other technology that we don't know about, that maybe brought all this on, a GPT is not a clear path to AGI.

So Ilya Sutskever, one of the most distinguished ML researchers of his generation does not understand the technology ?

The same guy who's been on record saying LLMs are enough for AGI ?

replies(3): >>lucubr+0f >>mianos+Cf >>fallin+aj
◧◩
3. lucubr+0f[view] [source] [discussion] 2023-11-20 08:59:16
>>famous+D1
To be clear, he thinks that LLMs are probably a general architecture, and thus capable of reaching AGI in principle with enormous amounts of compute, data, and work. He thinks for cost and economics reasons it's much more feasible to build or train other parts and have them work together, because that's much cheaper in terms of compute. As an example, with a big enough model, enough work, and the right mix of data you could probably have an LMM interpret speech just as well as Whisper can. But how much work does it take to make that happen without losing other capabilities? How efficient is the resulting huge model? Is the end result better than having the text/intelligence segment separate from the speech and hearing segment? The answer could be yes, depending, but it could also be no. Basically his beliefs are that it's complicated and it's not really a "Can X architecture do this" question but a "How cheap is this architecture to accomplish this task" question.
replies(1): >>famous+TQ1
◧◩
4. mianos+Cf[view] [source] [discussion] 2023-11-20 09:03:30
>>famous+D1
Sorry, I am not including Ilya when I say not understand the technology.

In fact, he is exactly the type to be on the board.

He is not the one saying 'slow down we might accidentally invent an AGI that takes over the world'. As you say, he says, LLMS are not a path to a world dominating AGI.

◧◩
5. fallin+aj[view] [source] [discussion] 2023-11-20 09:26:12
>>famous+D1
AGI doesn't exist. There is no standard for what makes an AGI or test to prove that an AI is or isn't an AGI once built. There is no engineering design for even a hypothetical AGI like there is for other hypothetical tech e.g. a fusion reactor, so we have no idea if it is even similar to existing machine learning designs. So how can you be an expert on it? Being an expert on existing machine learning tech, which Ilya absolutely is, doesn't grant this status.
replies(1): >>famous+7R1
◧◩◪
6. famous+TQ1[view] [source] [discussion] 2023-11-20 17:40:38
>>lucubr+0f
This is wholly besides the point. The person I'm replying to is clearly saying the only people who believe "GPT is on the path to AGI" are non technical people who don't "truly understand". Blatantly false.

It's like an appeal to authority against an authority that isn't even saying what you're appealing for.

◧◩◪
7. famous+7R1[view] [source] [discussion] 2023-11-20 17:41:20
>>fallin+aj
This is wholly besides the point. The person I'm replying to is clearly saying the only people who believe "GPT is on the path to AGI" are non technical people who don't "truly understand". Blatantly false. It's like an appeal to authority against an authority that isn't even saying what you're appealing for.
[go to top]