zlacker

[return to ""]
1. achow+(OP)[view] [source] 2024-03-01 08:11:07
>>modele+z4
> Microsoft gained exclusive licensing to OpenAI's GPT-3 language model in 2020. Microsoft continues to assert rights to GPT-4, which it claims has not reached the level of AGI, which would block its licensing privileges.

Not sure this is a common knowledge - MSFT licence vis-a-vis AGI.

2. modele+z4[view] [source] 2024-03-01 08:56:05
3. rickde+06[view] [source] 2024-03-01 09:11:44
>>achow+(OP)
It's described here: https://openai.com/our-structure

Quote:

  Fifth, the board determines when we've attained AGI. Again, by AGI we mean a highly autonomous system that outperforms humans at most economically valuable work. Such a system is excluded from IP licenses and other commercial terms with Microsoft, which only apply to pre-AGI technology.

> "Musk claims Microsoft's hold on Altman and the OpenAI board will keep them from declaring GPT-4 as a AGI in order to keep the technology private and profitable."

Well.....sounds plausible...

◧◩
4. jp_nc+Zr[view] [source] 2024-03-01 13:10:24
>>rickde+06
If he thinks GPT-4 is AGI, Elon should ask a team of GPT-4 bots to design, build and launch his rockets and see how it goes. If “economically valuable work” means creating terrible, wordy blog posts then yeah I guess it’s a risk.
◧◩◪
5. bart_s+Cw[view] [source] 2024-03-01 13:47:44
>>jp_nc+Zr
I don’t think GPT-4 is AGI, but that seems like a foolish idea. An AGI doesn’t need to be hyperproficient at everything, or even anything. Ask a team of any non-aeronautical engineers to build a rocket and it will go poorly. Do those people not qualify as intelligent beings?
◧◩◪◨
6. spence+p11[view] [source] 2024-03-01 16:48:25
>>bart_s+Cw
Outperforming humans does not mean outperforming an average untrained human
◧◩◪◨⬒
7. stubis+0l2[view] [source] 2024-03-02 00:39:00
>>spence+p11
Why does AGI infer outperforming any humans? Half the humans on the planet perform worse than the average human. Does that make them non-intelligent?
◧◩◪◨⬒⬓
8. spence+yp2[view] [source] 2024-03-02 01:30:27
>>stubis+0l2
That's the definition given by Sam Altman above
[go to top]