zlacker

[parent] [thread] 5 comments
1. bart_s+(OP)[view] [source] 2024-03-01 13:47:44
I don’t think GPT-4 is AGI, but that seems like a foolish idea. An AGI doesn’t need to be hyperproficient at everything, or even anything. Ask a team of any non-aeronautical engineers to build a rocket and it will go poorly. Do those people not qualify as intelligent beings?
replies(2): >>blibbl+Yr >>spence+Nu
2. blibbl+Yr[view] [source] 2024-03-01 16:35:01
>>bart_s+(OP)
> Ask a team of any non-aeronautical engineers to build a rocket and it will go poorly. Do those people not qualify as intelligent beings?

I suspect you'd have one person on the team that would say "perhaps you'd be better choosing a team that knows what they're doing"

meanwhile GPT-4 would happily accept and emit BS

replies(1): >>ToValu+YG
3. spence+Nu[view] [source] 2024-03-01 16:48:25
>>bart_s+(OP)
Outperforming humans does not mean outperforming an average untrained human
replies(1): >>stubis+oO1
◧◩
4. ToValu+YG[view] [source] [discussion] 2024-03-01 17:42:53
>>blibbl+Yr
Have you used GPT-4? I'd criticize it in the opposite direction. It routinely defers to experts on even the simplest questions. If you ask it to tell you how to launch a satellite into orbit, it leads with:

>Launching a satellite into orbit is a complex and challenging process that requires extensive knowledge in aerospace engineering, physics, and regulatory compliance. It's a task typically undertaken by governments or large corporations due to the technical and financial resources required. However, I can give you a high-level overview of the steps involved:

◧◩
5. stubis+oO1[view] [source] [discussion] 2024-03-02 00:39:00
>>spence+Nu
Why does AGI infer outperforming any humans? Half the humans on the planet perform worse than the average human. Does that make them non-intelligent?
replies(1): >>spence+WS1
◧◩◪
6. spence+WS1[view] [source] [discussion] 2024-03-02 01:30:27
>>stubis+oO1
That's the definition given by Sam Altman above
[go to top]