People who say this nonsense need to start properly defining human level intelligence because nearly anything you throw at GPT-4 it performs at at least average human level, often well above.
Give criteria that 4 fails that a significant chunk of the human population doesn't also fail and we can talk.
Else this is just another instance of people struggling to see what's right in front of them.
Just blows my mind the lengths some will go to ignore what is already easily verifiable right now. "I'll know agi when i see it", my ass.
"Average human level" is pretty boring though. Computers have been doing arithmetic at well above "average human level" since they were first invented. The premise of AGI isn't that it can do something better than people, it's that it can do everything at least as well. Which is clearly still not the case.
It seems you have the wrong idea of what is being conveyed, or what average human intelligence is. It isn't about being able to do math. It is being able to invent, mimic quickly, abstract, memorize, specialize, and generalize. There's a reason humans have occupied every continent of the earth and even areas outside. It's far more than being able to do arithmetic or playing chess. This just all seems unimpressive to us because it is normal, to us. But this certainly isn't normal if we look outside ourselves. Yes, there's intelligence in many lifeforms, even ants, but there is some ineffable or difficult to express uniqueness to human intelligence (specifically in its generality) that is being referenced here.
To put it one way, a group of machines that could think at the level of an average teenager (or even lower) but able to do so 100x faster would probably outmatch a group of human scientists in being able to solve complex and novel math problems. This isn't "average human level" but below. "Average human level" is just a shortcut term for this ineffable description of the _capacity_ to generalize and adapt so well. Because we don't even have a fucking definition of intelligence.
But this is exactly why average is boring.
If you ask ChatGPT what it's like to be in the US Navy, it will have texts written by Navy sailors in its training data and produce something based on those texts in response to related questions.
If you ask the average person what it's like to be in the US Navy, they haven't been in the Navy, may not know anyone who is, haven't taken any time to research it, so their answers will be poor. ChatGPT could plausibly give a better response.
But if you ask the questions of someone who has, they'll answer related questions better than ChatGPT. Even if the average person who has been in the Navy has no greater intelligence than the average person who hasn't.
It's not better at reasoning. It's barely even capable of it, but has access to training data that the average person lacks.