LLMS don't have to be smart enough to be AGI. They just have to be smart enough to create AGI. And if creating something smarter than yourself sounds crazy, remember that we were created by simpler ancestors that we now effortlessly dominate.
>>Footke+(OP)
I don't disagree with the general notion, but it seem to me that LLMs being smart enough to create AGI is even more far fetched than if they are just smart enough to be AGI.