* It also goes without saying that by this definition I mean to say that humanity will no longer be able to meaningfully help in any qualitative way with respect to intellectual tasks (e.g. AGI > human; AGI > human + computer; AGI > human + internet; AGI > human + LLM).
Fundamentally I believe AGI will never happen without a body. I believe intelligence requires constraints and the ultimate constraint is life. Some omniscient immortal thing seems neat, but I doubt it'll be as smart since it lacks any constraints to drive it to growth.
Francis Fukuyama wrote in "The Last Man":
> The life of the last man is one of physical security and material plenty, precisely what Western politicians are fond of promising their electorates. Is this really what the human story has been "all about" these past few millennia? Should we fear that we will be both happy and satisfied with our situation, no longer human beings but animals of the genus homo sapiens?
It's a fantastic essay (really, the second half of his seminal book) that I think everyone should read
Happiness is always fleeting. Aren't our lives a bit dystopian already if we need to do work and for what reason? So that we can possibly feel like we are meaningful with hopes that we don't lose our ability to be useful.