* It also goes without saying that by this definition I mean to say that humanity will no longer be able to meaningfully help in any qualitative way with respect to intellectual tasks (e.g. AGI > human; AGI > human + computer; AGI > human + internet; AGI > human + LLM).
Fundamentally I believe AGI will never happen without a body. I believe intelligence requires constraints and the ultimate constraint is life. Some omniscient immortal thing seems neat, but I doubt it'll be as smart since it lacks any constraints to drive it to growth.
as I mention in another post, this is why I do not make any distinction between AGI and superintelligence. I believe they are the same thing. a thought experiment - what would it mean for a human to be superintelligent? presumably it would mean learning things with the least possible amount of exposure (not omniscience, necessarily).