* It also goes without saying that by this definition I mean to say that humanity will no longer be able to meaningfully help in any qualitative way with respect to intellectual tasks (e.g. AGI > human; AGI > human + computer; AGI > human + internet; AGI > human + LLM).
Fundamentally I believe AGI will never happen without a body. I believe intelligence requires constraints and the ultimate constraint is life. Some omniscient immortal thing seems neat, but I doubt it'll be as smart since it lacks any constraints to drive it to growth.
You're basically requiring AGI to be smarter/better than the smartest/best humans in every single field.
What you're describing is ASI.
If we have AGI that is on the level of an average human (which is pretty dumb), it's already very useful. That gives you robotic paradise where robots do ALL mundane tasks.