* It also goes without saying that by this definition I mean to say that humanity will no longer be able to meaningfully help in any qualitative way with respect to intellectual tasks (e.g. AGI > human; AGI > human + computer; AGI > human + internet; AGI > human + LLM).
Fundamentally I believe AGI will never happen without a body. I believe intelligence requires constraints and the ultimate constraint is life. Some omniscient immortal thing seems neat, but I doubt it'll be as smart since it lacks any constraints to drive it to growth.
It needs vast resources to operate. As the competition in AI heats up, it will continually have to create new levels of value to survive.
Not making any predictions about OpenAI, except that as its machines get smarter, they will also get more explicitly focused on its survival.
(As apposed to the implicit contribution of AI to its creation of value today. The AI is in a passive role for the time being.)