I have to admit, of the four, Karpathy and Sutskever were the two I was most impressed with. I hope he goes on to do something great.
When the next wave of new deep learning innovations sweeps the world, Microsoft eats whats left of them. They make lots of money, but don't have future unless they replace what they lost.
E.g. Oppenheimer’s team created the bomb, then following experts finetuned the subsequent weapon systems and payload designs. Etc.
Engineering Level:
Solve CO2 Levels
End sickness/death
Enhance cognition by integrating with willing minds.
Safe and efficient interplanetary travel.
Harness vastly higher levels of energy (solar, nuclear) for global benefit.
Science: Uncover deeper insights into the laws of nature.
Explore fundamental mysteries like the simulation hypothesis, Riemann hypothesis, multiverse theory, and the existence of white holes.
Effective SETI
Misc: End of violent conflicts
Fair yet liberal resource allocation (if still needed), "from scarcity to abundance" AI does not experience fatigue or distractions => consistent performance.
AI can scale its processing power significantly, despite the challenges associated with it (I understand the challenges)
AI can ingest and process new information at an extraordinary speed.
AIs can rewrite themselves
AIs can be multiplicated (solving scarcity of intelligence in manufacturing)
Once achieving AGI, progress could compound rapidly, for better or worse, due to the above points.A model that is as good as an average human but costs $10 000 per effective manhour to run is not very useful, but it is still an AGI.
Geohot (https://geohot.github.io/blog/) estimates that a human brain equivalent requires 20 PFLOPS. Current top-of-the-line GPUs are around 2 PFLOPS and consume up to 500W. Scaling that linearly results in 5kW, which translates to approximately 3 EUR per hour if I calculate correctly.