I have to admit, of the four, Karpathy and Sutskever were the two I was most impressed with. I hope he goes on to do something great.
When the next wave of new deep learning innovations sweeps the world, Microsoft eats whats left of them. They make lots of money, but don't have future unless they replace what they lost.
E.g. Oppenheimer’s team created the bomb, then following experts finetuned the subsequent weapon systems and payload designs. Etc.
GPT4o's context window is 128k tokens which is somewhere on the order of 128kB. Your brain's context window, all the subliminal activations from the nerves in your gut and the parts of your visual field you aren't necessarily paying attention to is on the order of 2MB. So a similar order of magnitude though GPT has a sliding window and your brain has more of an exponential decay in activations. That LLMs can accomplish everything they do just with what seems analogous to human reflex rather than human reasoning is astounding and more than a bit scary.