Relative to his level of fame, his actual level of contribution as far as pushing forward AI, I’m not so sure about.
I deeply appreciate his educational content and I’m glad that it has led to a way for him to gain influence and sustain a career. Hopefully he’s rich enough from that that he can focus 100% on educational stuff!
That blog post inspired Alec Radford at Open AI to do the research that produced the "Unsupervised sentiment neuron": https://openai.com/research/unsupervised-sentiment-neuron
Open AI decided to see what happened if they scaled up that model by leveraging the new Transformer architecture invented at Google, and they created something called GPT: https://cdn.openai.com/research-covers/language-unsupervised...
"In fact, I’d go as far as to say that
The concept of attention is the most interesting recent architectural innovation in neural networks."
when the initial attention paper was less than a year old, and two years before the transformer paper.