>>nikhil+(OP)
The most interesting thing about this article is the claim that GPT-4 has 1 trillion parameters.
Microsoft's recent GPT-4 paper [0] hints at the "unprecedented scale of compute and data" used to train the model. What else do we know about the new model itself?