zlacker

[parent] [thread] 0 comments
1. wsgeor+(OP)[view] [source] 2023-03-25 22:53:21
The most interesting thing about this article is the claim that GPT-4 has 1 trillion parameters.

Microsoft's recent GPT-4 paper [0] hints at the "unprecedented scale of compute and data" used to train the model. What else do we know about the new model itself?

[0] https://www.microsoft.com/en-us/research/publication/sparks-...

[go to top]