zlacker

[parent] [thread] 2 comments
1. himara+(OP)[view] [source] 2023-11-19 01:12:51
It's not trivial given current supply bottlenecks, not to mention research expertise.
replies(2): >>draken+R4 >>initpl+Ec
2. draken+R4[view] [source] 2023-11-19 01:50:35
>>himara+(OP)
I don't feel like compute for pretraining the model was a huge constraint?

The supply bottlenecks have been around commercializing the ChatGPT product at scale.

But pretraining the underlying model I don't think was on the same order of magnitude, right?

3. initpl+Ec[view] [source] 2023-11-19 02:35:06
>>himara+(OP)
The control of the supply si with Microsoft, who are likely falling on Sam’s side here.
[go to top]