zlacker

[parent] [thread] 2 comments
1. alexel+(OP)[view] [source] 2023-05-16 23:19:04
I don’t think you really disagree with GP? I think the argument is we peaked on “throw GPUs at it”?

We have all kinds of advancements to make training cheaper, models computationally cheaper, smaller, etc.

Once that happens/happened, it benefits OAI to throw up walls via legislation.

replies(2): >>Neverm+c6 >>yarg+tc1
2. Neverm+c6[view] [source] 2023-05-17 00:00:02
>>alexel+(OP)
No way has training hit any kind of cost, computing or training data efficiency peak.

Big tech advances, like the models of the last year or so, don't happen without a long tail of significant improvements based on fine tuning, at a minimum.

The number of advances being announced by disparate groups, even individuals, also indicates improvements are going to continue at a fast pace.

3. yarg+tc1[view] [source] 2023-05-17 11:13:14
>>alexel+(OP)
Yeah, it's a little bit RTFC to be honest.
[go to top]