zlacker

[parent] [thread] 0 comments
1. sigmai+(OP)[view] [source] 2025-05-06 22:34:13
If the scaling continues. We just don't know.

It is kinda a meme at this point, that there is no more "publicly available"... cough... training data. And while there have been massive breakthroughs in architecture, a lot of the progress of the last couple years has been ever more training for ever larger models.

So, at this point we either need (a) some previously "hidden" super-massive source of training data, or (b) another architectural breakthrough. Without either, this is a game of optimization, and the scaling curves are going to plateau really fast.

[go to top]