zlacker

[parent] [thread] 2 comments
1. joshjo+(OP)[view] [source] 2025-05-06 22:11:51
I mean, if you draw the scaling curves out and believe them, then sometime in the next 3-10 years, plausibly shorter, AIs will be able to achieve best-case human performance in everything able to be done with a computer and do it at 10-1000x less cost than a human, and shortly thereafter robots will be able to do something similar (though with a smaller delta in cost) for physical labor, and then shortly after that we get atomically precise manufacturing and post-scarcity. So the amount of stuff that amounts to nothing is plausibly every field of endeavor that isn't slightly advancing or delaying AI progress itself.
replies(2): >>sigmai+43 >>namari+j81
2. sigmai+43[view] [source] 2025-05-06 22:34:13
>>joshjo+(OP)
If the scaling continues. We just don't know.

It is kinda a meme at this point, that there is no more "publicly available"... cough... training data. And while there have been massive breakthroughs in architecture, a lot of the progress of the last couple years has been ever more training for ever larger models.

So, at this point we either need (a) some previously "hidden" super-massive source of training data, or (b) another architectural breakthrough. Without either, this is a game of optimization, and the scaling curves are going to plateau really fast.

3. namari+j81[view] [source] 2025-05-07 11:54:05
>>joshjo+(OP)
"Extrapolation" https://xkcd.com/605/
[go to top]