zlacker

[return to "Thousands of AI Authors on the Future of AI"]
1. mmaund+in[view] [source] 2024-01-08 23:13:58
>>treebr+(OP)
I think history has shown us that we tend to underestimate the rate of technological progress and it's rate of acceleration.

It's tempting to look at Moore's law and use say the development of the 8080, z-80 and 6502 in 1975 as an epoch. But it's hard to use that to get a visceral sense of how much things changed. I think RAM - in other words, available memory - may be more helpful, and it does relate in a distant way with model size and available GPU memory.

So the question is, if we surveyed a group of devs, engineers and computer scientists in 1975 and asked them to extrapolate and predict available RAM a few decades out, how well would their predictions map to reality?

In 1975 the Altair 8800 microcomputer with the 8080 processor had 8K of memory for the high end kit (4096 words).

8 years later, in 1983 the Apple IIe (which I learned to program on) had 64K RAM as standard, or 8 times the RAM.

13 years later in 1996, 16 to 32 MB was fairly commonplace in desktop PCs. That's 32,768K which is 4096 times the 8K available 21 years earlier.

30 years later in 2005, it wasn't unusual to find 1GB of RAM or 1,048,576K or 131,072 times 8K from 30 years earlier.

Is it realistic to expect a 1975 programmer, hardware engineer or computer scientist to predict that available memory in a desktop machine will be over 100,000 times greater 30 years in the future? We're not even taking into account moving from byte oriented CPUs to 32bit CPUs, or memory bandwidth.

2054 is 30 years in the future. It's going to fly by. I think given the unbelievable rate of change we've seen in the past, and how it accelerates, any prediction today from the smartest and most forward thinking people in AI will vastly underestimate what 2054 will look like.

Edit: 32bit CPU's not 64. Typo.

◧◩
2. arp242+9v[view] [source] 2024-01-08 23:55:39
>>mmaund+in
> I think history has shown us that we tend to underestimate the rate of technological progress and it's rate of acceleration.

It's also been overestimated tons of times. Look at some of the predictions from the past. It's been a complete crap-shoot. Many things have changed significantly less than people have predicted, or in significantly different ways, or significantly more.

Just because things are accelerating great pace right now doesn't really mean anything for the future. Look at the predictions people made during the "space age" 1950s and 60s. A well-known example would be 2001 (the film and novel). Yes, it's "just" some fiction, but it was also a serious attempt at predicting what the future would roughly look like, and Arthur C. Clarke wasn't some dumb yahoo either.

The year 2001 is more than 20 years in the past, and obviously we're nowhere near the world of 2001, for various reasons. Other examples include things like the Von Braun wheel, predictions from serious scientists that we'd have a moon colony by the 1990s, etc. etc. There were tons of predictions and almost none of them have come true.

They all assumed that the rate of progress would continue as it had, but it didn't, for technical, economical, and pragmatic reasons. What's the point of establishing an expensive moon colony when we've got a perfectly functional planet right here? Air is nice (in spite of what Spongebob says). Plants are nice. Water is nice. Non-cramped space to live in is nice. A magnetosphere to protect us from radiation is nice. We kind of need these things to survive and none are present on the moon.

Even when people are right they're wrong. See "Arthur C Clarke predicts the internet in 1964"[1]. He did accurately predict the internet; "a man could conduct his business just as well from Bali as London" pretty much predicts all the "digital nomads" in Bali today, right?

But he also predicts that the city will be obsolete and "seizes to make any sense". Clearly that part hasn't come true, and likely never will. Can't "remotely" get a haircut, or get a pint with friends, or all sorts of other things. And where are all those remote workers in Bali? In the Denpasar/Kuta/Canggu area. That is: a city.

It's half right and half wrong.

The take-away is that predicting the future is hard, and that anyone who claims to predicts the future with great certainty is a bullshitter, idiot, or both.

[1]: https://www.youtube.com/watch?v=wC3E2qTCIY8

◧◩◪
3. idopms+TC[view] [source] 2024-01-09 00:52:48
>>arp242+9v
> What's the point of establishing an expensive moon colony when we've got a perfectly functional planet right here?

I think this is the big difference between what you're describing and AI. AI already exists, unlike a moon colony, so we're talking about pushing something forward vs. creating brand new things. It's also pretty well established that it's got tremendous economic value, which means that in our capitalist society, it's going to have a lot of resources directed at it. Not necessarily the case for a moon colony whose economic value is speculative and much longer term.

[go to top]