The power of Moore's Law — which states that the power of computer chips doubles
roughly every two years — is such that the next five years' worth of digital
progress will involve bigger leaps in raw processor power than the previous
five years. It's at least possible that we really will have a massive leap
forward in productivity someday soon that starts substantially reducing the
amount of human labor needed to drive the economy forward.
I wonder how long it's going to take for word of Moore's Law's end to filter out into the wider public perception. Here, in the industry, we look at Intel's delays going to first 14nm and 10nm as clear signs that Moore's Law is sputtering out (as all exponential growth does, eventually). But out in the wider public, there seems to be a perception that chips will get "twice as good" every year into perpetuity. It's going to be interesting to see when (or if) people realize that their hardware isn't getting better at the same rate, year-over-year, that it used to.I'd think people in the wider public have no clue as to exactly how much faster computers get every year. Considering that software usually gets slower at about the same rate as processors get faster, I'd imagine many people would think the increases in speed are much smaller than they actually are.