Tuesday, September 8, 2009

Moore's Law is Dead, and I'm not feeling so good myself

Despite continued inertia to the contrary, the law attributed to Gordon Moore describing the doubling of computer chip speeds every 1 and 1/2 years has in fact reached the end of its rope. CPUs are no faster today than 5 years ago. Yes, dual and even quad core CPUs have arisen to claim the title of state-of-the-art on the laptop/desktop. But for almost all typical applications and workloads, multi core machines in 2009 actually run slightly slower than a one-core 3.2 GHz Pentium 4 did four years ago in 2005.

Strangely, for a phenomenon that has been so central to the advancement and pervasion (invasion?) of computer-based tech in every part of our lives, I've been surprised that the law's demise has received so little recognition, as well as the general disinterest that has greeted the prospect that computers might not get any faster. I think that small fact is actually a Very Big Deal. It has the potential to stop computer-driven tech advancements dead in their tracks. I think it's entirely possible that those forms of science and technology that depend on ever faster computers may in fact not develop much further until a new revolution in processor design arrives (e.g. optical or quantum computers). Until we can break the CMOS clock roadblock, and the exponential rise power consumption that accompanies it, we ain't going nowhere.

In fact, I think the problem is worse than our just standing still. There is a long history of code bloat and slowdown that has accompanied every advancement in software. New apps, new operating systems, new development tools, all of these are invariably slower than its predecessor. It's been so long since CPUs were slow that an entire generation of programmers has never had to write fast or efficient code. The skill has been lost.

Now it's also true that being skilled in writing and optimizing code is no surrogate for the exponential growth rate of Moore's Law. Writing fast code can achieve no more than a linear speedup (e.g. efficient cache management) in most cases. The problem is that the tools (languages, libraries, OSes) continue to slow down, and there's no chance that this trend is going to change for some time to come. Until then, given the end of Moore's Law, we'd better get used to computers that are getting slower, not faster.

I'm not sure what impact this imminent slowdown will have on the warp and woof of computing, especially on mainstays like laptops and cell phones. Probably it means that phones and computer sill require greater engineering effort than usual. Designers can't just assume that the CPU will continue to carry the full workload. More custom secondary chipsets will be needed to keep up with specialized tasks (voice processing, speech recognition, media streaming, etc). The rise of GPU (graphical processing units) is one examples of the inability of CPUs to keep on doing all the heavy lifting. I think new standards in audio processing chipsets may be next (for both speech recognition and generation).

However, I'm more concerned with the impact of Moore's ex-law on two tech areas: telecommunications and information-exploitive computing. If CPUs can't speed up then networks can't speed up. If networks remain static, then the cost of distributing broadband is not going to continue to drop. ISPs and telecommunications pipelines may 'fail to thrive' and stabilize well below the level of performance needed to realize the full potential of internet-based and/or time-shifted TV broadcast, or universally net-distributed education (e.g. on-demand education), or ubiquitous teleconferencing (badly needed to more efficiently deliver business collaboration that is independent of geography or time zone). Without big advances in telecom, the US is going to have a VERY hard time competing with low-wage nations that can charge 1/3 to 1/8 our labor rate.

My other area of concern, information-exploitation, covers techniques that maximize the value of data and information, like data mining, artificial intelligence, and robotics. Further advancements in these technologies will expand the limits of what computers can do. Without computer speedup, autonomous and tele-robots, software-bots, intelligent assistants, and even the indexing and organization of data-into-information-into-value is going to slow substantially. And that has the potential to be surprisingly bad for mature economies like the US.

For more than a century, America's fortunes have depended if not on our technological leadership, then at least on our ability to stay ahead of the competition by delivering products first that keep up with the state-of-the-art in technology, something which has not been possible elsewhere in the world and where labor costs otherwise handily out-compete us. But if advancement of the state-of-the-art slows with the end of Moore's Law, the rising tides in the BRIC (Brazil, Russia, India, China) countries threatens to overtake not only our already lost heavy industries, but also those tech-driven markets in which we currently enjoy an advantage, like the minimal time-to-market for novel uses of IT. Corporations like Intel and Apple that deliver 'hot' or 'cool' goods and services first in the US, and enable US business to trailblaze and invent new markets, likely will not continue to deliver their new toys here first if advancement in info tech slows and China or India gain a chance to invent those novel markets first. (See the books 'The Inventor's Dilemma' or 'The Inventor's Solution' for more on this.)

I think Intel's Andy Grove was right in saying 'Only the paranoid survive.' If the end of Moore's Law means what I think it might, then it's high time for the law's inventors to get as paranoid as possible.

No comments:

Post a Comment