Advertisement

Chip Improvements and the Real Truth of Moore’s Law

Share

Last month, when Dataquest forecast that the home personal computer market in the U.S. would grow by 8% in 1996, the computer industry reacted with stunned disbelief.

A casual observer couldn’t be blamed for thinking the 8% growth figure was a misprint. After all, that kind of annual growth would have most industries dancing with glee.

But the computer industry has always believed that double-digit growth is its birthright. Why? Because it has Moore’s Law on its side. Coined by Gordon Moore, one of the founders of Intel Corp., Moore’s Law states that the number of transistors that can be squeezed onto a silicon chip doubles every 18 months. For the last 30 years, the law has held true.

Advertisement

The result has been twofold: The cost of making a given chip is regularly cut in half, allowing manufacturers to see high profit margins for a little while. Even better, the constant and dramatic improvements made possible by Moore’s Law keep customers coming back to computer stores with open checkbooks.

So now, as evidence of an industrywide slowdown continues to mount, people are starting to wonder if Moore’s Law is somehow ailing. Perhaps, like Yellowstone’s Old Faithful, which becomes less regular every year, the chip improvements predicted by Moore’s Law are finally slowing down.

Indeed, scientists have been suggesting for years that we will eventually run up against inflexible, physical limits that constrain the number of transistors that can fit on a chip. And even Gordon Moore has argued that improvements in chip density will eventually end because the capital investment required to build each new generation of chips will increaseuntil it will no longer be affordable.

But a close look at the numbers shows that Moore’s Law is still doing fine. However, what Moore’s Law actually says and what people think it says have grown increasingly disparate. Clearing up some of the misconceptions goes a long way toward understanding the real limits--and limited importance--of Moore’s Law. The most common misconception is that Moore’s Law states that chip speed--rather than transistors per chip--doubles every 18 months.

*

In fact, there isn’t a direct correlation between the number of transistors on a chip and the chip’s processing speed. For proof, just look at RISC--Reduced Instruction Set Computing. RISC, developed in the late 1970s and early 1980s, was based on the idea that simple, streamlined chips could actually run faster than the increasingly byzantine microprocessors of the time. The result was chips that were not only faster than their more complex rivals, but also cheaper and easier to design.

Admittedly, having more transistors to play with does usually translate to improved performance. Engineers can use the extra transistors to duplicate the brains of the chip, thereby approximately doubling the chip’s intelligence. But, as RISC shows, this is not the only path to speed gains.

Advertisement

If, in the future, improvements in transistor density do finally slow, we’ll have to rethink how we use the transistors. The result may be revolutionary improvements that surpass even the steady, evolutionary ones we are accustomed to.

A second misconception people have is that since computers improve exponentially, they will soon be able to solve problems that currently take hours in just seconds. You hear this logic a lot among computer programmers. Ask them why your spreadsheet is so slow, or why the Internet always seems to be operating at a crawl, and they will blithely tell you to just wait another six months for the hardware to catch up.

But sometimes it never does. The problem is that the tasks grow more difficult faster than computers improve. In part, the problem occurs because we constantly expect computers to do more: We want fancier graphics, smarter interfaces and computers that talk. But sometimes the problem is more subtle.

Look, for example, at the “404 Not found” messages that Web surfers are starting to encounter with increasing frequency. The problem stems from devices on the network known as routers, which steer messages to their destination. To do their job, these routers must know how to get to every possible address.

However, the number of computers hooked up to the Net is doubling every nine months--twice as often as computer memory doubles in capacity. The result? Router vendors must throw more and more money at the problem just to keep their heads above water. Stories like these start making Moore’s Law look a lot less impressive.

A final misconception is that Moore’s Law is a natural one, something we have fortuitously stumbled upon. That may sound ridiculous, but I’ve heard smart people wonder what physical laws make it possible to double transistor density only in 18-month increments.

Advertisement

The truth is that what appears to be an autonomous technological trajectory is really just a self-fulfilling prophecy. We believe computer performance will double every 18 months, so we invest the resources necessary to make it happen.

*

Companies that build chips amortize their costs based on the assumption that their machinery will be obsolete after just a few years. Computer manufacturers time their product cycles to coincide with the introduction of the next generation of microprocessors. Even software vendors take Moore’s Law into account when planning what capabilities to add to future program versions.

When you look at Moore’s Law in this way--as a rule of thumb predicting improvements in transistor density, rather than an unstoppable dynamo driving the computer industry forward--it becomes easier to understand why sales growth will not always be in the double digits. This may come as a disappointment, but with growth predicted to become more anemic over the next couple of years, it might be best to get our expectations in line with reality now.

* Steve G. Steinberg (steve@wired.com) is an editor at Wired.

Advertisement