Advertisement

Better, Cheaper Chips: It’s His Law

Share

Two score years ago, a young Silicon Valley executive observed in the pages of an industry journal that the integrated circuit eventually would come cheap enough to be embedded in our daily lives.

The term “Silicon Valley” hadn’t yet been coined (that wouldn’t happen until 1971). Integrated circuits, or chips, were so expensive they were considered suitable only for multimillion-dollar projects, like Apollo moon missions. The home computer wouldn’t appear for nearly 20 years.

Yet Gordon E. Moore, then the research and development director of Fairchild Semiconductor, which he had co-founded, foresaw in 1965 that silicon chips were going to plummet in price. He even tried to approximate the rate, projecting a drop of 50% a year for the next 10 years.

Advertisement

“I never had any idea it was going to be at all precise,” he said last week of his prediction. Yet it was accurate enough to have since gained the stature of holy writ: Moore’s Law.

Moore’s Law has become one of the most oft-quoted insights in science and industry. Stripped of its technical details, it amounts to an expression of faith in the march of technology -- the inexorable trend in electronics toward the smaller, the cheaper, the more powerful.

“Moore’s Law is an example of a tangible belief in the future -- that the future isn’t just a rosy glow,” says Carver Mead, Caltech’s emeritus Betty and Gordon Moore Professor of Engineering and Applied Science, who joined Moore in a public discussion celebrating the law’s 40th anniversary last week. (Mead, who is generally credited with the term “Moore’s Law,” believes he first so described Moore’s insight during a casual chat with a writer many years ago. “It certainly wasn’t anything premeditated,” he told me.)

Unusual for a senior statesman of electronics, Moore was trained as a chemist. Armed with a doctorate from Caltech, he was recruited by William Shockley, soon to win the Nobel Prize as co-inventor of the transistor, to join his pioneering Palo Alto electronics company in 1956. How new was the field? “Hardly anybody knew anything about semiconductors,” Moore related, “especially silicon.”

Shockley’s paranoid management style soon drove Moore and seven other executives to quit. They formed Fairchild, earning the lifelong enmity of Shockley, who labeled them “the traitorous eight.” In 1968, along with Robert Noyce, another member of the original renegade group, Moore left Fairchild and co-founded what would become semiconductor giant Intel Corp.

At 76, Moore projects a wry, self-effacing sense of humor about his distinguished career. The hearing aids he wears as a concession to age, he says, are the consequence of his youthful infatuation with chemical experimentation in his parents’ Redwood City home. A drop of homemade nitroglycerin placed on an anvil and slammed with a hammer, he had discovered, “made an absolutely superior firecracker.”

Advertisement

Moore’s Law first appeared in an article he published in the journal Electronics on April 19, 1965. Inelegantly entitled “Cramming more components onto integrated circuits,” the piece observed that (technically speaking) the complexity of the most economical chips being manufactured had approximately doubled every year for the prior four years, and that their cost per component had fallen at almost the same rate. Projecting the timeline out another decade, he estimated that by 1975 the model chip would hold 65,000 transistors on a wafer of silicon a quarter-inch square.

He was not off by much; there would be nine doublings of complexity in that 10-year span. Subsequently, however, the pace slowed, largely because chip designers had already exploited the easiest miniaturization strategies. Since 1975, the doubling has come every 21 months, on average. Even at that pace, the power of electronics has exploded. A version of Intel’s Itanium 2 chip introduced last year holds 592 million transistors. Its forthcoming Montecito chip will comprise 1.72 billion.

Moore is careful to explain that he was not intending to draw a technological roadmap in 1965, but merely arguing that silicon chips would have more applications than anyone anticipated. “People thought integrated circuits were very expensive, that only the military could afford the technology. But I could see that things were beginning to change. This was going to be the way to make inexpensive electronics.”

His foresight was exemplary; the 1965 article predicted that the drop in cost would “lead to such wonders as home computers ... automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today.”

Over four decades, technological progress has been so closely tied to Moore’s Law that Silicon Valley gurus constantly worry about when the price/miniaturization trend will run out. Several times in the last 40 years, insurmountable physical barriers appeared to be looming. Each time, however, a sharp turn in technology gave Moore’s Law another couple of decades of life. “It still amazes me how far we’ve been able to go,” Moore says.

Still, it’s assumed in the industry that the trend finally is nearing its ultimate limit: The devices crammed onto silicon wafers are approaching atomic scale, at which point moving electrons along the minuscule pathways etched on the chips will become impossible. (Some estimates place the end around 2020.)

Advertisement

“One of these days we’re going to have to stop making things smaller,” Moore says. But he’s wary of predicting developments too far into the future, mindful that even his brash forecast in 1965 turned out to be conservative. “As engineers,” he observes, “we’re always way too optimistic in the short run. But in the long run, things will always evolve much further than we can see.”

*

Golden State appears every Monday and Thursday. You can reach Michael Hiltzik at golden.state@latimes.com.

Advertisement