The computer chips that physicist Marvin Cohen and his colleagues at the Lawrence Berkeley Laboratory have in mind are so small and powerful that four wine bottles could contain enough to store all the information in all the human brains in the world. A thread of the same circuits less than one-hundredth of an inch long could easily hold all the information in all the books ever written.
These theoretical computer circuits would be constructed from complex carbon molecules called nanotubes that have the same electrical properties as the silicon semiconductors used in most computers today. They would be a hundred times stronger than steel, as fast as a conventional supercomputer and, best of all, would assemble themselves. These are chips measured in nanometers: one billionth of a meter--the size of some viruses. They promise circuits 100 times smaller than the most miniature devices available today--computers that can be woven into clothing, painted onto walls, injected into the bloodstream or sprinkled like fairy dust in the air.
It is one vision of what lies beyond Silicon Valley, when the technology of conventional semiconductors has exhausted its possibilities and the cost of producing increasingly complex silicon chips becomes more than anyone can pay.
Researchers at Caltech, Stanford and MIT also are exploring the quirky quantum properties of subatomic particles for special purpose calculators. At USC and other centers, groups have harnessed the genetic code to program molecules of DNA so that the natural biological machinery of life will solve scientific equations.
"Just as the transistor was mind-boggling 50 years ago, and then the silicon chip, the idea that you are getting down to devices that are just a few atoms thick truly is mind-boggling," said Cohen of UC Berkeley.
But scientists have yet to discover a way to assemble these molecules for nanocomputers into the flawless circuits that today's computers demand.
It is not for want of trying. Already, scientists wielding electron beams like arc welders have built experimental structures thousands of times smaller than a human hair--gears that turn, pumps that operate, electric turbines only 60 microns in diameter that run on static electricity, transistors only 10 atoms in diameter. IBM researchers recently built a working abacus in which carbon molecules slide along microscopic copper grooves.
Not to be outdone, two Cornell University scientists crafted a guitar just 10 microns long, about the size of a single cell. They pluck its six silicon strings--each about 100 atoms wide--with an atomic force microscope.
But even this skill in molecular machining falls short of the manufacturing perfection required for conventional circuits, said computer designer Philip J. Keukes at Hewlett-Packard Laboratories in Palo Alto.
Instead, scientists like UCLA chemist James R. Heath hope this next generation of molecular machines will build themselves. In theory, they could be grown as the complex lattices of crystals, which arise spontaneously from the right combination of chemicals. "They would be transistors in a beaker," Heath said.
The problem is that some percentage of these self-assembling molecules always would be defective, due to the inherent nature of the chemistry used to assemble them.
At first glance, the nanotube, a molecule discovered in 1991, seems the perfect candidate for a self-assembling computer circuit.
So tiny that 10,000 will fit in the thickness of a human hair, nanotubes form naturally from a mist of heated carbon vapor in sheets exactly one atom thick. Without any prompting, the sheets neatly roll up into tubes as part of their natural chemistry. They can conduct electricity as well as any copper wire and they also can form the semiconductors used to make computer circuits.
At the same time, however, the tubes also tangle themselves into snarls of conducting and semiconducting tubes that contain countless flaws, any one of which would crash a conventional computer.
Now, Heath at UCLA and Keukes at Hewlett-Packard have found a way around that stumbling block. In the process they turn conventional computing on its head.
There is no need to eliminate the errors in complex chips, they determined. Instead, let the computer diagnose and heal itself. "It is possible to make detours around defects," Keukes said.
They proved their idea on an experimental computer built at Hewlett-Packard called Teramac, which contains 220,000 hardware defects, any one of which would be fatal to a conventional computer.
The key to the Teramac design are the thousands of extra connections between its components--11,000 wires on each chip, compared to several hundred on a Pentium chip--that let the computer steer around any defect. It can still work even if only a fraction of its components are functioning properly.
Despite the defects, the refrigerator-sized Teramac computer not only works; it operates 100 times faster than a normal high-performance computer workstation.
"For most computers, a defective chip or connection must be physically repaired or replaced for the system to be operational," Heath and his colleagues reported recently in Science. "For Teramac, all 'repair' work was done with software. A program was written to locate the mistakes and create a defect database for the computer."
As an advance in computer architecture, their defect-tolerant system promises to help open the way for a revolution in molecular engineering by freeing designers to build molecular devices with defective circuits. While too expensive for commercial silicon-based computers, the Teramac technology offers the perfect cost-effective solution for nanocomputers.
Cohen, who helped formulate some conceptual underpinnings of the new computer architecture, said it might prove especially useful for molecule-sized nanocomputers that assemble themselves by following the natural laws of chemistry.
"They built an architecture that was very tolerant," Cohen said. "It is laid back. If something is wrong with one path, it takes another path. That architecture might be useful for molecular computers or computers based on DNA sequences."
At the Lawrence Berkeley Lab, Alexander Zettl and his team of condensed matter physicists already are testing a very primitive prototype of an electronic circuit made of nanotubes. It is a first step toward what researchers have dubbed the "tube cube"--a device the size of a sugar cube packed with billions of nanotubes wired into a network of miniature computers. Still, Cohen cautioned that any working molecular computer is still at least a decade or more away.
"Our hope is that we will hook up eventually with their computer architecture and use it for our [nanotube] system," Cohen said. "But we are not ready for them and they are not ready for us."
(BEGIN TEXT OF INFOBOX / INFOGRAPHIC)
Beyond Silicon Valley
* Existing silicon chips: Today's computers use silicon chips that must be flawless to work properly.
* Using imperfect chips: The Teramac computer can find a way around the defects. It locates the mistakes and creates a defect database.
* Nanotubes: These unusual carbon molecules can conduct electricity. They can be used as diodes, transistors and other components of computer circuits.
* Replacing silicon chips: Nanotubes form naturally into complex but imperfect chip-like structures. Teramac technology could be used to make nanotube chips work despite the imperfections.
* The new chip: A nano computer chip would take up an area of about 0.1 square millimeters. A half-inch cube could contain billions of nanotubes wired into a computer network. This kind of computer is at least a decade or more away.
* Connecting wires: Made of self-assembling metallic particles, they are so small that the distance between them is a single molecule.
HITTING THE SILICON WALL
* Eventually, the complex silicon chip will cost too much to manufacture. By 2010 it is estimated that the cost of a silicon chip fabrication facility will exceed $30 billion, compared to $2 billion today.
Celeron 300A chip in 1998: 19 million transistors
Pentium II chip in 1997: 7.5 million transistors
Pentium chip in 1993: 3.1 million transistors
486 DX chip in 1990: 1.2 million transistors
386 chip in 1985: 275,000 transistors
4004 chip in 1971: 2,300 transistors
Sources: Science, UCLA, Hewlett-Packard, Intel.