Times are grim, but Silicon Valley knows how to invent its way out of adversity. Three times the vultures have circled, three times the valley's technologists have chased them away.
In the 1970s, foreign manufacturers undercut Intel and other local titans with cheaper memory chips, ending years of easy profits and threatening to return Santa Clara County to farmland. Intel then shifted its focus to microprocessors, the more expensive brains of the computer. Microprocessors not only saved Intel, they also stimulated a cycle of innovation that brought personal computers to the masses.
When the PC era began in 1975, economist Doug Henton says, Silicon Valley had 830 technology companies; by 1990 it had 3,000. Five years later that boom was going bust as PCs became ubiquitous, boring and cheap.
But the valley roared back again when the Internet created a surge of innovations, from Web sites to e-commerce to high-speed telecommunications links for the home. And long before the collapse of the Internet economy, valley scientists were advancing theories and technologies that might again shape the future. Hopeful signs point to biotechnology for new drugs and medical treatments, and nanotechnology--structures at the molecular or atomic scale--or maybe a combination of the two. Nano-processors could create computers thousands of times more powerful than today's models. Some optimists predict that nano-bots the size of a virus and with the computational abilities of today's most powerful computers could be injected into the bloodstream to hunt and kill cancer cells.
In electronics, smaller traditionally means cheaper--think $20 laptop computers within a generation. Such pricing forces companies to rethink old notions about who and what technology is designed for. Consider GrameenPhone of Bangladesh, which has sold 20,000 cellular phones and solar rechargers to desperately poor "phone ladies." They are earning, on average, twice the nation's per capita income by selling air time to friends and neighbors in their villages, and linking local farmers to markets offering the best price for their harvest. What if a cell phone could be produced so cheaply that every one of those farmers could afford one?
Giant new technology markets would develop, assuming Silicon Valley was open to what A. Richard Newton, dean of UC Berkeley's College of Engineering, calls "radical, disruptive business models" that will define the future. To win that future, companies will have to buck quarter-to-quarter capitalism and risk failure on far-out plans. Fortunately, some in the valley already are doing just that.
The long-promised emergence of biotech as a dominant force in the economy has never quite materialized. Silicon Valley may change that with a marriage of bio- and electronic technologies.
The region boasts 750 life sciences firms--plus Stanford University, UC Berkeley and UC San Francisco--to form the largest concentration of biotech talent on the planet. About 150 companies are using "bioinformatics" to create pharmaceutical products, or to find cures for cancer or AIDS, according to economist Henton, president of Collaborative Economics in Mountain View. "The next big thing will come out of convergences," he predicts.
Examples abound. Supercomputers made Applera Corp. in Foster City a prime mover in decoding the human genome in 2001--considered one of the great accomplishments in bioscience history. Also, one of the most compelling new tools is the microarray, pioneered by Santa Clara-based Affymetrix Inc. The company builds "gene chips"--dime-sized pieces of silicon onto which are placed half a million unique strands of DNA. In one of many applications for gene chips, cancer researchers pour tumor-cell RNA--genetic material that binds selectively to DNA--over the gene chip, then use a specialized electronic scanner and computer supplied by Affymetrix to analyze the resulting information. The method has helped identify hereditary links to leukemia, and it recently revealed a genetic marker for whether a breast tumor is treatable with surgery--or will become a metastasizing killer.
Such technologies are expected to lead to personalized medicine, such as drug regimens customized for a person's genetic profile. Since the decoding of the human genome, the market for such products has exploded. Last year Affymetrix sold 280,000 gene chips to researchers and pharmaceutical companies.
If things go well, a local planning organization called Joint Venture: Silicon Valley Network projects that, by 2005, the region could generate the lion's share of revenues in a $10-billion to $15-billion bio-high-tech market.
Consider sensors so small and durable that they can be dropped from aircraft and reach the earth undamaged, and so smart that they form self-organized, sentient networks. Such devices might be dropped by the thousands onto the forest floor of the San Bernardino Mountains to sniff for the first signs of fire. This is one idea behind "smart dust"--a plan to revolutionize the use of sensors in society.
If UC Berkeley scientists are right, tiny wireless sensors may soon save California $7 billion in annual energy costs. People leave lights on needlessly and buildings are heated or cooled inefficiently because of expensive, centralized thermostat systems, or no automatic controls at all. The solution: cheap rice-grain-sized wireless devices that can sense light and heat, and communicate with each other to optimize and conserve light and heat--cutting off or reducing power.
Smart dust with radiation-detection abilities could soon be inexpensive enough to be embedded in every shipping container to make sure it isn't used to smuggle nuclear weapons. UC Berkeley electrical engineering professor Kris Pister's new company, Dust Inc., plans to commercialize micro-sensors for inventory control and security.
"Any reasonably valuable piece of consumer electronics equipment--like a stereo, a Walkman or a watch--can be tracked with a sensor and disabled if it is stolen," he says. Sensors will eventually be so small and cheap, Pister says, as to allow self-tracking of FedEx parcels. Sensors might be woven into cashmere sweaters or pasted onto a box of cereal to control inventory and foil shoplifters.
These are only the obvious applications. Disneyland might like to offer sensor networks to families. Pin a sensor that looks like Mickey Mouse on 8-year-old Susie's collar, and every time she strays more than 50 feet away, Mom's cell phone beeps--and displays Susie's location on a miniature map.
While privacy advocates are leery of a society of ubiquitous sensors, Pister argues that they are being developed in collaboration with anthropologists and legal experts to balance intrusiveness against social norms. And personal networks will be encrypted, he says, so Mom can find Susie, but a predator can't.
Nearly two decades ago, Silicon Valley theoretician K. Eric Drexler, drawing on the ideas of Nobel physicist Richard P. Feynman, predicted that someday armies of "assemblers"--autonomous microscopic robots--will create any object or substance, from microchips to a genetically perfect human liver, much the way an auto assembly line builds cars today.
Nanotechnology has elicited unlimited fascination and anxiety. Michael Crichton's latest bestseller, "Prey," involves self-reproducing nano-bot swarms that become a sentient plague that infects and tries to kill its makers. Real-world nanotechnologists dismiss such far-fetched scenarios. "I'd put the probability of a Drexlerian assembler being built at about that of this [conference room] table jumping up and flying to the moon," says R. Stanley Williams, a top nano-scientist at Hewlett-Packard Labs in Palo Alto.
But the nanotech circuit already is a reality. Researchers in several labs have created functional electrical switches out of a single molecule. If they can figure out how to wire up billions of such molecules, the work would move into products such as processors that are thousands of times more powerful than today's, yet so tiny that they could draw all the power they need from a nano-scale solar cell.
HP scientists have used an imprinting technique--akin to minting a microscopic coin--to create a simple but fully programmable microcircuit that manipulates 64 discrete "0s" and "1s," the binary foundation of digital data. The circuit is 20 times smaller than today's most advanced silicon chips; about 1,000 can fit on the end of a strand of human hair.
Within a few years, researchers expect to build commercially viable nano-chips. They don't foresee the new technology supplanting silicon microprocessors, but working with today's technologies to open up a range of applications--from hand-held X-ray machines to aerial surveillance drones the size of wasps.
Hip business executives are fond of flashing super-thin, 3-pound notebook computers that cost about $2,000. What if you could have a 1- or 2-pound equivalent for $500? It's possible if promising research pans out in a field that can be described in one word: Plastics.
Researchers are close to getting rid of the glass in notebook PCs, global-positioning systems for cars, and all manner of portable electronic devices, and replacing it with inexpensive polymers the thickness and flexibility of a zip-lock freezer bag. The glass--as well as the circuits and hard plastic or metal that protect it--amounts to about one-third of the weight of such devices.
Researchers at Xerox Corp.'s Palo Alto Research Center are creating such displays by spraying circuitry onto plastic sheets using ultrafine drops of conductive "ink." The process--far less costly than today's display manufacturing methods--is identical to printing on paper with an ink-jet printer. It's one of several similar methods being explored by a range of companies that could open entirely new options for size and shape for any device that uses a glass display. For example, imagine a cell phone that resembles a small pencil, with a rolled-up screen inside. Scientists eventually plan to build plastic microprocessors in the same manner--again, at a fraction of today's costs.
Most such products are still years away. Among the problems to be solved: durability. For example, a navigation system "has to be able to sit in your car, on your dashboard, in Tucson in the summer and Wisconsin in the winter," says Raj Apte, who directs polymer research at the Palo Alto Research Center. However, within five years, he says, "we'll make something like a PalmPilot that you can sit on without having to worry about it." An inexpensive TV screen covering a large living room wall, yet only a few millimeters thick, may be on the horizon.
The cost of making flexible electronics would be a pittance compared to the average $5 billion costs of building a typical plant for microprocessors. If Apte is right--and big companies are betting hundreds of millions of dollars in R&D; that he is--flexible electronics could drop the price consumers pay for a hand-held computer to $20. "That will totally change who uses these devices--it will go down from the technology elite to the masses," Apte says.
So how about a 5-cent cell phone?
Apte pauses to ponder, then smiles. "That's at least a decade away."
MINING UNSTRUCTURED DATA
"In the next three years humankind will generate more data"--video footage, Internet traffic, corporate records and even newspapers--"than it has generated in all of human history," says Nelson Mattos, a software expert at IBM's Santa Teresa Laboratory in San Jose. That's hardly reassuring for those of us already drowning in data. So the quest continues for effective ways to tackle information overload.
The goal, Mattos says, is to "organize, index and mine [diverse data types] so that you can discover the trends and patterns." Then exploit that knowledge for everything from corporate marketing to research surveys of thousands of medical papers in multiple languages to detecting potential terrorist plots amid billions of innocuous activities by billions of law-abiding citizens.
Until recently, software that organizes and allows rapid searches of data files was limited to text and numbers. Today's image search tools, for example, require each photo to be labeled with words or numbers before it can be found. You can find more than 4,000 pictures of Winston Churchill on the Internet, but if the late British prime minister's face is posted without his name, it becomes invisible to a search engine such as Google. That's a problem because the vast majority of digital data comes in the form of sounds, images and other nuanced information more difficult to define in easily searchable terms.
IBM and other companies are developing methods to organize everything from millions of hours of commercial television archives to medical X-rays to satellite images scattered across thousands of locations across the Internet. Commercial products are starting to apply similar search techniques by labeling video streams--to recognize scene changes, individuals, locations and voices. Before long a news network will be able to identify every video clip of John Lennon singing "Imagine," in archives dating back decades. Or a company wondering how it is perceived by the public might conduct a "sentiment analysis" that distills millions of media references, images and opinions that would be sifted and measured with artificial intelligence software.
Security companies are also trying to develop reliable facial recognition systems--deployed so far with mixed results at the Super Bowl two years ago and at a few airports since Sept. 11, 2001. The FBI hopes to use such methods to interpret security-cam pictures, phone taps, immigration files, financial records and millions of other data points to thwart terror attacks.
That ability, to predict events from "the soup of billions of possible coincidences," as Stanford University computer scientist Jeffrey D. Ullman has called it, is the holy grail of data mining. It's also still years away. But more modest efforts to use and exploit the data stream are expected to create major new market opportunities.