Advertisement

Where Technology Is Taking Us : Innovation: Consumers were offered loads of new gadgets and choices in the 1980s. The 1990s will allow people to customize products to their own needs.

Share
SPECIAL TO THE TIMES

The most important technological innovation of the 1980s wasn’t in recombinant DNA or software or silicon chips or superconductivity. Nor was it manufactured by Silicon Valley start-ups, refined by the Japanese or lurking in university laboratories.

The greatest breakthrough of the past decade--and the one that will relentlessly shape the 1990s--has been in our radically increased expectations of what technology can do. People fully expect their computers to become faster, cheaper and more powerful--and if American companies can’t deliver, the Japanese will. Molecular biologists are supposed to turn the genetic keys that unlock the cures for cancer. Detroit and Toyota are expected to manufacture cars that deliver more miles per gallon while spewing fewer noxious gasses into the atmosphere. Even if the scripts are lousy, next year’s special effects from Hollywood are expected to be even more dazzling than last year’s.

“Technology is breeding a whole new concept of what can or cannot be done,” says Georgetown University humanities professor O. B. Hardison Jr., author of “Disappearing Through the Skylight,” a book that explores how technology has shaped popular culture in this century. “The dynamic is that we’re not going to be limited by history; we’re going to make up new traditions as we go along. . . . Anything goes.”

Advertisement

In the 1980s, technology became personal, portable, accessible and pervasive. Fax machines, cellular phones, test tube babies, compact discs, videocassette recorders and MTV didn’t just transform markets but transformed our perceptions. Technology wasn’t just a vessel of pop culture; it also became the content. No fewer than six of the top 10 grossing films of the 1980s were based on technological themes. The Challenger, Barney Clark’s artificial heart, the shooting down of Korean Air Lines Flight 007, the fate of the Strategic Defense Initiative and the role of faxes and video in the Tian An Men Square uprising became real-life dramas based on technology. The 1990s begin with technology acknowledged as a powerful medium for personal wealth, political change and global competitiveness.

But where the 1980s were a decade that created a new consciousness of technology by offering a tsunami of choice, the 1990s will be fundamentally different. The emphasis will shift from choosing options to designing alternatives. Instead of selecting a personal computer because you like its interface, you will buy it because you can design the interface to your own liking. Manufacturers will build products in ways that enable you to customize them yourself-- if that’s what you want. Technology--as much as politics and culture--will become the medium through which society expresses its values and priorities.

“The way we now think about technology is as an independent force in its own right,” asserts sociologist Christopher Lasch. “Technologies are political; what technologies finally appear as is not something that is decided in a laboratory.”

What are the forces shaping our techno-expectations? As one looks at disciplines ranging from computers to pharmaceuticals to materials science, one of the most powerful themes driving technological development is the belief that everything can be engineered.

In computer technology, notes Lotus Development Corp. founder Mitch Kapor, one of the hottest fields is “software engineering”--the idea that millions of lines of computer code can be engineered as rigorously and reliably as a suspension bridge or a silicon chip. In organic chemistry, chemists and molecular biologists talk about “protein engineering” as enthusiastically as they do about genetic engineering. Even on Wall Street, mathematically gifted “quants” promote the “financial engineering” of options, futures and other instruments as the profitable path to money management.

“There’s definitely a new sensibility here,” says Harvard University’s Daniel Bell, the sociologist who coined the phrase “Information Economy” in the 1950s. Where the symbol of American engineering prowess in the 1960s was the Apollo space program, with its huge multibillion-dollar systems and phalanxes of engineers, the symbols of the new engineering ethic, says Bell, can be found on a scale orders of magnitude smaller: the silicon chip and the strand of DNA. Instead of engineering industrial mass, the focus is on engineering molecules and information.

Advertisement

Nowhere is this better illustrated than the emerging cult of “nanotechnology”--a term coined by Stanford University scientist Eric Drexler to describe his vision of a new generation of micromolecular machinery. Where Archimedes once said, “Give me a lever long enough and I can move the Earth,” the nanotechnologists now say, “Give us a lever small enough, and we can move a single molecule.”

Using tools like the scanning electron tunneling microscope--an invention of the 1980s--Drexler and other nanotechnologists believe that they will be able to build machines and materials one molecule at a time. Drexler actually believes that nanotechnologists will be able to engineer machines that can be injected into the human bloodstream to clean out clogged arteries.

Running through this engineering ethic is the growing feeling that technology is silly putty--technologists can make it whatever they want it to be. Nothing has to be taken for grantedThe only things that are immutable are the laws of physics--because, with genetic engineering, society is already re-engineering the laws of nature.

So the revolution just isn’t in the technologies themselves but in how people approach technology. This has led to a post-modern, more freewheeling notion of technology. Computer scientists now work with organic chemists to gain new insights into molecular structures. Electrical engineers now collaborate with neurophysiologists to design neural network silicon chips. Physicists work side by side with radiologists to map new ways to use magnetic resonance imaging scanners. The new technologies are dissolving disciplinary boundaries. Technologists feel unencumbered by limits. The questions have shifted from “How?” and “Why?” to “Why not?” and “How much?”

“Science and technology--and scientists and technologists--are now much more connected to the real world and not just the world of ideas,” says John Seely Brown, the director of Xerox’s Palo Alto Research Center, a pioneer in computer science and technology.

The real challenge all these technologies impose, scientists like Brown insist, is how best to “package” and “integrate” them into forms that people can meaningfully use. The emphasis is less on discovery than design. Instead of just creating technological “components” and “systems,” technologists increasingly talk about creating “environments” and “ecologies.” One of the most intriguing new domains of computer science is the “ecology of computation,” that explores the behavior of large computer networks. Computer scientists are scrutinizing these compu-systems in the same way that ecologists study the rain forests and the greenhouse effect.

Advertisement

“Our conception of technology is still rooted in machines,” says Harvard’s Bell, “but you now hear scientists and engineers using organic metaphors more and more frequently.”

In other words, technologists want to engineer tools and technologies that have different properties than ordinary machines. They want to design technologies that offer more than just functionality. To a very large extent, technologists want to become more like architects; designing environments that people live in and use. That means that they want their technologies to offer an aesthetic.

Just as with automobile design, the question of style is assuming greater importance in technology. This can clearly be seen, asserts Massachusetts of Technology Media Lab director Nicholas Negroponte, in the design of computer interfaces--such as Apple’s Macintosh computer, Steve Jobs’ Next workstation and the nation’s telecommunications networks.

“The objective of network design in the past was transparency; basically no detectable style at all,” says Robert Lucky, the executive director of research at AT&T; Bell Labs. “We’re moving away from that now; networks have to have some intelligence associated with them, but their styles are rather abstract right now.”

Even the nodes of the network can exhibit a style. For example, says Lucky, consider the answering machine. “The message conveys something of the personality of the individual and so many people’s messages are outright boring,” he notes. “People are now working on intelligent answering machines that can give customized messages. That raises basic design questions: Do we want computerized speech to sound perfectly real? Do I want my phone machine surrogate to project my style or its style?”

This ability to imbue technology with intelligence and style changes the rules of design. “The answering machine becomes a new genre,” asserts Xerox’s Brown, “like the play and the novel are genres. Over time, we will become more concerned about the genre of the technology than the technology itself. How the technology is constructed will almost seem to disappear.”

Advertisement

For example, says Brown, “There are now, in one form or another, at least a dozen computers inside your car. . . . What all that computation is in service of is keeping things simple for you and keeping you connected. Instead of distancing you from the world, the technology is making you feel more attached.”

The idea of using technology to enhance one’s feeling of being connected with the world is emerging as a key design ethic of this decade. “Our times have become very fragmented,” says AT&T;’s Lucky, “and technologies spread your existence over time and space. They’re going in the direction that you want to be everywhere at once; you don’t want to miss anything,” which explains why cellular phones became a best-selling innovation of the late 1980s.

To counter this fragmentation, places like Xerox PARC, Bell Labs and MIT’s Media Lab are championing the idea of using computer-based “intelligence” to capture complexity and repackage it in more palatable forms. “There’s a new role for organizational technology from being primarily information processors to collaborative sense makers,” says Brown. Where people might use the fast-forward button on their VCRs to rapidly scan the news, says Brown, imbue the VCR with intelligence so that it can restructure the news into a mini-documentary for you.

In other words, technologists assert that the elements of style are now concerned about how to use complex technologies to recapture simplicity. For example, because programming VCRs has proven beyond the skill of most college-educated Americans, Panasonic now sells a model with a device that can scan in the desired program from your television guide and automatically record the desired program. “I think people want the world to be a simpler place,” says AT&T;’s Lucky. “We all yearn for simplification, but we’re all governed by the second law of thermodynamics.”

At the same time, technologists want to use technology as a medium to evoke and manage meaning. “These media amplify our ability to be reflective,” says Xerox’s Brown. “We’re going to be able to bootstrap understandings in a way we weren’t before. We’re moving much more into a world that worships analysis to one that worships synthesis.” For example, instead of using a spreadsheet to analyze a company’s numerical cash flow, the next generation of computer models will allow you to track the flow of money in a corporation with the same visual acuity as an angiogram lets you follow the flow of blood in the heart.

Indeed, one of the most powerful aspects of computer technology is that it enables disparate information--video, image, text, data, acoustics--to be integrated, synthesized and manipulated. As microprocessors and new generations of software and silicon chips spill out over the next decade, the technology will increasingly perform the synthesizing. In the 1980s, technologists programmed computers to perform tasks; in the 1990s, the computers will begin to program themselves and communicate with each other about what they’ve learned. Instead of programming the VCR to record a certain show at a certain time, the VCR will “learn” what shows you want to have taped.

Advertisement

As these technologies continue to diffuse throughout society, people will be forced to place greater trust in their abilities. “A lot of this technology is virtual,” says AT&T;’s Lucky. “You have less time to delve into the reality of things; you have to take so much on faith.”

Unfortunately, notes sociologist Christopher Lasch, “One important effect sophisticated technology has is to mystify; people grow too dependent on it. In some respects, technology could become the ‘magic of the 1990s’ that tends to make people feel lost in the world.”

It’s not that technology will split society into the H. G. Wells vision of the technical but brutish Morlocks and the gentle but stupid Eloi in “The Time Machine”--but rather that society’s increasing ability to engineer technology along whatever dimensions that it desires forces people to confront design decisions that they didn’t have to before.

Technology is far more widely distributed than it ever was. Microwaves, VCRs, Nintendo games and personal computers are products of the masses; the average household today in one form or another has more than three dozen silicon chips. Computer hackers are in virtually every high school. Ghetto high schools have computer wizards who can program synthesizers to generate rap music rhythms. Some biologists wonder if we won’t see teen-aged “bio-hackers” who tinker with recombinant DNA sometime soon. Indeed, crack is the most obvious example of a street-based pharmaceutical innovation.

In the 1980s, technology became increasingly democratized; in the 1990s, society will take more “votes” on what it wants its technologies to be and to do. In the 1980s, we already gained a sense of the difficult choices that medical technologies impose; in the 1990s, as genetic engineering becomes a mainstream technology, the alternatives become even more difficult. In the 1980s, the question was which computer to buy; the question of the 1990s will be what kind of computer environment does the organization want to live in.

In essence, the very ability to engineer new medicines, materials and systems--and do so with technologically enhanced style and intelligence--confers new responsibilities. In the 1980s, technology increasingly defined pop culture. In this new decade, pop culture is increasingly going to define technology. In the process, our new expectations of technology will lead to new expectations of ourselves.

Advertisement

“I think what is happening is irreversible,” says humanities futurist Hardison. “There’s no way to stop it.”

DR, Gary Tanhauser / for The Times

Advertisement