Advertisement

The Future Lies Beyond the Box

Share

Reporters and columnists attempting to sum up developments in the high-tech field in 1998, or speculating on the big stories for this year, are all ticking off what one would expect: the boom in Internet commerce, the Microsoft antitrust trial, highflying Internet stocks, the resurgence of Apple Computer, the merger of America Online and Netscape Communications, the rise of open-source software such as the Linux operating system, and the year 2000 bug, among other notable subjects.

The interesting developments I saw in 1998 were mostly in research laboratories, and they pointed to a profound rethinking of how networks operate and information is circulated.

The buzzwords to watch this year and beyond are “embedded” or “ubiquitous” computing and “distributed” computing, terms now used by computer scientists to describe a reorienting of how we’ll use networking and information technologies in the future. This new paradigm, whose building blocks only began to appear in 1998, will be the next big thing in computing.

Advertisement

Embedded or ubiquitous computing refers first of all to the trend of putting computational and networking capabilities into devices and services other than the familiar “screen-keyboard-box” of the personal computer. We’re seeing a huge shift among technology companies that are looking beyond the PC toward a proliferation of hand-held network devices such as 3Com’s PalmPilot, a new networking cellular phone from Microsoft and Qualcomm, a promised palmtop system from Apple and electronic books from Rocketbook and Softbook, among other related products.

Compaq’s Western Research Laboratory in Palo Alto, which the company acquired when it bought Digital Equipment Corp., has even produced a working prototype of a hand-held computer that runs Linux.

At the Internet Society convention in Geneva in July, I frequently heard the slogan, “IP on everything, everything on IP.” IP stands for Internet protocol, the basic data standard that allows information to be “seen” or passed around on the Internet. With everything on IP and IP on everything, nearly all our common, everyday devices will be “smart” and “on” the Internet: cars, refrigerators, household appliances and light switches, manufacturing tools, TVs, cameras, sensors and even smart cards in our wallets or purses.

In order to get all these devices to talk to each other and to be identified on the network, we need a new standard of software that’s small, platform-independent and ubiquitous itself. Sun Microsystems’ solution is called Jini (https://java .sun.com/products/jini/), which was previewed for developers in 1998 and will be formally announced Jan. 25 in New York.

*

Jini is based on Java, the programming language that runs on a “virtual machine” that can be included with any operating system. Jini-enabled devices contain “agents,” small segments of software code that tell other Jini machines what they do, where they are and how they operate. A Jini-powered house, for example, would show up in a Web browser or on a PC desktop displaying its capabilities, such as the ability to turn lights on or off, inventory its refrigerator or cupboards, set the temperature, check phone messages, etc.

This is how embedded computation and distributed computing intersect: Machine intelligence shifts from general-purpose computing, such as in a PC, to device-specific intelligence, and the network itself becomes smart as an aggregate of billions of devices performing specific tasks and sharing information.

Advertisement

The network architecture may change too. Instead of the familiar client-server model we use today, distributed computing allows a peer-to-peer architecture, which means that there’s no longer any need for large, centralized computers running huge operating systems such as Windows NT. Jini resources can “see” one another without having to be switched through a server, and Jini agent software can run in under a megabyte of memory space.

Microsoft is aware of the threat this model poses to its core software products. It has its own version of the Jini approach, called the Millennium Project. Lucent Technologies has one called Inferno (https://www.lucent-inferno.com/), and General Magic, in Sunnyvale, Calif., has a product called Odyssey (https://www.generalmagic.com/technology/odyssey.html).

This is where the money and research are headed these days. Another interesting and related development is under investigation at Caltech in Pasadena, a program called the Infospheres Project (https://www.infospheres.caltech.edu/), directed by Caltech computer science professor K. Mani Chandy.

The Infospheres Project, funded by the U.S. Air Force and the National Science Foundation, grew out of the tragedy of TWA Flight 800 near Long Island, N.Y., in 1996, Chandy says. After the airliner blew up, a large array of institutions and people needed to talk to one another--the FBI, hospitals, TWA, the Coast Guard, families of the victims, etc. The technological problem became how these people could communicate without knowing one another or knowing how to get in touch with one another in advance.

*

The thrust of the Infospheres Project has been to develop a new mode of information distribution, which Chandy calls “content-based” addressing instead of “address-based” addressing. In other words, people who need to communicate would be able to pass information to one another based on the content of their messages rather than on knowing the precise address of their correspondents.

To accomplish this task, the Infospheres Project also uses Java agents, software that organizes information into classes and then searches for matches on a network. A user who wants to buy a product online, for example, might use such an agent to search for sellers and have all potential sellers with a match for the product report back instead of requiring the user to search individual Web sites.

Advertisement

Using the Infospheres Project software (a trial version will be available later this month), users will be able to find information not by using Internet addresses but by creating conditions, or parameters, for agents that will search the network for appropriate responses. In March, the project will launch an experimental model in the San Francisco Bay Area with both end users and vendors.

“I think this is going to be the metaphor for the future,” Chandy says. Most other computer scientists seem to agree--the phrase “post-PC era” is heard often among researchers now.

The challenge for this new approach will be how to make the ubiquitous and distributed models of computing reliable, safe, secure and transparent to users. Privacy will also be vastly more complicated--the present model of online privacy, which is dependent on the “informed consent” of consumers, will get significantly more opaque if we have millions of devices constantly communicating all around us without our direct control or knowledge. There is also the interesting question of whether future software agents will bear any of the legal rights or responsibilities we attach to individuals.

Right now, cyberspace is typically viewed by the public as something glimpsed through a screen, the “window” of a PC monitor. Even that level of abstraction, however, is about to expand beyond most people’s imaginations as the ideas incubating in research labs begin to migrate into our daily lives. As we might expect, given the rapid technological change we’ve seen in the last five years, things are about to get very different from what we know now.

*

Gary Chapman is director of the 21st Century Project at the University of Texas at Austin. His e-mail address is gary.chapman@mail.utexas.edu.

Advertisement