Advertisement

Is Making Our Machines More Human Misguided?

Share

It’s one of the hottest new products in telephony, and it’s straight out of the 1950s. Called Wildfire, it’s an automated assistant that screens your calls, takes messages, keeps track of your contact list and routes callers to your current phone number.

Do you only want a few, important callers to be transferred to you when you’re in the car? No problem: Just give Wildfire a few spoken commands. With its sophisticated voice recognition and a fair amount of built-in intelligence, Wildfire takes us back to the days before voicemail, when secretaries answered the phones.

The result is extraordinarily useful; almost everyone who has tried Wildfire raves about the productivity boost it provides. The system has been around since 1994 and has achieved only modest commercial success so far, but it’s likely to take off in popularity over the next year as cellular companies begin to offer it as an added-value service. Yet I feel a bit queasy whenever I use the system.

Advertisement

My discomfort stems from how you interact with Wildfire. For one thing, Wildfire speaks in a perky female voice that smacks of misguided nostalgia. (Couldn’t they at least offer a male option?) But even more disturbing is that Wildfire pretends to be human. The interface is not a list of touch-tone choices read off in a deadpan, as we’re accustomed to. Instead, with its ability to understand spoken commands, its distinctive voice and details like yawning when it answers the phone late at night, Wildfire mimics a human assistant as closely as possible.

Of course, technologists have been talking about “intelligent agents” that are modeled after human characters for years. But Wildfire is the first system I’ve seen that is good enough that users routinely suspend their disbelief and refer to it as a person. By doing so, Wildfire confronts us with some big questions.

Like, what are the dangers of anthropomorphizing computer programs? And does the fact that we are used to interacting with people necessarily mean it’s the best way to interact with computers? Or is it a case of blind and ultimately pointless imitation, similar to the early days of plastic when manufacturers tried to make their products look like wood?

Proponents of anthropomorphic agents see the richness of human interaction as a resource that we would be foolish to ignore. It can act as a kind of shorthand for ideas that would otherwise be hard to express. For example, says user Alex Knight, “Wildfire’s tone of voice lets the caller know that important calls will get through but that she won’t be fooled into transferring unimportant ones.”

Mimicking human traits not only allows computers to express subtleties, it also affects our relationship with technology. When asked why the company chose an anthropomorphic interface, Wildfire Vice President Nick d’Arbeloff points to the failure of AT&T; Corp.’s 700 numbers, single phone numbers that follow users around.

“People eventually got tired of punching in all the digits necessary to update the system of their location” d’Arbeloff said. But with Wildfire, he said, users build up a relationship with the system that makes the task seem less onerous and more enjoyable. From my experience, that’s true: You feel a sense of responsibility with Wildfire that doesn’t come into play when dealing with, say, voicemail.

Advertisement

But computers masquerading as people can lead to problems when users forget they are dealing with an illusion. For one thing, all the smoke and mirrors can cause people to overestimate the agent’s intelligence. This problem has long plagued interface designers. It’s possible, for example, to design databases that respond to basic English queries such as “How many widgets were sold last year?” But such a product invariably ends up disappointing users once they discover that the system doesn’t really “understand” English; it just knows some simple rules.

There is also a more insidious problem: Once we are accustomed to seeing machines act like people, some argue, we may start treating people more like machines. If you’re used to the fairly limited repartee of Wildfire, you will start restricting your conversations with real administrative assistants as well.

It’s not clear if these problems with anthropomorphizing outweigh the advantages. But they do point to the need for a lot more research. If Wildfire is a harbinger of what’s to come, it’s critical that we figure out which human characteristics are useful to imitate and which are confusing or harmful. As Stewart Brand, creator of the Whole Earth Catalog once said, if we are going to be as gods, we might as well get good at it.

* Steve G. Steinberg is an editor at Wired magazine. He can be contacted via the Internet at steve@wired.com

Advertisement