Children accuse computers of cheating at Tic-Tac-Toe. Adults swear their PCs know what they want them to do, but are unwilling to tell them. People scream at voice-mail systems. And in the event that computers don't totally traumatize their human users, it's common for them to be patted, praised and effusively blessed.
Of course, everyone knows that computers are just machines. But however much we deny it, new research shows we respond to them as we would to other people. That is the kernel of a simple, but controversial theory that may soon crack open the alienating world of high-tech: make computers more human.
Microsoft chairman Bill Gates, who made his fortune selling frustrating computer software to millions of people, is an unlikely convert to this philosophy. But last week he renounced the past and introduced Bob, one of the first software products to use a "social interface" aimed at making a computer behave according to basic social norms.
Gone is the inscrutable "C-prompt" of the DOS operating system, condemned by the new school as "passive aggressive." Gone, too, are the graphic icons and mouse-controlled menus of the Apple Macintosh computer or Microsoft's Windows software. Bob brings, instead, a menagerie of animated characters, each with its own personality. They are devoted guides designed to make you and your computer best buddies.
Bob, a group of eight programs for automating household tasks, is only a primitive example of what eventually may be done to instill humanity in a machine. Taken to its logical extreme, the concept conjures up images of the personable female computers of "Star Trek" and other sci-fi fantasies in which talking to a computer is simple and stress-free.
"The coming revolution in computing is social," Gates declared in a speech at the Consumer Electronics Show in Las Vegas. "The C-prompt served its purpose, and today we can say that is completely obsolete. We see the social interface as something that has immense depth, to make the machine fun and to make it really predictable."
The idea may seem obvious to some, but many in the industry consider it misguided or even ethically offensive.
"Why should a computer act like a person? A computer is not a person. A computer is a computer," says Don Norman, who is leading Apple's competing effort to create new types of software.
But proponents are making progress in convincing a skeptical technology elite.
"It was always like, 'Well, that's cute, but computers are serious machines,' " says Clifford Nass, a professor of communications at Stanford University. His work with colleague Byron Reeves--a novel blend of social psychology and computer science--laid the groundwork for Bob and may yet spur an industry-wide sea change in the way software is made.
Gates is not exactly a touchy-feely kind of guy, and he surprised even Microsoft insiders with his strong public support of the new paradigm, vowing to use it in future products ranging from spreadsheets to interactive television software.
Microsoft's interest in a rather ethereal set of ideas--and that of other companies including IBM, Compaq and US West--is inspired in part by the industry's new-found success in selling computers to consumers for use at home.
But transforming Joe Six-Pack into a happy home computer geek is harder than it sounds. It requires not only high-powered hardware that handles video and sound, but also software that is far easier to use--friendlier, if you will.
Bob, which helps with tasks such as scheduling, bill paying or letter writing, was born in part out of the horror with which Karen Fries, its chief developer, watched focus group reactions to another software product.
"It was awful to watch them," says Fries, a one-time psychology major. "You sit with someone and you ask them to write a letter, and they can't do it and they blame themselves. The idea that you would create a product that would hurt someone's self-esteem was really horrible to me."
Bob employs 14 animated characters that give users instructional tips. Microsoft says a manual, the bane of many a computer owner's existence, is not necessary. None is included with the product.
The guides, which include a yellow dog called Rover and a sewer rat called Scuzz, are meant to appeal to a broad spectrum of personality types. They were intensively tested to ensure compliance with the 200-odd "social rules" identified by Nass and Reeves.
The researchers based their work on a theory that social psychologists say predicts marital attraction and friendship patterns: People like people who are like themselves. So they created personality profiles for about 100 testers and developed the characters to match.
The Microsoft crew unanimously hated Hopper the Bunny, for example. But Hopper--a submissive female--was enormously popular among a group of older men who were insecure about computers.
Ruby the Parrot, who is "aggressive and unfriendly," was a common favorite among Microsoft employees, including Gates.
All of the characters conform to rules of American social behavior such as, "Don't turn your back when speaking." "Emotions should be consistent with the situation." "Comments should be cooperative and relevant."
A menu system like Windows never changes, says Nass. "It makes no suggestions. It doesn't produce relevant responses. It just sits there. If I did that in a conversation eventually you'd punch me."
Not everyone thinks humanizing computers is such a great idea. Many computer scientists argue that anthropomorphic software tends to be cutesy, patronizing and hard to use. Some warn against the personification of computers on ethical grounds. They raise the specter of HAL, the psychotic computer from the movie "2001: A Space Odyssey," and the dangers of equating humans and machines.
And experts on the study of media caution that the suspension of disbelief can be manipulated in dangerous ways. The wielding of such power by a company like Microsoft--which had a 3-D parrot singing the hit song "Everybody Wants to Rule the World" during Gates' demonstration last week--is unsettling to some.
If the goal is to empower users, critics say, engineers should try to make computers "smarter," not more social.
Moreover, writing algorithms that portray personality traits is difficult--and sometimes risky.
A screen that offers dozens of command choices under pull-down menus may be confusing, but it's not offensive. A talking olive named Martina--one of the proposed Bob characters rejected because of its association with alcohol--might be.
Still, the convergence of technological advances--notably voice recognition and 3-D graphics--and the pursuit of profit is prompting experimentation with Nass and Reeves' theories throughout the high-tech industry.
The result is that computer companies are hiring social scientists, psychologists and anthropologists to weigh in on software design. And computer scientists are becoming more sensitive to social concerns.
John Tyler, a scientist at IBM's personal operations systems programming center in Boca Raton, Fla., and a convert to the theory, says it can be tough to get through to his colleagues.
"We think this phenomenon is a pretty important part of being successful with products in the future," says Tyler. "But if you go down to the development lab and look out at all the computer science majors and people who have spent their careers developing operating systems, it's rather difficult to persuade them that social science is a very viable approach to solving problems."
The study of "human factors," as the field is dispassionately known, has a long history in the computer industry. Apple popularized the field in the early 1980s, using teams of psychologists in the early work on the Macintosh.
And in 1987, the firm came out with a videotape called "The Knowledge Navigator," introducing an intelligent guide called Phil, who acted as the user's conduit to the computer. Xerox Parc, birthplace of the Macintosh and many other computer breakthroughs, has long employed anthropologists and other social scientists.
What's different about the work of Nass and Reeves is that they look at human interaction with computers from a social perspective, often substituting computers for people in classic social psychology experiments.
"We do traditional human factors work--how legible is the display, how audible are the prompts in a voice messaging system, for example," says U.S. West's Adam Marx, who discovered Nass and Reeves when he was in the firm's advanced research group. "But until recently we haven't looked at the social aspect--is this pleasant to use. Not 'Do you understand what a person is saying?' but 'How do you react to it?' "
Among other things, Nass and Reeves have found that when you have a computer ask people questions about its own performance, they'll be more tactful than when answering the same survey on a different PC.
They say that's because people extend the same courtesy to the machine as they would to other human beings.
In addition, they say, people apply gender stereotypes to voice-based technologies. Computers with female voices are perceived as better teachers of love and relationships; male voices are considered more credible on technical subjects.
The findings, Nass and Reeves say, have implications for much-hated voice-mail systems, where one voice could be used to mention the company name while another voice puts the user on hold.
"Of course when you ask them, virtually all the people in our experiments deny that any of this could be going on for them," says Reeves. "They say, 'I'm not stupid. I know this is only a machine. Why would I gender stereotype? Why would I get angry at it?' But they do. They behave consistently with them treating it as a person."
Not surprisingly, most of Bob's developers are major fans of Nass and Reeves, eagerly reciting their mantra: "Human interaction with computers is fundamentally social and natural."
They marvel at the professors' ability to solve problems by citing simple rules of social behavior. One of the animated characters they were testing for Bob had his back turned toward the user. Focus groups hated him.
"Nass and Reeves told us it was rude," says Microsoft's Fries. When the character was turned around, so were the test results.
Painstaking care was taken with Bob to develop characters that make users feel at ease.
Seattle resident Paula Smith, an early tester of the product, said that when Rover barked she thought he was telling her she'd done something wrong. The ruff-ruffing was immediately cut back. A robot-like character was rejected as having Nazi connotations when it saluted.
For some computer scientists, all of this is overkill--and potentially dangerous.
"There are certain social things that are going to have to be involved in the technology to make them practical, to make people feel at ease," says Tony Fernandes, manager of human experience at software maker Claris Inc. in Santa Clara.
"But where is it heading? Are we going to have to practice safe computing in the future? Whose social values are we going to be pushing on people? People in East Los Angeles and downtown Boston have a different way of doing things. I guess it kind of scares me," Fernandes said.
"The question is not 'Can we do some of this?' but 'Should we do some of this?' "
Nass and Reeves, who have been bowled over with the attention they have received since Gates' announcement last week, say there's no way around it.
Whether machines are given coherent personalities or not, people will react to them as though they are social beings.
"Everyone is expert at being human," says Nass. "Why not take advantage of it?"