Advertisement

Science / Medicine : Peril for Pilots : Psychologists call it the “glass cockpit” syndrome, a computer information overload in which the flood of technical information, faulty communication and outside stress lead to judgment errors.

Share
<i> Squires is a medical writer for the Washington Post health section, from which this is reprinted</i>

The increasing use of sophisticated automation is threatening to overwhelm airline pilots, naval crews and others who deal with high technology in stressful situations.

Psychologists call it the “glass cockpit” syndrome--referring to the ubiquitous glass computer screens. It is a situation where the combination of a flood of technical information, faulty communication among crew members and outside stress lead to major judgment errors that can cause accidents.

Testifying for the American Psychological Assn. recently before the House Committee on the Armed Forces, experts said the growing use of complicated technological devices often isolates those who most need to communicate with each other.

Advertisement

The hearing examined the psychological factors that helped lead the crew of the cruiser Vincennes to mistake an Iranian airliner for an attacking F-14 fighter jet. The Vincennes shot down the Iranian airplane in July, killing all 290 people aboard. Although a Department of Defense investigation found cause to discipline only one of the crew members, questions remain about how such a gross error could have occurred.

But the panel of witnesses said the Vincennes incident was just a symptom of a larger problem facing society.

Research is badly needed to understand just how much automation to introduce--and when to introduce it--in situations where the ultimate control and responsibility must rest with human operators, said psychologist Richard Pew, manager of the experimental psychology department at BBN Systems and Technologies Corp. in Cambridge, Mass.

“Everywhere we look we see the increasing use of technology,” Pew said. “In those situations where the operator has to remain in control, I think that we have to be very careful about how much automation we add.”

The growing use of high-tech devices in the cockpit or on ships can have two seemingly contradictory effects. One response is to lull crew members into a false sense of security. They “regard the computer’s recommendation as more authoritative than is warranted,” Pew said. “They tend to rely on the system and take a less active role in control.” Sometimes crews are so mesmerized by technological hardware that they are lulled into what University of Texas psychologist Robert Helmreich calls “automation complacency.”

Another response is to fall victim to information overload and ignore the many bits of data pouring from myriad technical systems. In two recent airline crashes, Helmreich said, black box recordings showed that the crews talked about “how the systems sure were screwed up” but did not verify what was wrong. In both cases, the systems worked but crews failed to check the information and crashed.

Advertisement

The stress of combat or poor weather or machine failure only serves to compound the errors that can be made. Yet “most military personnel feel impervious to stress,” Helmreich said. “I am amazed at their lack of awareness of its effects.”

But many stress effects can be overcome even in combat--if people are conscious of their vulnerability. “When the whole team is aware of stress, they are more likely to avoid its effects,” Helmreich said. Had the Vincennes crew had more training in high-stress situations, for example, he said, “they might have stopped and examined the links in the chain.”

Helmreich, who was in the Navy during the Cuban missile crisis, noted that when “multiple people verify information and decisions” there is less chance of error. “I’m not saying they should stop and reread the manual,” Helmreich said, “just that they should have double-checked what was happening.”

Research shows, for example, that when people must switch from one task to another without completing the first job they are more likely to make errors. This kind of work pattern “puts heavy demands on memory and the ability to keep things straight,” Pew said.

It was precisely this type of situation, a DOD investigation found, that helped set the stage for the downing of the Iranian airliner by the Vincennes in July.

“Errors of the sort made by Vincennes personnel can be anticipated, and procedures to reduce their likelihood or their gravity can be instituted,” University of Michigan psychologist Richard E. Nisbett told the committee. “Many kinds of judgmental errors can be reduced by training.” Research shows, for example, that under stress “people may treat other people’s judgments as fact,” Nisbett said.

Advertisement

An investigation of the Vincennes incident found that at the time the airliner was shot down, the ship’s crew had been coping with multiple emergencies.

“Not only were Iranian gunboats shooting at them, but at the same time they had a major malfunction of one of their guns,” Helmreich said. A live round of ammunition didn’t fire from the gun. The crew had to scramble to make sure that the ammunition didn’t explode on the ship, and then the captain had to maneuver the ship quickly so that its remaining gun could fire on the gunboats. In taking this action, the ship heeled 30 degrees.

“Everything that wasn’t bolted down flew through the air,” adding to the chaos, Helmreich said. Men in the Command Information Center had to dodge debris just at the time they noticed an incoming plane--the airliner that was mistaken for an F-14.

According to the report of the commander-in-chief of the U.S. Central Command, during the final minute and 40 seconds of the Vincennes incident, the anti-air warfare officer told his captain that the aircraft had veered from the flight path into an attack profile and was rapidly descending at increasing speed directly toward the ship.

“The anti-air warfare (officer) made no attempt to confirm the reports on his own,” the commander-in-chief of the U.S. Central Command reported. “Quick reference to . . . the console directly in front of him would have immediately shown increasing, not decreasing, altitude.”

Instead, the report notes, this “experienced and highly qualified officer, despite all of his training, relied on the judgment of one or two second-class petty officers, buttressed by his own preconceived perception of the threat, and made an erroneous assessment to his commanding officer.”

Advertisement
Advertisement