Advertisement

U.S. Medicine Reexamining Its Human Errors

Share
TIMES HEALTH WRITER

When it comes to preventing mistakes, American medicine is just beginning to see the error of its ways.

Though it prides itself on huge advances in technology and skill, medicine has lagged far behind other life-and-death industries in its understanding of simple human error. Ask those who run nuclear reactors for a living or ferry passengers through the sky: Accidents will happen; the trick is to minimize them with built-in precautions.

But doctors have been expected to individually transcend their mortal flaws or, failing that, to accept the sting of blame. Moreover, the nation’s malpractice system, the only public means of assigning blame and compensating victims, has left many physicians terrified and patients traumatized. What it has not done is substantially stem the tide of medical errors.

Advertisement

With jarring recent estimates that tens of thousands of Americans a year die from medical mistakes, however, the health care profession is being forced to examine itself. The result is shaping up as a profound transformation in the way the profession treats the scourge of human error.

Many health care leaders are coming around to the view that mistakes are not, by and large, the fault of “bad” doctors; more frequently, they are committed by good--but imperfect--doctors working in a badly designed system.

Reform, in this view, must target the system and not errant individuals. Through technology, computerized reminders and routine double-checks, doctors can be deterred from making mistakes rather than suffering punishment later.

“When you have a culture that stresses the only way to keep patients safe is to be perfect, that’s not a good system,” said Dr. James Bagian, director of the National Center for Patient Safety. “You can want to be perfect and strive to be perfect, but when you fall short, the question is, what do you do about it?”

Increasingly, medical leaders are looking to other high-risk fields for advice. Often, they look skyward, to aviation, which for decades has used built-in safeguards to make it tough for pilots to err.

Many of the leaders in the new medical safety movement are aviators themselves, or at least flying buffs. Bagian is a former astronaut. John Nance, a commercial airline pilot and consultant, is a member of the National Patient Safety Foundation at the American Medical Assn.

Advertisement

According to Nance, aviation learned more than 30 years ago that “when the question is ‘who’s wrong’ rather than ‘what’s wrong,’ you accomplish nothing.”

What the airline industry did was establish a reporting system that allowed pilots and others to flag errors and close calls confidentially, minimizing the risk to their careers. From this wealth of experience on possible calamities, it designed safety measures to make such disasters nearly impossible.

Nance gives a simple example: “If you don’t want people to put the landing gear in the ‘up’ position on the ground, then you disconnect their ability to retract it.”

In medicine, the same concept is now being applied to the operating room: To prevent an anesthesiologist from killing a patient by mistaking the nitrous oxide hose for the oxygen hose, you make the fittings to the anesthesia machine in different sizes for different gases.

Aviation is full of such “forcing functions” that steer pilots away from errors. Its manuals also contain checklists and standardized procedures that Nance says have sometimes been “paid for in human blood.”

The flying industry doesn’t wait for real crises; it simulates them. During sometimes hair-raising rides in pseudo-cockpits, captains are taught to avoid panicky errors and trained to work well in teams.

Advertisement

By comparison, medicine’s attention to basic safety lags far behind, according to a report last November by the national Institute of Medicine. That report, titled “To Err Is Human,” gave momentum to change like nothing before. It jolted the media, Congress, President Clinton and--perhaps most important--the medical industry itself, into awareness and action.

The report estimated that as many as 98,000 people each year die from medical mistakes. Although the accuracy of that estimate has been challenged, the impact of the report was “huge--just huge,” said Dr. Gil Kuperman, director of clinical systems research at Partners HealthCare System, a nonprofit company based in Boston. “The numbers were staggering. It’s impossible to avoid this issue now.”

Yet there are formidable barriers to change, perhaps the greatest being the prevailing culture of medicine itself.

Doctors aren’t universally required to report their mistakes, and they lack incentives to come forward. Pilots, at least, have the ultimate incentive: avoiding what Randall R. Bovbjerg of the Urban Institute calls “capital punishment.” If they really blow it, they go down with the plane.

In medicine, many doctors say, fear of litigation often discourages the candor needed to pinpoint where and why errors occur--an essential prelude to designing prevention systems.

How to produce that candor is a matter of enormous debate.

Physicians groups, including the American Medical Assn., argue that any nationwide error reporting system should be a voluntary program in which the doctor has some legal immunity--much like a pilot--and feels free to convey the full truth without fear of retribution.

Advertisement

The Assn. of Trial Lawyers of America counters that there is already too much confidentiality in medicine. “Consumers can learn more information about the refrigerator they want to buy than they can about the doctor who is going to cut them open,” the association says.

Some advocates for reforms in medicine say the future safety of patients hinges on resolving these differences.

Faced with the specter of outside regulation, the health care industry is under pressure to reform itself. Physicians, hospital administrators and other experts who have pioneered systematic and often high-tech approaches to error prevention now find their counsel in great demand.

One of them is Dr. David Gaba, a Stanford University anesthesiologist and former private pilot whose offices at the Veterans Affairs Medical Center in Palo Alto are adorned with model planes and rockets and a giant poster of a cockpit.

Gaba has created his own version of a cockpit simulator. Several years ago, he began training young anesthesiologists, using a simulated human being that breathes, suffers allergic reactions and regularly cheats death. Medical residents struggle to keep him alive while Gaba and his team of trainers watch in a control room behind a one-way mirror.

The $200,000 simulator is so sophisticated that the dummy mimics human physiological responses to what the residents do, and instructors can insert additional challenges--such as failing vital signs--with a computer. Distractions from crabby, clueless surgeons--played by Gaba or his colleagues--are thrown in for good measure.

Advertisement

Afterward, pale and humbled residents emerge from the fake operating room to view a videotape of their performance and discuss how they might have called for help sooner, or checked the oxygen source earlier, or administered the epinephrine faster.

No one is berated; the atmosphere is friendly. But Gaba uses the opportunity to help the anesthesiologists learn from their omissions and missteps so they aren’t flummoxed when a real patient is on the table. Gaba deliberately challenges them with complex cases and bizarre complications, knowing that though such occurrences are rare, the consequences of bad decisions can be deadly.

“We are not shooting to make people immune from errors but to make them better prepared to avoid them and to detect them when they occur,” Gaba said.

Far from simply pointing out mistakes, Gaba encourages teamwork and leadership that too often can crumble in a crisis. The approach, again, is adapted from an aviation concept called “crew resource management.”

For an anesthesiologist, this means promptly advising a colleague when a patient’s breathing tube is in the wrong place, the same way a co-pilot tells a pilot when the plane is headed for a mountain.

Of course patients are far more complicated and far less predictable than planes, and they don’t come with how-to manuals. But the idea is that the same human failings that can bring down a plane--fatigue, frenzy or not flagging a problem as soon as it is perceived--can doom patients as well.

Advertisement

Computers Can Refuse to Allow Mistakes

Simulators, used in about 100 medical centers worldwide, are just one example of technological innovations that can help prevent medical errors. In the fast-paced, real world of medicine, the beeps, flashed warnings and alarms of machines can provide reminders that busy doctors and nurses need to keep from making critical mistakes.

A small but growing number of medical centers--about 5%--have computerized medical orders for treatment and medication, as opposed to relying on the scrawled instructions in bulky paper charts. The advantage, beyond efficiency, legibility and convenience, is that computers can be programmed to question, prod and even refuse certain demands.

For example, if a patient is allergic to a medication, or the doctor is ordering something that doesn’t mix with another drug, the computer will flash a warning on the screen. Most systems allow doctors to override such alerts, but only with acceptable explanations.

“Some physicians worry about . . . the [lost] ‘art of medicine,’ but no physician wants to write an order for a medication that a patient has a known allergy to,” says Dr. David Bates, who helped implement computerized ordering at Brigham and Women’s Hospital in Boston. “It’s like flying with a better cockpit. It’s so much safer. Would you want a pilot to fly without any instruments through the clouds?”

Medication errors are the most common sort, accounting for more than 7,000 deaths annually, according to the Institute of Medicine Report.

In a hospital, where many thousands of orders are given daily, the wrong drugs can go to the wrong patients, or the right drugs in the wrong dosages. Physicians’ notoriously unreadable handwriting and pharmaceutical companies’ knack for giving entirely different drugs similar names don’t help.

Advertisement

Even the most reputable institutions are susceptible, as the well-publicized experience of two cancer patients at the Dana Farber Cancer Institute proved in 1994.

Both received massive overdoses of experimental chemotherapy; one, a health care reporter for the Boston Globe, died. The problem? A doctor wrote the drug’s total dose, to be given over four days, as though it should be given each day for four days. There was no backup system in place to catch the mistake.

Dana Farber has since instituted a computerized ordering system for chemotherapy that it hopes will prevent future tragedies. Now, the doctor types in the patient’s name, weight and height along with the drug’s dose and the number of days it is to be given. If the dosage exceeds precalculated guidelines, the computer says so in a big red box.

The Veterans Affairs health care system, the largest such network in the country, has forged ahead with the same notion. Beyond computerizing all their orders and medical records--a monumental undertaking in itself--veterans hospitals are now bar-coding patients’ wristbands and drug packets to make sure the right drugs go to the right patients.

In the pharmacy at the VA medical center in Palo Alto, a sliding, rotating robot resembling a large camera scans the bar codes of medication packets and sucks the requested packets off the shelf and onto a hook. At the bedside, the drugs’ bar codes then are checked against the codes on patients’ wrists.

Hospital Cuts Serious Medication Errors 55%

Such efforts appear to pay off mightily in preventing mistakes. Through computerizing its ordering system alone, Brigham and Women’s says it has reduced its serious medication errors by 55%.

Advertisement

Some doctors--particularly the younger set--aren’t waiting for wholesale reform. Many are now keeping key medical information and programs on hand-held computers, so they can rely less on their own admittedly overtaxed memories.

“You can’t keep it all in your brain” anymore, says Dr. Jeff Goldsmith, a 31-year-old UCLA family doctor who depends on a new program called ePocrates to remind him of drug side effects and interactions.

For an entire hospital or health system, however, the cost of error-prevention technology can be steep.

“This is a good way to go broke,” says Dr. Brent James, vice president for research at Intermountain Health Care, a chain of hospitals and clinics based in Salt Lake City, which began electronic record keeping a quarter-century ago.

While upfront costs of converting a 520-bed hospital to a computerized ordering system are about $5 million to $7 million, he says, much more goes to training staffers to adapt. And a lot of dollars are lost in the “organizational trauma” of switching over.

Of course, savings may be huge when errors plummet and efficiency increases. But health plans capture that windfall unless, as in Intermountain’s case, hospitals arrange beforehand to reap a share of the savings.

Advertisement

Even if a hospital makes a substantial investment in technology, that’s no panacea. Medical treatment cannot be entirely computerized, nor would patients want that. Sometimes humans put in the wrong data, and the computer, lacking the ability to truly think, doesn’t notice. Sometimes computers go down.

“People get more and more so they . . . do whatever the computer tells them,” said Dr. Michael McCoy, associate dean of information technology at UCLA and a former fighter pilot in Vietnam. “That gets to be a risk. Suppose it fails? . . . Computers can make huge errors because they don’t have sensibility. They can screw up by a magnitude or more. Humans don’t generally do that.”

Advertisement