Advertisement

Op-Ed: Second-guessing may do more harm than good — especially among doctors

Share

You always remember the first patient who died on your watch. Mine was an older man with a faulty heart — the main pump had failed and his heart was beating irregularly and far too fast. We tried to slow it down with medications, but later that night, it suddenly stopped beating completely.

In the following months, I kept questioning if I should have done something differently. Whenever I would have a case like that one, I found myself second-guessing my clinical management. However, it turns out that thinking twice may actually cause more harm than good.

In a working paper, Emory University researchers found that when doctors delivering a baby have an adverse outcome, they are more likely to switch to a different delivery method with the next patient, often unnecessarily and sometimes with worse results. For instance, if a C-section didn’t go well, the doctors may be more likely to try a vaginal delivery with the next patient even if that isn’t the best choice.

Advertisement

The doctors overreacted to an emotional event, the delivery of a child that didn’t go well. But this phenomenon is not unique to medicine. Overreacting is a common hazard of everyday life.

A 2008 study by two Harvard University economists included many examples where the title of their paper — “Overreaction to Fearsome Risks” — holds true for broader society.

For instance, sensational headlines about shark attacks on humans in Florida in 2001 caused a panic and led the state to prohibit shark-feeding expeditions. Yet shark attacks had actually fallen that year and, according to the study, such a change was probably unnecessary given the minuscule risk of such an attack happening.

A similar outcry occurred after the 2001 anthrax scare in which anonymous letters laced with deadly anthrax spores began arriving at media and congressional offices. Even though only five people died from inhaling the anthrax, the U.S. government responded to the public’s fear by devoting significant resources to anthrax-fighting drugs and vaccines.

In the face of a “fearsome risk” people often exaggerate the benefits of preventive, risk-reducing measures.

Because doctors make so many decisions that have serious consequences, the fallout from second-guessing looms especially large for us. A bad outcome can mean the difference between life and death. Overreacting to that outcome can do even more harm.

Advertisement

A 2006 study found that if a patient had a bleed after being prescribed warfarin, the physician was about 20% less likely to prescribe subsequent patients the blood thinner that prevents strokes. However, if a patient had a stroke and was not on warfarin, physicians were still no more likely to prescribe warfarin to their other patients.

These findings highlight interesting behavioral patterns in doctors. In the blood-thinner study, doctors were more affected by the act of doing harm (prescribing a blood thinner that ended up hurting a patient) and less affected by letting harm happen (not prescribing a blood thinner and the patient having a stroke). Yet a stroke is often more permanent and damaging than a bleed.

The choices doctors made regarding prescribing blood thinners are reminiscent of the “trolley problem,” a well-known ethics experiment that poses a dilemma: A runaway trolley will kill five people, unless you pull a lever that diverts it, causing it to kill one person but saving five. Most people say they would do it. But if they have to push one person in front of the trolley to make this happen, they won’t do it.

The problem shows how humans are averse to feeling like they directly caused harm — they are less bothered by the harm itself.

The onus shouldn’t necessarily be on physicians to make better decisions on their own, but on creating systems to help them do so. Doing so must be “balanced against the possibility that increasing bureaucratic barriers to care or making physicians more risk-averse might actually harm patients even more,” said Manasvini Singh, author of the study looking at decision-making for C-sections.

Several institutions are studying the problem, including University of Pennsylvania’s “Penn Medicine Nudge Unit” (literally designing ways to “nudge” physician behavior), with the goal of helping clinicians make the right decisions where they are most susceptible to making the wrong ones. Artificial intelligence and machine learning are also being used to try to solve the problem.

Advertisement

Problems in clinical care occur when we stray too far from what we know works. For example, VIP patients may receive worse care because the doctors caring for them give in to their requests and end up ordering unnecessary tests and interventions.

When we overthink, we fail to rely on thinking based on what we know or have experienced. Instead, we may inadvertently overanalyze and come to the wrong conclusion, such as doing a vaginal delivery when the patient should have a C-section.

Humans are susceptible to emotional and often irrational thinking when processing information, adverse events and mistakes. As much as we don’t want to cause an unfortunate event to recur — in a medical setting or in the wider world — we need to be aware that a worst-case scenario doesn’t necessarily mean we did anything wrong.

I have treated dozens of patients who presented with the same illnesses as my first patient, who died more than a year ago. Instead of second-guessing myself, I trusted my clinical instinct and stayed the course. Every one of those patients survived.

Abraar Karan is a medical doctor at Brigham and Women’s Hospital and a clinical fellow in medicine at Harvard Medical School. @AbraarKaran

Advertisement