Advertisement

Bias is hard to spot in ourselves

Share
Times Staff Writer

Being frank and open doesn’t help; neither do religious convictions nor a lifetime of good decisions.

Even the most thoughtful, ethically cautious people can find their judgment questioned, their reputations at risk over actions that others consider questionable, if not blatantly improper.

In December, top scientists at the National Institutes of Health, a federal research agency, acknowledged taking hundreds of thousands of dollars in consulting fees from drug companies with products being evaluated by the agency. In January, reports surfaced that Supreme Court Justice Antonin Scalia went on an exclusive hunting trip with Vice President Dick Cheney, who’s involved in a case now before the court.

Advertisement

And earlier this month, Rep. W.J. “Billy” Tauzin (R-La.), chairman of the House Energy and Commerce Committee, became the subject of much Washington discussion when it became known he was being offered jobs by powerful motion picture and drug industry lobbies with issues before the committee.

Tauzin (who resigned as committee chairman effective today), Scalia and the NIH scientists said that, despite appearances, their judgment had not been compromised, that they were capable of making independent decisions uncolored by bias.

It’s a belief that many people share -- whether a manager weighing a promotion for her son-in-law or someone selling a car to a friend. Honest soul-searching, most of us believe, will ensure impartiality.

And we’re almost always wrong, say psychologists who study the effect of bias on decision-making.

Even when we think we’re compensating for it, bias is not something we can easily remove, or factor out of our decisions, they say. It operates unconsciously, often as a protection against the kind of self-doubt that can cause or deepen mood problems. We’re far better at spotting it in others than in ourselves.

“It’s not a matter of being corrupt, but of being human,” said Max Bazerman, a Harvard Business School professorwho has studied the psychology behind accounting scandals after the collapse of Enron Corp. “It’s like asking a parent how smart their child is; no one expects a reasonable answer.”

Advertisement

Evidence for this kind of bias is easy to find. On surveys, most people rate themselves as significantly more decent, responsible and kind than their peers, as better drivers, harder workers and, of course, as less biased. In one experiment, people randomly assigned to act as independent advisors for “sellers” of an imaginary company rated it as worth about 30% more than independent advisors for the “buyers” did, based on the same information.

Other research has shown that a pharmaceutical researcher who receives funding from a drug company is less likely to write a negative cost-benefit analysis of a new drug than one who receives no company support. And anyone who has watched a pro basketball game knows that each sideline sees only the other team’s fouls, not its own.

Perhaps more important, this subjective judgment persists even when people are given a chance to correct it.

In one recent study, psychologists asked 91 Stanford University students to rate their personal qualities, such as friendliness and selfishness, as compared with their peers. The investigators then explained to the students how self-favoring bias skews answers to such surveys, and let them reevaluate their responses. Only 24%, or 19 of the 79 students, who rated themselves as significantly better than average acknowledged that their answers had been biased. The other 76% said that their initial ratings were objective, or that they had been too modest.

In another experiment, Harvard students showed dramatic differences in how they viewed a “social sensitivity” test in which their scores were randomly selected as high or low.

Those who did well on the exam said it was useful and fair; those who didn’t do well thought otherwise. Even after learning that the scores were bogus, the students concluded that their own opinion of the test was significantly less biased than others’ opinions were.

Advertisement

The students then were asked how they arrived at their judgments: by self-questioning, by projecting themselves into the heads of peers, or by applying theories of human behavior.

“What we found is that they tended to go inside their own heads to look for evidence of bias in themselves, but judged others using general theories of behavior,” said Emily Pronin, an assistant professor of psychology and public affairs at Princeton University and lead author of the Stanford and Harvard studies.

Self-serving prejudices cannot be filtered from our experience because they form our experience in a fundamental way, researchers say. For instance, it’s psychologically protective for people who get a bad grade or evaluation to conclude that the tests or the supervisor were not being fair or perceptive enough. From then on the memory of the disaster is colored by the explanation, the perception, the bias.

In searching our own experience, then, we run into what Pronin calls “introspection illusion,” the assumption that our own golden rule of objectivity works well for ourselves -- but others’ rules don’t work for them. The result is a blind spot that can lead otherwise careful people to exempt themselves from rules of behavior they would rigorously apply to others.

As Pronin’s work shows, about in 10 people do seem to judge themselves objectively, or close to it, on some measures. And a couple of the other nine correct biased assessments when given the chance (and some may, for that matter, overcorrect).

But the same person who’s fairly objective when judging himself or herself may be highly biased when assessing others, or vice versa. Psychologists have no way to predict whether a person is likely to be unbiased, or when. And contrary to expectations, a successful career built on making carefully reasoned decisions may only reinforce the illusion of objectivity, says psychologist Steven Berglas of the UCLA Anderson School of Management, who specializes in advising highly successful people, including rock stars and athletes.

Advertisement

“When you’re rich, you’re powerful, it’s natural to think you’re smarter than other people -- you must be, that’s how you got where you are,” he said. “And along with it comes this syllogistic quality, ‘I know what’s good, I know what’s an ethical breach, I know when I can be corrupted and when I can’t, and I have the success to prove it.’ ”

That son-in-law may truly be the best person for the job. And a trip to Louisiana to shoot ducks may have nothing to do with legal issues and everything to do with ducks. But the insidious nature of bias means that, however good your instincts, you -- and your colleagues -- can never know for sure.

Advertisement