Advertisement

Brain’s Use of Shortcuts Can Be a Route to Bias : Perception: The mind relies on stereotypes to make fast decisions. But in hiring, that can lead to discrimination.

Share
TIMES SCIENCE WRITER

Affirmative action stirs up powerful emotions in both supporters and opponents. But while both sides battle for the hearts of voters, psychologists say the real issues have more to do with the mechanisms of the mind.

Human brains are finely tuned, decision-making machines designed to make quick judgments on a wide variety of confusing events. How far away is that car in the distance? Is that form in the shadows a garbage can or a man with a gun? Is that round red thing a cherry or a marble?

In general, the brain uses past experience to jump to the “most likely” conclusion. Yet these same assumptions can lead people grossly astray.

Advertisement

“This acceptance by the brain of the most probable answer,” writes British perceptual psychologist Richard Gregory, makes it “difficult, perhaps somewhat impossible, to see very unusual objects.”

When “unusual objects” are women and minorities, it may be impossible to see them as qualified for a variety of jobs, psychologists say.

“Even if you have absolutely no prejudice, you are influenced by your expectations,” said Diane Halpern, professor of psychology at Cal State San Bernardino. “A small woman of color doesn’t look like a corporate executive. If you look at heads of corporations, they are tall, slender, white males. They are not fat. They are not in a wheelchair. They are not too old. Anything that doesn’t conform to the expectation is a misfit.”

“Similarity is a strong predictor of attraction,” said David Kravitz, psychologist at Florida International University. “So there is a natural human tendency to prefer and hire people like you.”

A growing number of behavioral studies point to patterns of perception that influence how people view everything from the moon to minority job candidates. These patterns, experts say, confirm that perception is an active process in which people color the world with their expectations. They do not so much believe what they see as see what they believe.

The ideal of a society free of prejudice may not be possible, experts say, simply because of the makeup of the human mind. Stereotypes are not only inevitable, but essential for survival. If people couldn’t make lightning-fast decisions on limited information, they would not be able to discriminate between friend or foe, shadow or object, far or near. To a very real extent, people have to judge every book by its cover. And once a judgment is made, virtually no amount of contrary evidence can turn it around.

Advertisement

People aren’t normally aware of the amount of guesswork that goes on in the brain because these perceptual tricks hit upon the right answer the vast majority of the time. Not only do perceptual processes work to ensure survival, they allow people to make music, play baseball, create art. In fact, one of the great puzzles of cognitive science is how a mind capable of dreaming up the music of Mozart and the equations of quantum mechanics can make so many egregious mistakes.

Social psychologists are finding that the occasional errors that the mind makes reveal the hidden rules it uses to make decisions. For example, the brain uses apparent size to judge distance: People don’t mistake a car in the distance for a toy because the brain knows through past experience that distant objects appear smaller; therefore the brain compensates, automatically making it larger.

But when the information is ambiguous, the brain often leaps to the wrong conclusion. For example, the moon appears to be much larger when it floats just above the horizon than when it shines overhead. The moon doesn’t change size, but the brain’s estimation of its distance does--in turn automatically changing its apparent size.

By studying how the mind can fool us, psychologists explore the nature of cognitive weak points. They have found that to a large extent, people see what they expect to see, and reject any information that would challenge their already established point of view. “It’s the one thing that everyone agrees on,” said psychologist Rachel Hare-Mustin, formerly of Harvard. “Unconscious prevailing ideologies are like sand at the picnic. They get into everything.”

Errors about everyday objects tend to provide immediate feedback, which makes people unlikely to repeat them. Even a slight mistake in estimating the size of a step can lead to a serious fall.

But errors about other people can more easily slip by unnoticed. “If you’re wrong about that car coming at you, it’s going to run you down,” said psychologist Jennifer Crocker of the State University of New York at Buffalo. “But if you’re wrong about whether someone is stupid, you don’t hire that person and you never find out how brilliant they are.”

Advertisement

The subversive nature of unconscious thought is revealed by this riddle:

A father and son are en route to a baseball game when their car stalls on the railroad tracks. The father can’t restart the car. An oncoming train hits the car. The father dies. An ambulance rushes the boy to a nearby hospital. In the emergency room, the surgeon takes one look and says: “I can’t operate on this child; he’s my son.”

As cognition researcher Douglas Hofstadter pointed out, even intelligent, broad-minded people go out of their way to invent bizarre scenarios--sometimes involving extraterrestrials--to solve the riddle. What prevents most people from seeing that the surgeon is the boy’s mother is the reliance of the brain on the “default assumption” that a surgeon is a man.

“A default assumption,” Hofstadter explained, “is what holds true in what you might say is the ‘simplest’ or ‘most likely’ case. But the critical thing is that they are made automatically, not as a result of consideration and elimination.”

Default assumptions are one of the strategies the brain uses to judge the most likely interpretation of an ambiguous situation. In effect, the brain calculates what psychologists call a “base rate”--the normal frequency of a certain event in a normal population.

Base rates have enormous survival value. A mail carrier who assumes that most pit bulls are dangerous is more likely to escape injury than a more open-minded colleague.

Other peculiarities of social perception have been uncovered in a wide variety of controlled experiments, mostly with college students. For example, subjects judge attractive colleagues as smarter, kinder and happier than their unattractive (but otherwise similar) counterparts.

Advertisement

They judge people perceived to be powerful as taller than less powerful people, even when they are actually the same height. They judge people living in poverty as less intelligent than people in affluent neighborhoods.

In one experiment, college students watched a short film of a girl taking a math test and getting a numerical grade. When the girl was portrayed in a suburban neighborhood, viewers remembered her score as higher than when she was shown in a ghetto--even though both the girl and the score were the same in both cases.

The brain also grabs for the most readily available image at hand. This automatic response--which psychologists refer to by the tongue-tangling term “availability-mediated influence”--can be easily manipulated.

In one frequently cited series of experiments, three groups of people were introduced to one of two bogus prison guards--one sweet natured and humane, the other sadistic and brutish. All three groups were later asked to make inferences about “prison guards in general.”

The first group was told that whatever guard they met was typical of all prison guards. The second group was told nothing. The third group was told that the guard they met was not at all typical; in fact, they were specifically warned that any inferences they made from this one case was likely to be wrong.

Nonetheless, all three groups described “prison guards in general” as either kind or brutish, depending on which guard they met.

Advertisement

The experiment, described in the classic book, “Human Inference” by Lee Ross of Stanford and Richard Nisbett of the University of Michigan, presents what the authors describe as “a humbling picture of human . . . frailty.” When presented with a single vivid “available” example, the mind tends to bury all other evidence under the carpet of the unconscious.

This reliance on one vivid example sheds light on one of the most painful contradictions of the affirmative action debate. Many white males, studies show, are angry because they are convinced that less qualified women and minorities are taking their jobs.

Yet minorities and women still feel excluded--and apparently for good reason. The recently published report of the Glass Ceiling Commission, established by legislation introduced in 1990 by then-Senate minority leader Bob Dole, concluded that 95% of top positions are still occupied by white men, even though they constitute only 43% of the work force.

“So much of the public discourse on this is debate by anecdote,” said William Bielby, chairman of the sociology department at UC Santa Barbara. “We hear from so many students that they have a white friend from high school who couldn’t get into UCSB, but a black kid got in with no problem. And we know how many black kids are on campus. If all those anecdotes were true, then 15% of our students, rather than 3%, would be black.”

In the same way, Bielby said, it’s easier to hang onto stereotypes in settings where only one or two women, for example, are in management positions. When only one woman occupies the executive suite, she becomes a target for all expectations about women in general. “But when the proportion of women is 40% or 50%,” Bielby said, “(their colleagues) can see the extent to which the women differ among themselves and the men differ among themselves.”

Psychologist Faye Crosby of Smith College conducted an experiment with a group of Yale undergraduate men that vividly showed how inequality becomes imperceptible on a case-by-case basis. Patterns of discrimination that are easy to see in a broad context become invisible when seen in individual instances.

Advertisement

Crosby and her colleagues created bogus job descriptions of various men and women at a hypothetical company. The students were instructed to look for unfairness in the salaries. Unknown to them, the women’s salaries were rigged to be 80% of the salaries of comparable men.

When the students compared one man with one woman at a time, they did not see any unfairness. But when they saw all the salaries of all the men and all the women at the same time, they could easily spot the pattern.

Crosby stresses that this inability to see unfairness on a case-by-case basis has nothing to do with sexism or bad attitudes. It has to do with how the mind works. ‘We’re not saying people are stupid. It’s just (a normal cognitive process) like optical illusions.”

However one’s perceptions are planted, they soon become almost impossible to root out.

In a process psychologists call “belief perseverance,” people do almost anything to cling to cherished notions. “If we were constantly changing the way we view the world, things would be too confusing,” Crocker said. So people tend to discount evidence that contradicts their “schema,” or theory about the world. “If you believe lawyers are slimy and you meet some who aren’t, you don’t revise your schema; you say, oh, that’s an exception.”

People also routinely change their memories, Halpern said, to fit their beliefs. If you think that successful people have to be aggressive, and you work with a successful person who is not aggressive “you remember that person as more aggressive,” Halpern said. “What we remember depends very much on our biases and beliefs.”

These self-fulfilling prophecies, known to psychologists as “behavioral confirmation biases,” were dramatically illustrated by a series of experiments in which similar black and white job applicants were questioned by a white interviewer while researchers watched behind a one-way mirror. When the job applicants were black, interviewers sat farther back in their chairs, avoided eye contact, stumbled over their speech and posed fewer questions.

Advertisement

The next part of the test was designed to look at the behavior of the job applicants. This time, the researchers became the interviewers. For consistency, all the applicants were white. With half of the applicants, the researchers intentionally mimicked the behaviors that the interviewers in the first part of the experiment used on blacks (sitting back, stumbling over words and so on); with the other half, they behaved as the interviewers had with whites--that is, they sat forward in the chairs, maintained eye contact, spoke clearly and asked more questions.

Other researchers watching from behind one-way mirrors evaluated how the applicants seemed to perform during the interview. The result was that the white applicants, when treated as the black applicants had been, were rated less confident, less articulate and less qualified for the job.

What makes these behaviors hard to correct is that they’re completely unconscious; the brain jumps to conclusions in less than 100 milliseconds, “the time it takes to recognize your mother,” Hofstadter noted.

In study after study, “the most important finding is that (biases) operate unconsciously, even in people who don’t want them to,” said Anthony G. Greenwald, psychologist at the University of Washington. One of the greatest misconceptions that people have, he said, “is that wanting to be fair is enough to enable you to be fair--not recognizing the unconscious forces that influence your judgments.”

In the end, he says, the best approach to affirmative action may have nothing to do with putting people’s hearts in the right place. Instead, it should come from understanding what goes on in the brain.

“If you understand that your car tends to drive to the left because your wheels are out of line, you can correct it,” he said. Affirmative action, says Greenwald, is a way to compensate not only for past discrimination, but also for future discrimination “by persons who have no intent to discriminate.”

Advertisement

(BEGIN TEXT OF INFOBOX / INFOGRAPHIC)

About This Series

In this series, The Times examines affirmative action, a policy that has left its imprint on the workplace and college campuses over the last 30 years. With some now questioning whether giving preferences to minorities has been fair to all, this series, which will appear periodically throughout 1995, will measure the policy’s impact on American institutions, ideas and attitudes.

* Previously: Why affirmative action became an issue in 1995, its legal underpinnings, its impact on presidential politics, the difficulties of defining a minority, the views of its beneficiaries and a Times poll showing ambivalent attitudes on the issue.

* Sunday: Before affirmative action, there was merit. Or was there? Informal systems of preferences have always molded American life, extending advantages to some and shutting out others.

* Today: How the same mechanisms of the mind involved in seeing optical illusions may be at work in racial stereotyping.

* Tuesday: Diversity programs aimed at defusing racial conflict in the workplace have survived both the recession and the anti-affirmative action backlash.

(BEGIN TEXT OF INFOBOX / INFOGRAPHIC)

The Lessons of Illusions

Psychologists use illusions to catch the brain in the act of jumping to conclusions. Most of the time, these perceptual short-cuts work quite well, so we don’t notice them. But in unusual situations--such as considering women and minority applicants for jobs traditionally held by white males----the same tricks can lead to egregious mistakes. Most psychologists believe that the unconscious mechanisms people employ to make judgments about other people are very similar to those behind visual illusions.

Advertisement

TRUE MOON

* THE ILLUSION: The moon appears larger when it’s low on the horizon than when it’s high overhead, even though the moon doesn’t change size.

* HOW IT WORKS: When the moon sits low in the sky, the horizon serves as a reference point, making the moon unnaturally bigger. (If you view the moon upside down and the horizon becomes the sky--thereby changing the apparent distance--the illusion disappears.)

* WHAT IT SHOWS: That the brain can jump to the wrong conclusions when information is ambiguous. Also, that knowing something is an illusion does not make the illusion go away.

Note: the moon on the right has been made smaller to simulate the illusion.

SHAPE AND FORM

* THE ILLUSION: A white triangle appears to float in front of three black circles, even though no triangle exists.

* HOW IT WORKS: The brain constructs the triangle as the “most likely” solution to the figure of three pie-shaped wedges. People who don’t immediately see the triangle usually find it after someone points it out to them.

* WHAT IT SHOWS: That people can see something that doesn’t exist, especially if they go looking for it. Also that it’s much easier to see something familiar.

Advertisement

PARIS, PARIS

* THE ILLUSION: A sign appears to read “Paris in the spring,” but it actually has an extra “the.”

* HOW IT WORKS: Since people do not expect to see a double “the,” most do not perceive it.

* WHAT IT SHOWS: That prior expectation influences what people see.

Source: The Exploratorium, San Francisco

Researched by K.C. COLE / Los Angeles Times

Advertisement