In humans, the distance between the brain and heart can be a foot or more. But when it comes to processing such raw emotions as fear and racial prejudice, new research finds the two organs are closer than we may want to believe.
A study has demonstrated that the snap judgments many of us make about people of color are exquisitely attuned to a mechanism that also operates almost entirely beneath our conscious control: the beating of our hearts.
During the moment that the heart ejects blood from its lower chambers, physiologists know that electrical signals race to one of the human brain's most primitive structures — the fear-processing, threat-responsive amygdala — and call it to high alert. In the brief interlude when the cardiac muscle relaxes before squeezing again, new electrical signals direct the amygdala, ever so briefly, to stand down.
That mind-body connection, of course, is powerful. It helps explain why a rush of anxiety or emotion makes us jumpy, and prepares us to flee or put up our dukes and fight when we're under attack.
Sometimes, however, the mind and body tricks that evolved to keep us safe prompt us to see danger where it doesn't exist.
In a series of experiments published Tuesday in the journal Nature Communications, British psychologists showed that judgments that inaccurately presume a black person to pose a threat are more common when the heart has signaled the amygdala to pay attention than when it is off-duty.
The researchers administered a series of tests for what's called "implicit bias" to 32 experimental subjects, nearly all of them white. Mounting evidence in this line of research suggests that even when people harbor no overt racism, they often assign stereotyped attributes — most of them negative — to people with African features than they do to white people.
The experiments conducted in the study certainly found evidence of implicit bias. But the study also found how deeply rooted such judgments become in the kinds of threat-assessment processes that help humans survive and prosper in a world with many dangers.
"We all know that to a very large extent, social and racial stereotypes seem to be embedded in our culture," said University of London psychologist Manos Tsakiris, the paper's senior author. "What we're showing with this study is they also become embodied in our physiology."
Implicit bias has become a hot-button political issue of late. During and in the days after their first debate, then-presidential candidates Hillary Clinton and Donald Trump squared off over the idea that implicit bias might be contributing to police shootings of black men. Trump, now the president-elect, ridiculed Clinton's suggestion that more training could help.
Researchers primed subjects to perceive a threat (seeing a gun rather than a tool or a phone in a man's hand) by first showing them a photo of a black man's face. In cases where that "priming" came on the heartbeat, subjects were more likely, in an experimental exercise, to assume a man in a photograph was holding a gun (irrespective of whether it was a wrench or a cellphone). When the priming came between heartbeats, subjects were more likely to distinguish between a gun and an everyday object in the pictured person's hand.
In the average human, this ebb and flow of perceived danger happens 72 times per minute. But in a person, say, navigating a dark passageway in an unfamiliar city or responding to frantic calls for help, this cycle of "warning/all clear" speeds up.
That, the authors suggest, makes people under stress more prone to implicit bias, especially when there's a threat of danger. The researchers tried to "prime" implicit biases that were less about threat and more about cultural beliefs — that black people are better athletes than whites, for example — and they found no greater bias on the heartbeat than during the period between beats.
In the kinds of real-life situations faced by an officer pursuing a suspect, "a faster, stronger heartbeat gives greater potential to tap into these signals, resulting in more stereotyped and racist behavior," suggested co-author Sarah N. Garfinkel.
While this can have "devastating real-life consequences," said Garfinkel, training that makes a person more aware of his or her implicit biases does appear to work to make their threat assessments more accurate.
MORE IN SCIENCE: