Psychologists ask: What makes some smart people so skeptical of science?
In Washington, D.C., revelers and protesters are marking the ascendance of a new president and the populist movement he says he has mobilized.
Some 1,600 miles away in San Antonio, thousands of psychologists from around the world are also marking the dawn of the Trump era by focusing their attention on the thought processes that prompt some people to resist and reject science. Matters for which there is a broad scientific consensus — including man-made climate change, the safety of childhood vaccines and Darwin’s theory of evolution — have been attacked as hoaxes and lies by senior members of the new administration.
Psychologists have come up with a name for this trend: the “anti-enlightenment movement.”
To better understand it, these professional observers of human behavior will draw from a recent election campaign in which fake news exploded, conspiracy theories flourished and derision was heaped on elites of all kinds.
“We were motivated by anxiety,” said social psychologist Matthew Hornsey, who organized a symposium on the issue for this weekend’s annual meeting of the Society for Personality and Social Psychology.
The popular rejection of scientific thinking — and sometimes of facts that are plainly evident — didn’t begin with the campaign that brought forth Donald Trump’s presidency, Hornsey and others said. But if anyone doubted its existence before, they could do so no longer.
”We’re asking, ‘What are these biases leading people to resist science? Where do they come from? How do they operate and what can be done about them?’” said University of Oregon social psychologist Troy H. Campbell, who will be speaking at the symposium.
Those questions won’t be easy to answer. Psychologists will have to delve into the guts of human decision-making. They will dissect the ways in which we discount information — however well evidenced — that conflicts with what we want to believe about ourselves and the ways things work. They will examine the role of our social networks, and the cognitive shortcuts we take to interpret scientific conclusions we don’t really understand. They will consider the role that declining trust plays in people’s decision to believe what they’re told.
“People don’t act like scientists, weighing up evidence in an even-handed way,” said Hornsey, a professor of psychology at the University of Queensland in Australia. “When someone wants to believe something — for whatever reason — then they act more like lawyers trying to prosecute what they already want to be true. And they selectively attend to and critique the evidence to be able to do that.”
Social psychologists like Hornsey and Campbell are science nerds, but with a curious twist: They have a peculiarly keen interest in people.
You may have taken notice when that quiet guy in accounting shared an improbably dark theory about Hillary Clinton. Or marveled at the well-educated woman in your Zumba class who spouts ideas about medicine you know to be discredited. But then you go on with your life.
Social psychologists, by contrast, ponder those dark impulses and irrational beliefs — and the behavior they spawn.
They diagram the tangle of missed cues and crossed wires that is woven every time a person makes decisions with incomplete or imperfect information, in a context where she must guess other people’s motives.
People don’t act like scientists, weighing up evidence in an even-handed way.
— Matthew Hornsey, professor of psychology, University of Queensland
They come up with theories and test them on undergraduates, on strangers in the street, and on denizens of the internet.
Their science is young, and the universe they study — society — still seems chaotic, conforming to few recognized rules. But they are gleaning regularities. And where you shrug your shoulders at your neighbors’ confusing, erratic attitudes and behaviors, they have begun to discern some patterns.
So it came as no surprise to them that fake news would gain armies of adherents. They expected that although scientists are virtually unanimous on the existence and causes of climate change, disbelievers would take issue.
And when a disproportionate number of police shootings involved white cops and black victims, these psychologists offered an explanation for the phenomenon — implicit bias — that is more nuanced than outright racism.
In a paper to be presented at the symposium, Dan M. Kahan of Yale University asserts that ordinary people have always engaged in “motivated reasoning” when it comes to science — picking and choosing the facts that support their sense of who they are and the group to which they belong.
When lay people have felt that scientific conclusions superseded political or social affiliations, Kahan argues, it was because the cultural and social leaders they looked to shared a common sensibility about the public’s best interests.
When that breaks down and leaders wield facts like weapons in a struggle for cultural supremacy, Kahan says, the result is “a polluted science communication environment.”
That would seem to describe the current situation. And that breakdown has made social psychologists feel an urgent need to communicate their findings to scientists, leaders and ordinary people befuddled by a resurgence of doubt over matters they thought settled.
“We grew up in an era when it was just presumed that reason and evidence were the ways to understand important issues; not fear, vested interests, tradition or faith,” Hornsey said. “But the rise of climate skepticism and the anti-vaccination movement made us realize that these enlightenment values are under attack.”
And the stakes, he added, “are just too high to ignore. Anti-vaccination movements cost lives. Climate change skepticism slows the global response to the greatest social, economic and ecological threat of our time.”
MORE IN SCIENCE