Advertisement

Scientists blame you for limited political viewpoints in your Facebook feed

A new study indicates Facebook filters limit a user’s exposure to politically challenging material, but that our own choices do so even more.

A new study indicates Facebook filters limit a user’s exposure to politically challenging material, but that our own choices do so even more.

(Richard Vogel / Associated Press)
Share via

People who use social networking sites like Facebook to “curate” the news they see wind up reading stories that largely fit their worldview and rarely challenge their beliefs or broaden their perspectives, according to a new study.

That’s not entirely because your Facebook news feed is pandering to you, the study authors found. Even more powerful are the choices we make as individuals within a social network. People who rely on a social networking site for their news click more often on articles that appear to cater to what they already know and think, and less often choose to view suggested content that would carry them out of their comfort zone.

When it comes to narrowing the breadth of an individual’s exposure to challenging views, the power of the “feed” -- the algorithms that sort and present material of presumed value to a Facebook user -- is great. But the power of individuals’ choices within that system is even greater, concludes the study published Thursday in the journal Science.

Advertisement

Researchers from Facebook and the University of Michigan’s School of Information set out to gauge how Americans’ growing reliance on social media to steer them to news and information might influence the political breadth of material to which they are exposed.

To do so, they analyzed the activity of more than 10 million users of the social networking site who publicly listed their political preference. The researchers looked not only at the news and information choices put in front of those users by their online friends and Facebook’s news-sorting algorithms; they also looked at which of those links these users clicked on.

The extent to which the nation’s Facebook users are immersed in a diverse “marketplace of ideas” has broad implications for the tenor of the nation’s civic debate, the authors suggest. If the social algorithms that size us up and make educated guesses about what we’d like are narrowing our exposure to challenging political views, for instance, that might help explain a political and civic environment that appears to be increasingly polarized.

Advertisement

What they found, observes Northeastern University’s David Lazer in a commentary published alongside the study, is that “the deliberative sky is not yet falling, but the skies are not completely clear either.”

Facebook’s news-feed ranking algorithm produced, on average, a 1-percentage-point change in the proportion of news that might challenge users’ beliefs, the authors of the study found. But individuals’ own choices about what to click on resulted in a 4-percentage-point decrease in the proportion of challenging content they saw.

“Our work suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals,” the authors wrote.

Advertisement

According to the Pew Research Center, 74% of American adults who are online use social networking sites. A 2010 survey by Pew found that Facebook users are more trusting and more politically engaged than those who use other sites, and that users of MySpace “are more likely to be open to opposing points of view.”

Social media news feeds, wrote the authors, do expose Facebook users to “at least some ideologically cross-cutting viewpoints.” On average, a Facebook user who publicly asserts his political alignment has one online friend who is his ideological opposite for every four who are ideologically aligned with him. That evidence should ease fears that Facebook users might be exclusively visiting ideologically friendly news sites, or might have opted out of hard news altogether, the researchers noted.

But that average obscures at least one interesting pattern: The authors found that “liberals tend to be connected to fewer friends who share conservative content than conservatives (who tend to be linked to more friends who share liberal content).”

Follow me on Twitter @LATMelissaHealy and “like” Los Angeles Times Science & Health on Facebook.

Advertisement