Facebook Inc., which is cutting the amount of news in its news feed, will prioritize information from the publishers that remain on the social network based on how trustworthy they are, the company said.
Trustworthiness is based on a recent survey of U.S. Facebook users that gauged their familiarity with, and trust in, different sources of news. The results will inform the company’s ranking in the news feed, a stream of updates people see when they log in. News sources should also be “informative” and relevant to people’s local communities, the company said Friday. The move seems geared toward helping Facebook avoid perceptions of bias in selecting which news providers to highlight.
“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” Chief Executive Mark Zuckerberg wrote in a Facebook post Friday. “We could try to make that decision ourselves, but that’s not something we’re comfortable with.”
Publishers expressed concern about the news feed changes announced last week because many news sites have come to depend on traffic from Facebook. On Friday, Zuckerberg said he expects news to make up roughly 4% of the news feed, down from roughly 5% today. “This is a big change, but news will always be a critical way for people to start conversations on important topics,” he said.
Facebook, which has come under fire for the spread of fake news on its service, recently said it will reduce the amount of content from brands and other company pages — including those run by news outlets — in the news feed. That move refocuses the Menlo Park, Calif., company on content from users’ friends and family members, taking Facebook back to its roots, but it could mean less time spent on the site, Zuckerberg said last week.
The social network has had trouble managing its role as one of the world’s most powerful news distributors. Ahead of the U.S. presidential election in 2016, Facebook was criticized for bias because its human curators of a “Trending Topics” section were only allowed to pick links from a set of sources Facebook designated as trusted, which excluded some conservative sites.
Since then, the company has sought to address the spread of fake news while trying to avoid being the arbiter of what is true or false. It works with third-party fact checkers who look at articles flagged by users as potentially false or misleading. Those efforts have had little impact on the overall problem.
Frier writes for Bloomberg.