Advertisement

Op-Ed: Google and Facebook aren’t fighting fake news with the right weapons

The Google logo as seen on March 23, 2010.
The Google logo as seen on March 23, 2010.
(Virginia Mayo / Associated Press)
Share via

We know a lot about fake news. It’s an old problem. Academics have been studying it — and how to combat it — for decades. In 1925, Harper’s Magazine published “Fake News and the Public,” calling its spread via new communication technologies “a source of unprecedented danger.”

That danger has only increased. Some of the most shared “news stories” from the 2016 U.S. election — such as Hillary Clinton selling weapons to Islamic State or the pope endorsing Donald Trump for president — were simply made up.

Unfortunately — as a conference we recently convened at Harvard revealed — the solutions Google, Facebook and other tech giants and media companies are pursuing aren’t in many instances the ones social scientists and computer scientists are convinced will work.

Advertisement

We know, for example, that the more you’re exposed to things that aren’t true, the more likely you are to eventually accept them as true. As recent studies led by psychologist Gordon Pennycook, political scientist Adam Berinsky and others have shown, over time people tend to forget where or how they found out about a news story. When they encounter it again, it is familiar from the prior exposure, and so they are more likely to accept it as true. It doesn’t matter if from the start it was labeled as fake news or unreliable — repetition is what counts.

The key to evaluating credibility, and story placement, is to focus not on individual items but on the cumulative stream of content from a given website.

Reducing acceptance of fake news thus means making it less familiar. Editors, producers, distributors and aggregators need to stop repeating these stories, especially in their headlines. For example, a fact-check story about “birtherism” should lead by debunking the myth, not restating it. This flies in the face of a lot of traditional journalistic practice.

Advertisement

The online Washington Post regularly features “Fact Checker” headlines consisting of claims to be evaluated, with a “Pinocchio Test” appearing at the end of the accompanying story. The problem is that readers are more likely to notice and remember the claim than the conclusion.

Another thing we know is that shocking claims stick in your memory. A long-standing body of research shows that people are more likely to attend to and later recall a sensational or negative headline, even if a fact checker flags it as suspect. Fake news stories nearly always feature alarming claims designed to grab the attention of Web surfers. Fact checkers can’t compete — especially if their findings are writ small.

To persuade people that fake news is fake, the messenger is as important as the message. When it comes to correcting falsehoods, a fellow partisan is often more persuasive than a neutral third party. For instance, Trump is arguably the individual most closely associated with birtherism. But in September 2016, Trump publicly announced that Obama was a native-born American, “period.” Polling a few days later showed an 18-percentage point drop among registered Republicans in acceptance of the birther myth. Countless debunking stories by fact checkers had far less impact.

Advertisement

The Internet platforms have perhaps the most important role in the fight against fake news. They need to move suspect news stories farther down the lists of items returned through search engines or social media feeds. The key to evaluating credibility, and story placement, is to focus not on individual items but on the cumulative stream of content from a given website. Evaluating individual stories is simply too slow to reliably stem their spread.

Google recently announced some promising steps in this direction. It was responding to criticism that its search algorithm had elevated to front-page status some stories featuring Holocaust denial and false information about the 2016 election. But more remains to be done. Holocaust denial is, after all, low-hanging fruit, relatively easily flagged. Yet even here Google’s initial efforts produced at best mixed results, initially shifting the denial site downward, then ceasing to work reliably, before ultimately eliminating the site from search results.

The platforms must also wrestle more seriously with how to evade manipulation. Recent research led by computer scientist Filippo Menczer highlights the synchronized push of fake news by millions of bots on social media and has developed new ways of detecting them. In a white paper released last month, Facebook claims that its top priority is making sure accounts are owned by real people. Yet its visible efforts to date to purge fake accounts — most notably 30,000 in France ahead of that nation’s presidential election — seem small relative to the scale of the problem. By its own estimates, Facebook may have as many as 138 million duplicate or false accounts.

Finally, the public must hold Facebook, Google and other platforms to account for their choices. It is almost impossible to assess how real or effective their anti-fake news efforts are because the platforms control the data necessary for such evaluations. Independent researchers must have access to these data in a way that protects user privacy but helps us all figure out what is or is not working in the fight against misinformation.

For all the talk about fake news, the truth is that we know a lot about why people read, believe, and share things that aren’t true. Now we just need the big technology platforms and media companies to take the truth to heart.

Advertisement

Matthew A. Baum is the Marvin Kalb professor of global communication and professor of public policy at Harvard University’s John F. Kennedy School of Government. David Lazer is a distinguished professor in political science and computer and information science at Northeastern University and visiting scholar at the Institute for Quantitative Social Science at Harvard.

Follow the Opinion section on Twitter @latimesopinionand Facebook

Advertisement