As anti-vaccine comments began proliferating, Facebook froze

People protesting against vaccine and mask mandates
Protesters against vaccination and mask mandates demonstrate near the New Mexico state capitol in Santa Fe.
(Cedar Attanasio / Associated Press)

In March, as claims about the dangers and ineffectiveness of COVID-19 vaccines spun across social media and undermined attempts to stop the spread of the coronavirus, some Facebook employees thought they had found a way to help.

By subtly altering how vaccine-related posts were ranked in people’s newsfeeds, researchers at the company realized they could curtail the misleading information that individuals saw and offer posts instead from legitimate sources like the World Health Organization.

“Given these results, I’m assuming we’re hoping to launch ASAP,” one Facebook employee wrote in March, responding to the internal memo about the study.


Instead, Facebook shelved some suggestions from the study. Other changes weren’t made until April.

When another Facebook researcher suggested disabling comments on vaccine posts in March until the platform could do a better job of tackling anti-vaccine messages lurking in them, that proposal was ignored.

Critics say Facebook was slow to act because it worried it might impact the company’s profits.

The Delta variant and vaccine hesitancy in red states stall progress in President Biden’s battle against COVID-19 and threaten the economy’s revival.

“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”

In an emailed statement, Facebook said it has made “considerable progress” this year with downgrading vaccine misinformation in users’ feeds.

Facebook’s internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including the Associated Press. The Los Angeles Times is not a part of the consortium.

The trove of documents shows that in the midst of the COVID-19 pandemic, Facebook carefully investigated how its platforms spread misinformation about life-saving vaccines. They also reveal that rank-and-file employees regularly suggested solutions for countering anti-vaccine misinformation on the site, to no avail. The Wall Street Journal reported last month on some of Facebook’s efforts to deal with anti-vaccine comments.

YouTube finally banned anti-vaccine activists Joseph Mercola and Robert F. Kennedy Jr., but why were they allowed on the platform until now?

The inaction raises questions about whether Facebook prioritized controversy and division over the health of its users.

“These people are selling fear and outrage,” said Roger McNamee, a Silicon Valley venture capitalist and early investor in Facebook who is now a vocal critic. “It is not a fluke. It is a business model.”

Typically, Facebook ranks posts by engagement — the total number of likes, dislikes, comments and re-shares. That ranking scheme may work well for innocuous subjects like recipes, dog photos or the latest viral singalong. But Facebook’s own documents show that when it comes to divisive, contentious issues like vaccines, engagement-based ranking only emphasizes polarization, disagreement and doubt.

To study ways to reduce vaccine misinformation, Facebook researchers changed how posts were ranked for more than 6,000 users in the U.S., Mexico, Brazil and the Philippines. Instead of seeing posts about vaccines that were chosen based on their engagement, these users saw posts selected for their trustworthiness.

Facebook the company is losing control of Facebook the product — and of the carefully crafted image it’s spent more than a decade selling despite problems such as misinformation, human trafficking and pervasive extremist groups on its platform

The results were striking: a nearly 12% decrease in content that made claims debunked by fact-checkers and an 8% increase in content from authoritative public health organizations such as the WHO or U.S. Centers for Disease Control.

Employees at the company reacted with exuberance, according to internal exchanges.

“Is there any reason we wouldn’t do this?” one Facebook employee wrote in response.

Facebook said it did implement many of the study’s findings — but not for another month, a delay that came at a pivotal stage of the global vaccine rollout.

Internal Facebook documents acknowledge that the company often does a poor job of monitoring content because of its own linguistic shortcomings.

In a statement, company spokeswoman Dani Lever said the internal documents “don’t represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation.”

The company also said it took time to consider and implement the changes.

Yet the need to act urgently couldn’t have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable — the elderly and sick. And public health officials were worried. Only 10% of the population had received their first dose of a COVID-19 vaccine, while one-third of Americans were thinking about skipping the shot entirely, according to a poll by the Associated Press-NORC Center for Public Affairs Research.

Despite this, Facebook employees acknowledged that they had “no idea” just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. But company research in February found that as much as 60% of the comments on vaccine posts were anti-vaccine or vaccine-reluctant.

Even worse, company employees admitted they didn’t have a handle on catching those comments, or a policy in place to take them down.

“Our ability to detect [vaccine hesitancy] in comments is bad in English — and basically non-existent elsewhere,” another internal memo posted March 2 said.

Los Angeles resident Derek Beres, an author and fitness instructor, sees anti-vaccine content thrive in the comments every time he promotes immunizations on his accounts on Instagram, which is owned by Facebook. Last year, Beres began hosting a podcast after noticing that conspiracy theories about COVID-19 and vaccines were swirling on the social media feeds of health and wellness influencers.

Earlier this year, when Beres posted a picture of himself receiving the COVID-19 shot, some on social media told him he would likely drop dead in six months’ time.

Health officials should more directly address faith communities so they can effectively dispel the wild conspiracy theories swirling around the vaccines.

“The comments section is a dumpster fire for so many people,” Beres said.

Some Facebook employees suggested disabling all commenting on vaccine posts while the company worked on a solution.

“Very interested in your proposal to remove ALL in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal,” one Facebook employee wrote March 2.

The suggestion went nowhere.

When Nicki Minaj became the latest hip-hop artist to spread COVID-19 vaccine misinformation, she reflected the anxieties of many in Black and brown communities.

Instead, Facebook CEO Mark Zuckerberg announced March 15 that the company would start labeling posts about vaccines that described them as safe.

The move allowed Facebook to continue to get high engagement — and ultimately profit — off anti-vaccine comments, said Ahmed of the Center for Countering Digital Hate.

“Facebook has taken decisions which have led to people receiving misinformation which caused them to die,” Ahmed said. “At this point, there should be a murder investigation.”