Advertisement

Facebook institutes sweeping ban on QAnon. Will it work?

In this Aug. 2, 2018 photo, a man holding a Q sign waits in line to enter a campaign rally with President Donald Trump
In this Aug. 2, 2018 photo, David Reinert holding a Q sign waits in line with others to enter a campaign rally with President Trump in Wilkes-Barre, Pa.
(AP Photo/Matt Rourke, File)
Share via

In its most sweeping content policy decision to date, Facebook implemented a comprehensive ban on QAnon-related pages, groups and Instagram accounts Tuesday. The action represents a sharp escalation against purveyors of the vast conspiracy theory that baselessly claims that a shadowy, Satan-worshipping cabal of Democrats and other elites operates a child sex-trafficking ring.

In August, the company said it would remove content accounts and groups associated with QAnon only when they discussed potential violence. That initial policy led the company to remove 790 QAnon groups and restricted an additional 1,950 related to the conspiracy.

Facebook seems to be moving rapidly to enforce the new ban. Three QAnon-focused groups identified by The Times that were online last week, including one called “ 🇺🇸🎖🎖POTUS|W•W•G•1•W•G•A[Q]🎖🎖🇺🇸,” were unavailable Tuesday after the announcement.

Advertisement

In a Tuesday blog post, Facebook said that although QAnon posts may not directly promote violence, they are often nonetheless linked to “different forms of real world harm.” The company cited a barrage of fake claims in recent weeks by QAnon followers that wildfires ravaging the West Coast were started by members of leftist anarchist groups such as antifa, diverting the attention of local officials from the important task of managing the fires and protecting residents.

The company also noted it began directing users to credible child-safety resources when they search for hashtags that have been co-opted by QAnon supporters, such as #SaveTheChildren, which refers to the false claim by conspiracy theorists that children have been kidnapped as part of the alleged human trafficking ring.

But the effort may be too little too late.

QAnon was a fringe movement when it sprouted in convoluted 4Chan posts in 2017. But the movement has grown enormously, bubbling into the mainstream this year and animating right-wing politics.

Advertisement

While Reddit and YouTube took earlier action against QAnon, Twitter and Facebook did not make moves to shut down or place limits on QAnon-linked accounts until this summer.

A Reddit executive told The Atlantic last month that the company hadn’t made any focused effort to keep QAnon off the platform. Instead, channels where followers gathered were removed amid a broader crackdown on harassment and hate speech. Reddit removed one of the original channels related to the conspiracy, r/CBTS_stream, in March 2018 for inciting violence, and in September of that year the company banned a main forum for QAnon conspiracy theories that had about 70,000 subscribers.

YouTube spokesman Alex Joseph said in an email that the company has removed tens of thousands of QAnon-related videos and channels for violating the site’s hate and harassment policies since 2018. In early 2019, the company also aimed to reduce recommendations of videos referencing QAnon-related conspiracy theories.

Advertisement

Facebook has struggled and failed to institute previous policies comprehensively. In its August move, Facebook aimed to limit the reach of QAnon pages and accounts, even if the platform couldn’t ban them. Still, the company’s own algorithm recommended users to groups discussing the theory and groups continued to grow, adding hundreds of new members, a New York Times investigation in September found.

Since the ban in August, QAnon groups have come up with ways to become less explicit in their references to the conspiracy theory in order to escape sanctions by the company.

“Facebook helped the QAnon community grow exponentially — and refused to take appropriate action earlier this year when it would have mattered,” said Angelo Carusone, president of nonprofit watchdog organization Media Matters for America, in a statement.

Carusone said the company bungled its earlier crackdown, giving the QAnon community warning with its announcement and “ample time” to change group names and page descriptions.

The U.S. House of Representatives approved a bipartisan resolution last week condemning the conspiracy theory. Top Senate Democrats including U.S. Sen. Mark R. Warner (D-Va.) have called on Trump to disavow the movement, to no avail. Far from it, Trump praised QAnon followers for supporting him.

“I’m pleased to see Facebook take action against this harmful and increasingly dangerous conspiracy theory and movement,” Warner said in a statement. “Ultimately the real test will be whether Facebook actually takes measures to enforce these new policies — we’ve seen in a myriad of other contexts, including with respect to right-wing militias like the Boogaloos, that Facebook has repeatedly failed to consistently enforce its existing policies.”

Advertisement