Facebook bans some QAnon groups and accounts, lets others stay

A person with a Q sign waits to enter a Trump rally in 2018
A person holding up a Q sign with an American flag pattern waits to enter a Trump campaign rally in Wilkes-Barre, Pa., in 2018.
(Matt Rourke / Associated Press)

Facebook Inc. says it will restrict the right-wing conspiracy movement QAnon and no longer recommend that users join groups supporting it, although the social media giant isn’t banning it outright.

Facebook said Wednesday that it is banning groups and accounts associated with QAnon and a variety of U.S. militia and anarchist groups that support violence. But the company will continue to allow people to post material that supports these groups — so long as they do not otherwise violate policies against hate speech, abuse and other provocations.

QAnon groups have flourished on Facebook in recent years. Twitter Inc. announced a similar crackdown last month.


The QAnon conspiracy theory centers on the baseless belief that President Trump is waging a secret campaign against enemies in the “deep state” and a child-sex trafficking ring run by satanic pedophiles and cannibals. For more than two years, followers have pored over tangled clues purportedly posted online by a high-ranking government official known only as “Q.”

The conspiracy theory emerged in a dark corner of the internet and has recently crept into mainstream politics. Trump has retweeted QAnon-promoting accounts, and followers of the theory flock to his rallies wearing clothes and hats with QAnon symbols and slogans.

Last week, Marjorie Taylor Greene, a House candidate who openly supports QAnon and has been criticized for a series of racist comments, won her Republican primary in Georgia. She’s part of a growing list of candidates who have expressed support for QAnon. Lauren Boebert, another such candidate, recently upset a five-term congressman in a Republican primary in Colorado.

Facebook said it will remove groups and accounts outright only if they discuss potential violence, including in veiled language.

“We will continue studying specific terminology and symbolism used by supporters to identify the language used by these groups and movements indicating violence and take action accordingly,” the company said.

Facebook will still restrict the material it doesn’t remove, initially by no longer recommending it. For instance, when people join a QAnon group, Facebook will not recommend similar groups to join. It also says it won’t suggest QAnon references in searches or, in the near future, allow it in ads.

As a result of the policy changes, Facebook says it has removed more than 790 groups, 100 pages and 1,500 ads tied to QAnon on the Facebook platform and has blocked more than 300 hashtags across Facebook and Instagram. There are 1,950 other groups and 440 pages that Facebook says it has identified that remain on the platform but face restrictions, along with 10,000 accounts on Instagram.


For militia organizations and those encouraging riots, including some that may represent themselves as part of the anti-fascist movement known as antifa, the company said it has removed more than 980 groups, 520 pages and 160 ads from Facebook.

Facebook said it is not banning QAnon outright because the group does not meet criteria necessary for the platform to designate it a “dangerous organization.” But it is expanding this policy to address the movement because it has “demonstrated significant risks to public safety.”