YouTube bans some QAnon content, following Facebook and Twitter

A woman holds a sign reading "Expose the pedo elite" at a protest
Conspiracy theorists hold a protest July 31 in Hollywood over child trafficking, a favorite topic of the QAnon conspiracy theory.
(Kent Nishimura / Los Angeles Times)

YouTube will ban videos that promote QAnon and other conspiracy theories, but only if they target specific people or groups, seeking to crack down on potentially dangerous misinformation after criticism that the service helped these fringe movements expand.

The decision comes a week after Facebook Inc. said it would remove accounts associated with QAnon, a far-right movement that the FBI has reportedly labeled a domestic terrorism threat.

YouTube’s ban is an attempt to stamp out the conspiracy theory without hindering the massive volume of news and political commentary on its service. Rather than a blanket prohibition of QAnon videos or accounts, YouTube is expanding its hate and harassment policies to include conspiracies that “justify real-world violence,” the company said Thursday.


“Context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups, may stay up,” YouTube, a unit of Alphabet Inc.’s Google, wrote in a blog post.

Technology platforms have released a blitz of new rules to curb misinformation amid mounting momentum for movements such as QAnon. Twitter Inc. recently said it would make it harder for people to find tweets supporting QAnon, while Etsy Inc. removed QAnon-related merchandise from its online marketplace.

Pressure for these companies to act has been building for months. YouTube already instituted a policy similar to Twitter’s, although it did not publicize it. Starting last year, the service began to treat QAnon videos as “borderline content,” meaning the clips are recommended and shown in search results less often. Views from recommendations on “prominent” QAnon videos have dropped 80% since then, the company said.

YouTube was a key driver of QAnon’s early popularity, said Angelo Carusone, president and chief executive of Media Matters for America, a nonprofit group that analyzes conservative misinformation.

A QAnon evangelist called PrayingMedic attracted almost 400,000 subscribers to his YouTube channel, for instance. And even after YouTube’s borderline content move last year, QAnon videos spread from the Google service to other sites. YouTube broadcasts about the conspiracy theory featured regularly in Facebook groups and pages, until Facebook’s recent ban. YouTube QAnon clips also continued to be shared on other niche services such as Parler.

Still, Carusone said YouTube’s efforts to slow the spread of the conspiracy theory have been relatively effective in recent months.


The tech platforms and QAnon supporters will now probably enter into a game of cat and mouse, in which users come up with new hashtags and different claims to evade automated filters. QAnon followers have proven particularly adept at this, Carusone said.

“There has never been a community where their participants are as adaptable,” he said.

A significant unanswered question is how well YouTube can identify videos designed to be less obvious upon initial inspection, Carusone added.

“It is very easy for them to identify explicitly identified QAnon content and accounts,” he said. “What they have not articulated is how well that can be applied to less explicit accounts.”