Op-Ed: Facebook drove QAnon’s mad growth and enhanced its power to poison elections

A supporter of QAnon conspiracy theory holds up a "Q" sign at a rally for President Trump.
A supporter of QAnon conspiracy theory at a rally for President Trump in 2018. The network has moved aggressively into the mainstream with candidates for office promoting the conspiracy theory.

(Matt Rourke / Associated Press )

Five weeks out from the November election, virulent disinformation from domestic and foreign sources continues to fill internet platforms such as Facebook, Twitter and Google. The most extreme voices are being amplified and conspiracy theories and lies are outcompeting expertise and facts on the most critical issues this nation faces — from the legitimacy of the election to public health measures against the coronavirus.

The damage has been continuous since 2016 and there’s little likelihood that these platforms will stop the subversion of truth at this point.

Postmortems of the election months or years from now may tell us with real accuracy how these platforms affected American democracy. For now, the benefit of examining how this dynamic works could help inoculate voters to limit the harm from this disinformation.


Of all the many sources of falsehoods and conspiracies, the latest wave comes from QAnon, which journalists have described as a cult and a collective delusion, wholly detached from reality. QAnon claims there is a deep state conspiracy of Satan-worshipping Democrats and celebrities, operating a pedophilia ring against which Donald Trump is the only hope. A mysterious character, Q, issues clues that are decoded and shared by followers.

QAnon is the baseless conspiracy theory that President Trump is battling a powerful group of elites who, among other crimes, run a child sex ring.

July 15, 2020

Beginning in 2017 on the fringe platform 4Chan, QAnon has grown to global scale — made possible by social media platforms. The game designer Adrian Hon said in an interview with the New York Times that QAnon applies the principles of alternate reality video games — specifically the use of the real world as a platform for storytelling — to grow its conspiracy theory.

QAnon treats every event, no matter how far off script, as part of the design. This has allowed QAnon to absorb every conspiracy it encounters and launch many new ones. The recent subversion of the hashtag #SaveTheChildren has enabled QAnon to put a softer face on its movement, attracting millions of women to a far-right network once largely filled with disaffected men.

As QAnon has grown in scale, it has become an animating force of right-wing politics around the world. A poll by Daily Kos/Civiqs revealed that 33% of Republicans believe that QAnon is “mostly true,” while an additional 23% believe “some parts” of the theory are true. Media Matters identified 70 candidates for Congress this year who expressed some level of support for QAnon. At least one of these candidates is favored to win.

Most extreme conspiracies that begin on fringe sites never get any further. The success of QAnon required far more than the embrace of gaming architectures. Internet platforms such as Facebook, Instagram, Google, YouTube and Twitter have provided the algorithmic amplification to drive QAnon from the fringes into the mainstream.

This was not an accident. QAnon is huge and dangerous because these platforms empowered it for their own profits and power.

These have become the most powerful businesses in our economy by converting human attention into revenue. Among the many problematic aspects of their business model is the algorithmic amplification of emotionally engaging content to maintain attention and the use of recommendation engines to steer behavior.

When Trump posts a message on Twitter and Facebook that implies support for QAnon, algorithms give it maximum reach because it grabs and holds attention. Trump’s tweets appear relatively benign to nonbelievers, but to QAnon, they are validation. Their existence on Twitter and Facebook allows them to recruit, indoctrinate and influence their audiences.


When Facebook’s systems analyze the immense amounts of personal data the company collects, they identify people who might be curious about conspiracy theories and recommend Facebook Groups to join. Facebook did a study in 2018 that revealed that 64% of the time when a person joins an extremist Facebook Group, they do so because of a Facebook recommendation.

With QAnon, Facebook’s influence cannot be overstated. Internal Facebook documents shared with NBC journalist Brandy Zadrozny revealed that the largest QAnon pages and groups had more than 3 million followers and members. This suggests that Facebook is responsible for 2 million of these people being pushed to QAnon from Facebook’s recommendation engine. Even allowing for significant double counting, Facebook cannot escape responsibility for mainstreaming QAnon. According to NBC, Facebook may have waited nearly a year after a FBI warning about the QAnon threat before doing an internal investigation on the site.

Under pressure from politicians, Twitter banned thousands of QAnon accounts. In August, Facebook removed thousands of QAnon pages and groups, but only ones that discuss potential violence. It did not ban thousands of other QAnon pages, including ones that hijacked #SaveTheChildren. The users on those pages are among the most engaged on the site. In neither case has the action of an internet platform impeded the growth of QAnon, even on that platform.

Twitter and Facebook claim to be proponents of free speech, but their sweet spot is extreme speech, the kind that spreads fast, engages users and harms democracy and public health. They have the power to eliminate most of the harm on their platforms, but not the interest or the will.

Facebook’s business model has been undermining democracy since at least 2016. Nothing short of eliminating features and policies such like algorithmic amplification of emotionally charged content and a recommendation engine that pushes users into extremist groups will protect democracy from further damage. The government has no power to force these changes immediately. Facebook would have to do them. It would be an act of good citizenship, but only a temporary fix.

Threats like QAnon will continue to thrive on internet platforms until policymakers force permanent changes to platforms’ business models. This will require a combination of safety, privacy and antitrust regulations.

It is bad enough to face the presidential election under the influence of a pandemic and economic contraction. But we are also struggling with unchecked assaults on reason and democracy. Thanks to amplification by internet platforms, QAnon is a key factor in both assaults.

Roger McNamee, a technology investor and early advisor to Facebook’s Mark Zuckerberg, is the author of “Zucked: Waking Up to the Facebook Catastrophe.”