Advertisement

Trump’s coronavirus test sparks online misinformation spree

A person holds a sign in the shape of the letter Q colored like the U.S. flag outside a Trump campaign rally
A QAnon adherent waits to enter a Trump campaign rally in Wilkes-Barre, Pa.
(Matt Rourke / Associated Press)
Share via

Adherents of QAnon, the vast conspiracy theory that baselessly claims that a satanic cabal of high-profile liberals runs a global human trafficking operation, are used to scouring the headlines for items of news they can point to as evidence they’re onto something. Social media and communications companies are used to watching those claims spread across their platforms in real time.

As soon as President Trump announced he had tested positive for the coronavirus, both sprang into action.

QAnon believers falsely distorted the news, saying the president is pretending to go into isolation as part of a grand plan to take down the alleged human trafficking cabal. Trump has said he does not know much about the QAnon phenomenon but has appeared to condone its supporters, saying they are people who “love America” and “like me very much.”

Advertisement

YouTube and Facebook both said they immediately began monitoring for coronavirus diagnosis-related misinformation after Trump announced his positive test and that of First Lady Melania Trump.

“Within minutes of their diagnosis being made public, our systems began surfacing authoritative news sources on our homepage, as well as in search results and watch next panels regarding the President and COVID-19,” YouTube spokesman Alex Joseph said in a statement.

A Facebook representative, who declined to be named because the situation is “rapidly evolving,” said in an email that the company is tracking the spread of conspiracy theories and will work to fact-check and label misleading content.

Advertisement

The company said it would also remove content that violates policy, such as calls for death and claims that the election is being canceled or postponed.

Twitter did not outline new efforts to contain conspiracy theories around Trump’s diagnosis.

“Using a combination of technology and human review, our teams have taken steps to address coordinated attempts to spread harmful misinformation around COVID-19. This applies today too,” Twitter spokeswoman Liz Kelley said in a statement.

Advertisement

Taking action on content that breaks rules, including spam and content that expresses a desire for death, serious bodily harm or fatal disease, is part of ongoing work to protect public conversation, she said. Twitter has specifically said it would take action against accounts that express a wish for Trump’s death. Facebook allows users to express a wish for the death of a public figure as long as users don’t tag that individual.

Platforms have long been under fire for allowing false information and discriminatory ideologies to spread on their platforms. In recent months, they’ve been under pressure to more comprehensively tackle white supremacist content as well as disinformation related to COVID-19 and the election.

Facebook has come up short in its attempts to contain QAnon content, which burst into the mainstream this year.

In August, Facebook said it removed 790 QAnon groups and restricted an additional 1,950 related to the conspiracy. Since then, a QAnon Facebook group added hundreds of new followers, and the company’s own algorithm recommended users to groups discussing the theory, the New York Times found.

In July, Twitter said it was removing thousands of QAnon accounts, but many returned within weeks, according to the New York Times investigation.

Reddit did not immediately respond to an inquiry about what types of Trump coronavirus conspiracy theories and other harmful activity have circulated on the platform.

Advertisement