Advertisement

Op-Ed: How you can fight Russia’s plans to troll Americans during Campaign 2020

Twitter logos are displayed on a cellphone, against a background of Twitter logos.
Twitter users could try to foil Russia’s campaign to divide the country before the November election by avoiding posting content that they have not fact-checked.
(Loic Venance / AFP/Getty Images)
Share

Another presidential election is approaching, which means Russian election interference is back in the news. Maybe you’ve already made up your mind about your favorite candidate, and so you’re immune to the social media messaging being circulated by Russian trolls — right?

Not exactly. Russian trolls aren’t only targeting behaviors, like pulling a voting lever. They’re targeting beliefs, trying to stoke tribalism and polarization. Those who think they are immune to Russian tactics could become complacent, and feed right into Russian hands.

To understand what the Russians are up to, a bit of a history lesson is in order. During the Cold War, researchers at Rand Corp. began applying game theory to national security policy. The Soviet military, in comparison, was developing its own theories, including one called reflexive control theory.

Advertisement

Reflexive control is, in part, the intellectual basis for current Russian efforts to more broadly interfere with U.S. elections and democracy. If you understand reflexive control then you can better understand Russian strategy — and devise ways to combat it.

The theory is mathematically dense, drawing on models from the study of graphs and abstract algebra. But the core idea is simple: The theory assumes that people live in a polarized world of cooperation versus conflict. And it describes how people make decisions based on who they view as friends or enemies — and how they think others view them. The Russians are trying to feed information to distort these views.

The end goal for these efforts is to trigger emotional reactions and drive people to ideological extremes, making it nearly impossible to build a consensus. The Russians also hope those who are not driven to extreme positions will throw up their hands in frustration and check out. The result is political paralysis.

Here’s a notional example: Suppose you and a neighbor agree that your property taxes are too high, but disagree on issues related to sensitive topics like race relations or immigration. You start seeing online memes focusing on extreme views on these topics. Those memes evoke strong reactions, painting the issue as a battle between two extremes. You begin thinking of your neighbor based on this false dichotomy. The neighbor becomes one of “them” rather than a person with whom you had some commonality. After all, it’s difficult to agree on most anything when you and your neighbor view each other as racist or anti-American.

The Russian objective is to create an illusion of deep-seated divisions between people like you and people who aren’t like you, so that you won’t be able to agree on anything.

The Russians don’t particularly care about the details of our social and political issues when they are trolling Americans. Their focus is to gin people up to be against one another regardless of their identity or political beliefs. That’s why Russia tries to infiltrate groups of both Black Lives Matter and white nationalists online.

Advertisement

We don’t know if these efforts are working, but we believe Russia is trying to divide U.S. society by seeding extremist views. And the current political environment, coupled with the nature of social media, makes combatting these efforts a bit tricky.

Short, shareable (and sensationalist) content is the currency of social networks — and it does not naturally promote nuanced conversations. Micro-targeting makes it easy to feed people customized content based on what they already like, which enables manufactured content to get a foothold with the right audience and go viral more quickly.

Everyone has the opportunity to fight Russian efforts to drive U.S. citizens to extremes before the November election. Tech firms have a responsibility to root out Russian social media content and ensure their users are who they claim to be. Political, religious and civic leaders could bring people together and help build consensus on divisive issues — like race relations, immigration, and economic inequality — that Russia may try to exploit.

Most importantly, users could be more careful about what information they share online. Don’t forward content from unknown sources. Don’t post content that you have not fact-checked. Be aware that even a humorous meme may have an underlying dark goal — to make you think less of another group.

Americans are less likely to have their emotions manipulated if they are aware that manipulation is the goal. Behind the veil of extreme positions are groups of people who may well have much in common. It’s important to recognize that disinformation efforts targeted at emotional beliefs could further decay the national discourse.

Marek N. Posard is a military sociologist, Jim Marrone is an associate economist and Todd Helmus is a senior behavioral scientist at the nonprofit, nonpartisan Rand Corp.

Advertisement