Advertisement

Russia tried and failed to sow discord in America. Then it discovered social media

Share

Russia has been trolling the United States for decades.

It bankrolled American authors who claimed Lee Harvey Oswald assassinated President Kennedy under the direction of the FBI and CIA; it planted articles arguing Martin Luther King Jr. was not radical enough; and it spread a conspiracy theory that the U.S. manufactured the AIDS virus.

None of these disinformation campaigns succeeded in undermining American stability, in part because the Soviets didn’t have access to what may be the world’s most powerful weapon for fomenting fear, outrage and unverified information: social media.

The indictments last week by special counsel Robert S. Mueller III against 13 Russians and three Russian companies accused of interfering in the 2016 presidential election laid bare the way America’s biggest tech platforms have altered the centuries-old game of spycraft and political warfare.

Advertisement

Russian operatives couldn’t have asked for better tools than Facebook and Twitter to spark conflict and deepen divisions within Americans, experts say. Never before could they fan propaganda with such ease and speed and needle the people most vulnerable to misinformation with such precision.

“They’re using the same playbook; it’s just a new medium,” said Clint Watts, a former FBI agent and a senior fellow at the Center for Cyber and Homeland Security at George Washington University. “Social media is where you do this stuff now. It wasn’t possible during the Cold War.”

At the root of the strategy are the algorithms social networks employ to encourage more engagement — the comments, likes and shares that generate advertising revenue for their makers.

The problem, researchers say, is that humans typically gravitate toward things that make us angry online. Outrage generates more stimuli in our brains, increasing the odds we respond to news and posts that tick us off. The algorithms know this and serve up such content accordingly.

“Online platforms have profoundly changed the incentives of information sharing,” Yale psychologist M.J. Crockett wrote in a paper for Nature Human Behavior. “Because they compete for our attention to generate advertising revenue, their algorithms promote content that is most likely to be shared, regardless of whether it benefits those who share it — or is even true.”

Since the platforms insist they aren’t media companies, they’re under no legal obligation to verify what’s posted. That allows falsehoods to spread faster, not in the least part, because most people don’t actually read the links they share, according to a 2016 study by researchers at Columbia University and the French National Institute.

Advertisement

Social media companies argue that they help bring people together. Yet studies suggest anonymity and fake accounts are having a corrosive effect on discourse. People who would never dare shout someone down in public can do so freely from behind the safety of their screens. And the access to information in real-time — highlighted under “trending topics” or amplified with a hashtag — ensures there’s never a shortage of issues to shout about.

The result is a feedback loop in which social media algorithms reward the loudest and angriest voices — often on some of the nation’s most sensitive topics, be it gun control, abortion or race. Reasoned debate is made even more difficult because users are often siloed with like-minded people.

“It further inflames a topic or debate,” said Karen North, a social media expert who teaches at USC’s Annenberg School for Communication and Journalism. “And there’s no incentive to compromise.”

Nuance, on the other hand, is rarely rewarded. One of Facebook’s ideas for expanding the scope of human emotions included adding emojis such as a heart and frowning face next to the obligatory “like” button. For Twitter, it meant doubling the limit of any tweet to 280 characters.

That would have been fine for internet users in the early days of social media more than a decade ago when tech companies had a better excuse to operate under the naive assumption that people would behave online the same way they do in the real world, said Jonathon Morgan, chief executive of New Knowledge, a company that tracks online disinformation.

“Social media was built around engagement that was very fast and almost like low-fidelity social contact,” Morgan said. “What’s changed over the years is that most people now get their information from these platforms, which were designed for frivolous interaction. There’s a disconnect when people look for substance where it doesn’t exist.”

Advertisement

It was in this environment that Russian operatives allegedly plied their trade, according to the indictment filed Friday.

They established hundreds of accounts posing as politically active Americans on Facebook, Instagram and Twitter, investigators allege. They parroted both sides of the political spectrum in an effort to heighten acrimony, and launched Facebook groups to ensnare more unwitting supporters, according to the indictment. The activism even spilled out into the real world after the operatives organized dueling rallies in New York for and against then president-elect Trump, authorities say.

“They’ve been doing this stuff on their own population since the 1990s,” said Watts, the former FBI agent.

It wasn’t until the Arab Spring, Watts said, that Russia gained a greater appreciation for the power of social media. If these tools could help activists coordinate a revolt, it wouldn’t be hard to imagine what they could do in the hands of the state, he said.

The platforms, slow to publicly acknowledge the meddling, have since cooperated with authorities and contacted users who engaged with Russian trolls. They’ve vowed to disclose backers of political ads to prevent a repeat of the Russian campaign. Twitter has also deleted thousands of automated bots.

But experts expect the likes of Facebook and Twitter to continue to be targeted by Russian operatives as long as Washington refrains from taking punitive action against Moscow for its interference.

Advertisement

“There’s really no reason for Russia to stop trying to influence election outcomes through the use of social media,” said Kimberly Marten, a professor of political science at Barnard College, Columbia University. “There is no meaningful response to what Russia is trying to do, beyond attempting to punish the perpetrators.”

If misinformation continues to flood social media and technology companies fail to improve their moderation, the sole remedy may be in media literacy, Marten said.

“The only way we can address the problem effectively overall is to improve our own elementary and high school educational systems, so that as many people as possible become critical readers and thinkers, able to call out any fake news they read on social media,” Marten said.

For now, it appears Russian influence campaigns aren’t missing a beat.

Such networks have directed their accounts to pile onto divisive issues like the clamor earlier this year to release a controversial memo by House Intelligence Chairman Devin Nunes, according to the Alliance for Securing Democracy, a project of the nonpartisan German Marshall Fund think tank. More recently, Russian accounts have reportedly perpetuated a conspiracy theory that a Florida school shooting survivor is a paid actor.

In a sign that the tech platforms remain ill-equipped to deal with the onslaught, a YouTube video pushing that conspiracy theory was the top trending video on the platform at one point Wednesday.

Morgan of New Knowledge said the Russian interference campaign will inspire others to exploit social media as long as the platforms remain vulnerable.

Advertisement

“The solution available in the short term is to stop a particular behavior,” Morgan said. “But to stop it in a general way will require years of redesigning the platforms. By then, the adversaries will be one step ahead. They’ve opened a can of worms and we probably have to accept things will never be the same.”

To read this article in Spanish, click here

david.pierson@latimes.com

Follow me @dhpierson on Twitter

Advertisement