Advertisement

Op-Ed: Prepping for what comes after the Kremlin’s beta attack on our elections

President Trump meets with Russian President Vladimir Putin at the G-20 Summit in Hamburg on July 7, 2017.
(Evan Vucci / Associated Press)
Share

The revelations of Russian-backed efforts to influence the 2016 election should come as no surprise to people familiar with history. The term dezinformatsiya — disinformation — was coined by Joseph Stalin, after all. And the Kremlin’s digital media intervention was just a beta test. We’ve yet to act to prevent another such campaign, and our nation remains vulnerable to an even stronger attack in the 2018 midterm elections.

In my work for Google and for Hillary Clinton’s presidential campaign, my objective was to harness media to communicate with the public within the framework of our democratic process. Since the 1920s, Kremlin-backed efforts have worked far outside that framework, exploiting vulnerabilities in our media infrastructure in an effort to stoke divisions in Western societies.

Operation INFEKTION, launched by the Soviets in the 1980s to exploit the AIDS epidemic, is just one example. It promulgated the false idea that the U.S. government manufactured HIV and spread it among developing nations and minority communities in the United States.

Advertisement

We know the Kremlin’s playbook.

The Soviets’ methodology then (as now) was chillingly effective. They first placed stories in marginal media outlets in non-Soviet countries, then quickly picked them up in more mainstream Soviet news outlets. Eventually, the spurious reports showed up in more than 80 countries and 30 languages. The effects of this one campaign still linger. A 2005 study showed that 50% of African Americans believed AIDS was man-made, and 15% considered it a form of genocide.

Operation INFEKTION succeeded at a time when Americans largely consumed news from just three network television stations and their local newspapers. Information-sharing was limited to neighbors talking over their backyard fences or sending newspaper clippings in the mail. Such campaigns today can make use of thousands of media outlets delivering around-the-clock content that can be amplified on instantaneous, global distribution platforms like Facebook, Twitter and Google.

This offers foreign adversaries a powerful new formula for an old strategy. They apply disinformation to the gaping and growing tears in our social fabric. Then they propagate the misleading information with an armada of automated Twitter accounts and fictitious Facebook profiles. They can spread anti-democratic lies and half-truths like a virus, faster and more persuasively than we could imagine even 10 years ago.

There is no simple way to prevent these attacks and at the same time protect the internet’s promise of open access to information and our basic right to free speech. But a handful of tactics, employed now, would help:

Disclose the source of political advertising. The most obvious way to inject transparency into political influence is to know who’s paying for political ads and how much they are spending. Paid digital advertising provides powerful targeting opportunities, allowing bad actors to use data to reach citizens most susceptible to disinformation efforts. Disclosure has been the norm for political advertising in other media for years.

Advertisement

Eliminate automated internet account activity. Automated accounts on platforms like Twitter are one of the most common vectors for propagating disinformation. The accounts appear to be operated by legitimate users, but they are actually operated by machines or by groups and individuals using fake identities. Technology platforms can’t prevent the creation of false identities in every situation, but they can and must shutter the automated accounts.

Create a bipartisan, independent commission to explore next-generation safeguards. Disinformation attacks by foreign adversaries will grow more sophisticated in coming years, yet we do not have a forward-looking body to assess and mitigate future threats.

Reestablish standards for truth in media. The current media and political environment has blurred the differences between lies and truth, opinion and verifiable information, allowing exploitation by enemies of democracy. Counterintelligence expert Clint Watts argued before the Senate Intelligence Committee in favor of a nonprofit, non-governmental “Consumer Reports”-style organization that would provide “nutrition labels” for information outlets. The goal would be to better identify fact versus fiction and reporting versus editorializing in media.

We know the Kremlin’s playbook. And we know there are vulnerabilities in our media infrastructure. In 2016, Clinton’s presidential bid was the focus of Russia’s efforts. But democracy’s enemies are not partisan. They will sow division wherever the opportunity presents.

While it is important that we uncover exactly what happened in 2016, Democratic leaders need to be equally focused on future threats. The Republican leadership, on the other hand, must stop equating concern about Russian campaign intervention with a challenge to President Trump’s legitimacy; the leaders must unambiguously recognize that disinformation campaigns, combined with digital technology, are a potent threat to democracies around the world. Silicon Valley has to stop resisting regulation as a matter of principle and devise counter measures to prevent the misuse of its platforms.

Advertisement

All sides have a stake in the long-term benefits of an internet and a media culture that engender public trust. We have seen Russia’s beta test. Now we must defend against the full rollout.

Jason Rosenbaum is a media consultant and founder of Seward Square Strategies. He is the former director of digital advertising at Hillary for America and the former director of elections and advocacy media at Google.

Follow the Opinion section on Twitter @latimesopinion and Facebook

Advertisement