By now, the world is well aware of how Russia used armies of bots and online commentary to manipulate information on social media and spread disinformation during the United States’ 2016 presidential campaign.
Less well known is how those methods have spread internationally, with dozens of countries, including the Philippines, Turkey and Sudan, using social media to suppress dissenting voices and promote an anti-democratic agenda.
Inspired by Russia’s pioneering work, authoritarian countries have turned to “fake news” and Twitter bots to create disinformation campaigns that undermine their internal enemies. The spread of such techniques was chronicled in a report released Tuesday by Freedom House, a Washington-based nonprofit that advocates for freedom and democracy.
Out of 65 countries surveyed for the report from June 2016 to May, 30 countries were using methods of information manipulation and disinformation largely developed and tested by Russia and China during the last decade, according to the report. At least 18 countries’ elections, including the 2016 presidential campaign in the U.S., were affected by the manipulations during the report’s research period.
In the Philippines, investigative reporters found that a “keyboard army” created fake news, fake accounts, bots and trolls to promote the presidential campaign of Rodrigo Duterte, the authoritarian firebrand who eventually won the election. Now in office, Duterte quickly drew criticism for his government’s harsh crackdown on drug dealers and drug abuse. Human rights advocates have accused him of extrajudicial killings.
Freedom House researchers found that after Duterte took office in July 2016, these keyboard soldiers were paid as much as $10 a day to continue using social media to attack oppositional journalists and online voices on the internet for speaking out against Duterte’s government.
“It’s not a censorship technique in the traditional sense, but it is a method that is being used to undermine democracy,” said Sanja Kelly, the director of the project, called Freedom on the Net. Government-backed online commentators or bots are limiting free speech, and “we are seeing a chilling effect,” she said in an interview Monday.
“If you have an opposition figure speaking out against the government, and then they are attacked by pro-government commentators or bots online, in the future they might think twice about voicing their opinions,” she said.
“The practice has become significantly more widespread and technically sophisticated over last few years,” the report said.
Freedom House’s annual report typically focuses on global trends in restrictions to internet freedom, including government restrictions on particular websites or applications, such as messaging services, as well as state-sponsored limitations on virtual private networks and mobile connectivity, prosecution for posting online oppositional views, and attacks on online citizens, or “netizens.”
This year’s report examined what it estimates to be 87% of the world’s internet users in 65 countries. Overall, internet freedom declined in 32 countries of those surveyed. China topped the list of most restricted countries for the third year in a row. (North Korea, which is even more restrictive than China, was not included in the survey.)
The 2017 report shows that overall, only 23% of the world can be considered to have internet access free of government restrictions. Thirty-six percent of the world’s internet is not free, while 28% is partially free.
Despite the concerns from Freedom House and warnings from human rights groups about the effects of disinformation campaigns, the United States has fallen behind in developing ways to counter such methods of information manipulation, said Melissa Hooper of Human Rights First., a U.S.-based advocacy organization that wants the United States to be an advocate for human rights globally. Hooper has done substantial research on Kremlin-linked disinformation campaigns and testified to a congressional committee on the subject in September.
“The U.S. is way behind on understanding ... what disinformation is and how it operates in the mainstream media and social media,” Hooper said. “We are way behind in saying, ‘This is a problem, and we need to do something about it.’”
Both Hooper and Kelly pointed to Scandinavian and Baltic countries, many of which have started education programs and school curricula that include news literacy skills to teach participants how to identify false reports as well as reliable sources.
Ironically, many of the countries implementing such programs have imported expertise and methods developed by U.S. experts, Hooper said.
Freedom House’s report recommends that governments strengthen regulations to ensure that political advertising is transparent online as well as offline. Tech and social media companies should develop comprehensive technical solutions to restrict the proliferation of automated political bots. One suggestion is for Twitter to find a way to separate and identify automated tweets from political bots, Kelly said in an interview.
“This is where policy makers need to focus,” Hooper said. “This is where real research needs to happen and hard decisions need to be made.”