Advertisement

Facebook was used in Myanmar to stoke ethnic violence. It could’ve done more to stop it, study says

Facebook said it didn't do enough to prevent its use "to foment division and incite offline violence" in Myanmar.
(Matt Rourke / Associated Press)
Share

For most of Myanmar’s 20 million internet users, Facebook is effectively the internet.

The social network comes pre-loaded on mobile phones in the Southeast Asian nation, often providing people there with their first portal to the Web.

But for all the good that did, it also provided a means to exploit deep-rooted ethnic and religious strife in Myanmar that saw the slaughter of tens of thousands of the country’s Rohingya minority, a human rights report commissioned by Facebook concluded.

The findings underscore how the world’s largest social network is often ill-equipped to fight rumor-mongering, hate speech and calls for mob violence, especially in unstable societies still developing digital literacy.

Advertisement

Contributing to the potency of misinformation in Myanmar, the study said, “A large population of internet users lacks basic understanding of how to use a browser, how to set up an email address and access an email account, and how to navigate and make judgments on online content.”

Facebook Inc. faces similar conditions in Sri Lanka, South Sudan and, to a lesser extent, the Philippines and India. Its messaging app WhatsApp has been used in India to spread hoaxes and rumors that led to murders.

The human rights report released Monday was conducted by the San Francisco-based Business for Social Responsibility, which deemed Facebook a conduit used to fan tensions in Myanmar, a country that spent decades under a military dictatorship before introducing a weak civilian government in 2010.

Instability in the country has led to the demonization of the Rohingya, a mostly Muslim minority group who until recently numbered a million in Myanmar’s western Rakhine state, which borders Bangladesh. The group has long been the target of ultra-nationalists, who belong to Myanmar’s Buddhist majority.

In Myanmar, hatred for Rohingya Muslims runs so deep that a diplomat called them ‘ugly as ogres’ — and got promoted »

In 2016, Myanmar’s military launched a crackdown on the Rohingya, committing atrocities and driving an estimated 700,000 of them out of the country and into squalid refugee camps in Bangladesh. Military leaders disseminated misinformation on Facebook to stir hatred of the Rohingya to help justify the campaign.

Advertisement

Civil society groups, researchers and journalists had already documented the spread of harmful content on Facebook in the nation as the terror inflicted on the Rohingya reached a crescendo in 2017.

Facebook was criticized for not employing Burmese-speaking employees to flag and stop the torrent of posts that called for the mass murder of Rohingya.

This week’s report does not address those criticisms in detail. In a blog post, a Facebook official reiterated the company’s position that it fell short of expectations in handling the situation in Myanmar.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence,” said Alex Warofka, product policy manager for Facebook. “We agree that we can and should do more.”

The report acknowledges that some of its recommendations build on efforts Facebook is already making. It does not say the company has solved the problem.

Business for Social Responsibility recommends that Facebook develop a robust human rights policy, establish teams that understand local conditions better, and engage with local stakeholders to prevent the problems that marred the social network in Myanmar from occurring again. The changes are all the more urgent as the country prepares for elections in 2020.

Advertisement

Facebook said it was already undertaking many of the recommendations, including hiring human rights specialists to sharpen its policies. It said it has almost made good on its promise to hire 100 content moderators fluent in local languages in Myanmar, though it’s unclear whether that just means Burmese, the country’s main language.

There are now 99 such moderators, Warofka said. None of those reviewers is based in Myanmar, however. Facebook cited safety concerns.

Facebook has also been used as a vector for misinformation in the United States. After underestimating Russian meddling in the 2016 election, it has been working to crack down on political influence campaigns. On Monday, the Menlo Park, Calif., company announced that it recently blocked 115 accounts “that may be engaged in coordinated inauthentic behavior.” The accounts — 30 of them on Facebook and 85 on Instagram — were initially flagged by U.S. law enforcement, the company said.

Facebook did not say what the accounts were aiming to do, but said it felt it needed to disclose the findings because of the U.S. midterm elections Tuesday. The company said the Facebook accounts used mostly French and Russian while the Instagram accounts were mostly in English.

‘I didn’t want this baby’: Rohingya rape survivors face a harrowing choice »

Worldwide, 1 in 110 people is displaced from home. Here’s what life is like for some of them »

Advertisement

Muslims faced hatred and violence in Sri Lanka. Then Facebook came along and made things worse »

david.pierson@latimes.com | Twitter: @dhpierson


UPDATES:

2:10 p.m.: This article was updated with Facebook’s blocking of 115 accounts.

This article was originally published at 1:15 p.m.

Advertisement