YouTube’s campaign against hateful and racist videos is claiming some unintended victims: researchers and advocates working to expose racist hatemongers.
A video published by the Southern Poverty Law Center was among those taken down after the company announced plans Wednesday to remove more videos and channels that advocate white supremacy.
The civil rights advocacy group received an email notification early Thursday that a video of journalist Max Blumenthal interviewing prominent British Holocaust denier David Irving was removed from the SPLC’s YouTube channel.
“We know that this might be disappointing, but it’s important to us that YouTube is a safe place for all. If content breaks our rules, we remove it,” YouTube said in the email.
A video channel tied to Cal State San Bernardino’s Center for the Study of Hate and Extremism also disappeared from YouTube, the center’s director, Brian Levin, said. YouTube declined to confirm whether the dozen or more academic videos were removed as part of the recent crackdown, but after the Los Angeles Times inquired, it said Thursday that it had reinstated the channel.
YouTube’s moves to start banning content promoting bigotry were “positive and well-intended,” but the execution has been botched, Levin said.
“Artificial intelligence has not been honed to the level where it can distinguish between content that is promoting the most odious bigotry, and that which is reporting and analyzing it,” he said.
That organizations working to raise awareness of hate speech may have been casualties of an effort to reduce the spread of hate speech was not surprising to Heidi Beirich, director of the SPLC’s Intelligence Project. That kind of ironic collateral damage has often resulted from tech companies’ efforts to police their platforms with software that relies on keywords and other ambiguous signals, backed up by human moderators.
Another anti-racist group, One People’s Project, had an informational video removed from its YouTube page after Wednesday’s policy change, according to a report by the Daily Beast. A high school history teacher and a South African blogger were among others affected while attempting to counter white supremacy.
“It indicates that they have not refined well enough the difference between someone who is exploring issues of racism and hatred and someone who’s promoting it,” Beirich said.
Other large internet platforms have fallen prey to the same types of errors. Trying to curb anti-gay posts, Facebook accidentally censored posts by LGBT users who use terms such as “queer.” Last year, some LGBTQ creators on YouTube raised concerns about their content being hidden, restricted to adult users or demonetized by the company, the Verge reported.
Jessica J. González, vice president of strategy at the media advocacy organization Free Press, said it’s important for tech companies to rely on human moderators as opposed to algorithms to train staff in cultural competency and to ensure their appeal processes are simple, transparent and rapid.
González’s organization helped develop a set of suggested content moderation policies. She said the suggested policies were informed by the experiences of people whose posts have been taken down on Twitter and Facebook for calling out racism.
“A policy that attempts to ban hateful content is only as effective as the mechanisms implemented to enforce such rules,” Color of Change President Rashad Robinson said in a statement. “If executed poorly, this policy could contribute to even more harm for black communities and other communities targeted by white supremacist ideologies.”
In the SPLC video, Blumenthal was exploring how people could believe the Holocaust was a hoax and how that belief contributes to anti-Semitism, Beirich said.
“YouTube saw someone speaking of Holocaust denial and assumed it was promotion. But it was the opposite — it was exposure and condemnation of Holocaust denial thinking.”
YouTube said it posts clear policies on what content is acceptable and removes videos violating those policies, but with the massive volume of videos on the site, sometimes the company makes the wrong call.
“When it’s brought to our attention that a video has been removed mistakenly, we act quickly to reinstate it,” YouTube said in a statement. “We also offer uploaders the ability to appeal removals and we will re-review the content.”
Beirich said the SPLC’s channel also hosts videos training law enforcement on how to identify lone-wolf domestic terrorists or Aryan prison gangs, or what to do if a hate group comes to town.
“Hopefully they don’t get caught up in this purge, but you could see how they might … if you have a ham-handed process,” she said.
Beirich said the SPLC plans to appeal the video’s removal.