Advertisement

YouTube will start labeling videos that receive government funding

YouTube has had to address criticism that it hasn’t done enough to police its site for propaganda, conspiracy theories and harmful content.
(Danny Moloshok / Associated Press)
Share

In another bid to quell criticism that its platform is overrun with misinformation, YouTube said Friday that it would start labeling news broadcasters’ videos that receive at least some government or public funding.

The move comes a year after the Office of the Director of National Intelligence detailed how Russian state broadcaster RT racked up hundreds of millions of views on YouTube promoting Kremlin propaganda.

YouTube yanked RT from its list of premium channels marketed to advertisers in October amid growing congressional pressure. The Russian broadcaster, which produced a wealth of reports critical of Hillary Clinton and promoted the viewpoints of figures such as Julian Assange, was the first news organization to surpass 1 billion views on YouTube in 2013.

Advertisement

RT did not respond to a request for comment.

In addition to RT, state and public broadcasters such as PBS and New China TV will see notices directly below their videos, above even their titles, YouTube said.

Links to the broadcasters’ Wikipedia pages will also be included below their videos.

PBS said it was misleading for YouTube to include the broadcaster in the initiative, saying it suggested the U.S. government had influence over its editorial content.

“PBS and its member stations receive a small percentage of funding from the federal government; the majority of funding comes from private donations,” the broadcaster said in an emailed statement. “More importantly, PBS is an independent, private, not-for-profit corporation, not a state broadcaster. YouTube’s proposed labeling could wrongly imply that the government has influence over PBS content, which is prohibited by statute.”

PBS said it was conducting discussions with YouTube to address its concerns.

It’s impossible to know if such disclosures would have limited RT’s influence in the past, experts say. But they still welcomed YouTube’s move as a way to improve media literacy.

“It’s a small but not insignificant step,” said Bret Schafer, an analyst at the German Marshall Fund’s Alliance for Securing Democracy, which tracks Russian influence networks over social media.

“The connection between RT and the content it publishes on YouTube has often been less than transparent,” he added. “This, in theory, would help solve that problem.”

Advertisement

YouTube, which is owned by Google’s parent company, Alphabet Inc., is resistant to legal oversight of its content. But it has made efforts to police its platform after a year in which the company was criticized for surfacing conspiracy theories, hoaxes and inappropriate content directed at children.

Starting last year, the company said it tweaked its algorithm to ensure more established news sources surfaced in search results in the wake of breaking news. The change was made after a slew of conspiracy theories surfaced on YouTube moments after the Las Vegas mass shooting in October.

“News is an important and growing vertical for us and we want to be sure to get it right,” wrote Geoff Samek, senior product manager for YouTube News in a blog post Friday.

Google, like Facebook and Twitter, is slowly coming to grips with its role in the Russian campaign to influence the 2016 presidential election. Forced to testify on Capitol Hill, the tech giants have since pledged to promote more trusted news sources and have disclosed more data on Russian-controlled accounts.

The three companies would much rather stay out of the business of editorial oversight. Doing so could bring them closer to being labeled media companies rather than platforms — a critical distinction that largely absolves them of liability over the content and activities that appear on their products.

By promoting transparency measures instead, the firms can argue it’s up to their users to decide what to watch and read.

Advertisement

“The principle here is to provide more information to our users, and let our users make the judgment themselves, as opposed to us being in the business of providing any sort of editorial judgment on any of these things ourselves,” Neal Mohan, YouTube’s chief product officer, told the Wall Street Journal.

Facebook announced last month that it would let its users determine which news sources are trustworthy. The social network had previously employed curators to cherry-pick news for its users, a strategy that was abandoned after the company was accused of omitting conservative viewpoints.

Editorial judgment could be more trouble than it’s worth, in the eyes of Silicon Valley. Deciding what’s acceptable content is fraught with risk during today’s political climate — and ever harder now that news sites that previously would have been dismissed outright, such as Alex Jones’ Infowars, have gained mainstream notoriety.

But transparency alone won’t stop the spread of propaganda and misinformation given the complexity of policing platforms with billions of users accessible to almost anyone in the world.

“The nature of an open platform means we never know what trends or moments are going to arise next,” wrote YouTube’s Chief Executive Susan Wojcicki in a blog post Thursday addressing the rash of objectionable material on her platform.

Data compiled by the Alliance for Securing Democracy show Russia’s influence campaign remains active on social media.

Advertisement

The group says Russian-linked influence networks on Twitter continue to promote hashtags such as #releasethememo, a reference to the House Intelligence Committee’s controversial memo on the Russia inquiry disclosed to the public Friday.

david.pierson@latimes.com

Follow me @dhpierson on Twitter


UPDATES:

2:55 p.m.: This article was updated with a comment from PBS.

This article was originally published at 11:35 a.m.

Advertisement