Internet video giant YouTube has found itself drawn into a global drama being played out in violent Mideast protests over a 14-minute video trailer for “Innocence of Muslims,” raising questions about the website’s responsibilities as the Internet’s preeminent distributor of video.
The trailer has been blamed for inciting violence in Libya, Egypt and Yemen. Obama administration officials said Thursday that they have asked YouTube to review the video and determine whether it violates the site’s terms of service, according to people close to the situation but not authorized to comment.
Some media observers predict that the incident will prompt calls for Google Inc.'s YouTube to play a more active role in curating the billions of hours of videos found on its site. One prominent 1st Amendment lawyer even suggested that YouTube should seek a judge’s ruling about whether to remove potentially incendiary content.
Other digital media experts, however, cited the technical limitations of scouring the torrent of videos that are uploaded to the site every minute and making value judgments about those likely to incite anger, hate or murder.
“We keep making that mistake that technology is just neutral,” said Ken Doctor, a longtime newspaper industry analyst. “But it can be cynically used. There are forces of darkness in the world that will rapidly use them, as well as forces for good.”
The controversial video depicts the prophet Muhammad as a child molester, womanizer and killer. On Wednesday, YouTube blocked access to the “Innocence of Muslims” in Libya and Egypt in the wake of the violence.
For now, the video remains on the site because YouTube has determined that it does not violate the community guidelines that prohibit sexually explicit content, graphic or gratuitous depictions of violence, hate speech or other “bad stuff,” such as animal abuse, drug abuse or underage drinking.
“We work hard to create a community everyone can enjoy and which also enables people to express different opinions,” a YouTube spokesman said in a statement. “This can be a challenge because what’s OK in one country can be offensive elsewhere. This video — which is widely available on the Web — is clearly within our guidelines and so will stay on YouTube.”
YouTube declined to comment on the Obama administration’s request.
Search giant Google acquired YouTube for $1.65 billion in 2006. The site attracts about 800 million visitors each month from around the world, with 70% of its traffic coming from outside the U.S. Mark S. Mahaney, an Internet analyst at Citigroup Investment Research, estimates that YouTube will generate more than $3.6 billion in revenue this year.
In the past, YouTube has complied with government requests to remove videos that officials feared would incite violence. As recently as a month ago, YouTube blocked a series of videos of rioting in northeast India at the request of officials, who worried that the videos could lead to further civil unrest, according to a person who requested anonymity because of the sensitivity of the situation.
One graphic 2009 video, which captured an Iranian woman dying on the street, remained on the site because of its intrinsic news value — but YouTube added a warning that flagged it as graphic and potentially offensive.
The “Innocence of Muslims” episode may represent a turning point for YouTube, which has emerged as the leading online source for eyewitness and surveillance-camera videos that afford unique views of natural disasters or significant world events like the Arab Spring uprisings.
“The era of uncurated and unmediated commenting is pretty much over,” said Tom Rosenstiel, an author, journalist and founder of the Project for Excellence in Journalism. “Almost all sites that I know of have moved toward curation and pulling down content that they think is objectionable. Where YouTube will end up on this, I don’t know.”
Noted 1st Amendment lawyer Martin Garbus said YouTube should develop a process for dealing with potentially inflammatory videos like “Innocence of Muslims.” He suggested a judge, working as a neutral arbiter, could make a ruling about whether a video should be posted to the site.
“Otherwise, a place like YouTube would be subject to various interest groups saying, ‘Take that off,’” Garbus said. “You can’t have them do that. They would become subject to too much pressure.”
Some digital experts question the feasibility of asking YouTube to play an active role in vetting videos.
“It has to be an automated, machine-driven process. And machines are nowhere near a level of sophistication where they can watch a film and make value judgments about whether or not it’s likely to incite anger, hate or a murder,” said Rebecca Lieb, a digital media analyst at the Altimeter Group.
Although the “Innocence of Muslims” incident would justifiably provoke soul-searching, Lieb doesn’t see the problem as having a technological solution.
“This is almost a freak occurrence that something so qualitatively bad and obscure could rise to this kind of prominence and have this catastrophic result,” Lieb said. “There is already so much objectionable, racist, hateful content on the Internet. A more apt question is, ‘How did this one piece of content provoke this extreme reaction?’”