Facebook Inc. plans to hire 3,000 additional people to review videos and other posts after being criticized for not responding quickly enough to homicides shown live on its service.
The hires over the next year will be on top of the 4,500 people Facebook already has to identify crime and other questionable content for removal. Chief Executive Mark Zuckerberg wrote Wednesday that the Menlo Park, Calif., social media giant is “working to make these videos easier to report so we can take the right action sooner — whether that's responding quickly when someone needs help or taking a post down.”
Videos and posts that glorify violence are against Facebook's rules, but Facebook has been criticized for being slow in responding to such content, including live videos of a murder in Cleveland and the killing of a baby in Thailand. In both those cases, police said the killers were the ones who posted the videos on Facebook. The Thailand video was up for 24 hours before it was removed.
In most cases, content is reviewed and possibly removed only if users complain. News reports and posts that condemn violence are allowed. This makes for a tricky balancing act for the company. Facebook does not want to act as a censor; videos of violence, such as those documenting police brutality or the horrors of war, can serve an important purpose.
Also on Wednesday, Facebook reported first-quarter results that beat Wall Street’s expectations.
The company reported a profit of $3.06 billion, or $1.04 a share. Adjusted for stock option expenses, earnings came to $1.23 a share. Analysts surveyed by Zacks Investment Research expected $1.10 a share.
Facebook’s revenue for the quarter was $8.03 billion. Analysts surveyed by Zacks expected $7.85 billion.
Facebook shares slipped 0.6% Wednesday to close at $151.80 before the company announced its earnings. The shares are up nearly 32% since the beginning of the year; for comparison, the Standard & Poor's 500 index has climbed almost 7%.
Policing live video streams is especially difficult, since viewers don't know what will happen. This rawness is part of their appeal.
Although the negative videos make headlines, they are just a tiny fraction of what users post every day. The good? Families documenting a toddler's first steps for faraway relatives, journalists documenting news events, musicians performing for their fans and people raising money for charities.
“We don't want to get rid of the positive aspects and benefits of livestreaming,” said Benjamin Burroughs, professor of emerging media at the University of Nevada Las Vegas.
Burroughs said Facebook clearly knew that livestreams would help the company make money, because they keep users on Facebook longer, making advertisers happy. If Facebook hadn't also considered the possibility that livestreams of crime or violence would inevitably appear alongside the positive stuff, “they weren't doing a good enough job researching implications for societal harm,” Burroughs said.
With a quarter of the world's population on it, Facebook can serve as a mirror for humanity, amplifying both the good and the bad. But lately, it has gotten outsized attention for its role in the latter for allowing the spread of false news and government propaganda as well as videos of horrific crimes.
Livestreaming videos showing murder or kidnapping and torture have made international headlines even when the crimes themselves wouldn't have, simply because they were on Facebook, visible to people who wouldn't have seen them otherwise.
As the company introduces more new features, it will continue to have to grapple with the reality that the features will not always be used for positive or even mundane things. From his interviews and Facebook posts, it appears that Zuckerberg knows this, even if he is not always as quick to respond as some would hope.
“It's heartbreaking, and I've been reflecting on how we can do better for our community,” Zuckerberg wrote Wednesday about the recent videos.
It's a learning curve for Facebook. In November, for example, Zuckerberg called the idea that false news spread on Facebook influenced the U.S. election “crazy.” A month later, the company introduced a slew of initiatives aimed at combating false news and supporting journalism. And just last week, it acknowledged that governments or other entities were using its social network to influence political sentiment in ways that could affect national elections.
What to do
Zuckerberg said Facebook workers review “millions of reports” every week. In addition to removing videos of crime or getting help for people who might hurt themselves, he said the reviewers will “help us get better at removing things we don't allow on Facebook like hate speech and child exploitation.”
Wednesday's announcement is a clear sign that Facebook continues to need human reviewers to monitor content, even as it tries to outsource some of the work to software due in part to its sheer size and the volume of stuff people post.
It's not all up to Facebook, though. Burroughs said users themselves need to decide how close they want to be to violence — do they want to look at the videos that are posted, and even circulate them, for example. And news organizations should themselves decide whether each Facebook livestreamed killing is a story.
“We have to be careful that it doesn't become a kind of voyeurism,” he said.
2:50 p.m.: This article was updated with Facebook’s earnings results and stock movement.
10:20 a.m.: This article was updated throughout with additional details and analysis and with comments from Benjamin Burroughs.
8:25 a.m.: This article was updated with more details from Mark Zuckerberg’s Facebook post and more information about violent videos posted to Facebook.
This article was originally published at 7:40 a.m.