Former content moderator files lawsuit against Facebook, claims the job gave her PTSD
A woman who worked as a content moderator for Facebook has filed a lawsuit against the social media giant, claiming she developed post-traumatic stress disorder as a result of “disturbing” images the job required her to watch.
The suit, filed in San Mateo County last week, alleges negligence and failure to maintain a safe workplace. Selena Scola said she worked at Facebook offices for nine months under a contract through Pro Unlimited Inc., a staffing company that also is a named defendant in the suit. Scola stopped working for Facebook in March.
“Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job,” Korey Nelson, an attorney with Burns Charest LLP, said in a statement. The firm is seeking class-action status for the lawsuit.
“From her cubicle in Facebook’s Silicon Valley offices, Ms. Scola witnessed thousands of acts of extreme and graphic violence,” the suit alleges.
Facebook’s chief executive, Mark Zuckerberg, acknowledged that some people were using the platform to broadcast self-harm last year.
“Just last week, we got a report that someone on [Facebook] Live was considering suicide,” he wrote. “We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”
Facebook and other internet service providers voluntarily established industry standards for training, counseling and supporting content moderators more than a decade ago, attorneys said. The lawsuit claims Facebook does not follow the workplace safety guidelines it helped create.
The suit asks that both Facebook and Pro Unlimited fund a medical monitoring program that would help diagnose and treat Scola as well as other content moderators for psychological injuries, including PTSD.
In a July blog post, the tech company stressed that reviewing reported content is “essential to keeping people safe on Facebook” and said it was doubling the number of people working on its safety and security teams this year to 20,000, including 7,500 content reviewers.
“We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously,” said Bertie Thomson, director of corporate communications at Facebook, adding that every person reviewing Facebook content is offered psychological support.
Facebook requires companies that it partners with to provide resources and psychological support as well, “including onsite counseling — available at the location where the plaintiff worked — and other wellness resources like relaxation areas at many of our larger facilities,” Thomson said.
Pro Unlimited did not immediately return a request for comment.
For more California news follow me on Twitter: @sarahparvini
2:30 p.m.: This article was updated with a statement from Facebook.
This article was originally published at 11:35 a.m.
The stories shaping California
Get up to speed with our Essential California newsletter, sent six days a week.
You may occasionally receive promotional content from the Los Angeles Times.