Zuckerberg was warned on social media addiction, filing says
Employees at Meta Platforms and ByteDance were aware of the harmful effects of their platforms on young children and teenagers but disregarded the information or in some cases sought to undermine it, according to claims in a court filing.
The allegations were disclosed in a lawsuit over social media addiction that had been filed previously but with key portions sealed from public view. An unredacted version filed over the weekend in federal court in Oakland offers details about how much engineers and others, including Meta Chief Executive Mark Zuckerberg, knew about the harms of social media and their misgivings about it.
“No one wakes up thinking they want to maximize the number of times they open Instagram that day,” one Meta employee wrote in 2021, according to the filing. “But that’s exactly what our product teams are trying to do.”
The case in Oakland comprises a collection of scores of complaints filed across the U.S. on behalf of adolescents and young adults who allege that Facebook, Instagram, TikTok, Snapchat and Google’s YouTube caused them to suffer anxiety, depression, eating disorders and sleeplessness. More than a dozen suicides also have been blamed on the companies, based on claims that they knowingly designed algorithms that drew children down dangerous and addictive paths. Several public school districts have filed suits too, alleging they can’t fulfill their educational mission while students are coping with mental health crises.
In their defense, the social media giants point to a 1996 law that gives internet platforms broad immunity from claims over harmful content posted by users. Both sides are closely watching a Supreme Court case that probably will determine the fate of the litigation in Oakland.
California lawmakers are taking aim at social media’s role in youth fentanyl use and sex trafficking.
According to the new filing, internal documents at TikTok parent ByteDance show that the company knows young people are more susceptible to being lured into trying dangerous stunts they view on the platform — known as viral challenges — because their ability to weigh risk isn’t fully formed.
Young people are more likely to “overestimate their ability to cope with risk,” and their “ability to understand the finality of death is also not fully fledged,” according to the filing.
Another unsealed portion of the filing contends that instead of moving to address the problems around children using Instagram and Facebook, Meta defunded its mental health team.
The filing says Zuckerberg was personally warned: “We are not on track to succeed for our core well-being topics (problematic use, bullying & harassment, connections, and SSI), and are at increased regulatory risk and external criticism. These affect everyone, especially Youth and Creators; if not addressed, these will follow us into the Metaverse.”
A Meta spokesperson said the claim that it defunded work to support people’s well-being is false.
California will impose first-of-its-kind requirements on social media companies to publish their policies for removing disturbing content including hate speech, with details on how and when they remove that content.
“We actually increased funding, shown by the over 30 tools we offer to support teens and families,” the spokesperson said. “Today, there are hundreds of employees working across the company to build features to this effect.”
Snap had no immediate comment on the court filing. Representatives of TikTok didn’t immediately respond to a request for comment.
“These never-before-seen documents show that social media companies treat the crisis in youth mental health as a public relations issue rather than an urgent societal problem brought on by their products,” according to a statement by the three plaintiffs’ lawyers leading the lawsuit, Lexi Hazam, Previn Warren and Chris Seeger. “This includes burying internal research documenting these harms, blocking safety measures because they decrease ‘engagement,’ and defunding teams focused on protecting youth mental health.”
The companies have previously said that user safety is a priority and that they have taken affirmative steps to give parents more control over their kids’ use of the platforms and to provide more mental health resources.
The view from Sacramento
Sign up for the California Politics newsletter to get exclusive analysis from our reporters.
You may occasionally receive promotional content from the Los Angeles Times.