Advertisement

Editorial: Social media can harm kids. Lawsuits could force Meta, others to make platforms safer

Mobile phone app logos for Facebook, Instagram and WhatsApp
Mobile phone app logos for Facebook, left, Instagram and WhatsApp in 2021.
(Richard Drew / Associated Press)
Share

It’s a rare issue that can bring 41 states together for a bipartisan fight. This week, state attorneys general across the political spectrum joined forces in suing Facebook parent company Meta for allegedly using features on Instagram and other platforms that hook young users, while denying or downplaying the risks to their mental health.

This comes two years after states began investigating Meta following revelations that the company’s internal research found Instagram was having a negative effect on some teen users’ mental health. Since then health professionals, including Surgeon General Dr. Vivek Murthy and the American Psychological Assn., have urged tech companies to make their products safer for young people.

California lawmakers should advance legislation to make social media safer for children and keep pressure on Congress to craft a national fix.

May 22, 2022

But there hasn’t yet been significant change in the industry. Most companies haven’t been willing to overhaul their platforms to curb addictive features or harmful content for users under 18 years old, such as setting time limits on their apps or changing algorithms that steer kids into “rabbit holes” to keep them online. Nor have federal lawmakers been able to enact comprehensive product safety regulations because legislation has stalled in Congress or been blocked by courts.

Advertisement

In the absence of policy changes, lawsuits are the next logical step in prodding technology companies to ensure their products are safe for young people or be held accountable. Some have compared the states’ legal strategy to lawsuits against Big Tobacco and opioid manufacturers that revealed how the companies lied about the harm caused by their products, and forced them to change their business practices.

Concerns about users’ privacy, disinformation and impacts on children are not unique to TikTok.

March 24, 2023

Meta is the first target because of the 2021 revelations, but the state attorneys general said this is an industry-wide investigation. They have also begun looking into TikTok.

The federal complaint alleges Meta used harmful and “psychologically manipulative product features,” such as “likes,” infinite scroll and constant alerts, to hook young people on Instagram and Facebook and keep them engaged for as much time as possible in order to boost profits. Despite knowing that young users’ brains are particularly vulnerable to manipulation by such features and internal studies warning that kids were being harmed, Meta allegedly concealed, denied and downplayed the harms.

The lawsuit, which was filed jointly by 33 states, including California, also accused Meta of violating the Children’s Online Privacy Protection Act, a federal law that protects the digital privacy of children under 13 years old. Eight states and the District of Columbia filed separate lawsuits in state or federal courts, many alleging that Meta violated state consumer protection laws.

Meta said in a statement that it has already rolled out 30 tools to support teens on its apps since 2021, including reminders on Instagram for teens to take a break and sharing expert resources if kids search for posts on suicide or eating disorders. That’s a good start. The company lamented that the states chose to sue rather than work with tech firms “across the industry to create clear, age-appropriate standards.”

Indeed, there is a need for comprehensive safety standards across social media platforms. But a tech lobbying group of which Meta is a member has sued to stop an effort by California, which passed a first-in-the-nation law last year requiring age-appropriate design and child privacy protection. The law was recently put on hold by a federal judge citing 1st Amendment concerns. California Atty. Gen. Rob Bonta has filed an appeal.

Advertisement

This is complex legal and regulatory terrain, and the states’ lawsuits are not a sure bet given existing laws that protect online platform companies from being held liable for content posted by users on their sites. Nor will any of these cases be resolved quickly. That’s OK. This is an essential fight for the future.

Advertisement