TikTok violated FTC orders on child privacy, complaint alleges

TikTok allows users to create and share short videos of themselves with millions of users.
(Joel Saget / AFP/Getty Images)

A coalition of child and consumer privacy advocates filed a complaint Wednesday night with the Federal Trade Commission alleging TikTok continues to unlawfully store and collect children’s personal information despite a prior agreement to stop.

Last year, the FTC reached a settlement with, the company now known as TikTok, that included a $5.7 million fine — its biggest-ever for a child privacy violation. The agency accused the company of illegally gathering children’s sensitive personal data, such as email addresses, names, pictures, and locations without parental permission through its app, and refusing parent requests to have their children’s data deleted.

In the new complaint, 20 groups including the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy accuse the company of violating the terms of this agreement with the FTC as well as the Children’s Online Privacy Protection Act more broadly. The organizations allege TikTok failed to comply with the FTC’s order to destroy all personal information it stores on users under 13 — including data on users who were younger than 13 at the time the data was gathered but are now older.


Owned by Chinese company ByteDance, TikTok allows users to create and share short videos of themselves with millions of users. In a statement, a TikTok spokesperson said, “We take privacy seriously and are committed to helping ensure that TikTok continues to be a safe and entertaining community for our users.”

TikTok offers accounts for users under 13 that collect less data and prevent children from sharing videos with others. But these accounts still collect some personal identifiers and user activity data, advocacy groups argue, and children can simply lie about their age to access TikTok without restrictions.

The groups said in the complaint they had found many regular TikTok accounts operated by young children with videos uploaded as far back as 2016. Regular accounts collect a swath of data that is shared with third parties and used for targeted advertising.

“It’s clear now a year later they haven’t properly done the things the FTC told them to do,” said Michael Rosenbloom, a fellow at the Institute for Public Representation, a technology law clinic at Georgetown University that is representing the advocacy groups filing the FTC complaint.

The 1998 child privacy law mandates that online services directed at children or possessing “actual knowledge” children are using them secure parental permission before collecting personal data of users under 13. But even in the limited accounts for younger users TikTok has not created any real mechanism to get parental consent, Rosenbloom said.

A Dutch privacy watchdog said last week it would investigate how TikTok handles children’s data and whether parental consent is required to collect and use that data. The company said it was cooperating with Dutch authorities, according to Reuters.


The company has announced some new features that could change how some teen users navigate the platform. On April 15, TikTok said it would roll out a new set of parental controls to its platform. The “family pairing” tool would allow parents to link to the accounts of users aged 13 to 16 and manage how long teens can spend on the app each day, limit content that might be inappropriate and restrict who can send direct messages to an account. The company said beginning April 30, it would automatically disable direct messages for users under 16.