Google’s YouTube has agreed to pay more than $150 million to resolve U.S. allegations that it violated children’s privacy laws, according to a person familiar with the matter.
The settlement with the Federal Trade Commission resolves an investigation into whether the video service broke a law that makes it illegal to collect information about children younger than 13 and disclose it to others without parental permission. Last year a group of activists asked the FTC to look into the matter.
Bloomberg previously reported that the agency approved the settlement without reporting details about the size of the fine. The agreement could be announced as early as next week. The FTC and Google declined to comment.
The settlement with the world’s largest video service represents the most significant U.S. enforcement action against a big technology company in the last five years over practices involving minors. The federal government is stepping up scrutiny of internet platforms that have largely operated with few regulatory constraints. Politico reported the amount of the settlement earlier Friday.
The FTC has been cracking down on firms that violate the 1998 Children’s Online Privacy Protection Act. It fined the popular app now known as TikTok $5.7 million in February to resolve claims that the video service failed to obtain parental consent before collecting names, email addresses and other information from children under 13.
Some children’s privacy advocates said the government hadn’t gone far enough.
“Once again, this FTC appears to have let a powerful company off the hook with a nominal fine for violating users’ privacy online,” Sen. Edward J. Markey (D-Mass.), a key figure behind the passage of COPPA, said in a statement. “We owe it to kids to come down hard on companies that infringe on children’s privacy and violate federal law.”
The amount is “woefully low, considering the egregious nature of the violation, how much Google profited from violating the law, and given Google’s size and revenue,” said Katharina Kopp, deputy director of the Center for Digital Democracy, which helped lead the complaint against YouTube.
Alphabet Inc.’s Google doesn’t break out YouTube’s sales figures, but it recently reported that the video site is its second-largest source of revenue behind search advertising. Research firm Loup Ventures estimates that 5%, or roughly $750 million a year of YouTube’s annual revenue, comes from content aimed at children.
YouTube has long said that children under 13 don’t use its site without parental supervision, as its terms of service stipulate. There’s ample evidence these young viewers do flock to the site, however, and consumer groups complained last year that the presence of children means YouTube is collecting data on these younger viewers to serve them ads in violation of COPPA.
The site has already made tweaks as it tries to create a safer platform for kids. In recent months, it changed its algorithm to promote what it called “quality” kids’ videos, a shift that alarmed many of its video creators. YouTube also began plans to stop offering targeted ads — which rely on information such as web-browsing cookies — on videos that are aimed at children. Some consumer advocates say the move would do little to stop tracking of kids when they watch content aimed at general audiences.
The company also introduced more parental controls for YouTube Kids, the app it launched in 2015 to offer a smaller selection of YouTube’s massive library, and created a web version of the app. The service is far smaller than YouTube’s primary audience of more than 2 billion monthly visitors, and data show the main site is used by more children than the kids’ app.