YouTube axes tens of millions of comments in crackdown on child sexual exploitation

In a video posted Sunday, video blogger Matt Watson detailed how users who visit YouTube for bikini-shopping videos can eventually be nudged to watch videos featuring young girls.
(Patrick Semansky / Associated Press)
Washington Post

YouTube cracked down Thursday on pedophilic content on the video site by purging tens of millions of comments, which serve as a kind of powerful but overlooked social network.

A video blogger this week published a report on YouTube documenting how comments and recommendations on the platform direct users to potentially sexual videos of children, enabling those users to participate in what the blogger called a “soft-core pedophile ring.”

YouTube, which is part of Alphabet Inc.’s Google unit, also terminated more than 400 channels Thursday that posted the comments on videos featuring children.


This week, several major brands such as Disney and Nestle halted their advertising on YouTube because their ads were played alongside videos with abusive or sexually explicit comments — a repeat of a brand boycott two years ago when advertisers protested the placement of their spots in inappropriate videos.

YouTube’s latest controversy focuses on the abusive aspect of its comments section.

“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” YouTube spokeswoman Andrea Faville said in a statement Thursday. “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”

In a video that has been viewed more than 2 million times since its release Sunday, video blogger Matt Watson detailed how users who visit YouTube for bikini-shopping videos can eventually be nudged to watch videos featuring young girls. After clicking on several bikini videos, YouTube’s recommendation engine suggests that users watch videos with minors, Watson said. The videos are not sexual in nature — they involve children talking to the camera, performing gymnastics or playing with toys, but users interpret them in inappropriate ways. The comments on the videos include hyperlinked time stamps, Watson said, enabling users to jump to moments when the girls are in compromised positions; in other instances, users posted sexually explicit comments about the children.

“Once you are in this loophole, there is nothing but more videos of little girls,” he said in the video. “How has YouTube not seen this?”

YouTube also said it removed dozens of videos that were posted without malicious intent but were nonetheless putting children at risk. The company said it continues to invest in technology that enables it and its industry partners to detect and remove sexually abusive imagery.

In a company blog post from 2017, YouTube outlined the ways it was “toughening” its approach to protect families on its platform. One aspect of its approach was blocking inappropriate comments on videos featuring minors. The company said it had historically used a combination of automated systems and of people flagging inappropriate and predatory comments for review and removal. YouTube said at the time that it would take a more “aggressive stance” on curbing abusive posts by turning off the commenting feature when it detected such posts. It is technically easier for software to scan text, such as comments, rather than video for anything that would violate YouTube’s policies.


In the wake of the latest controversy, YouTube said it has been hiring more experts dedicated to child safety on the platform and to identifying users who wish to harm children.

YouTube has previously grappled with publishing exploitative videos of children. In 2017, the company cracked down on accounts that posted disturbing videos for young audiences that featured children in predatory or compromising situations which drew massive audiences.

“YouTube, in addition to other social media platforms, should offer regular, independent, external audits of online hate and harassment,” said George Selim, senior vice president of the Anti-Defamation League.

Watson said that some of the YouTube videos feature ads for big-name companies, including Disney.

Nestle said, “An extremely low volume of some of our advertisements were shown on videos on YouTube where inappropriate comments were being made,” adding that it is investigating the matter with YouTube and its partners and has decided to pause its advertising on the platform globally.

Disney has also suspended its advertising on YouTube, according to Bloomberg. Disney did not immediately respond to a request for comment.


“Fortnite” maker Epic Games said it has “paused” its advertising on YouTube that runs before videos, but it’s unclear whether Epic’s ads appeared alongside the controversial content. “Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service,” Epic said in a statement.

Hamza Shaban writes for the Washington Post.