Google came under fire Tuesday after a string of children's advocacy groups filed an update to their Federal Trade Commission complaint alleging the YouTube Kids App contains content that "would be extremely disturbing" and "potentially harmful" for young children.
The news comes after the coalition of opponents filed their initial complaint in early April, asking the FTC to investigate whether the newly launched YouTube Kids app is exposing young audiences to an unwanted barrage of commercials.
The app, which rolled out in February and is available on iPhone and Android platforms, enables kids to choose from hundreds of channels to watch entertainment and educational programs. It includes content such as tutorials, sing-alongs and other kid-friendly programming. However, opponents of the app argue that mixed in between there is branded content from corporate giants such as McDonald's, Mattel and Hasbro.
The updated complaint argues that the app has content that includes: "explicit sexual language presented amidst cartoon animation; videos that model unsafe behaviors such as playing with lit matches, shooting a nail gun, juggling knives, tasting battery acid and making a noose; a profanity-laced parody of the film 'Casino' featuring Bert and Ernie from 'Sesame Street'; graphic adult discussions about family violence, pornography and child suicide; jokes about pedophilia and drug use; advertising for alcohol products."
Advocacy groups that filed the complaint include the Center for Digital Democracy, the Campaign for a Commercial-Free Childhood, American Academy of Child and Adolescent Psychiatry, Center for Science in the Public Interest, Children Now, Consumer Federation of America, Consumer Watchdog, Consumers Union and Public Citizen.
“The same lack of responsibility Google displayed with advertising violations on YouTube Kids is also apparent in the content made available on the app,” said Dale Kunkel, a University of Arizona professor specializing in children's media who helped draft the FTC complaint. “There is a serious risk of harm for children who might see these videos. It’s clear Google simply isn’t ready to provide genuinely appropriate media products for children.”
A YouTube spokeswoman emphasized that the Google-owned video platform takes "feedback very seriously."
"Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed," she said in a statement addressing the updated complaint. "For parents who want a more restricted experience, we recommend that they turn off search."
Ahead of the app's launch, YouTube emphasized that the app is family friendly. The company also pointed out that parents are able to control how much time their kids spend on the app, and also monitor what they are watching.
YouTube also screens all advertising that is aired on the app to make sure it's family friendly. Before the app went live, the video sharing network received endorsements from content creators and advocacy groups, such as Common Sense Media and the Family Online Safety Institute.
Parents can limit available videos on the homescreen by turning off search. They can also get additional information for any show on the network, such as the synopsis, target age range and educational goals. Content in the app is selected through a mix of automated controls and user feedback.
"We work to make the videos in YouTube Kids as family-friends as possible," the YouTube spokeswoman said. "We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video."
For more news on the entertainment industry, follow me @saba_h