Advocates Say YouTube Kids App Flush with Inappropriate Content


Google’s YouTube Kids app has come under fire for including inappropriate content. The app, launched in February, purported to narrow down YouTube’s content to just the videos appropriate for children ages 5 and under, but its filtering system has been found to be faulty, allowing children to view objectionable videos.

The Federal Trade Commission was contacted by the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, who complained that Google’s marketing was deceptive. Advocates charged that YouTube couldn’t advertise the app as safe for children while it allowed videos that parents objected to so strongly.

The groups found explicit sexual language; jokes about pedophilia and illegal drugs; risky activities like juggling knives, tasting battery acid, and making a noose; and disturbing discussions about domestic violence, pornography, and the suicide of children. Chris Leo Palermino of Digital Trends cites a dubbed video of Bert and Ernie swearing and President Obama joking about marijuana.

Alistair Barr of the Wall Street Journal reported that the videos available on the app are chosen through automatic filters, feedback from users, and manual reviewing. Parents can flag videos that they find inappropriate, which will then be reviewed by Google employees. A spokeswoman added that parents who want to further limit content can switch off the ability to search, which also turns off links to recommended or related content which may be inappropriate.

Aaron Mackey, a graduate fellow at Georgetown University Law Center’s Institute for Public Representation, said:

When Google set up the app, it said it was a child-friendly app that parents can feel comfortable leaving their kids alone to navigate and learn how to start searching using their voice. We found videos that are not appropriate for kids under 5. It’s deceptive to tell kids that this is a safe product. …Anyone, with just a little bit of searching, can find a lot of inappropriate content.

Heather Kelly of CNN Money says that the CFCC suggests changing how the app works on a fundamental level. Instead of “blacklisting” certain videos, meaning filtering out everything that has objectionable content and leaving the rest, Google should be “whitelisting” content, meaning that only pre-approved videos are available to children.

Last month, other child advocacy groups complained that branded videos marketing to children were included as programming by the app.

Josh Golin, Associate Director at CCFC, said:

Young children are uniquely vulnerable to commercial persuasion. They don’t understand persuasive intent, role-playing, or abstract thinking. I saw that the YouTube Kids app was advertised and marketed as family-friendly and safe when it came out. When I used it, I was dumbfounded. Most of the advertising and marketing tactics would be illegal on TV.

Golin is referring to the Children’s Television act of 1990, which requires those who hold television broadcast licenses to provide educational programming for children and limits the amount of advertising that can appear in those shows. Online videos are exempt from these requirements since they don’t require a broadcast license, writes Alexander Howard of the Huffington Post.

05 27, 2015