YouTube Hit With Complaint By Child Advocacy Groups, Which Say It Illegally Targets Kids

Published on

YouTube Hit With Complaint By Child Advocacy Groups, Which Say It Illegally Targets Kids

By Marco della Cava, USA TODAY

April 9, 2018

SAN FRANCISCO — YouTube increasingly has been called an inappropriate video playground for kids, charges that have caused executives to vow reform.

As the Google-owned platform grapples with the implications from a shooting last week by a disgruntled YouTube creator at its San Bruno, Calif. headquarters, the company now stands accused of pocketing billions by illegally targeting minors with ads based on data mined from their devices and viewing habits.

A consortium of child advocacy groups filed a complaint with the Federal Trade Commission, urging the federal agency to investigate and sanction Google for violations of the Children’s Online Privacy Protection Act. COPPA aims to safeguard online privacy for those 13 and under.

YouTube a haven for young viewers

Child advocates say that YouTube is being disingenuous when it comes to acknowledging just how much influence the site has over shaping the worldview of younger views, many of whom have come to regard the platform as a replacement for television. YouTube says you have to be 13 years or older to use the site, but critics say it is tacitly allowing violations of this restriction when it enables content aimed at young children to flourish. 

“For years, Google has abdicated its responsibility to kids and families by disingenuously claiming YouTube, a site rife with popular cartoons, nursery rhymes and toy ads, is not for children under 13,” Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, said in a statement. “It’s time for the FTC to hold Google accountable.”

Golin’s organization is among more than a dozen signing on to the complaint, including Common Sense Media, Consumer Watchdog and the Parent Coalition for Student Privacy.

YouTube is a popular platform for users of all ages. Videos on the site range from family-friendly to significantly racy. There are some steps you can take, though, to keep your kids away from questionable content. USA TODAY

The groups argue in their complaint to the FTC that YouTube is well aware of its popularity among children considering that it hosts channels such as ChuChuTV Nursery Rhymes and created the YouTube Kids app. 

“YouTube’s privacy policy discloses that it collects many types of personal information, including geolocation, unique device signifiers, mobile telephone numbers … from children under the age of 13,” reads the complaint summary. YouTube does so “without giving notice or obtaining advanced, verifiable parental consent as required by COPPA.”

The advocacy groups want the FTC to levy penalties against Google in the billions of dollars.

“Just like Facebook, Google has focused its huge resources on generating profits instead of protecting privacy,” Jeff Chester of the Center for Digital Democracy said in a statement.

The charges against YouTube come at a time of growing scrutiny from regulators and consumers about what personal information technology companies have collected and profited from in the past decade as social media use has skyrocketed. 

The child advocacy groups initially had planned to reveal the complaint last week but postponed after Nasim Aghdam arrived at the company’s San Bruno, Calif., headquarters on April 3 and opened fire with a handgun, injuring three employees before she took her own life, according to police.

Aghdam complained in social media posts that YouTube had been restricting access to her videos — an action taken by the company in the wake of criticism that it does not adequately restrict content that violates its community guidelines — which in turn reduced her income from ads posting alongside those videos.

Police continue to investigate the matter. Meanwhile, more tough questions arise about a site that started in 2005 as a way to share homemade videos but has grown into an entertainment force.

For all the positive aspects of YouTube — it’s not only a repository of obscure how-to videos but essentially acts as a video log of history — the site continues to draw negative attention thanks to some of its users.

Recently, YouTube courted controversy for everything from dangerous stunts gone tragically wrong (as in the recent jailing of a woman for accidentally shooting her boyfriend on camera) to dumb pranks gone horribly viral (cue the eating of Tide Pod laundry detergent). 

When it comes to minors in particular, YouTube has come under pressure for allowing users to upload videos aimed at children that sometimes feature violent and sexual themes.

Last summer, the company took steps to address the issue, including no longer allowing creators to make money from videos that featured the inappropriate use of family friendly characters. 

In November, YouTube announced that it would age-restrict such content in the main YouTube site — when flagged. But for the most part, algorithms and not humans do most of the video policing on the site.

The often ineffectual nature of that machine-learning approach to gatekeeping was addressed in December by YouTube CEO Susan Wojcicki, who announced that in 2018 it would increase the number of people working to oversee content to more than 10,000.

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” said Wojcicki.

But just a month later, YouTube and its coterie of often millionaire stars were caught in an unflattering spotlight as Logan Paul blasted out a video to his fans of a dead body hanging from a tree in a Japanese forest known for attracting those intent on committing suicide. The public backlash was intense and Paul apologized, and YouTube later dropped Paul from its Google Preferred ad program.

Child advocacy groups have not been swayed by YouTube’s efforts to use algorithms, human curators and community whistleblowers to improve the policing of its content for minors. 

“It’s not until something tragic is shown via a video, and viewers react, that the content is removed or dealt with by the platform,” Jill Murphy, editor in chief of Common Sense Media, said after the Logan Paul video incident. 

Now Murphy’s group and many others want the FTC — which recently launched an investigation into Facebook’s role in the Cambridge Analytica consumer data scandal — to turn off the ad-dollar tap if YouTube doesn’t stop targeting children with ads.

Says Angela Campbell, attorney for two of the groups filing the complaint: “The FTC needs to impose large civil penalties to show it is serious about protecting children’s privacy online.”

Follow USA TODAY tech writer Marco della Cava on Twitter.

Consumer Watchdog
Consumer Watchdog
Providing an effective voice for American consumers in an era when special interests dominate public discourse, government and politics. Non-partisan.

Latest Videos

Latest Releases

In The News

Latest Report

Support Consumer Watchdog

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More Releases