Skip to content

CARU referral to FTC results in record $5.7M payment to settle claims that app violated children’s privacy law

The Federal Trade Commission has reached a record-breaking settlement in a children’s privacy action., a popular video social networking app now known as TikTok, will pay $5.7 million to settle the FTC’s claims that it violated the Children’s Online Protection Privacy Act (COPPA) by failing to notify parents it was collecting personal information from children under age 13 and to obtain their consent, the agency announced on Feb. 27.

The app let users create short videos of themselves lip-syncing to music and then share those videos with other users. To register, the app required users to provide an email address, phone number, username, and first and last names, plus a short biography and a profile picture. User accounts were public by default, but even when the account was set to private, users’ profile pictures and bios remained public, and they could still receive direct messages from other users.

More than 200 million users have downloaded the app since 2014, with 65 million accounts registered in the United States, according to the FTC. Many of the app’s users have been children.

The Children’s Advertising Review Unit (CARU), the children’s arm of the advertising industry’s self-regulation program, referred the case to the FTC last year after refused to comply with CARU’s recommendations to protect children’s privacy. maintained the app targeted a general audience and was not specifically directed at children.

The FTC disagreed, asserting that under COPPA, the app was directed at children because it used subject matter, visual content, music and celebrities that appeal to children.’s child-oriented content includes songs from Disney movies, pop stars with large tween followings (such as Miley Cyrus and Ariana Grande), and colorful emojis of cute animals and smiley faces.

In addition, was aware that personal information was being collected from children, the FTC asserted. A look at user profiles showed many users who provided their date of birth or grade in school. Further, since 2014, has received thousands of complaints from parents of children under 13 who used the app, said the agency.

The FTC also noted reports of adults trying to contact children on the app and the fact that, until October 2016, included a feature that let users view other users within a 50-mile radius.

In this case, whether the company intended to direct the app at children was ultimately irrelevant as to whether it is covered by COPPA. In its analysis, the FTC says it looks at a site’s or product’s look and feel, as well as any evidence that the operator or manufacturer knows users are children under 13.

In a separate joint statement about the case and the record-breaking settlement, Commissioners Rohit Chopra and Rebecca Kelly Slaughter lauded the action and settlement as a “a big win in the fight to protect children’s privacy” and called out “disturbing practices” by, including “collecting and exposing the location and other sensitive data of young children.” Noting that the investigation started before the current slate of commissioners was in place, Commissioners Chopra and Slaughter advocated for a change in the FTC’s investigative focus, to investigate and hold liable individuals within companies that have engaged in illegal conduct.

While the statement was made in the context of this COPPA case, in which the company engaged in “egregious conduct,” the joint statement didn’t suggest any kind of limitation on holding individuals accountable. “When any company appears to have made a business decision to violate or disregard the law, the Commission should identify and investigate those individuals who made or ratified that decision and evaluate whether to charge them,” the commissioners wrote. “As we continue to pursue violations of law, we should prioritize uncovering the role of corporate officers and directors and hold accountable everyone who broke the law.”