Skip to content

It looks like we may have content for your preferred language. Would you like to view this page in English?

Children’s Online Privacy in 2025: The Amended COPPA Rule

As the dust settles following the presidential administration changes earlier this year, children’s privacy is expected to be a priority at the state and federal levels. However, we’re still reading the tea leaves on what happens next as we anticipate what actions regulators and legislators intend to take to protect children and minors. Below is a recap of significant children’s privacy updates thus far in 2025 and predictions about what’s to come. 

Key Takeaways

  • The Federal Trade Commission (FTC) published final amendments to the Children’s Online Privacy Protection Act (COPPA) Rule on April 22.

  • Updates include expanded definitions of “personal information” and “online contact information,” new standards for “mixed audience” services, enhanced parental notice and consent requirements, stricter data retention and security obligations, and greater safe harbor program transparency.
  • Companies must comply by April 22, 2026, though planning and operational adjustments should start now.

What Happened?

On April 22, the FTC finalized amendments to the COPPA Rule—its first major update since 2013. The amendments modernize the rule to better protect children under 13 online, accounting for advances in technology, particularly biometric recognition, mobile usage and data security threats.
The amended rule was approved unanimously by the FTC before the change in administrations, suggesting minimal risk of reversal despite political shifts.

Key Changes to the COPPA Rule:

1. Broader definitions:

  • Personal information: Now includes biometric identifiers (e.g., fingerprints, facial patterns, DNA sequences, voiceprints, gait patterns) and government-issued identifiers (e.g., state IDs, birth certificates).
  • Online contact information: Expanded to cover mobile phone numbers used solely to send text messages for parental consent purposes.

2. Expanded factors for determining when a website is directed to children:

  • The FTC will now consider marketing, promotional materials or plans, representations to consumers or third parties, reviews by users or third parties, and the age of users on similar websites of services.

3. Mixed audience clarification:

  • Defines mixed-audience services as sites directed to children but not primarily targeting them.
  • Requires age screening before collecting any personal information.
  • Mixed-audience operators can use the same consent exceptions available to child-directed operators.

4. Stronger parental notice and consent requirements:

  • Direct notice: Must now include how personal data will be used, the identities or specific categories of third parties receiving it, and their specific purposes.
  • Privacy policy updates: Must disclose data retention practices, persistent identifier usage, audio file handling where applicable, and the specific identities and categories of third parties receiving personal information.
  • Separate consent: Operators must obtain separate parental consent for nonintegral third-party disclosures (e.g., marketing, AI training).

5. New verifiable parental consent methods:

  • Text plus: Allows text messages to parents to initiate consent (where children’s data is not disclosed to third parties).
  • Knowledge-based authentication: Now codified as an approved method.
  • Facial recognition: Permitted when matching a parent’s webcam image to a government-issued ID, with mandatory immediate deletion after verification.

6. Enhanced security and data retention requirements:

  • Operators must implement a written information security program proportionate to their size and data sensitivity and must prohibit indefinite retention of children’s personal data.
  • Requires the designation of a security coordinator, annual risk assessments, regular testing and oversight of service providers.

7. Safe harbor programs:

  • Programs must publicly disclose membership lists and increase transparency through regular reporting to the FTC.

What’s Missing?

Notably absent from the amended rule:

  • Codification of the school authorization exception: The FTC did not codify its long-standing guidance permitting schools to authorize the collection of personal information on behalf of parents for educational technology services. The agency explained that it avoided making amendments that could conflict with upcoming potential changes to the U.S. Department of Education’s Family Educational Rights and Privacy Act (FERPA) regulations. The FTC indicated it will continue to enforce COPPA in the education technology context based on existing guidance.
  • Express prohibition of the use of “support for internal operations” for engagement-enhancing techniques: The FTC chose not to adopt a proposed amendment that would have expressly barred the use of the support for internal operations exception to justify engagement-enhancing techniques such as push notifications. While the agency acknowledged concerns that these techniques can harm children’s physical and mental health, it concluded that the proposed language was overly broad and might limit beneficial use cases. Instead, the FTC reaffirmed that unfair or deceptive practices encouraging harmful prolonged use will continue to be addressed through enforcement under Section 5 of the FTC Act.

What Do Companies Need to Do?

  • Reevaluate websites to determine whether they are directed to children: Review company marketing materials/merchandise and similar websites, since the FTC will now consider each of these as factors in determining whether a website is directed to children (even if a company does not have analytical data that shows it is collecting the personal information of children under 13).
  • Reevaluate whether an age gate is needed: Based on the analysis above, reevaluate websites to determine whether they are primarily directed to children (which means age gating is prohibited), mixed-audience sites (which means age gates are required) or general-audience sites (which means age gating is not necessary). To avoid having to age gate, consider bifurcating sections of the website that are directed to children or mixed-audience from general-audience sections. 

  • Review current practices and update policies: Audit existing data collection, use and disclosure practices to determine whether any data collected falls under the expanded definitions of personal and online contact information. These audits may also be needed to obtain information needed to meet enhanced disclosure requirements for direct notices and privacy policies (e.g., how persistent identifiers are used and which specific third parties companies are sharing data with).

  • Update consent workflows (if needed): Consider whether the new options for obtaining verifiable parental consent (e.g., text plus or knowledge-based authentication) are appropriate and ensure separate consent is obtained for nonintegral disclosures.
  • Strengthen security programs: Create or update written information security programs and data retention schedules specific to children’s data, which may be a significant lift for companies that have not yet designated a security coordinator or implemented annual risk assessments, regular testing of safeguards and oversight of service providers.
  • Prepare for safe harbor transparency: If participating in a safe harbor program, note that these programs must meet new disclosure and reporting obligations, which will mean participating companies will be placed under increased scrutiny.

  • Reassess third-party contracts: Ensure data-sharing practices can be disclosed and justified.

The updated rule becomes effective June 23, and companies must comply by April 22, 2026. Companies that embrace these updates thoughtfully will not only reduce legal risk but also demonstrate a strong commitment to child safety—a reputational asset as regulators, parents and advocates focus increasingly on kids’ online privacy.

What Should We Expect Next?

In the waning days of the Biden administration, the FTC took several other actions related to children’s privacy and online safety. We can look to FTC Chair Andrew Ferguson’s opinions on those actions and some upcoming events to help us predict what may come next: 

  • On Jan. 16, the FTC announced it had referred a complaint against Snap Inc. to the U.S. Department of Justice (DOJ) alleging that the company’s deployment of an AI chatbot resulted in risk and harm to young users of its Snapchat application. The case emerged from compliance reviews following Snap’s 2014 FTC settlement of charges the company deceived consumers about the disappearing nature of messages sent through Snapchat. While it’s rare for the FTC to publicly announce such a referral, the agency stated that doing so in this case was in the public interest. Notably, then-Commissioner Ferguson issued a separate statement on his opposition to the complaint due to its incorrect application of Section 5 of the FTC Act (as a matter of statutory interpretation) and conflict with the First Amendment of the U.S. Constitution. Some say the case referral was announced publicly to force the DOJ to take action, but there are indications the DOJ will refuse to do so. For example, in January, the DOJ ordered its Civil Rights Division to pause any ongoing litigation left over from the Biden administration and has since been stepping back from other cases.

  • On Jan. 17, the FTC announced a settlement with Cognosphere, the maker of the video game Genshin Impact, for violating COPPA by collecting the personal information of children under 13 without parental consent, deceiving users about the cost of in-game transactions and the odds of winning loot box prizes, and engaging in unfair conduct by marketing and offering in-game virtual currency and loot boxes to children and teenagers. As part of the settlement, Cognosphere was required to pay $20 million, block children under 16 from making in-game purchases without parental consent and delete any personal information previously collected from children under 13 unless they obtain parental consent to retain such data. It’s worth noting that then-Commissioner Ferguson concurred with the COPPA allegations in the Cognosphere complaint but dissented with respect to the rest of the claims. He objected to the complaint’s broad use of Section 5 by alleging that offering loot boxes to children is an unfair act or practice. Ferguson argued that the substantial injury in this case (i.e., the amount of money that children and teens may spend in games) was avoidable (because parents can decline to give children access to a credit card or use parental control systems widely available on mobile platforms) and therefore not a violation of Section 5. Ferguson also noted that marketing a loot box system to children and teens is not unfair, specifically stating that the FTC’s allegations were unclear and that he “will not support novel theories of liability advanced in the final hours of the Biden-Harris Administration.” If this view prevails, we may see fewer expansions of Section 5 authority under the current FTC.

  • The FTC will also take aim at tech companies’ potentially harmful practices in an upcoming event. The agency will convene a panel called “The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families” to discuss how Big Tech imposes addictive design features, erodes parental authority and fails to protect children from exposure to harmful content. Originally scheduled for May 28, the event has been rescheduled for June 4 due to “increased interest.” The event will be held in person and virtually, and the FTC has asked speakers with expertise on these topics to contact the agency by April 30. The event appears to be a reframed version of the FTC’s earlier-planned meeting, “The Attention Economy: Monopolizing Kids’ Time Online,” likely adjusted to align more closely with the new administration’s priorities. It remains to be seen whether the tone of the discussion will be less aggressive toward Big Tech.