Skip to content

It looks like we may have content for your preferred language. Would you like to view this page in English?

FTC's "Top Priority" —Children's Privacy

Last week at the International Privacy Professionals Association's Summit, several different officials from the Federal Trade Commission (FTC) spoke, including FTC Commissioner Mark R. Meador. The FTC officials emphasized that both the FTC and the Trump Administration are committed to protecting children online. In particular, the FTC laid out a roadmap on how they intend to approach these cases, which closely follows in the path of the recent social media addiction cases in New Mexico and California.  The FTC appears to be taking the following three-step process for addressing these online services directed to children:

  1. Evaluating whether the design of the online service is harming children 

  2. What is the cause of the harm?

  3. Can the FTC address the cause of the harm?

Whether the FTC can address the cause of the harm will depend on whether the cause of the harm falls within the FTC's Section 5 powers.  In particular, the FTC will look at whether the online service is making false statements about its children's privacy practices or whether the online service has failed to be transparent about its privacy practices.  

In addition to attempting to address whether the designs of websites are harming children, the FTC appears focused on three other areas related to children's privacy: 

  1. Age verification—Following up on the FTC's Age Verification Workshop in January 2026 and then its policy statement in February 2026, the FTC still appears focused on advancing the use of age verification technologies.
  2. Enforcement of the Take it Down Act—The Take it Down Act prohibits any person from knowingly publishing “intimate visual depictions” of minors and non-consenting adults. Under the act, a covered platform that is subject to the Act’s notice-and-removal requirements must remove non-consensual intimate images within 48 hours of receiving a valid removal notice. Starting in May 2026, the FTC is responsible for ensuring that the covered platforms comply with their notice-and-removal requirements, and the FTC appears ready to jump on these responsibilities.  
  3. Chatbots—In September 2025, the FTC issued 6(b) orders to seven companies seeking information to study whether children and teens are negatively impacted by chatbots that simulate human-like communication and interpersonal relationships.  The FTC appears to be ready to further address this issue over the next year.