Lindsey Tonsager, Jenna Zhang, Natalie Maas, Bryan Ramirez, and Irene Kim of Covington and Burling write:
Since our mid-year recap on minors’ privacy legislation, several significant developments have emerged in the latter half of 2025. We recap the notable developments below.
Age Signaling Requirements
In October, California enacted the Digital Age Assurance Act (AB 1043), which, effective January 1, 2027, requires operating system providers—defined as entiti…
Lindsey Tonsager, Jenna Zhang, Natalie Maas, Bryan Ramirez, and Irene Kim of Covington and Burling write:
Since our mid-year recap on minors’ privacy legislation, several significant developments have emerged in the latter half of 2025. We recap the notable developments below.
Age Signaling Requirements
In October, California enacted the Digital Age Assurance Act (AB 1043), which, effective January 1, 2027, requires operating system providers—defined as entities that develop, license, or control operating system software—to offer an accessible interface at account setup that prompts the account holder to indicate the user’s birth date, age, or both. The developer is required to request an age signal about a particular user when the application is downloaded and launched.
This California law differs from other app store legislation that passed earlier this year, such as in Utah, Texas, and Louisiana, because AB 1043 applies to operating system providers rather than solely mobile application store providers. While the Texas App Store Accountability Act is scheduled to take effect on January 1, 2026, in October, the Computer & Communications Industry Association (“CCIA”) filed a lawsuit challenging the Texas law as unconstitutional.
Regulation of AI Chatbots The second half of 2025 saw increased attention to minors’ use of AI chatbots. The Federal Trade Commission (“FTC”) launched an inquiry into AI chatbots acting as companions under its Section 6(b) authority, including what actions companies are taking to mitigate alleged harm and compliance with the Children’s Online Privacy Protection Act Rule. State legislatures have also been active in enacting laws on the use of AI chatbots:
- California recently enacted SB 243, which requires operators of companion chat platforms to provide a clear and conspicuous notice indicating that the chatbot is AI-generated. If operators know a user is a minor, they must (1) disclose to the minor user that they are interacting with AI, (2) provide a notification every three hours that reminds the minor user to take a break and that the chatbot is AI, and (3) institute “reasonable measures” to prevent the chatbot from sharing or encouraging the minor user to engage in sexually explicit content.
- There are several bills pending in Congress that would require AI chatbots to implement age-verification measures. Under the CHAT Act, if an entity determines that the user is a minor, the minor account would need to be affiliated with a parental account and the entity would need to obtain verifiable parental consent for the minor to use the service. Under the GUARD Act, a minor would be prohibited from accessing or using the AI companion. Under the SAFE BOTs Act, chatbots would need to make disclosures of its non-human and non-professional status and provide crisis intervention resources to minor users.
Read more about other state and federal level activity this year, as well as developments concerning mental health warning labels at Inside Privacy.