Apple leans into age checks as Australia’s teen social media ban goes live, putting the App Store in the middle of a political fight over kids, tech, and compliance.
Australia’s new social media law takes effect on December 10 and it targets the apps, not the devices. The law requires major social platforms to prevent anyone under 16 from holding an account, and to deactivate existing under-age accounts.
The eSafety Commissioner can issue fines in the tens of millions of dollars for violations. Regulators expect platforms to rely on age-assurance tools and behavioral signals instead of mandatory ID scans.
Legal challenges and practical workarounds are already forming as compani…
Apple leans into age checks as Australia’s teen social media ban goes live, putting the App Store in the middle of a political fight over kids, tech, and compliance.
Australia’s new social media law takes effect on December 10 and it targets the apps, not the devices. The law requires major social platforms to prevent anyone under 16 from holding an account, and to deactivate existing under-age accounts.
The eSafety Commissioner can issue fines in the tens of millions of dollars for violations. Regulators expect platforms to rely on age-assurance tools and behavioral signals instead of mandatory ID scans.
Legal challenges and practical workarounds are already forming as companies figure out how to comply.
The law targets social media platforms, but those services operate inside Apple’s ecosystem on every device. Apple still feels the pressure even if it isn’t the statute’s primary focus.
The company now offers a compliance toolkit that developers can use to meet the rules. The App Store ends up acting as an intermediary layer between lawmakers and the apps they regulate.
Apple’s developer update, dated December 8, spells out what social media apps operating in Australia need to do next. The company reminds developers that they, not Apple, are responsible for disabling under-16 accounts, blocking new signups, and otherwise following the law.
Then, Apple lays out a set of tools that essentially turn platform plumbing into age-check infrastructure.
The primary feature is a Declared Age Range API that signals a user’s age band. Apps can check whether someone falls under 16 and tailor their behavior around that answer.
In Australia, a social app could block new accounts or log out underage users while still offering a full experience to adults. Apple also points to several App Store levers that sit around the app rather than inside it.
- Use the App Store description to flag that their service is unavailable to people under 16 in Australia.
- Answer updated age-rating questions that now include fields for age assurance and parental controls, so that customers and regulators can see which apps claim to use these tools.
- Raise the minimum age rating for an app above the automatic score generated by Apple’s questionnaire.
- Add an "Age Suitability URL" pointing to a developer-hosted page that explains regional age rules, including Australia’s under-16 ban.
None of these features ensure compliance with the law on their own. Together, they give lawmakers a clear path for age checks and offer developers an easier way to plug into Apple’s framework instead of building their own.
From privacy branding to compliance infrastructure
Apple has spent years arguing that it already protects children and families through Screen Time, content limits, and on-device safety features. However, lawmakers often ignore protections Apple already ships.
Australia has pushed Apple harder than most markets on child safety. In 2022, the eSafety Commissioner criticized Apple and other tech firms for insufficient action on child abuse material after demanding detailed reports on their detection methods.
In 2024, Apple tested iOS features in Australia that let children report inappropriate content directly to the company. The effort built on its existing nudity detection tools in Messages and other apps.
The Declared Age Range API and new App Store options continue that trajectory. The tools let Apple show it can help enforce age-based rules at the platform level. They also keep identity documents and raw biometrics out of developers’ hands.
Lawmakers in Australia prefer age checks that rely on existing data and inference instead of mandatory ID uploads. The approach fits neatly with how Apple wants to present its own system.
Regulators push platforms to solve social problems, and developers often push back against new obligations. Apple steps in to handle some of the work in return for more control over how apps operate in its ecosystem.
What this means for developers and platforms
For social media companies, the Australian law is still the main risk. Apps need to identify under-16 users in Australia, deactivate their accounts, and stop new ones from appearing.
For social media companies, the Australian law is still the main risk
Apple’s message is clear that compliance rests with the developers, even as it supplies the tools. Using Apple’s age-range signal may be more attractive than running separate ID checks or buying third-party verification services, especially for smaller platforms.
Updated ratings, age-assurance disclosures, and an optional suitability URL help companies show good-faith compliance. Regulators can also see those signals when they review an app’s approach to the law.
There are real trade-offs for developers. OS-level age signals concentrate power in Apple’s hands and may disadvantage rival app stores, while also requiring developers to trust that Apple’s age estimates won’t lock out legitimate users.
Regulators may start treating Apple’s toolkit as a baseline standard. Once the API exists, lawmakers elsewhere could ask why social apps aren’t using it beyond Australia.
Apple’s growing role as regulator by proxy
Australia’s ban puts heavy burdens on private platforms, and critics think teenagers will bypass it with a VPN or fake details. Apple can’t solve those problems, but its response shows how far it’s willing to go as an intermediary between lawmakers and developers.
By making age checks and disclosures part of the platform, Apple argues that regulators should work through the App Store instead of adding conflicting rules. The same pattern is emerging in the United States, where politicians want stronger age verification despite the controls Apple already provides.
Developers now have a clearer path for complying with Australia’s law and future child-safety rules if they follow Apple’s system. The open question is whether regulators let Apple become a de facto standards body for age assurance or push back against the idea that one platform should shape how laws get enforced.