Meta has announced that it will remove Australian users under the age of 16 from Facebook and Instagram beginning December 4, marking one of the company’s most sweeping safety actions to date. The decision follows intensified scrutiny from Australian regulators, who have ramped up pressure on tech giants to enforce stricter age verification and protect minors from online harms.
Why Australia Is Forcing Stricter Age Enforcement
The move comes after Australia’s planned Online Safety Codes and evolving privacy guidelines demanded clearer age verification and greater accountability from digital platforms. Regulators argue that children face significant risks online—ranging from predatory behavior to exposure to harmful content—and that platforms must do more to restrict access.
Authorities have warned that companies failing to comply may face multi-million-dollar penalties, prompting Meta to tighten its age-check processes.
How Meta Plans to Identify Underage Users
Meta says it will use a combination of technology and user-driven verification to enforce the new rule:
-
AI-based age detection tools analysing activity patterns and profile behaviour
-
Government-approved identity checks where necessary
-
Facial age estimation technology in partnership with third-party verification firms
-
User and parent reporting mechanisms for suspected underage accounts
Accounts flagged as potentially belonging to under-16 users will be suspended until age is confirmed. If verification fails, the accounts will be removed entirely.
Impact on Australian Teens and Parents
Industry analysts estimate that the changes will affect hundreds of thousands of young users, many of whom joined the platforms before strict age checks were enforced. Parents have shown mixed reactions: some welcome the additional safety layers, while others worry about limiting online social interaction or access to support communities.
Digital experts caution that removing underage users without offering alternative youth-safe platforms may push them toward unregulated apps.
Meta Says Safety Is the Priority
In a statement, Meta said the update aims to create a “safer and more age-appropriate online environment” for young Australians. The company acknowledged that new verification measures may feel intrusive to some users but insists the changes are necessary to align with emerging global standards on youth protection.
Australia Continues to Lead in Online Safety Regulation
Australia has been at the forefront of global online safety enforcement, imposing strict codes on social media giants and empowering the eSafety Commissioner to demand rapid platform reforms. Meta’s decision may become a precedent for similar actions in Europe and the U.S., where lawmakers are considering heightened digital safety rules for minors.