Printed from
TECH TIMES NEWS

EU Flags Meta Over Failure to Block Under-13 Users on Facebook and Instagram

Deepika Rana / Updated: Apr 29, 2026, 17:22 IST
EU Flags Meta Over Failure to Block Under-13 Users on Facebook and Instagram

The European Union has concluded that Meta is falling short in enforcing its own age restrictions, allowing children under 13 to access Facebook and Instagram despite longstanding policies prohibiting such use. The findings come as part of broader regulatory scrutiny under Europe’s evolving digital governance framework, particularly the Digital Services Act (DSA).

At the core of the issue is a gap between policy and enforcement. While Meta officially bars users under 13, regulators argue that its current verification mechanisms are too weak to effectively prevent underage sign-ups.


What the EU Investigation Found

According to officials familiar with the review, the EU identified systemic weaknesses in how Meta verifies user age during account creation. In many cases, the process relies heavily on self-declaration—an approach widely seen as easy to bypass.

The investigation suggests that:

  • Underage users can still create accounts with minimal friction
  • Age verification lacks robust cross-checking mechanisms
  • Existing safeguards do not adequately detect or remove underage accounts

These shortcomings raise concerns not just about compliance, but about exposure of minors to harmful or inappropriate content.


Why This Matters: Beyond Policy to Real-World Risks

The implications go far beyond technical compliance. Social media platforms are increasingly under pressure to ensure safer environments for young users, especially as concerns grow around:

  • Mental health impacts
  • Exposure to harmful or addictive content
  • Data privacy and targeted advertising

For regulators, the issue is simple: if a platform claims to restrict access, it must prove that the restriction works in practice—not just on paper.


Meta’s Position and Ongoing Efforts

Meta has previously stated that it is investing in advanced age verification tools, including AI-based systems and partnerships aimed at improving detection of underage users. The company has also introduced parental controls and teen-specific safety features in recent years.

However, the EU’s findings indicate that these measures may not yet be sufficient to meet regulatory expectations, especially under stricter frameworks like the DSA, which demand proactive risk mitigation.


Regulatory Pressure Is Only Increasing

This development is part of a broader shift in how governments—particularly in Europe—are approaching Big Tech. The Digital Services Act places clear obligations on platforms to:

  • Assess and mitigate systemic risks
  • Protect vulnerable users, including minors
  • Maintain transparency in safety practices

Failure to comply can lead to significant penalties, including fines tied to global revenue.


What Happens Next

While the EU has not yet announced final penalties, the findings could trigger formal enforcement actions if Meta is deemed non-compliant. That may include:

  • Mandatory changes to age verification systems
  • Increased monitoring and audits
  • Financial penalties

For Meta, the challenge is not just regulatory—it’s reputational. Trust around user safety, particularly involving children, remains a sensitive and high-stakes issue.


The Bigger Picture for Users and Industry

This case highlights a growing reality: self-regulation is no longer enough for major tech platforms. Governments are stepping in with stricter rules, and enforcement is becoming more aggressive.

For users and parents, it reinforces the importance of awareness and active supervision. For the industry, it signals a shift toward accountability that could reshape how platforms design onboarding, verification, and safety systems.