EU Flags Meta’s Child Safety Failure: Under-13 Users Still Slipping Into Facebook and Instagram

Sapatar / Updated: Apr 29, 2026, 17:23 IST 2 Share
EU Flags Meta’s Child Safety Failure: Under-13 Users Still Slipping Into Facebook and Instagram

The European Union has concluded that Meta is falling short in enforcing its own minimum age rules, allowing children under 13 to access Facebook and Instagram. The findings, emerging from ongoing regulatory reviews, suggest that existing safeguards—largely based on self-declared age inputs—are insufficient to prevent underage sign-ups.

Under both platforms’ policies, users must be at least 13 years old to create an account. However, EU regulators argue that Meta’s systems lack robust verification layers, making it easy for minors to bypass restrictions by simply entering a false birth date.


Digital Services Act Puts Pressure on Big Tech

The issue is not just about policy—it directly ties into the EU’s Digital Services Act (DSA), which came into force to enforce stricter accountability on large digital platforms. Under the DSA, companies like Meta are categorized as “Very Large Online Platforms” (VLOPs), meaning they face higher scrutiny, especially regarding user safety and risk mitigation.

Failure to comply can lead to fines of up to 6% of global annual turnover, a significant threat even for a company of Meta’s scale. Regulators are particularly focused on how platforms assess and mitigate risks related to minors, including exposure to harmful content, addictive design patterns, and data privacy concerns.


Why Age Verification Remains a Weak Link

At the core of the problem is a long-standing industry challenge: verifying user age without compromising privacy or creating friction. Meta currently relies heavily on:

  • Self-reported age during sign-up
  • AI-based detection of suspicious accounts
  • Reporting mechanisms for underage users

However, experts say these measures are reactive rather than preventive. Without stronger identity verification—such as document checks or biometric estimation—platforms remain vulnerable to misuse.

At the same time, stricter verification raises its own concerns around data protection, especially in Europe where GDPR sets high standards for handling personal information.


Expert Insight: A Structural Problem Across Platforms

Industry analysts note that Meta is not alone in facing this issue. Nearly all major social media platforms struggle with enforcing age restrictions effectively.

“Age gating on the internet has always been more symbolic than secure,” says a digital policy expert familiar with EU proceedings. “The difference now is that regulators are no longer willing to accept that limitation.”

The EU’s stance signals a shift from passive compliance to proactive enforcement, where companies are expected to demonstrate measurable effectiveness—not just intent.


Potential Consequences for Meta

If the EU determines that Meta has failed to meet DSA obligations, the company could face:

  • Financial penalties
  • Mandatory changes to platform design
  • Stricter auditing and reporting requirements
  • Increased oversight from EU regulators

More importantly, repeated violations could damage Meta’s credibility at a time when trust in social media platforms is already under pressure.


What This Means for Users and Parents

For everyday users, especially parents, the findings reinforce a critical reality: platform safeguards are not foolproof. While companies provide tools like parental controls and reporting systems, much of the responsibility still falls on guardians to monitor online activity.

For young users, early exposure to social media raises concerns around mental health, privacy, and content safety—issues that regulators are now prioritizing more aggressively.


The Bigger Picture: A Turning Point in Platform Accountability

The EU’s findings against Meta are part of a broader global trend toward stricter tech regulation. Governments are increasingly unwilling to rely on self-regulation by platforms, especially when it comes to protecting minors.

The takeaway is clear: compliance is no longer about having policies in place—it’s about proving they work. For Meta and others in the industry, the next phase will require deeper investment in safety infrastructure, transparency, and accountability.