In a move aimed at enhancing online safety for minors, Instagram has announced a major policy update that will restrict users under the age of 16 from livestreaming unless they have explicit parental or guardian consent. The change is part of Meta's broader effort to create a safer digital environment for young users amid growing global concerns about child privacy and internet safety.
New Safeguards for Young Users
The new rule, which will roll out globally over the coming weeks, is designed to give parents more oversight over their children’s online activity, especially in interactive features like livestreaming. According to Instagram, minors attempting to go live will now see a prompt asking them to confirm whether they have the required parental permission — and accounts suspected of non-compliance may face feature restrictions or further verification steps.
“We’re committed to building a platform that prioritizes safety and responsibility,” said a spokesperson from Instagram. “Livestreaming is a powerful tool for self-expression, but it also comes with risks. By requiring parental consent for younger users, we’re taking a meaningful step to ensure more age-appropriate use of the platform.”
Why This Matters
Livestreaming has become increasingly popular among younger users, but it also presents challenges — from online harassment and exposure to inappropriate content to the potential for oversharing personal information in real time. Advocacy groups and regulators have repeatedly urged social platforms to bolster protections for children, particularly around features that allow real-time interaction with strangers.
This latest move follows several policy adjustments Instagram has implemented in recent years, including defaulting accounts for users under 18 to private settings and limiting how advertisers can target minors.
Growing Regulatory Pressure
Instagram’s new policy arrives amid rising scrutiny from lawmakers in the U.S., Europe, and other regions, who are pushing tech companies to take more accountability for how their platforms impact young users’ mental health and safety. The European Union’s Digital Services Act (DSA) and proposed U.S. legislation like the Kids Online Safety Act (KOSA) have intensified the pressure on social media companies to enforce stricter content moderation and age controls.
What's Next?
Instagram has not specified how it will verify parental consent but hinted that new tools and prompts will be introduced in-app to help facilitate the process. Industry analysts suggest that Meta may eventually integrate age verification technology more widely across its apps as part of a comprehensive child safety initiative.
In the meantime, parents are encouraged to talk with their children about the risks and responsibilities of livestreaming — and to use available parental control tools offered by the platform to stay involved in their kids' digital lives.