Meta has announced new safety measures for teens across its platforms, including Instagram, Facebook, and Messenger. Starting now, teens under 16 will need parental permission to use livestreaming features on Instagram. The update is part of a wider effort to improve digital safety for young users. Meta is rolling out these changes in the US, UK, Canada, and Australia as the tech giant responds to growing calls for stronger online protections.
New Livestreaming Rules for Teens
Teens under 16 will no longer be able to go live on Instagram without a parent’s approval. This move aims to prevent harmful or inappropriate content from being shared by younger users.
In addition, Meta is expanding its image-blurring feature in direct messages. This tool hides photos suspected to show nudity. Teens will now need a guardian’s consent to turn this setting off.
These changes are part of a broader update to teen account controls across all Meta platforms.
What’s Changing on Facebook and Messenger?
Meta has started rolling out teen-focused features on Facebook and Messenger. These include:
-
Default safety settings for users under 18
-
Time limits set by parents or guardians
-
Blocked access during certain hours (like late at night)
-
Parental insight into who their child is messaging
Teens aged 16 and 17 will still have some control over these settings, but younger users will need guardian permission to make changes.
Meta says the goal is to make it easier for parents to support their children’s safety online, without completely removing teens’ ability to connect with friends and communities.
Why Meta Is Making These Changes
The announcement comes at a time when major tech platforms face mounting pressure to protect younger users. The UK recently began enforcing its Online Safety Act, which requires platforms to remove harmful content—especially material related to abuse, self-harm, fraud, or terrorism.
Meta reports that 90% of Instagram users aged 13 to 15 already use the default safety settings. As of now, more than 54 million users worldwide under 18 are using teen accounts on Meta platforms.
These numbers suggest progress, but safety experts say more needs to be done.
Experts React to the Update
Child safety advocates have welcomed the changes. The NSPCC, one of the UK’s top child protection groups, said the move is a step in the right direction. However, the group also warned that tech companies must take even stronger steps.
“To be truly effective, platforms must also prevent dangerous content from appearing in the first place,” said Matthew Sowemimo, NSPCC’s Head of Policy.
Parents and educators have long called for easier tools to monitor teen activity and protect children from harmful content online. These updates offer a stronger foundation but may not be enough alone.
A Push Toward Greater Accountability
Meta says these changes are designed to “shift the balance in favour of parents.” The company believes many families still don’t know about existing controls or how to use them.
Nick Clegg, Meta’s former global affairs chief, emphasized the need to empower families. He noted that the digital world has moved faster than many people can keep up with, and it’s time to give parents the tools to guide their children safely.
The company is also running educational campaigns to help families understand new features and how to use them.
What’s Next for Teen Safety Online?
Meta’s changes may lead the way for more platforms to introduce similar controls. As laws like the UK’s Online Safety Act take effect, social media companies could face fines or other penalties if they fail to act.
Other countries, including the US and Canada, are also considering new laws that could push tech firms to protect children more proactively.
As young people spend more time online, safety tools must keep up. Meta’s latest move is a key step toward building a safer internet for teens. But experts agree—lasting change will require both strong technology and active involvement from parents.