![Meta]()
Meta is launching Teen Accounts on Facebook and Messenger to protect young users. The feature, which automatically sets up safety measures for teens, will first be available in the U.S., U.K., Australia, and Canada, with plans to expand to other regions later.
![Teen Accounts]()
Built-in Protections for Teen Accounts
Teen Accounts come with built-in protections, such as restricting who can contact teens and view their content. Teens under 16 must get parental permission to change these settings. They can only receive messages from people they follow or have previously messaged, and only friends can see and reply to their stories.
![New]()
New Restrictions for Instagram
On Instagram, teens under 16 will need parental approval to go live or disable the feature that blurs potentially inappropriate images in direct messages. These new restrictions aim to increase control over teens' social media interactions and provide additional safety features.
![Focus]()
Focus on Teen Mental Health and Safety
These changes are part of Meta’s efforts to address concerns about teen mental health in relation to social media. By limiting contact and content, Meta aims to create a safer environment for teens online, following growing concerns from lawmakers and health professionals about social media’s impact on young people.
Positive Feedback from Parents
Meta reports that 54 million teens have already switched to Teen Accounts on Instagram. A survey found that 94% of parents believe these accounts are helpful for keeping their teens safe, with 85% stating that the protections make it easier to guide their teens toward positive online experiences. Meta plans to continue improving these features based on feedback.