In a significant step toward safeguarding young users, Meta has launched its Teen Accounts initiative on Facebook and Messenger, extending a framework first introduced on Instagram in September 2024. The rollout, which began this week in the United States, United Kingdom, Australia, and Canada, aims to create a safer digital space for teens by curbing exposure to harmful content and unsolicited interactions. Meta has pledged to bring these features to more countries in the near future.
The Teen Accounts system automatically enrolls users under 16 into a restricted environment. Key features include blocking messages from strangers—limiting communication to existing friends or prior contacts—and restricting who can view or engage with teens’ stories, tags, and comments. To encourage healthy usage habits, teens will receive reminders to log off after an hour of daily activity and will enter “Quiet mode” during undisclosed nighttime hours.
Parental oversight is a cornerstone of the update. Teens under 16 must obtain parental consent to modify default safety settings, such as disabling nudity filters in direct messages or enabling live streaming on Instagram. “We’re empowering parents to guide their teens’ online experiences,” Meta stated in a blog post, emphasizing its commitment to adolescent well-being.
The move follows growing concerns from global regulators and mental health advocates about social media’s effects on youth. Meta’s initiative comes as a proactive response to such scrutiny. Since its Instagram debut, over 54 million teens have adopted Teen Accounts, with 97% of 13- to 15-year-olds retaining the protective settings, according to company data. An Ipsos survey commissioned by Meta revealed that 94% of parents found the features valuable.
As Meta integrates these tools into Facebook and Messenger, the company hopes to set a new standard for teen safety across its platforms, blending innovation with accountability to address a pressing societal challenge.