Scorecard Research Beacon
Search Icon
Family April 9, 2025

Meta expands teen safety features across platforms

WATCH: Instagram CEO talks app's new protections for teens

Meta announced a new set of safety features for teen users across Instagram, Facebook and Messenger, part of an ongoing effort to create a safer digital environment for adolescents on its platforms.

The updates, announced Tuesday, will roll out over the next few months, and include expanded restrictions for users under 16, such as:

Children must be at least 13 years old to create an account on any Meta-owned platform, in compliance with the Children's Online Privacy Protection Act (COPPA).

Meta introduced Teen Accounts last year in response to growing concerns about social media's impact on youth mental health and safety.

A 2023 U.S. Surgeon General advisory warned that social platforms can expose adolescents to harmful content, cyberbullying, and exploitation. The expansion of these protections comes as lawmakers and regulators in the U.S. and Europe continue to scrutinize how tech companies safeguard young users.

Expert concerns over gaps in Meta's approach

While the updates mark a continuation of Meta's safety efforts, some online safety experts say the changes fall short of addressing deeper risks.

Titania Jordan, chief marketing officer and chief parent officer of Bark Technologies, an online safety company focused on kids, said she's concerned the new protections don't apply to children who register accounts using fake ages or who maintain hidden profiles.

"My concern is for the kids who have an adult account after entering in a fake age, which Instagram still makes incredibly easy to do," Jordan tells "Good Morning America." "These restrictions do nothing for them. Or for kids whose parents may not know about secret additional Instagram accounts."

Instagram introduces mandatory 'Teen Accounts' with built-in limits, parental controls

Jordan added that Meta has not clarified whether adult users can still send DM requests to teens, nor addressed algorithmic practices that could make youth accounts more visible to potential predators.

She also questioned the practical reach of the updates on Facebook and Messenger, saying, "This energy could have been directed more toward additional Instagram protection, an app that’s far more popular with young people.”

PHOTO: Stock photo of a group of teens using their smartphones.
Daniel De La Hoz/Getty Images
Stock photo of a group of teens using their smartphones.

In an email to "GMA" Meta said that it has “significantly changed content controls so that teens are recommended less potentially sensitive content” and that it continues to evolve how teen profiles are presented and surfaced to other users.

Meta also stated that “adults cannot send DM requests to teens who don’t follow them,” and reiterated this has been a key element of its existing safety policies.

In terms of its reach with young people across all platforms, Meta noted that “millions of teens use Facebook and Messenger, especially in regions where data access is more limited.” The company also emphasized that the current announcement includes “new, additional Instagram protections,” not just a cross-platform expansion.

This story has been updated with additional information from Meta.