Meta announced on Tuesday new “built-in protections” for its Instagram Teen Accounts, including a new requirement for parents to consent before children under 16 can go “live” on the platform or unblur nudity in images they receive in direct messages.
The updates expand on previous restrictions rolled out last year as part of parent company Meta’s “Instagram Teen Accounts” program, which came in response to heightened concern about the harmful effects of social media on children and teen mental health.
With the new features announced Tuesday, teens under 16 will be prevented from using the Instagram Live feature without parental consent. They will also be required to get their parents’ permission to turn off a feature that blurs images containing suspected nudity in direct messages.
Meta said the updates will be available in the next couple of months.
Meta also announced on Tuesday it will expand its teen account program to Facebook and Messenger. Teen accounts will first be available in the U.S. UK, Australia and Canada, before expanding to “other regions soon.”
Meta said the Facebook and Messenger Teen Accounts will include similar features to those included in the Instagram Teen Accounts, which launched in September.
The Instagram Teen Account program includes expanded protections for users under 18, including making the accounts private by default, allowing direct messages only from people they follow or are connected to and limiting sensitive content young users see. Users also get notified when they’ve been on the app for more than 60 minutes, and “sleep mode” is enabled at night to disable notifications and to auto-reply to direct messages.
These features are automatically turned on for all teen accounts, but 16- and 17-year-olds can disable the features themselves, and children under 16 can do so with parental consent.
Meta touted the success of the program, saying there are at least 54 million active Teen Accounts globally, since the program launched in September.