Meta launches Instagram for teens to address social media concerns

Meta on Tuesday launched Instagram Teen Accounts, a more limited experience for younger users of the platform, in what is the technology company’s latest effort to assuage concerns about the impact of social media on kids. 

Meta will automatically migrate all Instagram users under the age of 16 to the new service, which features built-in protections through settings controlled by their parents. The move is designed to address mounting criticism that social media can harm young people’s mental health, as well as put parents at ease about the kind of content their children are exposed to and who is able interact with them. 

User profiles on Teen Accounts are automatically made private and can only be viewed if a request to access a teen’s information is accepted. The new tool also places restrictions on messaging, allowing parents to see who their kids are communicating with, and includes a feature that silences notifications at night. Such features may be deactivated, but only with a parent’s permission. 

“We know parents want to feel confident that their teens can use social media to connect with their friends and explore their interests, without having to worry about unsafe or inappropriate experiences,” Meta said in a statement Tuesday. “We understand parents’ concerns, and that’s why we’re reimagining our apps for teens with new Teen Accounts.” 

Beyond giving caregivers more control over a child’s Instagram usage, a new “Explore” feature lets teens select topics they want to see more of, according to Meta. 

Facing legal pressure to change

Antigone Davis, global head of safety at Meta, told CBS News that Meta designed Teen Accounts in consultation with parents of teens and that the changes will affect tens of millions of Instagram users. Although Meta has made incremental changes over the years, the new service “standardizes the experience.” she said.

“It gives parents peace of mind. Their teens are in a certain set of protections,” Davis said, adding that Meta is seeking to “reimagine how parents and teens interact online.” 


Meta issues a warning about increasing sextortion scams. Here’s how to stay safe.

03:16

In 2023 dozens of states sued Meta, alleging the company deliberately engineered Instagram and Facebook to be addictive to young users in a bid to boost profits. The lawsuits also accused Meta of collecting data on children under 13 without their parents’ consent, a violation of federal law.

Meta has denied such allegations, saying that it is focused on providing teens with “positive experiences online” and that it has introduced dozens of tools aimed at making social media safer for teens. 

How will Teen Accounts be enforced?

With Teen Accounts, users under 16 need their parents’ permission to roll back restrictions, according to Meta. An additional feature lets parents further shape their teens’ online experiences by showing who how they’re messaging and how much time they are spending on the platform. Parents can also block teens from accessing Instagram during certain times of day.

To keep teens honest, Meta is asking them to verify their ages by uploading an ID card and by using a tool called Yoti, which analyses a person’s facial features to determine if they appear to be under or over 18. 

Teens will be notified that their accounts are being migrated into Teen Accounts. The transition is expected to take place over 60 days in the U.S., U.K., Canada and Australia. 

—CBS News’ Jo Ling Kent contributed to this report

This post was originally published on this site