Instagram makes teen accounts private as pressure mounts

Instagram is making teen accounts private by default as it tries to make the platform safer for children amid a growing backlash against how social media affects young people’s lives.

Beginning on Tuesday in the United States, United Kingdom, Canada and Australia, anyone under 18 who signs up for Instagram will be placed into restrictive teen accounts, and those with existing accounts will be migrated over in the next 60 days. Teens in the European Union will see their accounts adjusted later this year.

Meta acknowledged that teenagers may lie about their age and said it will require them to verify their ages in more instances, such as if they try to create a new account with an adult birthday. The Menlo Park, California-based company also said it is building technology that proactively finds teen accounts that pretend to be grown-ups and automatically places them into the restricted teen accounts.

Private messages in these accounts are restricted so teens can only receive them from people they follow or are already connected to. “Sensitive content”, such as videos of people fighting or those promoting cosmetic procedures, will be limited, Meta said. Teens will also get notifications if they are on Instagram for more than 60 minutes and a “sleep mode” will be enabled that turns off notifications and sends autoreplies to direct messages from 10pm until 7am.

While these settings will be turned on for all teens, 16- and 17-year-olds will be able to turn them off. Kids under 16 will need their parents’ permission to do so.

“The three concerns we’re hearing from parents are that their teens are seeing content that they don’t want to see or that they’re getting contacted by people they don’t want to be contacted by or that they’re spending too much on the app,” said Naomi Gleit, head of product at Meta. “So teen accounts is really focused on addressing those three concerns.”

The announcement comes as the company faces lawsuits from dozens of US states that accuse it of harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.

New York Attorney General Letitia James said Meta’s announcement was “an important first step, but much more needs to be done to ensure our kids are protected from the harms of social media”. James’s office is working with other New York officials on how to implement a new state law intended to curb children’s access to what critics call addictive social media feeds.

Giving parents more options

In the past, Meta’s efforts at addressing teen safety and mental health on its platforms have been met with criticism that the changes don’t go far enough. For instance, while kids will get a notification when they’ve spent 60 minutes on the app, they will be able to bypass it and continue scrolling.

That’s unless the child’s parents turn on “parental supervision” mode, through which parents can limit teens’ time on Instagram to a specific amount, such as 15 minutes.

With the latest changes, Meta is giving parents more options to oversee their children’s accounts. Those under 16 will need a parent’s or guardian’s permission to change their settings to less restrictive ones. They can do this by setting up “parental supervision” on their accounts and connecting them to a parent or guardian.

Nick Clegg, Meta’s president of global affairs, said last week that parents don’t use the parental controls the company has introduced in recent years.

Gleit said she thinks teen accounts will create a “big incentive for parents and teens to set up parental supervision”.

“Parents will be able to see, via the family centre, who is messaging their teen and hopefully have a conversation with their teen,” she said. “If there is bullying or harassment happening, parents will have visibility into who their teen’s following, who’s following their teen, who their teen has messaged in the past seven days and hopefully have some of these conversations and help them navigate these really difficult situations online.”

US Surgeon General Vivek Murthy said last year that tech companies put too much responsibility on parents when it comes to keeping children safe on social media.

“We’re asking parents to manage a technology that’s rapidly evolving, that fundamentally changes how their kids think about themselves, how they build friendships, how they experience the world — and technology, by the way, that prior generations never had to manage,” Murthy said in May 2023.

This post was originally published on this site