Social media and video sharing platforms put on notice over poor children’s privacy practices

We are calling on 11 social media and video sharing platforms to improve their children’s privacy practices. Where platforms do not comply with the law, they will face enforcement action.

This follows our ongoing review of social media platforms (SMPs) and video sharing platforms (VSPs) as part of our Children’s Code Strategy. Our Tech Lab reviewed 34 SMPs and VSPs focusing on the process young people go through to sign-up for accounts.

Varying levels of adherence to our Children’s code were found, with some platforms not doing enough to protect children’s privacy.

Eleven out of the 34 platforms are being asked about issues relating to default privacy settings, geolocation or age assurance, and to explain how their approach conforms with the code, following concerns raised by the review.

We are also speaking to some of the platforms about targeted advertising to set out expectations for changes to ensure practices are in line with both the law and the code.

Emily Keaney, Deputy Commissioner said:

“There is no excuse for online services likely to be accessed by children to have poor privacy practices. Where organisations fail to protect children’s personal information, we will step in and take action.

“Online services and platforms have a duty of care to children. Poorly designed products and services can leave children at risk of serious harm from abuse, bullying and even loss of control of their personal information.”

We also identified areas where further evidence is needed to improve understanding of how these services are impacting children’s privacy. We are launching a call for interested stakeholders including online services, academics and civil society to share their views and evidence on two areas of children’s privacy:

  • How children’s personal information is currently being used in recommender systems (algorithms that use people’s details to learn their interests and preferences in order to deliver content to them); and
  • Recent developments in the use of age assurance to identify children under 13 years old.

The evidence gathered will be used to inform our ongoing work to secure further improvements in how SMPs and VSPs protect children’s privacy.

As the SMP and VSP market is continuously changing, and children’s internet usage is continually evolving, the Tech Lab review included a mixture of new and established providers that give access to under-18s.

The project is part of our ongoing work on children’s privacy which has pushed industry to make significant changes meaning children benefit from a more positive online experience – helping the UK’s fast-moving technology sector look after young people’s information.

Our work supports platforms to develop services that recognise and cater for the fact that children warrant special protection in how their personal information is used, whilst also offering plenty of opportunity to explore and develop online.

Ms Keaney added:

“Our world-leading Children’s code has helped stop targeted advertising at children on some of the biggest social media platforms. The code has even encouraged other areas, including tech-famous California, to create their own codes. We’re now building on the code’s achievements to gather more evidence and push for further changes.”

Further details of what we have found have been published in the Children’s Code Strategy update.

Notes to editors
  • Children’s privacy is a priority area of work for the ICO and more can be found on our Children’s code strategy web pages.
  • The call for evidence will help build the ICO’s knowledge in this area, with responses and a summary report shared at a later date. 
  • The Children’s code explains how organisations can make sure their digital services safeguard children’s personal information, giving them online experiences that are appropriate for their age. Platforms are responsible for complying with the UK General Data Protection Regulation (UK GDPR).
  • The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals. 
  • The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations. 
  • The ICO can take action to address and change the behaviour of organisations and individuals that collect, use, and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit. 
  • To report a concern to the ICO telephone call our helpline on 0303 123 1113, or go to ico.org.uk/concerns. 

This post was originally published on this site