Social media given ‘last chance’ to tackle illegal posts

A boy sits on the floor looking at his smartphone.Image source, Getty Images
  • Published

Online platforms must begin assessing whether their services expose users to illegal material by 16 March 2025 or face financial punishments as the Online Safety Act (OSA) begins taking effect.

Ofcom, the regulator enforcing the UK’s internet safety law, published its final codes of practice for how firms should deal with illegal online content on Monday.

Platforms have three months to carry out risk assessments identifying potential harms on their services or they could be fined up to 10% of their global turnover.

Ofcom head Dame Melanie Dawes told BBC News this was the “last chance” for industry to make changes.

“If they don’t start to seriously change the way they operate their services, then I think those demands for things like bans for children on social media are going to get more and more vigorous,” she said.

“I’m asking the industry now to get moving, and if they don’t they will be hearing from us with enforcement action from March.”

Under Ofcom’s codes, platforms will need to identify if, where and how illegal content might appear on their services and ways they will stop it reaching users

According to the OSA, this includes content relating to child sexual abuse material (CSAM), controlling or coercive behaviour, extreme sexual violence, promoting or facilitating suicide and self-harm.

But critics say the Act fails to tackle a wide range of harms for children.

The Molly Rose Foundation – set up in memory of teenager Molly Russell, who took her own life in 2017 after being exposed to self-harm images on social media – said the OSA has “deep structural issues”.

Andy Burrows, its chief executive, said the organisation was “astonished and disappointed” by a lack of specific, targeted measures for platforms on dealing with suicide and self-harm material in Ofcom’s guidance.

“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life,” he said.

And children’s charity the NSPCC has also voiced its concerns.

“We are deeply concerned that some of the largest services will not be required to take down the most egregious forms of illegal content, including child sexual abuse material,” said acting chief Maria Neophytou.

“Today’s proposals will at best lock in the inertia to act, and at worst create a loophole which means services can evade tackling abuse in private messaging without fear of enforcement.”

The OSA became law in October 2023, following years of wrangling by politicians over its detail and scope, and campaigning by people concerned over the impact of social media on young people.

Ofcom began consulting on its illegal content codes that November, and says it has now “strengthened” its guidance for tech firms in several areas.

Ofcom codes

Ofcom says its codes include greater clarity around requirements to take down intimate image abuse content, and more guidance on how to identify and remove material related to women being coerced into sex work.

It also includes child safety features such as ensuring that social media platforms stop suggesting people befriend children’s accounts and warnings about risks of sharing personal information.

Certain platforms must also use a technology called hash-matching to detect child sexual abuse material (CSAM) – a requirement that now applies to smaller file hosting and storage sites.

Hash matching is where media is given a unique digital signature which can be checked against hashes belonging to known content – in this case, databases of known CSAM.

Many large tech firms have already brought in safety measures for teenage users and controls to give parents more oversight of their social media activity in a bid to tackle dangers for teens and pre-empt regulations.

For instance, on Facebook, Instagram and Snapchat, users under the age of 18 cannot be discovered in search or messaged by accounts they do not follow.

In October, Instagram also started blocking some screenshots in direct messages to try and combat sextortion attempts – which experts have warned are on the rise, often targeting young men.

Technology Secretary Peter Kyle said Ofcom’s publication of its codes was a “significant step” towards the government’s aim of making the internet safer for people in the UK.

“These laws mark a fundamental reset in society’s expectations of technology companies,” he said.

“I expect them to deliver and will be watching closely to make sure they do.”

‘Snail’s pace’

Concerns have been raised throughout the OSA’s journey over its rules applying to a huge number of varied online services – with campaigners also frequently warning about the privacy implications of platform age verification requirements.

And parents of children who died after exposure to illegal or harmful content have previously criticised Ofcom for moving at a “snail’s pace”.

The regulator’s illegal content codes will still need to be approved by parliament before they can come fully into force on 17 March.

But platforms are being told now, with the presumption that the codes will have no issue passing through parliament, and firms must have measures in place to prevent users from accessing outlawed material by this date.

This post was originally published on this site