Online harms regulator Ofcom has published its first code of practice for tackling illegal harms under the Online Safety Act (OSA), giving businesses three months to prepare before enforcement begins in March 2025.
Published 16 December 2024, Ofcom’s Illegal Harms Codes and guidance outlines the steps providers should take to address illegal harms on their services.
This includes nominating a senior executive to be accountable for OSA compliance, properly funding and staffing content moderation teams, improved algorithmic testing to limit the spread of illegal content, and removing accounts that are either run by or are on behalf of terrorist organisations.
Covering more than 100,000 online services, the OSA applies to search engines and firms that publish user-created content, and contains 130 “priority offences” covering a variety of content types – including child sexual abuse, terrorism and fraud – that firms will need to proactively tackle through their content moderation systems.
With the publication of the codes, providers now have deadline of 16 March 2025 to fulfil their legal duty to assess the risk of illegal harms taking place on their services, after which they will be immediately expected to implement the safety measures set out in the codes, or use other effective measures to their protect users.
Ofcom has said it is ready to take enforcement action if providers do not act promptly to address the risks on their services. Under the OSA, failure to comply with its measures – including a failure to complete the risk assessment process within the three month timeframe – could see firms fined up to 10% of their global revenue or £18m (whichever is greater).
“For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today,” said Ofcom chief executive Melanie Dawes.
“The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”
Technology secretary Peter Kyle – who set out his draft Statement of Strategic Priorities (SSP) to the regulator in November 2024 – described the codes as a “material step change in online safety” that mean platforms will have to proactively taken down a host of illegal content.
“This government is determined to build a safer online world, where people can access its immense benefits and opportunities without being exposed to a lawless environment of harmful content,” he said.
“If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites. These laws mark a fundamental re-set in society’s expectations of technology companies. I expect them to deliver and will be watching closely to make sure they do.”
While the SSP is set to be finalised in early 2025, the current version contains five focus areas, including safety by design, transparency and accountability, agile regulation, inclusivity and resilience, and innovation in online safety technologies.
Under the OSA, Ofcom will have to report back to the secretary of state on what actions it has taken against these priorities to ensure the laws are delivering safer spaces online, which will then be used to inform next steps.
Ofcom said it will be holding a further consultation in spring 2025 to expand the codes, which will include looking at proposals on banning accounts that share child sexual abuse material, crisis response protocols for emergency events such as the August 2024 riots in England, and the use of “hash matching” to prevent the sharing of non-consensual intimate imagery and terrorist content.
Under Clause 122 of the OSA, Ofcom has the power to require messaging service providers to develop and deploy software that scans phones for illegal material. Known as client-side scanning, this method compares hash values of encrypted messages against a database of hash values of illegal content stored on a user’s device.
Encrypted communication providers have said Ofcom’s power to require blanket surveillance in private messaging apps in this fashion would “catastrophically reduce safety and privacy for everyone”.
Responding to the publication of the codes, Mark Jones, partner at law firm Payne Hicks Beach, said the fact that there will have been 14 months between the OSA receiving Royal Assent in October 2023 and the codes coming into force in March 2025 shows “there has been no urgency” in tackling illegal harms.
“Let’s be clear – this is, to a degree, self-regulation. Providers decide for themselves how to meet the legal duties and what is proportionate for them,” he said. “Ofcom does, however, have enforcement powers such as fines of up to £18m or 10% of global turnover or even blocking sites in the most serious of cases. But will we see these powers being used swiftly, or at all? Critics say the Codes of Practice do not go far enough and that a gradualist approach is being taken to illegal harms.”
Xuyang Zhu, partner at global law firm Taylor Wessing, added that while there are further codes of practice set to be published, companies now have strict timelines to adhere to and can no longer delay taking action on implementing safety measures.
“Companies need to act now if they want to avoid failing compliance and facing potentially significant fines,” she said. “For many services, it will take substantial time and effort to do the risk assessment, going through the system and data to identify risks as well as putting in compliance measures to mitigate the identified harms. It won’t be an easy task and, to ensure that companies can make it by the deadline, they need to start now.”
Ofcom previously published its draft online child safety codes for tech firms in April 2024. Under the codes, Ofcom expects any internet services that children can access (including social media networks and search engines) to carry out robust age checks, to configure their algorithms to filter out the most harmful content from these children’s feeds, and implement content moderation processes that ensure swift action is taken against this content.
The draft codes also include measures to ensure tech firms compliance, including by having a named senior person accountable for compliance with the children’s safety duties, an annual senior-body review of all risk management activities relating to children’s safety, and an employee code of conduct that sets standards for employees around protecting children.