Page Contents
Key Contacts
Related Services
As Ofcom begins enforcing the Online Safety Act, we review the Illegal Harms Codes of Practice and their impact on service providers.
Background
The Online Safety Act (the Act), passed in October 2023, is the UK’s most ambitious attempt to regulate digital platforms. Aiming to make the UK “the safest place in the world to be online”, its focus is threefold:
Our first Online Safety Act in Focus article outlined key challenges arising from the Act for online services, search engines and tech companies. These included free speech concerns, questions over the Act’s long-term viability and divergence from regulatory regimes including the EU’s Digital Services Act. Since then, we’ve seen several significant milestones including, from 17 March 2025, the implementation of Ofcom’s Illegal Harms Codes of Practice (the Codes).
Illegal Harms Codes of Practice
Both search and user-to-user (U2U) services must now comply with Ofcom’s safety measures or use other effective measures to protect users from illegal content and activity. The Information Commissioner has stressed that providers must also ensure “data protection by design and default approach” when implementing safety systems under the Act.
In this update, we outline the scope of illegal content under the Codes, key requirements on platforms and Ofcom’s approach to enforcement. We also provide a breakdown of the Act’s notice and takedown regime, focusing on the Section 10(3) duty on U2U services to swiftly remove illegal online content.
What is illegal content?
The Act introduces a new legal concept of illegal content, defined at section 59 as “content that amounts to a relevant offence”. For these purposes, both relevant offences and content are given broad interpretations. Where the former includes over 130 terrorism, CSAM and other priority offences, content means certain words, images, speech or sounds whose use, possession, viewing, access, publication or dissemination constitutes an offence.
The Act requires platforms to collect and assess all relevant information to determine whether online content is illegal. The threshold here is reasonable grounds to infer a relevant offence, albeit that Ofcom has clarified that providers are not required to make findings to a criminal standard or determine whether there has in fact been a breach of UK criminal law.
Ofcom’s Illegal Content Judgements Guidance highlights the following key points:
The regulator also highlights the need for providers to ensure a sufficient understanding of UK law including, where necessary, the distinctions between its three legal jurisdictions.
Key requirements on regulated services
The onus is now on providers to detect and remove illegal content, as well as to reduce the likelihood of priority criminal content being made available online. Proposed measures include:
Ofcom also recommends that large or multi-risk platforms track and regularly report any evidence of new and increasing illegal harms online, including increased user complaints or referrals from law enforcement.
Notice and takedown
While Section 10 of the Act creates a duty on service providers to use proportionate systems and processes to swiftly remove illegal content once alerted to it by a person or made aware of it in some other way, its practical outworkings remain unclear. Neither the Act nor its associated guidance touch upon the existing regime under the Electronic Commerce (EC Directive) Regulations 2002 (the Regulations). In particular, Regulation 19 provides that an information society service provider (ISSP) cannot be found liable for storing unlawful data where it lacked actual knowledge that the data was illegal or, upon learning of the illegal nature of the data, acted expeditiously to remove or disable access to it.
In determining whether an ISSP has actual knowledge for these purposes, Regulation 22 requires the Court to take into account all matters which appear to it in the particular circumstances to be relevant and, among other things, to have regard to:
(a) whether the ISSP received a notice through a specified email address and
(b) the extent to which any such notice included its sender’s full name and address, details of the location of the information and details of the unlawful nature of the activity or information in question.
The focus here is on facts or circumstances of which the host is actually aware, as opposed to facts of which the host could, or ought to, have been aware.
It’s also unclear what constitutes swift removal under the Act. Like expeditious removal under Regulation 19, however, this is likely to remain a fact-specific exercise which will vary on a case-by-case basis. This is reinforced by Ofcom’s recommendation that large and multi-risk services setting performance targets on content moderation should “balance the need to take relevant content moderation action swiftly against the importance of making accurate moderation decisions.” It remains to be seen how (and to what extent) this will impact upon enforcement action under the Act.
Outlook
Ofcom is expected to make additional announcements on formal enforcement action in the coming weeks. It has said that its first priority remains child protection, as well as ensuring providers introduce the backend protections required to fulfil their legal duties under the Act. Early enforcement will also focus on sites and apps that Ofcom considers may present particular risks of harm from illegal content because of their size and nature, including the UK’s largest search and U2U service providers.
For more information, please contact Ciaran O’Shiel, Partner or Niamh Flanagan, Solicitor.
With thanks to Martyn Doherty for his assistance in the preparation of this article.
Date published: 24 March 2025