Quick guide to Protection of Children Codes

Published: 7 May 2024
Last updated: 24 April 2025

Under the Online Safety Act, services that are likely to be accessed by children have duties to protect children online. 

On 24 April 2025 we published our Protection of Children Codes in draft form as submitted to the Secretary of State. From 25 July 2025, as long as the draft Protection of Children Codes complete the Parliamentary process, service providers will need to take the safety measures set out in the Codes, or use other effective measures to protect children from harmful content. 

This page gives a quick introduction to the safety measures recommended in the Protection of Children Codes.

Our resources to help you keep children safe online

Services must manage risks and protect children from encountering harmful content

Services that are likely to be accessed by children are required to carry out a children’s risk assessment and use proportionate safety measures to keep children safe online. You can read our quick guide to children's access assessments and quick guide to children's risk assessments which summarise the steps you need to take first.

Children’s safety duties for user-to-user services

If you are the provider of a user-to user service that is likely to be accessed by children, you will need to take proportionate measures to effectively:

  • Prevent children of any age from encountering pornography, suicide, self-harm, and eating disorder content (primary priority content). If your service does not prohibit one or more kinds of primary priority content for all users, this involves using highly effective age assurance to prevent children from encountering such content where it is identified on their service.
  • Protect children in age groups judged to be at risk of harm from other harmful content from encountering it. This includes content that is abusive or incites hatred, bullying content, violent content, and content which encourages, promotes, or provides instructions for dangerous stunts and challenges, and self-administering harmful substances (priority content), as well as other types of content that present a material risk of significant harm to an appreciable number of children in the UK (non-designated content).
  • Mitigate and manage the risks of harm to children in different age groups identified in your children’s risk assessment.
  • Mitigate the impact of harm to children in different age groups presented by content that is harmful to children.
  • Explain how you’ll do this in your terms of service.
  • Allow people to easily report content that is harmful to children and operate a complaints procedure.

Children’s safety duties for search services

If you are the provider of a search service that is likely to be accessed by children, you will need to take proportionate measures to effectively:

  • Minimise the risk of children of any age encountering the most harmful search content to children, namely pornography, suicide, self-harm, and eating disorder content (primary priority content).
  • Minimise the risk of children in age groups judged to be at risk of harm from other harmful content from encountering it. This includes content that is abusive or incites hatred, bullying content, violent content and content which encourages, promotes, or provides instructions for dangerous stunts and challenges, and self-administering harmful substances (priority content), as well as other types of content that present a material risk of significant harm to an appreciable number of children in the UK (non-designated content).
  • Mitigate and manage the risks of harm to children in different age groups identified in your children’s risk assessment.
  • Mitigate the impact of harm to children in different age groups presented by content that is harmful to children.
  • Explain how you’ll do this in a publicly available statement.
  • Allow people to easily report content that is harmful to children and operate a complaints procedure.

The Protection of Children Codes set out a range of proposed safety measures that apply to different services

You should decide for yourself how to meet the specific legal duties; you can implement the measures that apply to your service set out in Ofcom’s Codes or you can take alternative measures. If you take alternative measures to the ones we recommend, you must also maintain a record of what you have done and how you consider that they fulfil the relevant duties.

The Protection of Children Codes include measures in the following areas:

  • governance and accountability
  • terms of service and publicly available statements
  • age assurance on user-to-user services
  • content moderation for user-to-user services and search services (we call this ‘search moderation’)
  • user reporting and complaints
  • recommender systems on user-to-user services
  • settings, functionalities, user support and user control controls on user-to-user services
  • search features, functionalities, and user support

Your safety measures will depend on your service

The Act is clear that the safety measures you need to put in place should be proportionate. Different measures in the Codes would apply to different types of services based on factors such as:

  • the type of service you provide (including user-to-user or search);
  • the relevant functionalities and other characteristics of your service that have been shown to pose risks to children;
  • the number of users your service has; and
  • the outcome of your latest children’s risk assessment, and what risks have been identified in relation to content harmful to children 

Some measures will apply to all services. For example, naming an individual accountable for compliance with the children’s safety duties, ensuring your terms of service (or publicly available statements) are clear and accessible, having a user reporting and complaints process, and some content and search moderation measures.

Some measures apply to services that pose greater risks to children

Your children’s risk assessment must set out if your service has a negligible, low, medium, or high risk of each kind of primary priority content, each kind of priority content, and non-designated content. These risk levels must accurately reflect the risks on your service, and we’ve provided guidance on this. 

Some measures apply based on the specific risk level:

  • If your service is low or negligible risk for content harmful to children, you are a ‘low risk service’, a core set of measures apply.
  • If your service is medium or high risk for (at least) one kind of content harmful to children, further measures may apply. Some of these measures apply irrespective of which kind(s) of content harmful to children the risk relates to (e.g. some reporting and complaints measures), whereas other measures only apply if the risk relates to specific kinds of content harmful to children listed as relevant to the measure (such as suicide content, self-harm content, eating disorder content or bullying content).
  • If your service is medium or high risk of two or more kinds of content harmful to children, we call it a ‘multi-risk’ service, further measures may apply.

Some measures apply to large services

Certain measures apply to large services (even if they are low risk), such as undertaking an annual review of risk management activities and provision of training and materials to individuals working in content moderation.

In the Codes, we have defined a large service as a service which has an average user base of 7 million or more per month in the UK. This is equivalent to approximately 10% of the UK population. A user does not need to be registered with the service, or post anything. Just viewing content is enough to count as using that service. 

Age assurance measures

Some of the measures involve using highly effective age assurance to work out if users are adults or children, so that appropriate safety measures can be targeted at children, while respecting adults’ rights to access legal content.

In some cases, providers of user-to-user services will need to use highly effective age assurance to prevent children from accessing the entire service. This is the case when the principal purpose of a service is to host or disseminate a kind of primary priority content that is harmful to children, such as pornography, or priority content that is harmful to children, such as violent content.

In other cases, user-to-user service providers need to use highly effective age assurance to target safety measures towards children to protect them from being exposed to harmful content they have identified through content moderation or in content recommender feeds.  

Rate this page

Was this page helpful?
Back to top