If children are likely to access all, or part of, an online service you provide, you have duties under the Online Safety Act to protect children from harmful content.
All service providers in scope of the Act must establish whether their service is likely to be accessed by children by completing a children’s access assessment.
If you establish that your service is likely to be accessed by children, you must comply with duties to:
1. Carry out a children's risk assessment
2. Put in place protections for children on your service
3. Comply with the record-keeping and review duties for these activities
You have three months to complete a children’s risk assessment after launching a service that the protection of children duties apply to, and will need to keep that risk assessment up to date. You also need to complete a risk assessment before making a significant change to your service.
If you conclude that your service is not likely to be accessed by children, you will need to carry out another children’s access assessment in a year (or less in some cases).
Our resources to help you comply with the protection of children rules
- Our Online Safety Act regulatory documents page includes a full list of documents and guidance related to compliance with the protection of children duties, including Ofcom’s children's risk assessment guidance and Protection of Children Codes of Practice.
- Our digital toolkit will guide you through how to comply with the protection of children rules. The toolkit will provide you with specific compliance recommendations for your service based on your answers to a series of questions. It will help you to:
- identify and asses the risks relevant to your service
- find recommended safety measures to address these risks
- use templates and checklists, if you need them
Children's risk assessment duties
The purpose of the children’s risk assessment is to improve your understanding of the risk of harm to children on your service.
It will help you understand how children could encounter harmful content, how your service’s user base, features and other characteristics could increase the risk of this happening, and what safety measures you need to put in place to protect children.
Your assessment must accurately reflect the risks on your service based on relevant information and evidence. You also need to keep it up to date.
You need to keep a written record of every risk assessment you carry out. You can use the protection of children duties record-keeping template (ODT, 148 KB) to make a record of your risk assessment.
Additional steps for Category 1 and 2A services
Ofcom expects to publish the register of categorised services in July 2026. The providers of services falling into Category 1 or Category 2A will have additional duties relating to their risk assessments.
These duties will provide transparency as to how relevant services view the levels of risk they pose to users in the UK.
Children's safety duties
The children's safety duties, and those relating to reporting and complaints, require online services to use proportionate safety measures to keep children safe online. It’s about making sure you have the right measures in place to protect children from harm that could take place on your service.
If you are the provider of a user-to user service that is likely to be accessed by children, you will need to take proportionate measures to effectively:
- Prevent children of any age from encountering pornography, suicide, self-harm, and eating disorder content (primary priority content). If your service does not prohibit one or more kinds of primary priority content for all users, you must use highly effective age assurance to prevent children from encountering such content where it is identified on your service.
- Protect children in age groups judged to be at risk of harm from encountering other harmful content online. This includes content that is abusive or incites hatred, bullying content, violent content, and content which encourages, promotes, or provides instructions for dangerous stunts and challenges, and self-administering harmful substances (priority content), as well as other types of content that present a material risk of significant harm to an appreciable number of children in the UK (non-designated content).
- Mitigate and manage the risks of harm to children in different age groups identified, using the best available information about the users on your service, in your children’s risk assessment.
- Mitigate the impact of harm to children in different age groups presented by content that is harmful to children.
- Explain how you’ll do this in your terms of service.
- Allow people to easily report content that is harmful to children and operate a complaints procedure.
If you are the provider of a search service that is likely to be accessed by children, you will need to take proportionate measures to effectively:
- Minimise the risk of children of any age encountering the most harmful search content to children, namely pornography, suicide, self-harm, and eating disorder content (primary priority content).
- Minimise the risk of children in age groups judged to be at risk of harm from other harmful content from encountering it. This includes content that is abusive or incites hatred, bullying content, violent content and content which encourages, promotes, or provides instructions for dangerous stunts and challenges, and self-administering harmful substances (priority content), as well as other types of content that present a material risk of significant harm to an appreciable number of children in the UK (non-designated content).
- Mitigate and manage the risks of harm to children in different age groups identified in your children’s risk assessment.
- Mitigate the impact of harm to children in different age groups presented by content that is harmful to children.
- Explain how you’ll do this in a publicly available statement.
- Allow people to easily report content that is harmful to children and operate a complaints procedure.
You can decide for yourself how to meet your legal duties. You can apply the measures that apply to your service set out in Ofcom’s Codes of Practice, or you can take alternative measures. If you take alternative measures to the ones we recommend, you must also maintain a record of what you have done and how you consider that they fulfil the relevant duties. In doing so, you must consider the importance of protecting users’ rights to freedom of expression and of protecting child users from breaches of relevant policy laws.
Implementing safety measures
The Protection of Children Codes sets out a range of measures for user-to-user and search services.
The Protection of Children Codes of Practice include measures in the following areas for user-to-user services:
- governance and accountability
- terms of service
- age assurance
- content moderation
- user reporting and complaints
- recommender systems
- settings, functionalities, and user support
- terms of service
The Protection of Children Codes of Practice include measures in the following areas for search services:
- governance and accountability
- content moderation (we call this ‘search moderation’)
- user reporting and complaints
- search features, functionalities, and user support
- publicly available statements
Your safety measures will depend on your service
The Act is clear that the safety measures you need to put in place should be proportionate. We recommend different measures for different types of services based on factors such as:
- the type of service you provide (including user-to-user or search);
- the number of users your service has (size);
- the outcome of your latest children’s risk assessment, and what risks have been identified in relation to content harmful to children; and
- the relevant functionalities and other characteristics of your service that have been shown to pose risks to children (for example whether you have a content recommender system).
Some measures will apply to all services. For example, naming an individual accountable for compliance with the children’s safety duties, ensuring your terms of service (or publicly available statements) are clear and accessible, having a user reporting and complaints process, and some content and search moderation measures.
Your children’s risk assessment must cover the level of risk of child users encountering each kind of content harmful to children on your service. You must assign a risk level - negligible, low, medium, or high – to each kind of content harmful to children. These risk levels must accurately reflect the risks on your service.
The measures that we recommend will be based on the risk levels assigned:
- If your service is low or negligible risk for all kinds of content harmful to children, it is a ‘low risk service’ and a core set of measures apply.
- If your service is medium or high risk for (at least) one kind of content harmful to children, further measures may apply. Some of these measures apply irrespective of which kind(s) of content harmful to children the risk level relates to (e.g. some reporting and complaints measures), whereas other measures only apply if the risk level relates to specific kinds of content harmful to children listed as relevant to the measure (such as suicide content, self-harm content, eating disorder content or bullying content).
- If your service is medium or high risk for two or more kinds of content harmful to children, we call it a ‘multi-risk’ service and further measures may apply.
Certain measures apply to large services (even if they are low risk), such as undertaking an annual review of risk management activities and provision of training and materials to individuals working in content moderation.
In the Codes of Practice, we have defined a large service as a service which has an average user base of 7 million or more per month in the UK. This is equivalent to approximately 10% of the UK population. A user does not need to be registered with the service, or post anything. Just viewing content is enough to count as using that service.
Some of the measures involve using highly effective age assurance to determine which users are children, so that you can target appropriate safety measures at children, while respecting adults’ rights to access legal content.
In some cases, providers of user-to-user services will need to use highly effective age assurance to prevent children from accessing the entire service. This is where the principal purpose of a service is to host or disseminate a kind of primary priority content that is harmful to children, such as pornography, or priority content that is harmful to children, such as violent content. 'Principal purpose’ in this context refers to the main activity or objective of the service.
Other user-to-user service providers may need to use highly effective age assurance to target safety measures towards children to protect them from being exposed to harmful content they have identified through content moderation or in children’s recommender feeds.
Further steps you may need to take to comply
Providers of online services must comply with a range of duties under the Act.
In addition to complying with protection of children duties under the Act, you will need to carry out your illegal content risk assessment, and comply with your record-keeping, safety and related duties for illegal content. Visit the Illegal content duties under the Online Safety Act page for guidance to help you do this.
If you provide a service that publishes or displays pornographic material, you may have further duties to comply with. You can find out more about these on the Adults Only page.