self-(web)

Protecting people from online suicide and self-harm material

Published: 5 March 2025
Last updated: 22 August 2025

In the latest in a series of explainers on specific online harms, Ofcom sets out what online service providers operating in the UK need to do to protect people from suicide and self-harm content.

Warning: This article contains distressing content relating to suicide and self-harm

A number of deaths in the UK have been linked with online material where detailed information is shared on methods of suicide and self-harm, or where suicidal and self-harm behaviours are actively encouraged. Adults and children may actively seek out this kind of content, with tragic results. But they may also encounter it accidentally or have it recommended to them by algorithms.

Ofcom’s research suggests that four per cent of UK internet users have seen online content promoting suicide in the last month, and children are more likely to see it than adults.

Online service providers must protect UK users 

Under the UK’s Online Safety Act, in-scope service providers have a duty to make their sites and apps safer by design and protect both children and adults from illegal and harmful content, including content that encourages or assists suicide and self-harm.    

On 16 December 2024, Ofcom published its first-edition codes of practice and guidance for illegal content. Regulated service providers then had until 16 March 2025 to carry out their illegal content risk assessments, and the illegal content safety duties came into force on 17 March 2025. This includes a duty to remove illegal content quickly when they become aware of it, including illegal forms of suicide and self-harm content. 

In our codes, we have set out specific measures that ‘user-to-user’ services can take to protect adults and children from illegal content. Some of them apply to all providers, and others to certain types of providers, or to the providers of larger or riskier services. These measures include: 

  • having content moderation systems and processes that enable sites to take down illegal suicide and self-harm material swiftly when they become aware of it
  • making sure content moderation functions are appropriately resourced and individuals working in moderation are trained to identify – and where relevant, take down – content including illegal suicide and self-harm material
  • allowing users to report illegal suicide and self-harm material through reporting and complaints processes that are easy to find, access and use
  • when testing their algorithms, checking whether and how design changes impact the risk of illegal content, including illegal suicide content, being recommended to users, and
  • setting clear and accessible terms and conditions explaining how users will be protected from illegal content, which includes suicide and self-harm material. 

In addition, providers of search services should:

  • take appropriate moderation action against illegal suicide or self-harm material when they become aware of it, which could include deprioritising its overall ranking or it not appearing in search results, and
  • provide crisis prevention information in response to search queries regarding suicide methods or suicide generally.

Further protections for children

Our regulation provides an additional layer of protection for children, given the vital importance of keeping them safe online from both harmful content and behaviour.

This is already a key area of focus in our illegal harms codes. For example, sites and forums with direct messaging functionality and a risk of grooming should implement safety defaults making sure children can only be contacted by people to whom they are already connected.

On 24 April 2025, we published our decisions on additional protections for children relating to content that is legal but harmful to them. Providers of services likely to be accessed by children had until 24 July 2025 to assess the risks to children from certain kinds of content designated in the Act that is harmful to them. This includes content that promotes, encourages or provides instructions for suicide or self-harm. The children’s safety duties came into force on 25 July 2025, which means providers now have to implement measures to mitigate risks to children on their platforms. Our children’s codes for user-to-user services recommend that services:

  • design and operate recommender systems so that content likely to be suicide or self-harm content is excluded from children’s feeds
  • have in place content moderation systems and processes that ensure swift action is taken when they identify suicide or self-harm content to prevent children from seeing it – where services allow this kind of content, they should implement highly effective age assurance to secure this outcome
  • make sure their content moderation functions are adequately resourced and individuals working in moderation are trained to identify and action suicide or self-harm content in line with their internal content policies, and
  • signpost children who report, post, share or search for suicide or self-harm material online to appropriate support

 We recommend that search services should:

  • take appropriate moderation action against suicide or self-harm content when they become aware of it, which could include deprioritising its overall ranking, as well as excluding it from search results of child users via safe search settings; and
  • provide children with crisis prevention information in response to search requests relating to suicide, self-harm and eating disorders.

Some of these apply to all providers, and others to certain types of providers, or to the providers of larger or riskier services.

Proposed additional safety measures

On 30 June 2025 we published our Additional Safety Measures consultation, which sets out proposals to ask platforms to go further to keep users safe. These include proposals that some service providers should: 

  • assess whether proactive technology to detect certain kinds of content is available and meets specific criteria. This includes technology to detect illegal suicide content, and suicide and self-harm content which is harmful to children. Where such tools exist, they should use them;
  • enable real-time reporting of livestreams showing imminent harm, and ensure human moderators are available when livestreaming is active, and
  • design and operate their recommender systems so that content likely to be certain kinds of priority illegal content, (including illegal suicide content) is excluded from users’ feeds.

These measures aim to tackle suicide and self-harm content at source and stop it from going viral.

Evidence is key to our work

Continuing to build our evidence base in this area is one of our top priorities, and we have a well-established programme of research that helps provide important insights and evidence to support our work. That includes hearing from people and organisations who specialise in supporting and protecting adults and children from suicide and self-harm.

We have also engaged directly with those with lived experiences of online harms as well as coroners conducting inquests. We will continue – and increase – this type of engagement to ensure such perspectives are incorporated into our work.   

Taking action 

We expect all providers to assess the risks associated with their service, and to take steps to protect their users from illegal and harmful content. Where it appears that a provider is not taking steps to protect its users from harmful content and there is a risk of serious harm to users – especially children – we won’t hesitate to take enforcement action. We opened an investigation into a suicide forum shortly after the illegal harms duties came into force.

Where we identify compliance failures, we can issue significant financial penalties, require providers to make specific changes, and – in the most serious cases – apply to the courts to block sites in the UK.   

If you are struggling to cope, please call Samaritans for free on 116 123 (UK and the Republic of Ireland) or contact other sources of support, such as those listed on the NHS help for suicidal thoughts web page. Support is available 24 hours a day, every day of the year, providing a safe place for you, whoever you are and however you are feeling.