The UK Video-Sharing Platforms regime was repealed 25 July 2025.
Four years of online safety regulation

Ofcom started regulating UK-established Video-Sharing Platforms (VSPs) in 2020.
In 2023, the Online Safety Act 2023 designated Ofcom as the UK’s online safety regulator for a much broader set of online platforms, including VSPs. In July 2025 the UK’s VSP regime was brought to an end and all in scope VSPs are now regulated in full under the Online Safety Act.
The VSP regime
The VSP regime was set out in Part 4B of the Communications Act 2003 and stemmed from the European Audiovisual Media Services Directive 2018. The requirements under Part 4B came into effect on 1 November 2020, and Ofcom completed its implementation of the regime when it published guidance to assist providers in complying with the new duties in October 2021.
The legislation required VSPs to protect all users from videos containing material likely to incite violence or hatred against particular groups and from content which would be considered a criminal offence under laws relating to terrorism; child sexual abuse material; and racism and xenophobia. VSPs were required to protect under 18s from material which might impair their physical, mental or moral development. Standards around advertising also had to be upheld.
In order to protect the general public and under 18s from harmful videos, VSPs had to take appropriate measures, which may have included:
- Having, and effectively implementing, terms and conditions for harmful material
- Having, and effectively implementing, flagging, reporting or rating mechanisms
- Applying appropriate age assurance and/or parental control measures to protect under 18s
- Establishing easy-to-use complaints processes
- Providing media literacy tools and information
In 2021, we laid the foundation for regulating VSPs in the UK through a series of key consultations and publications:
- March: We published the VSP Notification Guidance to help providers determine:
- Whether they qualified as a VSP under the Communications Act 2003;
- If they fell within UK jurisdiction; and
- When and how to notify Ofcom of their service.
- May: We consulted on the proposal to designate the Advertising Standards Authority (ASA) as the co-regulator for VSP-controlled advertising.
- October:
- We released a comprehensive Guidance and a Statement for VSP providers, outlining measures to protect users from harmful content.
- We published Ofcom's plan and approach around regulating VSPs – websites and apps, established in the UK, that allow users to share video. This document set out what we would do over the course of the regime to raise standards in user protection and address areas of poor compliance, as well as outlined the priority areas of focus for the first 12 months of the regime.
- We launched a simplified industry guide with key compliance dates and requirements.
- December:
- We confirmed the ASA’s role as co-regulator for VSP advertising through our final statement.
- We also initiated a new phase of supervisory engagement by writing to all notified VSPs, sharing detailed plans for the year ahead.
In 2022, we deepened our understanding of how notified services protect users and began seeking improvements across the sector.
- March: We issued formal information requests to notified VSPs to assess:
- The measures they had in place to protect users from harm;
- How those measures were being implemented; and
- How platforms evaluated their effectiveness.
- October: A pivotal month in our regulatory journey:
- We published a survey of industry responses following the livestreamed terrorist attack in Buffalo, New York (14 May 2022), which tragically claimed ten lives. The report examined how platforms responded to the incident and the state of cross-industry collaboration on livestream moderation.
- We released Ofcom’s first annual VSP regulation report, summarising:
- Key findings from the first year of regulation (Oct 2021–Oct 2022);
- Our regulatory approach for the year ahead; and
- The state of user protection measures across platforms.
- We also published a comprehensive report on the UK VSP landscape, drawing on commissioned research to:
- Contextualise how providers apply protection measures;
- Present user perspectives on their experiences with VSPs.
- This was accompanied by three focused research reports on:
By 2023, our understanding of notified services’ systems and processes had significantly deepened, and we began to see tangible improvements in several areas.
It was also a landmark year: we launched our first enforcement programme and initiated our first enforcement case under the VSP regime, marking a new phase in our regulatory oversight.
- January: We opened an enforcement programme into age assurance on notified adult platforms and non-notified adult platforms that may have fallen in scope of the VSP regime, to assess their age assurance methods and to understand the challenges they face when considering implementing age assurance measures.
- April: Snapchat updated its age gate from 18 to 13, to introduce more friction. This was a result of regular engagement with Snapchat, where Ofcom shared its concern about under 18s being able to sign up as adults without making any changes during the process.
- May:
- OnlyFans incorporated a specific CSAM reporting category after external research findings suggested that having a specific CSAM reporting category could help identify this type of illegal content for quicker takedown.
- Tapnet Ltd, provider of VSP RevealMe, was fined £2,000 after failing to respond to a statutory request for information.
- August: We published a report which shines a light on VSPs’ approaches to designing and implementing their terms and conditions to protect users and highlights what we consider to be examples of good practice.
- October:
- We worked closely with VSPs to make sure they were carrying out their duty to have systems and processes in place to protect users and anticipate and respond to the potential spread of harmful material at the start of the crisis in Israel and Gaza, which increased the risk that people might encounter harmful online content.
- BitChute made some changes to its reporting and flagging mechanisms and to the content moderation processes which underpin these, as well as committing to further changes, in response to concerns following the 2022 Buffalo attack.
- November: We received compliance assurances from Tapnet, provider of RevealMe, to make significant changes to its access control and age assurance measures, as a result of Ofcom’s enforcement programme into age assurance measures on UK-established VSPs that specialise in adult content.
- December:
- We published a further report which considered the steps being taken by TikTok, Snap and Twitch – three of the most popular regulated VSPs for children – to meet the requirements to protect children from encountering videos that may impair their physical, mental or moral development.
- We also received compliance assurances from Kiwi Leisure - provider of AdmireMe - to make significant changes to its access control and age assurance measures.
In 2024, after the Online Safety Act (OSA) received Royal Assent, the VSP regime entered a transitional period, in preparation for repeal in 2025.
- January: This report summarised our approach to regulating VSPs as well as our priorities for the rest of the VSP regime until repeal.
- March:
- We received compliance assurances from Visional Media, provider of Xpanded; and Ampay Ltd, provider of SoSpoilt, to make significant changes to its access control and age assurance measures.
- Twitch made some important changes to protect under 18s from videos containing Restricted Material, after a period of compliance remediation following concerns that it may not have sufficient measures in place.
- May:
- Onevsp engaged constructively with Ofcom and made improvements to its terms of use to prohibit additional types of harmful material, after an investigation based on concerns about the implementation and effectiveness of their terms of use.
- In this paper, we set out how a widely used evaluation framework could be applied to assess the impact and effectiveness of online safety measures. It also provides an illustrative example of what such an evaluation might look like in practice.
- July: Ofcom fined TikTok £1.875 million for failing to accurately respond to a formal request for information about its parental controls safety feature.
- August: OnlyFans refined its user policies to make them clearer and more accessible in response to our 2023 user policies report, which identified key areas for enhancing VSPs’ policies, including readability, clarity on prohibited content and overall length.
- September: Our Economics Discussion Paper analysed the impact of changes that the video sharing platform Twitch made to its content classification labelling in 2023.
- October:
- Snapchat refined its user policies to make them clearer and more accessible in response to our 2023 user policies report, which identified key areas for enhancing VSPs’ policies, including readability, clarity on prohibited content and overall length.
- Twitch expanded its reporting functionality to include logged-out users to meet Ofcom’s VSP guidance, which requires that all users able to view content on a platform must have access to user reporting tools.
In 2025, VSPs regulated under Ofcom’s VSP regime started having to comply with some of their duties under the Online Safety regime.
Simultaneously, Ofcom continued to secure improvements under the VSP regime.
- February:
- Ofcom imposed a financial penalty of £7,000 on MintStars Ltd, provider of VSP MintStars, for failing to have sufficiently robust measures in place, or implement measures effectively, to protect children from accessing pornographic content.
- Twitch started to introduce changes to improve the readability of their user policies to make them clearer and more accessible.
- March: Ofcom fined the provider of OnlyFans, Fenix International Limited, £1.05 million for failing to accurately respond to formal requests for information about its age assurance measures on the platform.
- April:
- TikTok introduced additional models to improve its automated content moderation tools including focusing on looking at ways to measure efficacy.
- July:
- We closed our enforcement programme into age assurance measures across the adult VSP sector, running since January 2023. This programme had a positive impact on improving compliance across the adult VSP sector by obtaining assurances that age assurance measures to protect under 18’s would be implemented by four notified adult VSPs.
- We also received assurances from non-adult VSPs on implementing age assurance We will ensure these improvements are implemented under the act.