The Online Safety Act makes online businesses legally responsible for keeping UK users - especially children - safe online, even if their business is based outside of the UK.
Ofcom is the UK’s independent regulator responsible for implementing the Act and provides guidance for the online video games industry to:
- find out how the rules apply to online video game services
- understand the risks games could present to users, especially children; and
- know what you need to do to comply
How the Online Safety Act applies
Online games can allow people of all ages to play, create, explore and express themselves. However, some features of online video game services can be exploited or misused and games can become an unsafe environment, especially for children.
The Online Safety Act applies to certain types of online services, including those where users can interact with each other, or create, share or upload content. You can use our tool to find out if the rules apply to your service.
This includes services that:
- have links to the UK; and
- are a ‘user-to-user’ service – an internet service that enables users to generate, share or upload content (such as messages, images, videos, comments, audio) on the service that may be encountered by other users of the service. This includes services that enable user interactions.
On online video game services, this could take many different forms. Sometimes users may interact by creating or manipulating player profiles, avatars, objects, and the environments themselves, or by using voice and text chat (including, for example, team-based channels or open-world chats).
Online safety rules would apply to content where games use matchmaking systems to connect users with each other, including strangers, through mechanisms such as populating lobbies and/or by assigning players to teams, and where services enable livestreaming.
On a user-to-user service, while the Act covers user-generated content, it does not cover content published by the provider of the service (with the exception of online pornography). On online video game services, this could include, for example, offline gameplay, original or additional game content developed and published by a studio, or the enforcement of PEGI age ratings.
The online safety risks in gaming
There is extensive evidence of the risks of online harm in games. Research set out in our Registers of Risks reports that:
- Ofcom’s Online Experiences Tracker has shown that nearly half of 13 to 17-year-olds report being highly concerned about trolling (47%), one-off abusive behaviour or threats (45%), and intentional harassment (37%) – known as ‘griefing’ – while playing games online.
- NSPCC research on online grooming highlights “voice or text chat services built into online multiplayer games” as methods used by grooming perpetrators to approach children.
- Charities such as Catch-22 and the Children’s Society have supported child victims/survivors of criminal exploitation who have been recruited into trafficking via online video games. While the NSPCC helpline has supported children who have encountered suicide and self-harm content in a game environment.
- Ofcom media literacy research reports that online games were the third most likely place for ‘nasty or hurtful’ behaviour to occur among children aged 8-17 (12%, after social media at 16% and messaging apps at 15%).
The Online Safety Act aims to change this. It introduces new legal duties for providers of regulated services to assess the risk of harm from:
- illegal content; and
- certain kinds of content harmful to children
Providers must then put in place measures to keep people safe online.
Illegal content
Ofcom’s research indicates that online video game services are more likely to have an increased risk of harm related to:
- terrorism;
- child sexual exploitation and abuse, in particular grooming;
- hate offences; and
- offences relating to harassment, stalking, threats and abuse
These are just some examples of illegal content that can appear on game services and you can read more in our Register of Risks (PDF, 5.69 MB) for illegal content. Providers need to assess the risk of 17 kinds of illegal content
Content harmful to children
For children, our research indicates that online video game services an increased risk of harm related to:
- abuse and hate content
- bullying content; and
- violent content.
Providers need to assess the risk of harm of 12 kinds of content harmful to children and we provide guidance on what these are. You can read more about how children encounter harmful content on online video game services in our Children’s Register of Risks (PDF, 3.66 MB).
How to comply
It is your responsibility to comply with the law, but Ofcom has produced guidance and resources to help you. You can read these documents in full, or start with our guide for services which will help you:
- check if the Online Safety Act applies to you;
- carry out your illegal content assessment and put in place protections under the safety and related duties;
- complete your children’s access assessment and, if applicable, carry out your children’s risk assessment and put in place protections under the children’s safety and related duties; and
- comply with the record-keeping and review duties for the activities above.
You can also raise queries and subscribe for updates on any changes to the regulations and what you need to do.