This glossary defines the terms used throughout Ofcom’s digital tools which help service providers understand how to comply with the illegal content and protection of children rules under the Online Safety Act.
A
Content which is abusive and which targets any of the following characteristics:
- race
- religion
- sex
- sexual orientation
- disability
- gender reassignment
The Online Safety Act 2023.
Technical mechanism which prevents users who have not been age assured, or having been age assured, did not meet the requirements of the age assurance process, from accessing a service (or part of it) or certain content.
Materials that are specifically designed to be accessible and understandable to all children permitted to use a service, and to the adults who care for them.
A collective term for age verification and age estimation.
A form of age assurance designed to estimate the age or age range of the user.
A form of age assurance designed to verify the exact age of the user.
User-to-user service functionality allowing users to create a user profile where their identity is unknown to an extent. This includes instances where a user's identity is unknown to other users; for example, through the use of aliases (pseudonymity). It also includes where a user’s identity may be unknown to a service, for example, services that do not require users to register by creating an account.
A complaint by an interested person if the provider of a search service takes or uses measures in order to comply with the illegal content safety duties, that result in content relating to that interested person no longer appearing in search results or being given a lower priority in search results.
A complaint by a user about any of the following actions, if the action concerned has been taken by the provider on the basis that content generated, uploaded or shared by the user is illegal content:
a) the content being taken down
b) the user being given a warning
c) the user being suspended, banned or in any other way restricted from using the service.
Feature that allows audiovisual content to continue playing without input from the user
B
A user-to-user functionality where: a) blocked users cannot send direct messages to the blocking user and vice versa; b) the blocking user will not encounter any content posted by blocked users on the service and vice versa; c) the blocking user and blocked user, if they were connected, will no longer be connected.
Any action that means that the content cannot be clearly seen by users. For example, this may be done by a greyscale overlaying an image, accompanied by a content warning.
An umbrella term that refers to a software application or automated tool that has been programmed by a person to carry out a specific or predefined task without any human intervention
Content targeted against a person which conveys a serious threat, is humiliating or degrading, or forms part of a campaign of mistreatment.
Business disruption measurCourt orders which require third parties to withdraw services from, or block access to, a regulated service.
The way in which a business operates to achieve its goals. For the purposes of the Children’s Register of Risks, this includes a service’s revenue model and growth strategy.
C
The characteristics of a service include its functionalities, user base, business model, governance and other systems and processes. This is set out in section 98(11) of the Act.
Refers to offences specified in Schedule 6 of the Act, including offences related to CSAM and grooming. CSEA includes but is not limited to causing or enticing a child or young person to take part in sexual activities, sexual communication with a child and the possession or distribution of indecent images.
A category of illegal child exploitation and abuse (CSEA) content, including in particular indecent or prohibited images of children (including still and animated images, and videos, and including photographs, pseudo-photographs and nonphotographic images such as drawings). CSAM also includes other material that includes advice about grooming or abusing a child sexually or which is an obscene article encouraging the commission of other child sexual exploitation and abuse offences; content which links or otherwise directs users to such material; or content which advertises the distribution or showing of CSAM.
The Children’s Register of Risks is a document setting out the findings of Ofcom’s own risk assessment for content harmful to children. It sets out detailed evidence on risk factors that we have used to inform the Children’s Risk Profiles.
The Children’s Risk Assessment Guidance is a guidance document to help providers of services likely to be accessed by children to comply with the children’s risk assessment duties, as set out in the Act.
Children’s Risk Profiles are the lists of different online safety risk factors published by Ofcom. They represent a selection of specific characteristics of online services such as user base, business models and functionalities that our Children’s Register of Risks indicates are most strongly linked to a risk of content harmful to children. They are not the same as the Risk Profiles for Illegal Harms.
The safety duties protecting children in sections 12 and 29 of the Act.
Under the Online Safety Act 2023 (the ‘Act’), Ofcom is required to prepare and issue Codes of Practice for providers of Part 3 services, describing measures recommended for the purpose of compliance with specified duties imposed on those providers by the Act.
User-to-user service functionality that allows users to reply to content, or post content in response to another piece of content posted on open channels of communication, visually accessible directly from the original content without navigating away from that content.
The size of the service in terms of capacity, the stage of service maturity and rate of growth in relation to users or revenue.
Anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description.
A means of restricting certain user’s access to a particular piece of content on a service.
Content that is harmful to children is primary priority content, priority content or content which is an identified kind of non-designated content.
An algorithmic system which determines the relative ranking of an identified pool of content (that includes regulated user generated content) from multiple users on content feeds. Content is recommended based on factors that it is programmed to account for, such as popularity of content, characteristics of a user, or predicted engagement. References to content recommender systems do not include a content recommender system employed exclusively in the operation of a search functionality which suggests content to users in direct response to a search query, product recommender systems or network recommender systems.
User-to-user service functionality allowing users to assign a keyword or term to content that is shared.
Content which incites hatred against people:
- of a particular race, religion, sex or sexual orientation
- who have a disability
- who have the characteristic of gender reassignment
Refers to information provided by a search service in search results that typically contains the contact details of helplines and/or links to supportive information provided by a reputable organisation, to assist users experiencing a mental health crisis.
The repeated or continuous engagement in behaviour by a perpetrator towards a victim, with whom they are personally connected, that is controlling or coercive, and this behaviour has a serious effect on the victim, putting them in fear of violence or causing serious alarm or distress which has a substantial adverse effect on their usual day-to-day activities.
Harm that occurs when harmful content (primary priority content, priority content or non-designated content) is repeatedly encountered by a child, and/or when a child encounters harmful combinations of content. These combinations of content include encountering different types of harmful content (primary priority content, priority content or non-designated content), or a type of harmful content (primary priority content, priority content or non-designated content) alongside a kind of content that increases the risk of harm from primary priority content, priority content or non-designated content.
The sending of a photograph or film of genitals, intending the recipient will be caused alarm, distress or humiliation, or sending a photograph or film of genitals to obtain sexual gratification and being reckless as to whether the recipient will be caused alarm, distress or humiliation.
D
Content which encourages, promotes, or provides instructions for a challenge or stunt highly likely to result in serious injury to the person who does it or to someone else
Dating services enable users to find and communicate with romantic or sexual partners.
A means for a Trusted Flagger (defined below) to report illegal content, for example an inbox, a web portal or another relevant mechanism for reporting.
Involves the removal of URLs (such as links to individual webpages) or domains (for example, entire websites) from a search index. This will prevent the webpage URLs from appearing in search results entirely.
User-to-user service functionality allowing a user to send and receive a message to one recipient at a time, and which can only be immediately viewed by that specific recipient.
A user-to-user service type describing general services that generally allow users to send or post messages that can be read by the public or an open group of people.
Action taken by a search service which involves altering the ranking algorithm such that a particular piece of search content appears lower in the search results and is therefore less discoverable to users
Search service type describing a subsection of general search services. Downstream general search services provide access to content from across the web, but they are distinct in that they obtain or supplement their search index from other general search services.
The supply or offer to supply of controlled drugs and/or psychoactive substances, and related offences.
E
Content which encourages, promotes, or provides instructions for an eating disorder or behaviours associated with an eating disorder.
User-to-user service functionality that allows users to send messages that are automatically deleted after they are viewed by the recipient, or after a prescribed period of time has elapsed.
The sending of flashing images electronically with the intention of causing harm, where it is reasonably foreseeable that an individual with epilepsy would be among those who view it or where the sender believes that an individual they know or suspect to have epilepsy will or might view it.
Publicly available documents aimed at users of a service which provide an overview of a service’s rules about what content is allowed and what is not. These are often in the form of terms of service and/or community guidelines.
An offence under section 63 of the Criminal Justice and Immigration Act 2008 (possession of extreme pornographic images).
F
A service whose primary functionalities involve enabling users to (i) store digital content, including images and videos, on the cloud or dedicated server(s); and (ii) share access to that content through the provision of links (such as unique URLs or hyperlinks) that lead directly to the content for the purpose of enabling other users to encounter or interact with the content.
These offences relate to a wide variety of offences such as, but not limited to, the purchase and sale of prohibited weapons, supplying firearms and imitation firearms to minors, purchase of firearms or ammunition without a certificate etc.
An offence under section 13 of the National Security Act 2023 (foreign interference).
A number of offences relating to fraud and financial services, such as but not limited to fraud by abuse of position, participating in fraudulent business, or the contravention of the prohibition on carrying on regulated activity unless authorised or exempt.
For user-to-user services, functionalities include features that enable interaction between users. Functionalities for search services include features that enable users to search websites or databases, as well as features that make suggestions relating to users’ search requests. This is set out in sections 233(2-3) of the Act.
G
User-to-user service type describing services that allow users to interact within partially or fully simulated virtual environments.
Search service type describing services that enables users to search the internet and which derives search results from an underlying search index (developed by either the service or a third party).
AI models that can create text, images, audio and videos, typically in response to a user prompt.
This term refers to the structures that ensure the adequate oversight, accountability and transparency of decisions within a service which affect user safety. [added from glossary] This is in relation to organisational structure as well as product and content governance.
An offence specified in any of paragraphs 5, 6, 11 or 12 of Schedule 6 to the Act.
User-to-user service functionality allowing users to send and receive messages through a closed channel of communication to more than one recipient at a time.
H
A range of offences such as, but not limited to, threats to kill, causing harassment, alarm or distress, causing fear of violence, and stalking.
Harm means physical or psychological harm.
As set out in the Act, harm can occur from isolated incidents of exposure, or from cumulative exposure. Cumulative harm arises when:
- harmful content (primary priority content, priority content or non-designated content) is repeatedly encountered by a child
- a child encounters harmful combinations of content - including encountering different types of harmful content, or a type of harmful content alongside a kind of content that increases the risk of harm from harmful content.
Harm can include circumstances of indirect harm, in which a group or individual are harmed, or the likelihood of harm is increased, as a consequence of another child seeing harmful content, which then affects their behaviours towards others.
Content which encourages a person to ingest, inject, inhale or in any other way self-administer a physically harmful substance or a substance in such a quantity as to be physically harmful
Public order offences relating to stirring up hatred on the grounds of certain protected characteristics.
An age assurance process that is of such a kind and implemented in such a way that it is highly effective at correctly determining whether or not a particular user is a child.
Functionality providing direct access to another piece of data by clicking or tapping on specific content present on the service.
I
Illegal content is content that amounts to a relevant offence. Content consisting of certain words, images, speech or sounds amounts to a relevant offence if the:
- use of the words, images, speech or sounds amounts to a relevant offence
- possession, viewing or accessing of the content constitutes a relevant offence, or
- publication or dissemination of the content constitutes a relevant offence
A relevant offence is:
- a priority offence which is an offence specified in Schedule 5, 6, or 7 of the Online Safety Act and is broadly in line with the kinds of illegal content set out at the top of this page
- another offence that isn’t a priority offence, where the victim or intended victim is an individual or individuals, subject to certain specific exemptions set out in the legislation
Guidance about making illegal content judgement that Ofcom is required to produce under section 193 of the Act.
Search content that has been identified in the provider’s publicly available statement for the service as being subject to appropriate moderation action, where the provider is satisfied that illegal content is included within that kind of content (including but not limited to priority illegal content).
Content that has been assessed and identified as being in breach of the service’s terms of service, where the provider is satisfied that the terms in question prohibit the types of content that include illegal content (including but not limited to priority illegal content).
The duties in sections 10 and 27 of the Act.
Harms arising from illegal content and the commission and facilitation of priority offences.
CSAM in the form of photographs, videos, or visual images.
A design pattern in which a page loads content as a user scrolls down, allowing them to discover and view large amounts of content with no distinct end. This design pattern is typically associated with content recommender system where large volumes of personalised content is curated.
Information services are primarily focused on providing user-generated information to other users.
More detailed versions of external content policies which set out rules, standards or guidelines, including around what content is allowed and what is not, as well as providing a framework for how policies should be operationalised and enforced.
An offence of sharing or threatening to share intimate images or film.
L
A service with more than 7 million monthly active UK users.
User-to-user service functionality that allows users to simultaneously create and broadcast online streaming media in, or very close to, real time.
A service which the provider has not assessed as being medium or high risk in relation to any kind of content harmful to children in its risk assessment.
M
Online marketplaces and listing services allow users to buy and sell goods or services.
User-to-user service type describing services that are typically centred around the sending and receiving of messages that can only be viewed or read by a specific recipient or group of people.
Businesses that employ 1-9 full-time equivalent (FTE).
When a service provider reviews and assesses content to determine whether it is harmful to children or not, or whether it is in breach of the terms of service or publicly available statement of the service, and takes appropriate action based on that determination We use ‘content moderation’ when referring to user-to-user services, and ‘search moderation’ when referring to search services.
A scheme by which the provider of a service labels the user profiles of a user who has made payment to the provider of the service or some other person. Such schemes may be open to all users and payment may be regular or one-off. Users participating in the scheme may benefit from access to additional features on the service. The label to indicate that a user is participating in a monetised scheme map appear on that user’s profile and/or any content they publish. Providers may or may not refer to such schemes as “verification” schemes.
A service is multi-risk if the provider has assessed the service as having medium or high risk of two or more specific kinds of content that is harmful to children.
Muting refers to a feature that enables a user to ‘mute’ another user. The muting user will not encounter any content posted by muted users on the service (unless the muting user visits the user profile of the muted user directly). The muted user is not aware that they have been muted and continues to encounter content posted by the muting user.
N
A functionality that by automated means, makes recommendations to connect with one or more specified users of the relevant service. Recommendations may include, but are not limited to, recommendations to connect with users that have similar interests, that are close geographically, that are associated with the same school or workplace, or that have a mutual connection.
Non-designated content is a category of content that is harmful to children. It consists of content which is not primary priority content or priority content, of a kind which presents a material risk of significant harm to an appreciable number of children in the UK.
A scheme by which the provider of a service labels the user profile of a user to indicate to other users that they are notable. “Notable users” include but are not limited to politicians, celebrities, influencers, financial advisors, company executives, journalists, government departments and institutions, non-governmental organisations, financial institutions, media outlets, and companies. The label to indicate that a user is notable (for example a “tick” symbol) may appear on that user’s user profiles and/or any content they publish. Services may or may not refer to such schemes as “verification” schemes.
O
The process of live testing the operation of different variants of a content recommender system on a service across a control group and treatment groups comprised of users of the service. It involves the collection of data to produce metrics relating to certain identified factors, such as commercial or user safety.
P
Refers to a search service that falls within the definition of section 4 of the Act.
A user-to-user service, as defined in section 4 of the Act.
A notice requiring payment of a penalty in respect of a failure to comply with: the requirements of a confirmation decision; a Technology Notice; or a failure to pay a fee set under section 84 or Schedule 10 of the Act.
Content of such a nature that it is reasonable to assume that it was produced solely or principally for the purpose of sexual arousal.
For the purposes of the Act, pornographic content that is harmful to children specifically excludes content that consists only of text or consists only of text and is accompanied by identifying content (that may be text or another kind of content which is not itself pornographic), non-pornographic GIFs, emojis or other symbols, or any combination of these.
Services whose principal purpose is the hosting or dissemination of pornographic content and who host user-generated pornographic content. These services are subject to the risk assessment duties and the children’s safety duties. Pornography that is published or displayed by the provider of the service is subject to different duties set out in Part 5 of the Act and Ofcom has published separate guidance for providers subject to these duties.
User-to-user service functionality allowing users to upload and share content on open channels of communication.
An algorithmic functionality embedded in the search field of a search service. It operates by anticipating a user’s search query and suggesting possible related search requests (‘predictive search suggestions’), based on a variety of factors (including a user’s past queries and other user queries, locations, and trends) to help users make more relevant searches.
Primary priority content is a category of content that is harmful to children. It consists of:
- pornographic content
- suicide content
- self-harm content
- eating disorder content
Priority content is a category of content that is harmful to children. It consists of:
- abusive content
- content which incites hatred
- bullying content
- violent content (provides instructions for)
- violent content (humans)
- violent content (animals or fictional creatures)
- dangerous stunts and challenges content
- harmful substances content
Content which amounts to a priority offence.
The offences set out in Schedules 5 (terrorism offences), 6 (CSEA offences) and 7 (priority offences) to the Act.
Consisting of three types of technology: content identification technology, user profiling technology, and behaviour identification technology (subject to certain exceptions) as defined in section 231 of the Act.
An offence under any of the following provisions of the Proceeds of Crime Act 2002:
- section 327 (concealing etc criminal properly)
- section 328 (arrangements, facilitating, acquisition etc of criminal property)
- section 329 (acquisition, use, and possession of criminal property)
An all-encompassing term that includes any functionality, feature, tool, or policy that a service provides to enable users to interact with or use the service.
A group or organisation proscribed by the Secretary of State under section 3 of the Terrorism Act 2000.
Age; disability; gender reassignment; marriage and civil partnership; pregnancy and maternity; race; religion or belief; sex; and sexual orientation.
Any content that is published on a service by the service provider or someone acting on their behalf.
An internet service on which pornographic content (defined in the Act as ‘regulated provider pornographic content’) is published or displayed by the provider of the service.
A statement that search services are required to make available to members of the public in the UK, often detailing various information on how the service operates.
R
User-to-user service functionality allowing users to express a reaction, such as approval or disapproval, of content that is shared by other users, through dedicated features that can be clicked or tapped by users.
The entities that are recommended by Ofcom as trusted flaggers in relation to fraud.
The Record-Keeping and Review Guidance is designed to help service providers understand what is expected in relation to keeping written records of risk assessments and the measures taken to comply with the relevant duties and reviewing compliance with the relevant duties.
The assessment of the risks of harm from illegal content on user-to-user and search services that Ofcom is required to prepare under section 98 of the Act.
Content which amounts to a non-priority offence.
All priority offences and relevant non-priority offences.
User-to-user service functionality which allows users to re-share content that has already been shared by a user.
How a service generates income or revenue.
A service which enables users to create and view critical appraisals of people, businesses, products, or services.
Identifying and assessing the risk of harm to individuals from illegal content and content harmful to children, present on a Part 3 regulated service.
The guidance to assist services in complying with the risk assessment duties that Ofcom is required to produce under section 99 of the Act. Our risk assessment guidance has been published alongside our Statement.
A risk factor is a characteristic associated with the risk of one or more kinds of harm.
The possibility of individuals encountering harm on a Part 3 service.
Risk Profiles set out the risk factors (features and functionalities) that are most strongly linked to one or more kinds of illegal harm. Risk Profiles will help you to identify and record the relevant risk factors for your service.
S
Content that may be encountered in or via search results of a search service. It does not include paid-for advertisements, news publisher content, or content that reproduces, links to, or is a recording of, news publisher content.
Includes a service or functionality which enables a person to search some websites or databases but does not include a service which enables a person to search just one website database.
A collection of URLs that are obtained by deploying crawlers to find content across the web, which is subsequently stored and organised.
In relation to a search service, this means content presented to a user of the service by operation of the search engine, in response to a search query made by a user.
An internet service that is, or includes, a search engine.
A process where the user is asked to provide their own age. This could be in the form of providing a date of birth to gain entry to a service or by ticking a box to confirm a user is over a minimum age threshold.
Content which encourages, promotes or provides instructions for an act of deliberate self-injury
A regulated user-to-user or search service.
The design of all the components that shape a user’s end-to-end experience of a service. These components can include the business model or decision-making structures, back-end systems and processes, the user interface, and off-platform interventions.
A characteristic that in general refers to the nature of the service. For example, social media services and messaging services.
An order that requires ‘ancillary providers’, such as search engines and payment services which facilitate the provision of the service, to take steps aimed at disrupting the non-compliant service’s business in the UK. These orders can also be made on a temporary (interim) basis.
Causing or inciting prostitution for gain, or controlling a prostitute for gain.
A business that employs 10-49 full-time equivalent (FTE).
User-to-user service type describing services that connect users and enable them to build communities around common interests or connections.
Content which encourages, promotes or provides instructions for suicide
These are actions taken by a service to mitigate the risk of harm arising from content harmful to children. This could include both human and automated moderation processes. This is set out in section 236 of the Act.
T
The duty under section 10(3)(b) of the Act for a user-to-user service to use proportionate systems and processes designed to seiftly take down any (priority or non-priority) illegal content when it becomes aware of it.
All documents comprising the contract for use of the service (or of part of it) by UK users.
A notice imposed under section 121(1) of the Act, to deal with terrorism content and CSEA content.
An offence specified in Schedule 5 to the Act, including but not limited to offences relating to proscribed organisations, encouraging terrorism, training and financing terrorism.
An entity which is a recommended trusted flagger and any other person:
- whom the provider has reasonably determined has expertise in a particular illegal harm; and
- for whom the provider has established a dedicated reporting channel
U
Shorthand for ‘user-to-user’ service, which means an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.
Offences relating to illegal entry, assisting unlawful immigration, or arranging or facilitating the travel of another person, or taking a relevant action, with a view to them being exploited.
Demographic make-up of the user base, including selected characteristics, intersectional dynamics and other relevant demographic factors.
Your user base is the information you hold about the users of your service. You may refer to them in various ways such as customers, clients, subscribers, visitors or similar terms. A user doesn’t need to be registered with a service to be considered a user of that service. This is set out in section 227 of the Act.
Functionality type that comprises user-to-user service functionalities which allow users to communicate with one another, either synchronously or asynchronously. Includes communication across open and closed channels.
User-to-user service functionality that allows users to follow or subscribe to other users. Users must sometimes be connected in order to view all or some of the content that each user shares.
User-to-user service functionality allowing users to create online spaces that are often devoted to sharing content on a particular topic. User groups are generally closed to the public and require an invitation or approval from existing members to gain access. However, in some cases they may be open to the public.
Functionality type that comprises user-to-user service functionalities which allow users to identify themselves to other users.
Functionality type that comprises user-to-user service functionalities which allow users to find or encounter each other and establish contact.
User-to-user service functionality that is associated with a user account, that represents a collection of information shared by a user which may be viewed by other users of the service. This can include information such as username, biography, profile picture, etc., as well as user-generated content generated, shared or uploaded by the user using the relevant account.
User reports are a specific type of complaint about content, submitted through a reporting tool.
Content (a) that is (i) generated directly on the service by a user of the service, or (ii) uploaded to or shared on the service by a user of the service; and (b) which may be encountered by another user, or other users, of the service by means of the service.
User-to-user service functionality allowing users to search for user-generated content by means of a user-to-user service.
An internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.
V
Search service type describing services that enable users to search for specific topics, or products or services (such as flights or hotels) offered by third-party operators. Unlike general search services, they do not return search results based on an underlying search index. Rather, they may use an API or equivalent technical means to directly query selected websites or databases with which they have a contract, and to return search results to users.
User-to-user service type describing services that allow users to upload and share videos with the public.
Content which depicts real or realistic serious violence against an animal or fictional creature or depicts the real or realistic serious injury of an animal or fictional creature in graphic detail.
Content which depicts real or realistic serious violence against a person or depicts the real or realistic serious injury of a person in graphic detail.
Content which encourages, promotes or provides instructions for an act of serious violence against a person.