Content moderation and data protection
Latest updates - last updated 16 February 2024
16 February 2024 - This guidance was published.
In detail
- Why have you produced this guidance?
- Who’s it for?
- What does it cover?
- What doesn’t it cover?
- How do we use this guidance?
- How does this guidance relate to the OSA?
Why have you produced this guidance?
This guidance explains how data protection law applies when you use content moderation technologies and processes. It provides practical advice to help you comply with the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018). Read it to understand the law and our recommendations for good practice.
It is not a comprehensive guide to compliance. We link to relevant further reading about any principles we’ve already covered in our other guidance.
If you are processing children's personal information, you should conform with our Children's code. When we refer to a child we mean anyone under the age of 18. Our code is a statutory code of practice that sets out how internet society services likely to be accessed by children can protect children’s information rights online. It sets out fifteen standards that you should implement, if you are an internet society service.
The Data Protection and Digital Information Bill was reintroduced in the Houses of Parliament on 8 March 2023. When the Bill becomes law, it will amend elements of the DPA 2018 and the UK GDPR relevant to this guidance. We have written this guidance in line with the applicable law at the time of publication.
This guidance on content moderation is the first in a series of products we’ve planned about online safety technologies.
This is part of our ongoing commitment to publishing guidance on online safety technologies, alongside our work to ensure regulatory coherence between the data protection and online safety regimes. We announced this in our 2022 joint statement with Ofcom on online safety and data protection.
Who’s it for?
This guidance is for organisations that use or are considering using content moderation. It is also for organisations that provide content moderation products and services. It is for both data controllers and processors.
The guidance is aimed at organisations who are carrying out content moderation to meet their obligations under the Online Safety Act 2023 (OSA). However, it is also applies to organisations who are carrying out content moderation for other reasons.
Whether you are carrying out content moderation to comply with the OSA or for other purposes, you must comply with data protection law.
We expect that this guidance will be most relevant to trust and safety professionals. It will also be relevant to those in roles with a data protection compliance focus, such as data protection officers, general counsel, privacy-legal professionals and risk managers.
What does it cover?
It sets out how organisations deploying content moderation processes or providing content moderation services can comply with data protection law.
In this guidance we define ‘content moderation’ as:
- the analysis of user-generated content to assess whether it meets certain standards; and
- any action a service takes as a result of this analysis. (See the section ‘What do we mean by content moderation?’ for more information.)
This guidance focuses on moderation of user-generated content on user-to-user services. For the purposes of this guidance, we follow the definitions in the OSA.
Section 3(1) of the OSA defines a user-to-user service as:
“User-to-user service means an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.”
Section 55(3) of the OSA defines user-generated content as:
“User-generated content means content that is:
- (i) generated directly on the service by a user of the service, or (ii) uploaded to or shared on the service by a user of the service, and;
- that may be encountered by another user, or other users, of the service by means of the service.”
This guidance discusses content moderation processes that are managed and administered by organisations. It applies to content moderation that is manual, partly automated and solely automated.
What doesn’t it cover?
This guidance sets out the requirements of data protection law where you process personal information in content moderation. It does not cover the following:
- Compliance with the specific obligations in the OSA. The regulator for the online safety regime is Ofcom. Please consult Ofcom's codes of practice and guidance for information about what you are required to do under the OSA.
- The use of behaviour identification and user profiling. In many cases, you may use content moderation alongside other systems and processes. This includes those that analyse the behaviour of users on a service, or build a profile of them to assess their characteristics. We plan to produce guidance on this in the future.
- Specific considerations that arise from on-device moderation, such as the application of the Privacy and Electronic Communications Regulations (PECR).
- The requirement in section 66 of the OSA for regulated services to report all detected and unreported Child Sexual Exploitation and Abuse (CSEA) content to the National Crime Agency (NCA). We will publish further data protection guidance about this when these regulations are implemented.
How do we use this guidance?
To help you to understand the law and good practice as clearly as possible, this guidance says what organisations must, should, and could do to comply.
Legislative requirements
- Must refers to legislative requirements.
Good practice
- Should does not refer to a legislative requirement, but what we expect you to do to comply effectively with the law. You should do this unless there is a good reason not to. If you choose to take a different approach, you must be able to demonstrate that this approach also complies with the law.
- Could refers to an option or example that you could consider to help you to comply effectively. There are likely to be various other ways you could comply.
This approach only applies where indicated in our guidance. We will update other guidance in due course.
We plan to keep this guidance under review and update it where appropriate, for example to reflect Ofcom’s final online safety codes of practice and guidance.
How does this guidance relate to the OSA?
The OSA sets out rules for user-to-user and search services. These services have new duties to protect UK users by assessing and responding to risks of harm. This includes duties on user-to-user service providers to:
- use proportionate measures to prevent users from encountering certain types of illegal content; and
- use proportionate systems and processes to swiftly remove any illegal content after becoming aware of its presence on the service.
If a service is likely to be accessed by children, the OSA sets out duties for the protection of children. The OSA also includes specific duties for services that display or publish provider pornographic content.
Ofcom is the regulator for the OSA. It is responsible for implementing the regime and supervising and enforcing the online safety duties. Ofcom is publishing codes of practice and guidance that will provide more detail about the regime and explain how you can comply with your new duties.
The OSA sits alongside data protection law. Compliance with one does not necessarily mean compliance with the other. If you are carrying out content moderation that involves personal information, you must comply with data protection law.
Further reading
- Children's code including the section on Services covered by this code.
- Ofcom OSA codes and practice and guidance.
- Online Safety Act 2023.