Profiling for automated decision-making
Profiling refers to any form of automated processing of personal data that assesses or predicts people’s behaviour, interests or characteristics. In the context of automated decision-making, services may use these profiles to:
- allow or restrict access to services (for example for age assurance); or
- enforce policies and community standards (for example automated moderation of content).
The Children’s code Profiling standard outlines expectations for online services likely to be accessed by children that profile child users for automated decision-making. The links below give examples and information on how this profiling impacts children’s rights under the UNCRC. We also offer code recommendations on how to positively support and mitigate risks to these rights:
- Article 13: Freedom of expression
- Article 16: Protection of privacy
- Children’s code recommendations on profiling for automated decision making
Article 13: Freedom of expression
Children have a right to seek, receive and impart ideas of all kinds, through any medium of their choice.
Profiling for automated decision-making can risk this right where services use automated moderation of children’s online speech against their community standards or without adequate transparency and safeguards.
Article 16: Protection of privacy
Children have a right to be protected from arbitrary or unlawful interference with their privacy.
This right is at risk by profiling that:
- infers children's personal information;
- is set to on-by-default;
- services use without adequate transparency or safeguards; or
- is not in the best interests of the child.
Children’s code recommendations on profiling for automated decision making:
- Switch options which use profiling ‘off’ by default. Do this unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child. Examples of a compelling reason include:
- profiling to meet a legal or regulatory requirement (such as safeguarding);
- to prevent child sexual exploitation or abuse online; or
- for age assurance.
- Differentiate between different types of profiling for different purposes. Offer different privacy settings for each of these purposes. Don’t bundle them into one consent notice or privacy setting.
- Provide information to child users at the point at which you activate any profiling. You should explain what happens to the child’s personal data and any risks that may arise from it. Also provide age-appropriate prompts to seek assistance from an adult. Do not activate the profiling if they are uncertain or don’t understand.
- If profiling is on, ensure that you put appropriate measures in place to safeguard the child (in particular from inappropriate content). Such measures could include:
- contextual tagging;
- robust reporting procedures; and
- elements of human moderation.
- Obtain assurances from, and do due diligence on, third parties you are sharing data with to perform profiling. You need to ensure they are not using children’s data in ways that are not in the children’s best interests.
- Introduce measures to ensure accuracy, avoid bias and explain use of AI-based profiling.
- Consult with the ICO if there is a residual high risk of profiling of children that you can’t mitigate.