Profiling for content delivery and service personalisation
Profiling refers to any form of automated processing of personal data that assesses or predicts people’s behaviour, interests or characteristics. In the context of content delivery and service personalisation, services use these profiles to suggest content and service features that align with a user’s profile. They also promote different service user experiences and features in line with the user’s interests, capabilities and needs.
The Children’s code Profiling standard outlines expectations for online services likely to be accessed by children that profile child users for content and service personalisation. The links below give examples and information on how this profiling impacts children’s rights under the UNCRC. We also offer code recommendations on how to positively support and mitigate risks to these rights:
- Article 6: Life, survival and development
- Article 17: Access to news media and information
- Article 19: Protection from violence, abuse and neglect
- Article 24: Access to health and health services
- Article 32: Protection from economic exploitation
- Article 33: Protection from drug abuse
- Children’s code recommendations on profiling for content delivery and service personalisation
Article 6: Life, survival and development
Children have an inherent right to life and survival. Their physical and emotional development should not be impeded.
Profiling for content delivery supports this right where the content promotes positive health behaviours or online safety tools. This right is at risk where profiling for content delivery exposes children to damaging content (for example age-inappropriate products, suicide and self-harm content or inaccurate health information).
Article 17: Access to news media and information
Children have a right to information from a diversity of digital media sources, and in particular those that promote their social wellbeing and general health.
Services can support this right by profiling and personalised targeting of news information in the best interests of the child. This right is at risk where this profiling exposes children to information not in the best interests of the child (for example misinformation). It is also at risk when the information is against regulatory standards (for example Ofcom’s broadcasting code).
Article 19: Protection from violence, abuse and neglect
Children have a right to be protected from all forms of physical or mental violence, abuse, maltreatment or exploitation.
This right is at risk where profiling and personalisation exposes children to violent or abusive content.
Article 24: Access to health and health services
Children have a right to the highest attainable standards of health, and access to health care information and services online.
Services can support this right where profiling and content personalisation promotes public health messaging and advice. This right is at risk where this profiling exposes children to inaccurate health information.
Article 32: Protection from economic exploitation
Children have a right to be protected from economic exploitation of all forms.
This right is at risk where services use profiling to target adverts and service features that generate revenue at children (for example loot boxes or in-game purchases). Some services do this by using on-by-default settings, without adequate transparency or safeguards. Profiling and personalisation can also pose risks to this right through targeted advertising of fraudulent or misrepresented products.
Services can support this right by using profiling and content personalisation to target content that promotes financial literacy.
Article 33: Protection from drug abuse
Children have a right to be protected from the illicit use of drugs and age-restricted substances.
This right is at risk where services use profiling to target age-restricted products to children (for example alcohol). Services can support this right by personalised targeting of information that protects children from drug abuse.
Children’s code recommendations on profiling for content delivery and service personalisation:
- Switch options which use profiling ‘off’ by default. Do this unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child. Examples of a compelling reason include:
- profiling to meet a legal or regulatory requirement (such as safeguarding);
- to prevent child sexual exploitation or abuse online; or
- for age assurance.
- Always provide a privacy setting for behavioural advertising. This is advertising which you use to fund a service, but is not part of the core service.
- Differentiate between different types of profiling for different purposes. Offer different privacy settings for each of these purposes. Don’t bundle them into one consent notice or privacy setting.
- Provide information to child users at the point at which you activate any profiling. You should explain what happens to the child’s personal data and any risks that may arise from it. Also provide age-appropriate prompts to seek assistance from an adult. Do not activate the profiling if they are uncertain or don’t understand.
- Provide options for children to tailor how content is personalised. This could include content controls.
- If profiling is on, ensure that you put appropriate measures in place to safeguard the child (in particular from inappropriate content). Such measures could include:
- contextual tagging;
- algorithmic risk assessments;
- transparent information on how content is recommended;
- robust reporting procedures; and
- elements of human moderation.
- For behavioural advertising, follow Committee of Advertising Practice (CAP) guidance on online behavioural advertising. This specifically covers advertising to children.
- For data-enabled delivery of online content to children, ensure the content does not breach Ofcom’s code of practice for broadcasters where it relates to people under 18.
- For data-enabled delivery of news, refer to the Independent Press Standards Organisation’s Editors’ Code of Practice provisions about reporting and children.
- Obtain assurances from, and do due diligence on, third parties you are sharing data with to perform profiling. You need to ensure they are not using children’s data in ways that are not in the children’s best interests
Consult with the ICO if there is a residual high risk of profiling of children that you can’t mitigate.