The ICO’s response to the Children’s code strategy call for evidence
Latest updates - 03 March 2025
03 March 2025 - This response was published
Introduction
Our Children’s code strategy set out that our focus in 2024/5 would be to drive further improvements in how children’s personal information is used by social media platforms (SMPs) and video sharing platforms (VSPs).
We launched a call for evidence (CfE) in August 2024 to request views and further evidence from stakeholders on two areas of our strategy:
- the use of children’s personal information in recommender systems; and
- the use of personal information of children under the age of 13.
We also sought views from respondents on the impacts of the code and their opinions on our approach including any changes they would like to see.
The CfE ran for 11 weeks and closed on 18 October 2024. We received 20 responses across a number of sectors. 1
This document summarises the information we received from respondents, as well as our responses, according to the following thematic areas:
- Recommender systems
- Age assurance to identify under 13-year-olds
- Impacts of the code
- Views on the ICO’s approach
We are feeding these responses into our ongoing work on the strategy, as set out in the further update we published alongside this document.
Recommender systems
We sought evidence on the following:
- How children’s personal information is used in the design and operation of recommender systems used by SMPs and VSPs.
- Whether, and if so, how, platforms are using children’s personal information to recommend content on SMPs and VSPs in ways that could lead to children spending extensive amounts of time on the platform.
- Good practice examples about processing children’s personal information in recommender systems on SMPs and VSPs.
- Whether using children’s information in recommender systems might have particular impacts on children belonging to specific groups, including children with protected characteristics. 2
11 out of 20 respondents provided their views on using children’s information in recommender systems. The following summarises key themes.
Commercial interests when operating recommender systems
Some respondents stated that service designers prioritise maximising time spent on a platform, user reach and activity when designing recommender systems. This results in extensive engagement and children staying on platforms for longer periods of time. The purpose of maximising children’s time spent online, according to respondents, was to boost revenue for services through the monetisation of data processing via advertising, and other means. They raised concerns that this is done at the expense of children’s safety, without children’s best interests in mind.
Several submissions pointed to public statements made by senior executives at major online platforms drawing a link between a platform’s revenue and the amount of time that users (including children) spend on it.
One response also noted that heuristics (ie metrics that help companies track and measure user engagement on platforms) are also used to encourage engagement with videos, again to boost commercial revenue.
Lack of transparency about how recommender systems work
Some respondents advised that there is a lack of transparency from platforms about how their recommender systems work. One respondent reported that this prevents users from being informed about how their personal information is used for this purpose, and from understanding what tools and safety mechanisms are in place to help them manage this.
Another respondent argued that platforms were deliberately opaque about how their recommender systems work and that children are often subject to profiling and automated processing with no adequate transparency or accountability. Such practices would, in their view, place children at a higher risk of harm, by providing a highly individualised service to children, over which they have little control or understanding about how their information is used.
One submission highlighted the importance of ensuring that the explanation that services provide about how they use personal information is understandable to children, and appropriately tailored to their needs and how they use online services.
Extensive data collection
Respondents noted the wide range of personal information about children that can be processed to inform content recommendations. These include:
- information that a person has posted online;
- their general activity on a service;
- the history of their interactions with other users; and
- how long they dwell on particular content.
Several respondents highlighted the interconnected nature of recommender systems and targeted advertising. Both commonly rely on the same datasets, which makes it difficult to separate the two. They felt that effective analysis of both is necessary as the revenue generated from advertising influences what content is recommended to users.
Respondents raised general concerns that processing a wide range of information to inform recommender systems was excessive and not in the best interests of the child. More specific concerns were raised about the results of this data collection:
- Extensive data harvesting can create a ‘feedback loop’, where the more information is collected about a child, the more personalised their content feed becomes. Consequently, children are more likely to engage with the platform for extended time periods, providing further opportunities for the service to collect information about them. For children with pre-existing mental health conditions, the result of this loop could be an increased likelihood that harmful content is recommended to them (eg material about self-harm or suicidal ideation).
- Data processing on platforms is used for a wide range of purposes, and recommendations may be informed by information collected from a range of services. This could mean that children are unaware that information they share online for unrelated reasons is also being used to inform content recommendation algorithms. A particular concern was raised that information shared by children for educational purposes was being used for content recommendations.
- Platforms are able to infer characteristics of a child, including age, gender, ethnicity and behavioural characteristics, even if a child has not volunteered this information knowingly. This process of inferring information about children was felt to be intrusive.
- Children’s information and their engagement behaviours, which may reveal psychological characteristics about them, are being used to shape content recommendations. This risks platforms creating comprehensive profiles of children without justification.
Data protection harms
Respondents referred to a number of pieces of research about how recommender systems are not tailored to children’s developmental needs and their neurological impacts. 3 Respondents also highlighted several studies 4 which describe the harms that children and young people experience as a result of recommender systems (eg psychological harm and physical harms).
This highlighted two areas:
- Amplification of inappropriate content: Several respondents reported that recommender systems are leading users to inappropriate or illegal content. They cited research that demonstrates how some recommender systems amplify harmful content, (eg violent or misogynistic content). This included an example of recommended accounts that were sharing real and AI-generated child sexual abuse material. 5 More specifically, concerns were raised that:
-
- teenage boys are likely to be recommended misogynistic content;
- teenage boys are more likely to be recommended violent content;
- recommender systems can promote extreme material or misinformation;
- inappropriate content can lead to users feeling worse about their appearance, and this particularly affects teenage girls; and
- content about self-harm and suicidal ideation can be promoted by recommender systems, and this particularly affects children who have pre-existing mental health conditions.
- Addictiveness and harmful design:
-
- Several respondents argued that SMPs and VSPs use addictive design features to keep users engaged. For example, endless scroll and autoplay features drive users to spend more time online, with one of the purposes being to serve more advertising.
- Another respondent explained that SMPs and VSPs use design features such as clicks, likes and shares to collect more information about children. This could then be used to encourage them to spend more and more time on a platform. The respondent compared this to how a slot machine promotes addictive behaviours. 6 They also explained how this process was linked to the release of the dopamine chemical in the brain, which gives the feeling of ‘reward’ to children by encouraging them to stay online. 7
- Some argued that these features disproportionately impact children and young people because of their stage of cognitive development. 8 They provided further examples of how recommender systems are not designed with children’s developmental needs in mind.
- The responses also highlighted that addictive service design can aggravate harmful behaviours and content online. For example, when recommender systems are combined with advertising business models, this can greatly increase the risk of children accessing harmful content. Standard 12 of the code explains that the profiling of children online should be ‘off’ by default, including behavioural advertising. 9 Responses argued that the presence of behavioural advertising on a platform means that children are being treated in the same way as adults, meaning that harmful content is more likely to be recommended to them.
Impacts on children with protected characteristics
Some respondents flagged the potential for discrimination of children with protected characteristics (such as their race). 10 One respondent noted how these children are more at risk from the negative impacts of recommender systems.
Another respondent highlighted that a balance needs to be struck about how content is personalised in recommender systems for these children. On the one hand, children are at risk of being reduced to their protected traits, if there’s too much focus on these characteristics. On the other, providing a generic, non-personalised experience to children with protected characteristics may be viewed by some as being discriminatory. This is because a non-tailored experience may prevent these users from engaging with relevant content that resonates with their interests.
Good practice
While respondents did not put forward specific examples of existing good practice, several set out their own recommendations, including:
- not using behavioural profiling of children to inform recommender systems;
- prioritising children’s safety by focusing on providing relevant and credible content, rather than maximising the time children spend online;
- performing a Children’s rights impact assessment 11 to ensure that services respect and remedy a full range of children’s rights, and adopt a child-centred approach (including how recommender systems function);
- removing addictive features, such as endless scroll and autoplay, to prioritise safety over engagement; and
- introducing features to encourage healthier online behaviours, such as notification controls and allowing users to reset their feeds.
The ICO’s response
- the volume and range of personal information collected for profiling children for making recommendations;
- the lack of transparency about how platforms use this information to make recommendations; and
- the potential for recommender systems to maximise children’s engagement at the expense of children’s privacy and wellbeing.
Our progress update, published alongside this document, sets out our concerns in this area and the action we are taking.
Age assurance to identify under 13-year-olds
We sought evidence on recent developments in age assurance technologies to support our work on the use of personal information of children under 13 years old. In particular:
- the effectiveness of profiling techniques to identify users under the age of on a platform;
- the approaches SMPs and VSPs are taking to reduce the potential risk that under 13s try to circumvent age assurance 12; and
- innovative new age assurance practices.
18 of the 20 respondents responded to our questions on age assurance. There were a number of common themes.
Profiling to identify users under the age of 13
A number of respondents noted that there is a lack of information about how profiling for age assurance works. Some raised concerns about profiling because it involves processing users’ information before their age has been established and therefore may expose underage users to harm.
One respondent considered profiling for the purpose of age assurance to be rudimentary compared to more sophisticated algorithms that are used for content recommendations and targeted advertising. This suggests there may be scope for improving the way profiling is used for this purpose.
Another respondent suggested there was potential to use profiling for the benefit of children, if done without bias. This is because it can be used to identify at-risk children based on their online behaviour and demographic. 13
New age assurance practices
The age assurance vendors who responded provided information about the technologies they supply. These include facial age estimation, email address age estimation and technology reliant on attribute and device-level data.
Some respondents also provided information on emerging developments, such as the use of hand movement and geometry to estimate age. 14
One respondent noted that a new age assurance system is in development in Germany. This uses a randomised number assigned to users to obtain proof of an age bracket from an authorised verification body (eg a bank or school administration). The user can then transmit the results of the check to the online service provider. The system relies on a ‘double blind’ approach - the platform does not have information about the identity of the person, nor the body which performed the verification process. At the same time, the verification body does not know which platform or service the person is trying to access.
Some responses highlighted that market developments in age assurance are primarily focused on the 18+ market, and innovation for other age thresholds and brackets may only be driven by a regulatory need to do so.
Other points made about age assurance
- Several respondents raised concerns about the efficacy of self-declaration. 15 They agreed with our position that using self-declaration in isolation is inappropriate for services that are considered high-risk because it is unlikely to be accurate or effective. 16
- One respondent called for a noticeable improvement in the adoption of age assurance measures and highlighted the need to ensure measures have an appropriate level of technical accuracy to protect children online.
- Two respondents supported ensuring high privacy protections for all, for example by applying all the standards of the code to all users. They considered that this would remove the requirement for age assurance which would also address concerns about privacy and potential circumvention.
The ICO’s response
On profiling to identify users under the age of 13, we agree with respondents that there is limited information about how profiling is currently used for age assurance and none of the responses we received provided further information on this. As set out in our progress update, we are writing to six platforms to better understand their approach and will consider next steps in light of the information we receive. 17
On developments in new age assurance technologies, we welcome the information provided and have followed up with those that referred to new age assurance technologies. This ensures that we remain informed about technology developments and understand any relevant data protection considerations.
On the point raised about applying the standards of the code to all users, the code is clear that organisations have a choice when they implement standard three of the code on age-appropriate application. They can either assess the age of their users to a degree of certainty appropriate to the data protection risks on their platform, or apply all standards of the code to all users. If platforms choose the first option, we expect them to be able to explain why they deemed the age assurance measure(s) they have chosen to be the most appropriate and effective.
Our progress update, published alongside this document, sets out the action we have taken about the use of personal information of children under 13 and our next steps.
Impacts of the code
We are committed to making timely, informed and impactful decisions. We draw on evidence and insight and understand the impacts of our interventions to ensure that we are making a material difference.
We asked industry respondents for their views on any challenges or barriers to implementing changes due to the standards in the code. Two respondents provided examples of these, including:
- Initial challenges when the code first came into force in interpreting “likely to be accessed by children”. While the respondent noted that we had provided subsequent clarification on this, they thought there had been a lack of practical application. For example, adult sites being absent from the research for the call for evidence and lack of enforcement in this area.
- One trade association indicated that their members had to spend resource (cash, time and opportunity) ensuring compliant implementation, including:
- further risk assessments (based on new guidance);
- developing new policies; and
- considering age assurance technologies.
- This respondent also reported increased complaints from consumers due to introducing new age gates.
We also asked respondents whether implementing the code had led to any benefits for their organisation or sector to date. Two respondents indicated that this question was not applicable, while one respondent (a trade association) reported benefits to their sector. That respondent stated that many services now have improved transparency around data collection and use, and that the process of implementing the code had been a helpful exercise for the industry in terms of:
- evaluating the application of GDPR for children’s data processing;
- refreshing and updating those standards; and
- encouraging the industry to consider what future technologies it might offer to further enhance privacy.
The ICO's response
We will continue to monitor and review the impact of the strategy, alongside the information provided in the responses to this call for evidence. We will do this in line with the principles and approaches set out in our Impact assessment framework and Ex-post impact framework. This will include us engaging further with stakeholders affected by the strategy, where appropriate and proportionate.
Views on the ICO’s approach
We asked respondents for their views on our work to improve children’s privacy, including whether they support our current strategy.
We found that 85% of respondents were fully or partially supportive of the action we are taking to ensure SMPs and VSPs comply with the code. 18
Several respondents also provided positive feedback about our strategy.
They welcomed:
- our consultative approach;
- the strategy’s intention to raise awareness among children and their parents about children’s online privacy; and
- our regulatory efforts to protect children’s privacy and safety online.
Other stakeholders commented on the current scope of our work, our approach to regulatory action and on the need for increased alignment with other regulators. We’ve summarised comments on these three areas and provided a response below.
The scope of the strategy
A number of respondents queried the scope of our strategy, suggesting that it should be broader. In particular:
- Some questioned why we had a particular focus on using information of children that are under 13 years of age and suggested that we should promote age-appropriate design across all ages instead.
- Some suggested that our current focus on SMPs and VSPs was too narrow and may be interpreted by some organisations as meaning that the code does not apply to other sectors:
- Some argued that we should address data protection issues that may not be immediately apparent in EdTech services, and provide effective remedies for children.
- Some argued that we should identify how children’s data protection applies in generative AI systems, and how AI companies should apply the children’s code.
- Others were concerned that we were not looking at adult services. They said that children were accessing these sites in significant numbers and, to date, we have not taken any substantive enforcement action on this.
In addition, one respondent called for us to revisit our Best interests framework. 19 This framework explains to online services how they can design their products with the best interests of a child in mind, in line with the requirements of the code and Article 3 of the United Nations Convention on the Rights of the Child. The response argued that online services are ’using and abusing’ this framework to place the business needs ahead of the rights of children. They argued that we should provide the framework and process to make a best interests of a child assessment, rather than this being the responsibility of platforms.
The ICO's response
As noted in our August update, data protection law and our code seek to protect the personal information of children of all ages within the digital world so that they have an age-appropriate experience. 20 One of the areas of focus under our current strategy is the use of the information of children under 13 years old. This reflects the additional support needs of younger children due to their earlier stage of development, and the UK’s data protection law which includes additional protection for such users. 21 Whilst this forms one element of the strategy, our other four workstreams are relevant to the use of the children’s personal information more generally.
Our work under the strategy is focused on SMPs and VSPs. We prioritised these services for 2024/5 because they are widely used by children and have been identified as presenting potential risks to them. 22
However, UK data protection law and the code continue to apply much more widely. They apply to all information society services likely to be accessed by children. 23 We already have ongoing work in the additional areas that some stakeholders thought we should look at:
- EdTech: We are currently undertaking a programme of audits focused on developing, providing and using EdTech solutions in schools to understand the privacy risks and potential non-compliance with data protection legislation. During the Data (Use and Access) Bill’s passage through the parliamentary scrutiny process, government has committed to using secondary regulation making powers to require the ICO to produce a code of practice on EdTech. We welcome this commitment and approach, which provides a vital opportunity to develop the exact scope of the code, drawing on insights from our existing work in this area, and in consultation with government and stakeholders.
- Generative AI and children: We are also carrying out work on children’s information and AI, including:
-
- supporting the Department for Education on data protection matters of its Generative AI work, including the upcoming Content Store. We have responded to the Department for Education Generative AI in education call for evidence.
- providing advice to stakeholders, including to organisations developing and deploying EdTech tools powered by Generative AI; and
- monitoring harms to children arising from the training and deployment of Generative AI tools.
- Adult services: Our strategy looks at how services that allow children on their platforms use children’s personal information. Platforms that provide adult-only services should focus on age assurance systems to prevent children from accessing their services. 24 We will continue to work alongside Ofcom on the use of age assurance to prevent under 18s from accessing adult services.
In terms of the suggestion that the ICO revise its Best interests framework, the existing framework already emphasises the importance of considering the rights of children. The Best interests of the child standard in the code sets out the need to consider the full range of children’s rights addressed by the UNCRC. Data protection law requires organisations to be accountable for their processing and stipulates that they are responsible for complying with data protection law and any applicable frameworks that are relevant to them. We have developed a number of materials and practical tools to assist organisations to comply with the accountability principle and it is the organisation’s responsibility to demonstrate this. This includes the requirement to consider the impact of processing on children, including benefits and risks, by completing a data protection impact assessment, where necessary.
Regulatory action and transparency
Some respondents raised concerns about limited formal enforcement action since the implementation of the code. They called on us to use the full extent of our enforcement powers to ensure that platforms meet their obligations. 25
There were also calls for further transparency about the organisations we are assessing or investigating. They suggested that publishing information about any changes made by services as a result of our regulatory intervention may increase conformance among other platforms in scope of the code.
The ICO’s response
As set out when we launched our strategy, we have driven significant changes in how services use children’s information, using a range of tools. This included fining TikTok £12.7 million for a number of data protection breaches which included the failure to use children’s personal information lawfully.
Our March 2025 progress update sets out the further improvements we have delivered so far under the strategy and also notes that we currently have three live investigations open.
In terms of the transparency of our actions, we regularly publish updates and reports on the action we have taken to keep the public informed about how we uphold information rights. For children’s privacy, we have published two progress updates since publishing our strategy in April 2024: one in August 2024 and another in March 2025, alongside this document.
To provide further transparency of our work, our March 2025 update included a table summarising key results from our review of a sample of SMPs and VSPs.
We will continue to provide updates on our work as it progresses.
Regulatory alignment
Some respondents suggested that we should increase our alignment with regulators and maintain a joined-up approach through the Digital Regulators Cooperation Forum (DRCF) and through continued cooperation with Ofcom.
There were also calls for us to be clearer about the different regulatory requirements between the data protection and online safety regimes (eg ICO’s code and Ofcom’s online safety codes and guidance) to uphold compliance with regulatory requirements, particularly where thresholds differ.
The ICO’s response
The ICO and Ofcom have already published two joint statements which set out our vision for regulatory alignment and how we collaborate on the regulation of online services. 26
We have also set out how compliance across the two regimes can be achieved, particularly when regulatory requirements differ. For example, when the ICO published the Commissioner’s Opinion on age assurance to help organisations understand their obligations under the UK GDPR, we signposted organisations to the requirements of the Online Safety Act. 27
The Digital Regulation Cooperation Forum (DRCF) Workplan for 2024/2025 sets out the ICO’s and Ofcom’s continued commitment to take a consistent approach to online safety and data protection. We are currently finalising our plans for the next phase of our joint work which will be published as part of the DRCF workplan for 2025-2026. We will continue to work together to ensure our respective codes of practice and guidance are aligned and provide organisations with the clarity they need to comply with online safety and data protection law.
1 The types of organisations and (number of respondents) were: age assurance vendors (3); academics (2); people acting in their professional or private capacity (3); civil society (7); regulator (1); think tank (1); trade associations or industry bodies (2); and assessment body (1).
2 Protected characteristics (Equality and Human Rights Commission)
3 One example provided was this study which discusses the neurological harms associated with recommender systems particularly about their ability to foster addictive behaviours and impact cognitive function.
4 One of the examples cited by respondents was this report which describes the harms that can be experienced by children and young people due to recommender systems.
5 5Rights challenges Meta’s inaction on AI-generated CSAM
6 The submission argued that using a recommender system was similar to using a slot machine in the sense that they offer ‘variable reward schedules’, a learning process made famous by psychologist B.F. Skinner. Users of slot machines do not know when they will ‘hit it big’, so they keep playing. Similarly, users of recommender systems, it is argued, may keep scrolling or refreshing to get to the best content.
7 The submission argued that using recommender systems triggers the release of dopamine, which acts as a reward to the brain, and also triggers a dopamine craving, which encourages children to use them again in the future. Children, it was noted, are particularly susceptible to the release of dopamine.
8 Children face difficulties in regulating their time and need more support to protect themselves as studies demonstrate this can cause neurological, psychological and physical harms to them.
9 Profiling refers to any form of automated processing of personal information that analyses aspects of someone’s personality, behaviour, interests and habits to make predictions or decisions about them. See GDPR Article 4(4).
10 This refers to the protected characteristics defined under the Equality Act 2010: Protected characteristics. In general, respondents referred to this term broadly. However, they did provide some examples to suggest that marginalised groups (race and gender reassignment) are more likely to have their content removed on Instagram.
11 A respondent pointed to the following document Best Interests of the Child to set out what key elements a best interests of the child assessment may include.
12 Responses to our CfE did not provide examples of how SMPs and VSPs minimise the risk of circumvention of age assurance methods by users under 13 years old.
13 While there wasn’t more information provided in the response, we consider that information like this would enable safeguards or protections to be put in place.
14 Using statistical models or machine learning algorithms, hand geometry for age estimation analyses the size, shape, and structure of hand bones and joints to estimate a person's age based on established growth patterns.
15 Self-declaration is where a user states their age but does not provide any evidence to confirm it; it can therefore be circumvented by children, sometimes with the support of their parents (see Ofcom ICO Age assurance report). Under the Online Safety Act 2023, self-declaration of age (without additional steps) is not to be regarded as age assurance.
16 The Commissioner does not consider that self-declaration on its own is an appropriate age assurance method for services that are considered high risk. However, it could be considered alongside other age assurance methods, if the combination is demonstrated to be effective. See the Commissioner’s Opinion for more information.
17 As part of this, we will be looking at the use of profiling for age assurance where users do not need to sign in to an account.
18 10 were fully supportive, seven were partially supportive, two did not respond and one responded ‘unsure/don’t know’.
19 Children’s code: best interests framework
20 3. Age appropriate application
21 Parental consent is required to process the personal information of children who are under the age of 13 when information society services are relying on consent as their lawful basis.
22 ICO sets out priorities to protect children's privacy online
23 An ‘information society service’ is “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.” This applies to most online services such as apps, search engines, social media platforms, online gaming platforms and content streaming services.
24 Age assurance for the Children’s code
25 One respondent also called for the government to set up an independent review of the ICO, because of concerns about a lack of enforcement and a lack of transparency about the challenges it faces about enforcement.
26 ICO and Ofcom strengthen partnership on online safety and data protection and A joint statement by Ofcom and the Information Commissioner’s Office on collaboration on the regulation of online services.
27 Age assurance for the Children’s code opinion - legislative framework