Skip to main content

Annex: Table of observations from our review of a sample of social media and video sharing platforms

Contents

Annex: Table of observations from our review of a sample of social media and video sharing platforms

In August 2024, we published the high-level findings from our review of 34 SMPs and VSPs. The review took place between March and May 2024, with follow-up testing on platforms undertaken as necessary.

To provide greater transparency of this work, we have identified key metrics that we could publish across the platforms reviewed. This annex explains the metrics and sets out the observations by platform in a table.

Our review

As noted in August, the SMPs and VSPs included in our review were selected based on: 

  • UK app download figures;
  • Ofcom research findings on the platforms that children use; and
  • whether their terms of service allow under 18s to view content and set up accounts.   

The platforms included a range of sizes and well-established platforms as well as newer services.  

We created new user accounts using proxies for children of different ages for each platform, to replicate the sign-up processes that children would follow. We viewed default settings and options available to child users, as well as any privacy information provided. We did not interact with other users.

Metrics included in the comparison table

The metrics have been informed by our review of each platform’s account set-up process. 1 They therefore cover ‘signed-in’ accounts on those platforms. 2

As the code is not prescriptive, the observations provided for these metrics do not mean that these platforms necessarily conform with the code. 

  • Use of age assurance during account set-up (metrics a and b)

    During our review, we observed whether, and if so how, platforms checked the age of their users during account set-up. In particular, we observed whether services asked users to provide their age when they were setting up an account (metric a). We could also observe whether they used other techniques to verify or estimate a user’s age during account set-up (metric b), for example by requesting official ID or by carrying out facial age estimation. 

    The nature of our review meant that we did not capture any additional measures platforms may employ to detect underage users after setting up the account. However, we are aware that some platforms have other age assurance methods in place post-sign up (including machine learning) and these techniques have been a feature of our ongoing discussions with platforms about both signed-in and signed-out accounts.
  • Profiling children for advertising (metrics c and d)

    We looked at whether platforms showed adverts to children (metric c) and, if so, what information they used for advertising purposes. In particular, we looked at whether platforms used very limited data points such as age and high-level location data (eg the country) to help ensure that advertising is age-appropriate and jurisdiction specific, or whether they used additional information to target adverts shown to children (metric d).
  • Default privacy settings (metric e)

    We looked at whether children’s accounts include settings which are ‘high privacy’ by default. This included looking at whether the profiles themselves were private by default when children set up an account. We have also considered other ways that platforms may look to deliver high privacy by default, including:
    • enabling children to have public profiles that do not contain personal information;
    • providing settings to control the visibility and searchability of children’s content and associated profile information;
    • providing settings or safeguards to prevent messages from strangers; and
    • limiting the visibility of children’s personal information in profiles to "friends only". 
  • Default geolocation settings (metrics f and g)

    We observed whether the platforms publicly shared children’s geolocation information with other users by default (metric f). Where this was not the case, we also noted whether users could opt in to share their geolocation with other users or whether the service prevented them from doing so (metric g). Sometimes users choose to share their location voluntarily after account set-up. This is excluded from our findings below. 

While we originally included 34 platforms in our review, the table provides observations for 29 platforms. We have excluded six platforms from this table (though two of these platforms, Flickr and WeChat, remain under review) 3 and we have added one new application, Wizz.

We shared our observations from the review with the platforms for factual confirmation prior to publication. Three platforms (BeFriend, Clubhouse and Frog) did not respond to confirm the information shared with them. 

We have updated our initial observations to reflect changes that platforms have made since we carried out our review. This includes any response to the letters we wrote to 11 organisations in August 2024 and our broader programme of regulatory scrutiny. We note where platforms have committed to make changes, but have not yet introduced them. We will continue to monitor these platforms to ensure that they implement the planned changes as expected.

We also note where we continue to scrutinise the approach platforms take in specific areas through compliance discussions and written enquiries.  

Our observations

The following table provides information for the metrics we reviewed by platform. The letters in brackets in the footnotes refer to the earlier columns in this table.

The observations in the table do not mean that these platforms necessarily do or do not conform with the code. This is because the code is not prescriptive and platforms need to take a proportionate and risk-based approach, so that children have the best possible access to online services while minimising data collection and use by default.

Please note the table will not reflect any changes platforms have implemented subsequent to our review or confirmation process. 

Service a) Does the platform ask users to state their age when setting up an account (self declaration)? b) Does the platform use any techniques to estimate or verify the age of the user during account setup? c) Are adverts shown to children? d) Is targeted advertising based on minimal information categories (age/location)? e) Does the platform ensure that children's profiles are high privacy by default?  f) Is a child's geolocation hidden to other users by default? g) Can children opt in to share their geolocation with other users?

Additional information

(→)

BeFriend Yes No Yes N/A No → Yes Yes 1
BeReal Yes No Yes N/A → Yes Yes →  Yes 2
Clubhouse Yes No No N/A Yes Yes No N/A
DailyMotion Yes No Yes Yes Yes → Yes No 3
Discord Yes No → Yes No → No → Yes No 4
Facebook Yes No Yes Yes Yes Yes Yes N/A
Frog Yes No Unknown Unknown → No → Yes Yes 5
Hoop Yes No No N/A No → Yes No 6
Imgur No → No Yes Yes No Yes No 7
Instagram Yes No  Yes Yes  Yes → Yes Yes 8
Odysee Yes No No N/A Yes Yes No N/A
Pinterest Yes No No N/A Yes Yes No N/A
Reddit No → No Yes Unclear → No → Yes No 9
Sendit Yes No No N/A Yes Yes → Yes → 10
Snapchat Yes No Yes Yes Yes Yes Yes N/A
Soda Yes Yes Yes Unclear → Unclear → Yes → No 11
Threads Yes No  No N/A Yes Yes Yes N/A
TikTok Yes No Yes Yes Yes Yes No N/A
Triller Yes No No →  N/A → No → Yes No 12
Twitch Yes No Yes N/A →  No → Yes No 13
Vero No → No No N/A No → Yes Yes 14
Viber Yes No Yes N/A → Yes → Yes Yes 15
Vimeo No → No No N/A Yes → Yes No 16
WeAre8 Yes No Yes Yes  No → Yes No 17
Wizz App Yes Yes Yes Unknown → Yes Yes Yes 18
X Yes → No → No N/A Yes Yes No → 19
YouTube Yes No Yes Yes Yes Yes No 20
YouTube Kids Yes Yes → Yes Yes N/A → Yes No 21
Yubo Yes Yes Yes N/A → Yes Yes Yes → 22

 

 


Table footnotes

1 (d) (e) We have sent written enquiries to BeFriend about their advertising practices and default privacy settings.

2 (f) We wrote to BeReal in August 2024 about their approach to geolocation settings. In response to our letter, they have stopped processing children’s precise geolocation data. This means that children only have the option of sharing approximate city-level location when posting.

 3 (e) We wrote to Dailymotion in August 2024 about their default privacy settings. Dailymotion set out that children's profiles are private until the point at which they choose to upload videos. When their profiles are public, they include limited personal information. In response to our letter, Dailymotion has implemented new privacy and transparency measures, including providing additional notifications to children before they upload videos to remind them not to disclose personal information. Dailymotion also added warnings to remind children to be cautious when drafting video descriptions so that they do not share personal information. Dailymotion has also improved their guidance for children.

(b) Discord is currently reviewing their approach to age assurance considering relevant legislation including the OSA.
(d) Targeted advertising on Discord uses some behavioural data however it is off by default for all users and provided on an opt-in basis.
(e) We wrote to Discord in August 2024 about their default privacy settings. Discord has outlined a range of measures they use to support high privacy settings, including teen safety alerts and sensitive content filtering, which they consider operate in the best interests of the child. We continue to have compliance discussions about their approach to ensure high privacy by default

 5 (e) (f) We wrote to Frog about their default privacy and geolocation settings in August 2024. As Frog did not respond, we sent an information notice, compelling them to provide the information requested. We have been assessing their evidence, and our compliance assessment is ongoing.

 6 (e) Hoop has stated that the profiles of under-18 users are only visible to other under-18s. Hoop has committed to ensure that, by default, children's profiles are private by the end of Q1 2025.

 7 (a) We wrote to Imgur about the processing of personal information of children under 13 years old. Imgur did not respond to our letter, but did respond to a subsequent information notice. We have since opened an investigation into how Imgur processes the personal information of children in the UK and their use of age assurance. 

 8 (e) Instagram has introduced Teen Accounts, and 13-15 year old users need parental approval to amend default privacy settings.

 9 (a)  We have an ongoing investigation into how Reddit processes the personal information of children in the UK and their use of age assurance. Reddit has indicated that they are currently reviewing their approach to age assurance and plan to implement self-declaration in Q3. 
(d) Reddit has indicated that some limited behavioural advertising is applied on the platform. Reddit has indicated that they plan to review their approach to targeted advertising. 
(e) While profiles are not private by default, Reddit has outlined that upon reviewing their age assurances measures they will also review their default settings for u18 users. 

10 (e) We wrote to Sendit about its default privacy settings in August 2024. It has set out the measures that it has in place to protect users’ privacy. On the basis that it delivers high privacy by default, we do not propose to take any further action but would revisit the position if relevant changes were to be made. 
(f), (g) We also wrote to Sendit about their geolocation settings in August 2024. In response, they have stopped automatically  populating users’ profiles with location information, providing further protections for children. They have also introduced new in-app location settings to make it easier for users to enable or disable location services.

 11 (d) (e) We are following up with Soda on their approach to targeted advertising and default privacy settings.
(f) We wrote to Soda about their geolocation settings in August 2024. In response to our letter, Soda has removed country location information that was previously included in children's profiles. Users can no longer view this information on the platform. 

12 (c), (d) and (e) Compliance discussions with Triller about targeted advertising and default privacy settings are ongoing.

13 (d) Advertising is provided on a contextual basis, rather than through the use of profiling or targeting by other means. 
(e) We wrote to Twitch about their default privacy settings in August 2024. In response to our letter, Twitch has committed to change the default settings for teen users in the first half of 2025, so that, by default, users cannot make or share clips for teen streamers in the UK. Our compliance discussions with Twitch are ongoing.

14 (a) We wrote to Vero in August 2025. In response, Vero has committed to introducing age assurance measures by the end of June 2025.                          
(e) Vero has also committed to introducing new protections for children between 13-17 years of age by the end of June 2025. This includes a ‘safe mode’ which locks certain profile settings to the most privacy-friendly by default, with a limit to what the user can change. We have sent Vero further written enquiries about their approach to default privacy settings.

15 (d) Following our intervention, Viber committed to turning off personalised advertising for 17-year-olds (previously this was only off by default for children under 16 years old).
(e) In addition, Viber has committed to extend privacy protections to all under-18 users to ensure only known contacts can add them to groups by end Q1 2025 (this protection is currently in place for children aged 16 and under).

16 (a) We are following up with Vimeo on their approach to age assurance and applying the standards of our code. 
(e) Vimeo informed us of planned changes to their UK platform ensuring all new accounts after November 2024 will be private by default. These changes include ensuring that all users can only view videos that were uploaded by another user if they are provided access via a direct link to the video.

17 (d) WeAre8 have confirmed that by May 2025, advertising to children aged 13-17 will only use minimal information categories (age, location, gender).
(e) WeAre8 has confirmed that they are making changes to ensure that children’s profiles are private by default by end of Q2 2025. These include ensuring that all children’s posts are private by default and that all other settings, including profile visibility settings, are set to the most private option when signing up.

18 (d) We are following up with Wizz App to understand the approach they take to targeted advertising. 

19  (b) Following our intervention, X stopped serving ads to under 18s. 
(g) In addition, X has removed the ability for under 18s to opt-in to geolocation sharing. However, at the time of publication, this change does not apply to ~1209 existing users who previously opted in to this feature. 

20 These observations relate to a proxy account for a child over 13 years old using the “For my personal use” option during account set-up. They do not represent the results that would appear for a parent or carer setting up an account for a child under 13 years old. 

21 (b) YouTube Kids apply an additional layer of parental age verification.  
(e) YouTube Kids account holders cannot create or upload content, and they cannot make any of their information visible to other users.

22 (d) Advertising is provided on a contextual basis, rather than through the use of profiling or targeting by other means and is limited to users aged 16+.
(g) Geolocation is completely disabled for users under 15 years old.

 


1 It was not possible to prepare simple metrics on recommender systems across the platforms.

2 As noted earlier, we also continue to consider platforms where users do not need to log into an account and which rely on consent as their lawful basis for processing at least some of their users personal information.

3 Whisper and Fruitlab have exited the UK market. We were not able to set up accounts with Flikr during our testing period, but we have recently retested this platform and are reviewing our observations and assessing whether follow up is needed. We were unable to review WeChat, as account set-up requires a referral from an active user with an established account, as well as access to a Chinese payment account, but we have sent this platform written enquiries. BitChute and Hive Social were removed because they no longer fall within the scope of our review following changes to their terms of service.