ICO Focus on Children’s Data Processing 

In February we wrote about the Information Commissioner’s Office (ICO) issuing fines under the UK GDPR to two social media companies. Reddit was fined £14.47 million and MediaLab (owner of Imgur) was fined £247,590 for failing to implement age‑assurance measures and for processing children’s personal data in a way that potentially exposed them to harmful content. 

Safeguarding children’s privacy is a key enforcement priority for the ICO. The ICO’s investigation into TikTok (opened in March 2025) is still ongoing. It is considering how the platform uses personal data of 13-17 year-olds in the UK to make recommendations to them and deliver suggested content to their feeds. This is in the light of growing concerns about social media and video sharing platforms using data generated by children’s online activity in their recommender systems, which could lead to them being served inappropriate or harmful content. The ICO is also investigating 17 other platforms including Discord, Pinterest, and X, and has been in discussions with Meta and Snapchat over how they use children’s location data in their user map features.  

Safeguarding children’s privacy is also a duty of the ICO under the Online Safety Act, alongside Ofcom. Last week the ICO published an open letter to social media and video‑sharing platforms operating in the UK, calling on them to strengthen age assurance measures so young children cannot access services that are not designed for them. The letter sets out the ICO’s expectations about measures that platforms with a minimum age must implement, beyond relying on children to self-declare their ages (which they can easily bypass).  Instead, platforms should make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services. The ICO has also written directly to platforms, starting with TikTok, Snapchat, Facebook, Instagram, YouTube and X to ask them to demonstrate how their age assurance measures meet the ICO’s expectations.  

The Data (Use and Access) Act 2025, most of which came in to force earlier this month, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data.  

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.  

This and other developments relating to children’s data will be covered forthcoming workshop, Working with Children’s Data.

Children’s Privacy Failures Result in a £14.47m for Reddit

Safeguarding children’s privacy is a key enforcement priority for the Information Commissioner’s Office (ICO). It is also one of their duties under the Online Safety Act, alongside OFCOM.  

In March 2025, the ICO announced three investigations looking into how TikTok, Reddit and Imgur (an image sharing and hosting platform) protect the privacy of their child users in the UK. The investigations into Imgur and Reddit specifically focussed on how the platforms use UK children’s personal data and their use of age assurance measures. 

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able to provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states: 

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.” 

Earlier this month MediaLab.AI, Inc. (MediaLab), owner of Imgur, was fined £247,590 for processing children’s personal data in ways that breached the UK GDPR. Imgur’s terms of use stated that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform. 

Yesterday the ICO announced that Reddit has now been fined £14.47m under the UK GDPR. The circumstance of the fine are very similar to MediaLabs. In summary: 

  • Reddit’s terms of service prohibited children under 13 years of age using its platform, but despite that it did not have measures in place to check the age of users accessing its platform until July 2025. 
  • The ICO’s estimates indicated that there were a large number of children under 13 on the platform and Reddit did not have a lawful basis for processing their personal data. 
  • Reddit had not completed a Data Protection Impact Assessment focusing on the risks of using children’s personal data before January 2025, even though children between 13 and 18 were allowed to use the platform. 
  • By using under 13-year-olds’ personal data without a lawful basis and without having properly considered the risks to children more generally, children were at risk of exposure to inappropriate and harmful content on Reddit’s platform. 

We are waiting for the ICO to publish the Monetary Penalty Notices in relation to Redditt and MediaLab. In the case of the latter, the ICO said at the time that it is still considering the redaction of personal and commercially confidential or sensitive information.  

The ICO’s investigation into TikTok is still ongoing. It is considering how the platform uses personal data of 13–17-year-olds in the UK to make recommendations to them and deliver suggested content to their feeds. This is in the light of growing concerns about social media and video sharing platforms using data generated by children’s online activity in their recommender systems, which could lead to them being served inappropriate or harmful content.  

The ICO is also investigating 17 other platforms, including Discord, Pinterest, and X, and has been in discussions with Meta and Snapchat over how they use children’s location data in their user map features. Watch this space! 

The Data (Use and Access) Act 2025, most of which came in to force earlier this month, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data. 

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.  

This and other developments relating to children’s data will be covered forthcoming workshop, Working with Children’s Data.

Children’s Image Hosting Platform Fined For Privacy Failures

Last week the Information Commissioner’s Office (ICO) issued its first UK GDPR fine of 2026. MediaLab.AI, Inc. (MediaLab), owner of image sharing and hosting platform Imgur, received a Monetary Penalty Notice of £247,590 for processing children’s personal data in ways that breached the UK GDPR.     

Safeguarding children’s privacy is a key enforcement priority for the ICO. 
In April 2023, it issued a £12.7 million fine to TikTok for a number of breached of the UK GDPR, including failing to use children’s personal data lawfully. The following year, the ICO launched its Children’s code strategy to look closely at social media platforms and video sharing platforms. 
In December it published a progress report on the strategy, reporting good progress and including a ‘proactive supervision programme’ to drive improvements in the industry. Perhaps this latest fine is part of this ‘proactive supervision programme’.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services” (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able to provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states: 

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.” 

Imgur’s terms of use did state that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform. 

In setting the £247,590 penalty amount, the ICO took into consideration the number of children affected by this breach, the degree of potential harm caused, the duration of the contraventions, and the company’s global turnover. It also considered MediaLab’s acceptance of the provisional findings set out in the Notice of Intent issued in September 2025 and its commitment to address the infringements if access to the Imgur platform in the UK is restored in the future. If MediaLab resumes processing the personal data of children in the UK (currently the Imgur site is not available in the UK) without implementing the measures it has committed to, the ICO may take further regulatory action. 

We are waiting for the Monetary Penalty Notice to be published. 
The ICO says it is still considering the redaction of personal and commercially confidential or sensitive information.  

This fine shows that the ICO’s spotlight is firmly on those processing children’s data. The Data (Use and Access) Act 2025, the key provisions of which came into force on last Thursday, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data. 

Listen to the Guardians of Data Podcast for the latest news and views on developments in GDPR, AI, cyber security and FOI.

This and other developments relating to children’s data will be covered in tomorrow’s online workshop, Working with Children’s Data. The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw.

ICO Reprimand for Children’s Services 

Yesterday, the ICO issued a reprimand to Birmingham Children’s Trust Community Interest Company after the personal information of a child was inappropriately disclosed to another family. 

The child protection and review department at Birmingham Children’s Trust Community Interest Company, which is owned by Birmingham City Council, was working with two neighbouring families when the data breach occurred. A child protection plan was disclosed to one family that contained both personal information and criminal allegations relating to a child from the neighbouring family. This information was included in error after being copied across from meeting minutes. 

The ICO investigation found that Birmingham Children’s Trust Community Interest Company did not have appropriate policies or sufficient practical guidance in place to ensure the security of personal information. This is a breach of Article 5(1)(f) and 32(1)(b) and 2 of the UK GDPR. 

The ICO has recommended that Birmingham Children’s Trust Community Interest Company should take further steps to ensure its compliance with data protection law, including: 

  • Implement a more granular approach to data protection and create a Standard Operating Procedure with regards to producing social care documents. 
  • Include a process for any social care product to be independently checked by someone other than the author prior to disclosure. 
  • Create and implement a corporate redaction policy, which ensures staff have the knowledge and tools, to redact the product if necessary. 

Our GDPR Essentials e learning course is ideal for organisations who wish to upskill their employees about data protection and data security. 

The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates