Safeguarding children’s privacy is a key enforcement priority for the Information Commissioner’s Office (ICO). It is also one of their duties under the Online Safety Act, alongside OFCOM.
In March 2025, the ICO announced three investigations looking into how TikTok, Reddit and Imgur (an image sharing and hosting platform) protect the privacy of their child users in the UK. The investigations into Imgur and Reddit specifically focussed on how the platforms use UK children’s personal data and their use of age assurance measures.
Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services” (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able to provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:
“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”
Earlier this month MediaLab.AI, Inc. (MediaLab), owner of Imgur, was fined £247,590 for processing children’s personal data in ways that breached the UK GDPR. Imgur’s terms of use stated that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform.
Yesterday the ICO announced that Reddit has now been fined £14.47m under the UK GDPR. The circumstance of the fine are very similar to MediaLabs. In summary:
- Reddit’s terms of service prohibited children under 13 years of age using its platform, but despite that it did not have measures in place to check the age of users accessing its platform until July 2025.
- The ICO’s estimates indicated that there were a large number of children under 13 on the platform and Reddit did not have a lawful basis for processing their personal data.
- Reddit had not completed a Data Protection Impact Assessment focusing on the risks of using children’s personal data before January 2025, even though children between 13 and 18 were allowed to use the platform.
- By using under 13-year-olds’ personal data without a lawful basis and without having properly considered the risks to children more generally, children were at risk of exposure to inappropriate and harmful content on Reddit’s platform.
We are waiting for the ICO to publish the Monetary Penalty Notices in relation to Redditt and MediaLab. In the case of the latter, the ICO said at the time that it is still considering the redaction of personal and commercially confidential or sensitive information.
The ICO’s investigation into TikTok is still ongoing. It is considering how the platform uses personal data of 13–17-year-olds in the UK to make recommendations to them and deliver suggested content to their feeds. This is in the light of growing concerns about social media and video sharing platforms using data generated by children’s online activity in their recommender systems, which could lead to them being served inappropriate or harmful content.
The ICO is also investigating 17 other platforms, including Discord, Pinterest, and X, and has been in discussions with Meta and Snapchat over how they use children’s location data in their user map features. Watch this space!
The Data (Use and Access) Act 2025, most of which came in to force earlier this month, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data.
Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.
This and other developments relating to children’s data will be covered forthcoming workshop, Working with Children’s Data.

