ICO Focus on Children’s Data Processing 

In February we wrote about the Information Commissioner’s Office (ICO) issuing fines under the UK GDPR to two social media companies. Reddit was fined £14.47 million and MediaLab (owner of Imgur) was fined £247,590 for failing to implement age‑assurance measures and for processing children’s personal data in a way that potentially exposed them to harmful content. 

Safeguarding children’s privacy is a key enforcement priority for the ICO. The ICO’s investigation into TikTok (opened in March 2025) is still ongoing. It is considering how the platform uses personal data of 13-17 year-olds in the UK to make recommendations to them and deliver suggested content to their feeds. This is in the light of growing concerns about social media and video sharing platforms using data generated by children’s online activity in their recommender systems, which could lead to them being served inappropriate or harmful content. The ICO is also investigating 17 other platforms including Discord, Pinterest, and X, and has been in discussions with Meta and Snapchat over how they use children’s location data in their user map features.  

Safeguarding children’s privacy is also a duty of the ICO under the Online Safety Act, alongside Ofcom. Last week the ICO published an open letter to social media and video‑sharing platforms operating in the UK, calling on them to strengthen age assurance measures so young children cannot access services that are not designed for them. The letter sets out the ICO’s expectations about measures that platforms with a minimum age must implement, beyond relying on children to self-declare their ages (which they can easily bypass).  Instead, platforms should make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services. The ICO has also written directly to platforms, starting with TikTok, Snapchat, Facebook, Instagram, YouTube and X to ask them to demonstrate how their age assurance measures meet the ICO’s expectations.  

The Data (Use and Access) Act 2025, most of which came in to force earlier this month, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data.  

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.  

This and other developments relating to children’s data will be covered forthcoming workshop, Working with Children’s Data.

Police Scotland Fined for Mishandling Alleged Victim’s Mobile Phone Data 

The Information Commissioner’s Office (ICO) has fined the Police Scotland £66,000 and issued a Reprimand for serious failures in the handling of sensitive personal data. 

Detective Constable Lianne Gilbert, who has now waived her right to anonymity, made domestic abuse allegations, including serious sexual assault, against another officer in 2020. However when a misconduct inquiry took place two years later, it emerged data extracted from Ms Gilbert’s phone was given to the accused officer, his lawyer and his Scottish Police Federation (SPF) representative. There were 40,000 pages of extracted data including 80,000 images, medical records and contact details of Ms Gilbert’s friends and family. Some of the images were of an intimate nature.  

Ms Gilbert has given her account to BBC Scotland News. She said: 

“It’s been absolutely horrific and very, very traumatic.” 

“At the time it happened I had a five-month-old baby. It’s really impacted my motherhood journey. At times I still feel quite numb.” 

It is important to note that the officer in question has not been charged with any offences against Ms Gilbert and the case remains live. 

UK GDPR Breaches 

The ICO investigation concluded that:  

a) Police Scotland failed to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk associated with the processing of personal data by the PSD for the purposes of compiling misconduct packs for disclosure as part of its investigations (Article 32(1) UK GDPR); 

b) These deficiencies put the personal data processed by the PSD at risk of unauthorised disclosure, in breach of the requirement to ensure appropriate security of personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage (Article 5(1)(f) UK GDPR); 

c) Police Scotland failed, at the time of the determination of the means of processing and at the time of the processing itself, to implement appropriate technical and organisational measures designed to implement data protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the UK GDPR and protect the rights of data subjects (Article 25(1)-(2) UK GDPR); 

d) Police Scotland failed to ensure that the personal data processed by the PSD when compiling misconduct packs for disclosure was adequate, relevant and limited to what was necessary in relation to the purposes for which it was processing that data (Article 5(1)(c) UK GDPR); and 

e) Police Scotland failed to inform the Commissioner of the personal data breach within 72 hours of becoming aware of the same (Article 33(1) UK GDPR) 

In assessing the fine amount, the ICO considered the seriousness of the incident, the sensitivity of the data involved and the impact on the affected person. It initially concluded that a £132,000 fine would be effective, proportionate and dissuasive. However applying its controversial public sector approach to enforcement, it decided to reduce the amount by a factor of 50%. 

The Monetary Notice states that Police Scotland paid a sum of money (amount redacted) as compensation to Ms Gilbert. This may have been in anticipation of a civil claim by Ms Gilbert. Article 82 UK GDPR gives a data subject a right to compensation for material or non-material damage for any breach of the UK GDPR. Section 168 of the DPA 2018 confirms that “non-material damage” includes distress. There may be more claims to come; no doubt amongst the data extracted (and shared) from Ms Gilbert’s phone there will have been personal data related to third parties. 

Part 3 DPA Reprimand 

The related reprimand was issued under Part 3 of the Data Protection Act 2018 (law enforcement processing). Police Scotland is a competent authority under Part 3 and was, according to the ICO, processing Ms Gilbert’s data for law enforcement purpose when it extracted the data. The ICO found that Police Scotland had infringed sections 35 and 37 of the DPA by failing to ensure that: 

a) The bulk download of personal data on the mobile phone of the Data Subject was lawful and fair (section 35 DPA); and 

b) The personal data processed from the mobile phone download was adequate, relevant and not excessive in relation to the purposes for which it was processed (section 37 DPA). 

The ICO initially considered that a fine would be appropriate for these DPA breaches, and considered notifying Police Scotland of its intention to impose a fine of £78,750. However, once again, due to the revised approach to public sector enforcement it decided a reprimand was more appropriate. 

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.   

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop and our Law Enforcement Data Processing workshop.

Is Data Still ‘Personal’ if the Recipient Cannot Identify the Data Subject? 

Data protection practitioners know that the first question to ask when considering their organisation’s data protection obligations in relation to any data is: “Is it personal data?” 

The Court of Appeal recently handed down a decision which gives useful judicial guidance on the definition of ‘personal data’ under UK data protection law and the responsibility on organisations to keep personal data secure.    

DSG Retail Limited v The Information Commissioner [2026] EWCA Civ 140 is concerned with events from 2017 and 2018 when the old Data Protection Act 1998 (DPA 1998) was in force. As such the judgement is persuasive rather than binding on UK courts when deciding on issues under the current law; namely the UK GDPR and Data Protection Act 2018. 

The background to the case is that, in 2017, DSG Retail Limited (the parent company of Dixons and Currys PC World) (DSG) suffered a cyberattack targeting point of sale systems in all its shops. Over a nine month period, attackers deployed malware to scrape transaction level card data and attempted to exfiltrate the captured information. More than 5.6 million payment cards were affected; though the majority consisted only of the 16-digit payment card numbers and expiry dates (together referred to as ‘EMV data’). Crucially, the attackers did not obtain any information that could directly identify the cardholders. 

In 2020, the ICO fined DSG £500,000 for breach of the data security principle. 
This was the maximum fine under the DPA 1998. There then followed a series of appeals. The First Tier Tribunal (FTT) upheld the ICO’s findings but reduced the fine by half.  

The Upper Tribunal (UT) in setting aside the FTT’s decision held that the data security principle under the DPA 1998 applies to only to ‘personal data’ i.e. information about living, identifiable, individuals. The data in question, EMV data, did not constitute ‘personal data’ from the attackers’ perspective because the attackers could not link it to specific individuals. As a result, the UT held that DSG did not have any security obligations with respect to such data.  

Following an appeal by the ICO, the Court of Appeal (CoA) has now overturned the UT’s ruling. The CoA held that the Data Controller (in this case DSG) is required to comply with the data security principle under the DPA 1998 with respect to data that is ‘personal’ from the perspective of the Data Controller,  regardless of whether the data might not be personal ‘in the hands of’ or ‘from the perspective’ of any other person. 

The CoA considered it implausible that (absent an explicit statement) Parliament intended to limit the scope of the data security duty so that a Data Controller would have no obligation to protect some parts of the data provided by the Data Subject. The CoA also noted the potential consequences of a contrary reading; there would be no obligation for the Data Controller to protect data when a third party would be unable to identify the Data Subject from that data. In the Court’s view, third-party interference with data, even where the attacker is unable to identify the Data Subjects, can still be harmful. Moreover, the Court found it impractical to put Data Controllers in a position where, in determining their data security obligations, they would need to assess whether attackers could
re-identify individuals via ‘jigsaw’ techniques. 

The case will now return to the FTT to apply the Court of Appeal’s interpretation of the law to the facts of the DSG cyberattack. 

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.  

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop.

Children’s Privacy Failures Result in a £14.47m for Reddit

Safeguarding children’s privacy is a key enforcement priority for the Information Commissioner’s Office (ICO). It is also one of their duties under the Online Safety Act, alongside OFCOM.  

In March 2025, the ICO announced three investigations looking into how TikTok, Reddit and Imgur (an image sharing and hosting platform) protect the privacy of their child users in the UK. The investigations into Imgur and Reddit specifically focussed on how the platforms use UK children’s personal data and their use of age assurance measures. 

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able to provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states: 

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.” 

Earlier this month MediaLab.AI, Inc. (MediaLab), owner of Imgur, was fined £247,590 for processing children’s personal data in ways that breached the UK GDPR. Imgur’s terms of use stated that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform. 

Yesterday the ICO announced that Reddit has now been fined £14.47m under the UK GDPR. The circumstance of the fine are very similar to MediaLabs. In summary: 

  • Reddit’s terms of service prohibited children under 13 years of age using its platform, but despite that it did not have measures in place to check the age of users accessing its platform until July 2025. 
  • The ICO’s estimates indicated that there were a large number of children under 13 on the platform and Reddit did not have a lawful basis for processing their personal data. 
  • Reddit had not completed a Data Protection Impact Assessment focusing on the risks of using children’s personal data before January 2025, even though children between 13 and 18 were allowed to use the platform. 
  • By using under 13-year-olds’ personal data without a lawful basis and without having properly considered the risks to children more generally, children were at risk of exposure to inappropriate and harmful content on Reddit’s platform. 

We are waiting for the ICO to publish the Monetary Penalty Notices in relation to Redditt and MediaLab. In the case of the latter, the ICO said at the time that it is still considering the redaction of personal and commercially confidential or sensitive information.  

The ICO’s investigation into TikTok is still ongoing. It is considering how the platform uses personal data of 13–17-year-olds in the UK to make recommendations to them and deliver suggested content to their feeds. This is in the light of growing concerns about social media and video sharing platforms using data generated by children’s online activity in their recommender systems, which could lead to them being served inappropriate or harmful content.  

The ICO is also investigating 17 other platforms, including Discord, Pinterest, and X, and has been in discussions with Meta and Snapchat over how they use children’s location data in their user map features. Watch this space! 

The Data (Use and Access) Act 2025, most of which came in to force earlier this month, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data. 

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.  

This and other developments relating to children’s data will be covered forthcoming workshop, Working with Children’s Data.

Data Protection Complaints Procedure: New ICO Guidance 

The main changes to the UK  data protection regime made by the Data (Use and Access) Act 2025 (DUA Act) came into force on Thursday 5th February 2026. One key provision though is due to commence on 19th June 2026; the requirement for Data Controllers to have a complaints procedure to handle data protection complaints.  

A new section 164A into the Data Protection Act 2018 requires Data Controllers to: 

  • give Data Subjects a way of making data protection complaints; 
  • acknowledge receipt of complaints within 30 days of receiving them; 
  • without undue delay, take appropriate steps to respond to complaints, including making appropriate enquiries, and keep Data Subjects informed; and 
  • without undue delay, tell Data Subjects the outcome of their complaints 

Following a consultation, which closed in October last year, the ICO has published its guidance explaining the new requirements and informing Data Controllers of what they must, should and could do to comply.  

Data protection expert, and guest on the first Guardians of Data podcast, Jon Baines writes on his personal blog that in declining to suggest how long controllers should normally take to respond to data subject complaints, the ICO has missed an opportunity to provide regulatory clarity.  

Listen to the Guardians of Data Podcast for the latest news and views on developments in GDPR, AI, cyber security and FOI.

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop.  

The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact.

Children’s Image Hosting Platform Fined For Privacy Failures

Last week the Information Commissioner’s Office (ICO) issued its first UK GDPR fine of 2026. MediaLab.AI, Inc. (MediaLab), owner of image sharing and hosting platform Imgur, received a Monetary Penalty Notice of £247,590 for processing children’s personal data in ways that breached the UK GDPR.     

Safeguarding children’s privacy is a key enforcement priority for the ICO. 
In April 2023, it issued a £12.7 million fine to TikTok for a number of breached of the UK GDPR, including failing to use children’s personal data lawfully. The following year, the ICO launched its Children’s code strategy to look closely at social media platforms and video sharing platforms. 
In December it published a progress report on the strategy, reporting good progress and including a ‘proactive supervision programme’ to drive improvements in the industry. Perhaps this latest fine is part of this ‘proactive supervision programme’.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services” (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able to provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states: 

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.” 

Imgur’s terms of use did state that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform. 

In setting the £247,590 penalty amount, the ICO took into consideration the number of children affected by this breach, the degree of potential harm caused, the duration of the contraventions, and the company’s global turnover. It also considered MediaLab’s acceptance of the provisional findings set out in the Notice of Intent issued in September 2025 and its commitment to address the infringements if access to the Imgur platform in the UK is restored in the future. If MediaLab resumes processing the personal data of children in the UK (currently the Imgur site is not available in the UK) without implementing the measures it has committed to, the ICO may take further regulatory action. 

We are waiting for the Monetary Penalty Notice to be published. 
The ICO says it is still considering the redaction of personal and commercially confidential or sensitive information.  

This fine shows that the ICO’s spotlight is firmly on those processing children’s data. The Data (Use and Access) Act 2025, the key provisions of which came into force on last Thursday, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data. 

Listen to the Guardians of Data Podcast for the latest news and views on developments in GDPR, AI, cyber security and FOI.

This and other developments relating to children’s data will be covered in tomorrow’s online workshop, Working with Children’s Data. The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw.

Data (Use and Access) Act: Key Data Provisions In Force on Thursday

The Data (Use and Access) Act 2025 (Commencement No. 6 and Transitional and Saving Provisions) Regulations 2026 were made on 29th January 2026. They bring into force most of the amendments to the UK GDPR, PECR and the DPA 2018 made by The Data (Use and Access) Act 2025 (DUA Act). 

The amendments coming into force on Thursday (5th February 2026), amongst others, cover: 

  • New ‘Recognised legitimate interests’  
  • When time starts for dealing with subject access requests 
  • Automated Decision Making
  • Information to be provided to data subjects 
  • Safeguards for processing for research etc purposes 
  • International Data Transfers 
  • PECR and marketing 

You can read a summary of the amendments here

DUA Act Workshop in Birmingham (Thursday 5th February 2026)

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham.

Revised GDPR Handbook   

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.    

Password Manager Provider Fined £1.2m for GDPR Data Breach 

On 20th November 2025, the Information Commissioner’s Office (ICO) fined password manager provider, LastPass UK Ltd, £1.2 million following a 2022 data breach that compromised the personal data of up to 1.6 million UK users. 

Two security incidents occurred in August 2022 when a hacker gained access first to a corporate laptop of an employee based in Europe and then to a US-based employee’s personal laptop on which the hacker implanted malware and then was able to capture the employee’s master password. The combined detail from both incidents enabled the hacker to access LastPass’ backup database and take personal data which included customer names, emails, phone numbers, and stored website URLs.  

For a good analysis of what went wrong at LastPass and how to avoid such incidents, please read this blog. This is the seventh GDPR fine issued by the ICO in 2025; all have been in relation to cyber security incidents.  In October professional and outsourcing services company Capita received a £14 million fine following a 
cyber-attack  which saw hackers gain access to 6.6 million people’s personal data; from pension and staff records to the details of customers of organisations Capita supports. In March an NHS IT supplier was fined £3million, in April a £60,000 fine was issued to a law firm and in June 23andMe, a US genetic testing company, was fined £2.31 million.  

The ICO has urged organisations to ensure internal security policies explicitly consider and address data breach risks. Where risks are identified access should be restricted to specific user groups. The ICO website is a rich source of information detailing ways to improve practices including Working from home – security checklist for employers, Data security guidance and Device security guidance

Cyber Security Training 

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security. See also our Managing Personal Data Breaches Workshop. 

Revised GDPR Handbook   

The data protection landscape continues to evolve. With the passing of the Data (Use and Access) Act 2025, data protection practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.   

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.    

DUA Act Workshop in Birmingham 

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham on 5th February 2026. 

Post Office Reprimand Following Horizon Data Breach 

You would think that the Post Office has learnt its lessons from the Horizon IT Scandal. And of course it would have taken extra care to ensure that the victims of the UK’s most widespread miscarriage of justice are not further harmed by their actions in dealing with the aftermath. Not so, judging by the Information Commissioner’s Office (ICO) announcement on Tuesday.  

The ICO has issued a reprimand to Post Office Limited following an ‘entirely preventable’ data breach which resulted in the unauthorised disclosure of personal data belonging to hundreds of postmasters who were the victims of the Horizon IT scandal.  The breach occurred when the Post Office’s communications team mistakenly published an unredacted version of a legal settlement document on its corporate website. The document contained the names, home addresses and postmaster status of 502 people who were part of group litigation against the organisation. The document remained publicly accessible for almost two months in 2024, before being removed following notification from an external law firm. 

During its investigation, the ICO found that the Post Office failed to implement appropriate technical and organisational measures to protect people’s personal data. There was a lack of documented policies or quality assurance processes for publishing documents on the Post Office website, as well as insufficient staff training, with no specific guidance on information sensitivity or publishing practices.  

In the ‘gold old days’ such a data breach would have attracted a substantial fine; especially considering the impact on the victims described by their lawyers (‘the shock and anxiety of this incident cannot help but compound all of the adverse harms suffered by our clients as a result of the wider Horizon scandal’.) Remember when the ICO fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online? 

 But we are in a new age of GDPR ‘enforcement’! The ICO says it had initially considered imposing a fine of up to £1.094 million on the Post Office Limited. However, it did not consider that the data protection infringements identified reached the threshold of ‘egregious’ under its public sector approach, and a reprimand has been issued instead. This approach, which was extended recently after a two year trial,  ‘prioritises early engagement and other enforcement tools such as warnings, reprimands, and enforcement notices, while issuing fines for only the most egregious breaches in the public sector’ so says the ICO. Not everyone agrees. The law firm, Handley Gill, has just published an analysis of the ICO’s public sector approach trial and the new version of it, essentially concluding that reprimands unaccompanied by enforcement notices won’t achieve the stated objective of driving up data protection standards in the public sector. 

The ICO highlights the following key lessons from this reprimand: 

  • Establish clear publication protocols: Sensitive documents should go through a formal review and approval process before being published online. A multi-step sign-off process can help prevent errors. 
  • Understand the data you handle: Every team, especially those handling public-facing content, must be trained to recognise personal information and assess its sensitivity in context. This includes understanding the reputational and emotional impact of disclosure. 
  • Centralise and classify documents: Use secure, shared repositories with clear access controls and classification labels. Avoid reliance on personal storage systems such as OneDrive and Google Drive. 
  • Define roles and responsibilities: Ensure that everyone involved in publishing content understands their role and the checks required before publication. 
  • Tailor training to the task: General data protection training is not enough. Teams need specific guidance on publishing protocols, data classification, and risk awareness.  

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop.The new (2nd) edition of the UK GDPR Handbook has been published. It contains all the changes made by the Data (Use and Access) Act 2025. 

ICO Public Sector Enforcement Policy to Continue

Last month, the Information Commissioner’s Office (ICO) announced that it will continue its controversial approach to enforcement of the UK GDPR against public sector organisations.   

A trial of the approach was launched in June 2022, in an open letter to public authorities from John Edwards. In the letter Mr Edwards indicated that greater use would be made of the ICO’s wider powers, including warnings, reprimands and enforcement notices, with fines only issued in the most serious cases. This approach has seen much criticism levelled at the ICO. Opponents say that it reduces the importance of data protection and gives special treatment to the public sector.  

One example of the approach, is the ICO’s action (or lack of it) in the Ministry of Defence’s Afghan Data breach. This involved an MoD official mistakenly emailing a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy.  The breach was only discovered in August 2023, when excerpts of the data appeared on Facebook. By then, the damage was done. A new resettlement scheme for those on the leaked list was set up and has seen 4,500 Afghans arrive in the UK so far. The Afghan Relocation Route has cost £400m so far, and the Government has said it is expected to cost a further £450m. Despite the scale and sensitivity of the breach, the ICO decided not to take any regulatory action; not even a reprimand! In its press release, the ICO praised the MoD’s internal investigation and mitigation efforts, stating that “no further regulatory action is required at this time”.  

Following a review last year, and despite strong criticism of its enforcement track record, the ICO has now announced that it will continue its public sector enforcement approach. In his blog post, John Edwards, said: 

“Fines in the public sector, particularly in local government, risk punishing the same people harmed by a breach by reducing budgets for vital services. They still have their place in some cases, but so do other enforcement tools.  

The review of our public sector approach trial reaffirmed that reprimands drive change and publishing them creates strong reputational incentives for compliance, while also offering other organisations valuable lessons from the mistakes of others… 

Focusing on a proactive approach of working with organisations to identify risks and implement improvements can influence sustainable change, protect public trust, and ensure taxpayer money is invested in prevention rather than punishment. The net benefit of this approach is higher data protection standards and faster remediation, backed by sanctions when necessary.” 

Following a consultation earlier this year, the ICO has also published a clearer definition of organisations in scope and the circumstances under which a fine may be issued.  

STOP PRESS: The law firm, Handley Gill, has just published an analysis of the ICO’s Public Sector Approach trial and the new version of it, essentially concluding that reprimands unaccompanied by enforcement notices won’t achieve the stated objective of driving up data protection standards in the public sector.

Revised GDPR Handbook  

  The data protection landscape continues to evolve. With the Data (Use and Access) Act 2025 now in force, practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.  

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.   

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop.