Home Office Acknowledges Racial and Gender Bias in UK Police Facial Recognition Technology

Facial recognition is often sold as a neutral, objective tool. But recent admissions from the UK government show just how fragile that claim really is.

New evidence has confirmed that facial recognition technology used by UK police is significantly more likely to misidentify people from certain demographic groups. The problem is not marginal, and it is not theoretical. It is already embedded in live policing.

A Systematic Pattern of Error

Independent testing commissioned by the Home Office found that false-positive rates increase dramatically depending on ethnicity, gender, and system settings.

At lower operating thresholds — where the software is configured to return more matches — the disparity becomes stark. White individuals were falsely matched at a rate of around 0.04%. For Asian individuals, the rate rose to approximately 4%. For Black individuals, it reached about 5.5%. The highest error rate was recorded among Black women, who were falsely matched close to 10% of the time.

The data highlights a striking imbalance: Asian and Black individuals were misidentified almost 100 times more frequently than white individuals, while women faced error rates roughly double those of men.

Why This Is Not an Abstract Risk

This technology is already in widespread use. Police forces rely on facial recognition to analyse CCTV footage, conduct retrospective searches across custody databases, and, in some cases, deploy live systems in public spaces.

The scale matters. Thousands of retrospective facial recognition searches are conducted each month. Even a low error rate, when multiplied across that volume, results in a significant number of people being wrongly flagged.

A false match can lead to questioning, surveillance, or police intervention. Even if officers ultimately decide not to act, the encounter itself can be intrusive, distressing, and damaging. These effects do not disappear simply because a human later overrides the system.

Bias, Thresholds, and Operational Reality

For years, facial recognition vendors and public authorities argued that bias could be controlled through careful configuration. In controlled conditions, stricter thresholds reduce error rates. But operational pressures often incentivise looser settings that generate more matches, even at the cost of accuracy.

The government’s own findings now confirm what critics have long warned: fairness is conditional. Bias does not vanish; it shifts depending on how the system is used.

The data also shows that demographic impacts overlap. Women, older people, and ethnic minorities are all more likely to be misidentified, with compounded effects for those who sit at multiple intersections.

Expansion Amid Fragile Trust

Despite these findings, the government is consulting on proposals to expand national facial recognition capability, including systems that could draw on large biometric datasets such as passport and driving licence records.

Ministers have pointed to plans to procure newer algorithms and to subject them to independent evaluation. While improved testing and oversight are essential, they do not answer the underlying question: should surveillance infrastructure be expanded while known structural risks remain unresolved?

Civil liberties groups and oversight bodies have described the findings as deeply concerning, warning that transparency, accountability, and public confidence are being strained by the rapid adoption of opaque technologies.

This Is a Governance Issue, Not Just a Technical One

Facial recognition is not simply a question of software performance. It is a question of how power is exercised and how risk is distributed.

When automated systems systematically misidentify certain groups, the consequences fall unevenly. Decisions about who is stopped, questioned, or monitored start to reflect the limitations of technology rather than evidence or behaviour.

Once such systems become normalised, rolling them back becomes difficult. That is why scrutiny matters now, not after expansion.

If technology is allowed to shape policing, the justice system, and public space, it must be subject to the highest standards of accountability, fairness, and democratic oversight.

These and other developments in the use of artificial intelligence, surveillance, and automated decision-making will be examined in detail in our AI Governance Practitioner Certificate training programme, which provides a practical and accessible overview of how AI systems are developed, deployed, and regulated, with particular attention to risk, bias, and accountability.

Post Office Reprimand Following Horizon Data Breach 

You would think that the Post Office has learnt its lessons from the Horizon IT Scandal. And of course it would have taken extra care to ensure that the victims of the UK’s most widespread miscarriage of justice are not further harmed by their actions in dealing with the aftermath. Not so, judging by the Information Commissioner’s Office (ICO) announcement on Tuesday.  

The ICO has issued a reprimand to Post Office Limited following an ‘entirely preventable’ data breach which resulted in the unauthorised disclosure of personal data belonging to hundreds of postmasters who were the victims of the Horizon IT scandal.  The breach occurred when the Post Office’s communications team mistakenly published an unredacted version of a legal settlement document on its corporate website. The document contained the names, home addresses and postmaster status of 502 people who were part of group litigation against the organisation. The document remained publicly accessible for almost two months in 2024, before being removed following notification from an external law firm. 

During its investigation, the ICO found that the Post Office failed to implement appropriate technical and organisational measures to protect people’s personal data. There was a lack of documented policies or quality assurance processes for publishing documents on the Post Office website, as well as insufficient staff training, with no specific guidance on information sensitivity or publishing practices.  

In the ‘gold old days’ such a data breach would have attracted a substantial fine; especially considering the impact on the victims described by their lawyers (‘the shock and anxiety of this incident cannot help but compound all of the adverse harms suffered by our clients as a result of the wider Horizon scandal’.) Remember when the ICO fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online? 

 But we are in a new age of GDPR ‘enforcement’! The ICO says it had initially considered imposing a fine of up to £1.094 million on the Post Office Limited. However, it did not consider that the data protection infringements identified reached the threshold of ‘egregious’ under its public sector approach, and a reprimand has been issued instead. This approach, which was extended recently after a two year trial,  ‘prioritises early engagement and other enforcement tools such as warnings, reprimands, and enforcement notices, while issuing fines for only the most egregious breaches in the public sector’ so says the ICO. Not everyone agrees. The law firm, Handley Gill, has just published an analysis of the ICO’s public sector approach trial and the new version of it, essentially concluding that reprimands unaccompanied by enforcement notices won’t achieve the stated objective of driving up data protection standards in the public sector. 

The ICO highlights the following key lessons from this reprimand: 

  • Establish clear publication protocols: Sensitive documents should go through a formal review and approval process before being published online. A multi-step sign-off process can help prevent errors. 
  • Understand the data you handle: Every team, especially those handling public-facing content, must be trained to recognise personal information and assess its sensitivity in context. This includes understanding the reputational and emotional impact of disclosure. 
  • Centralise and classify documents: Use secure, shared repositories with clear access controls and classification labels. Avoid reliance on personal storage systems such as OneDrive and Google Drive. 
  • Define roles and responsibilities: Ensure that everyone involved in publishing content understands their role and the checks required before publication. 
  • Tailor training to the task: General data protection training is not enough. Teams need specific guidance on publishing protocols, data classification, and risk awareness.  

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop.The new (2nd) edition of the UK GDPR Handbook has been published. It contains all the changes made by the Data (Use and Access) Act 2025. 

ICO Public Sector Enforcement Policy to Continue

Last month, the Information Commissioner’s Office (ICO) announced that it will continue its controversial approach to enforcement of the UK GDPR against public sector organisations.   

A trial of the approach was launched in June 2022, in an open letter to public authorities from John Edwards. In the letter Mr Edwards indicated that greater use would be made of the ICO’s wider powers, including warnings, reprimands and enforcement notices, with fines only issued in the most serious cases. This approach has seen much criticism levelled at the ICO. Opponents say that it reduces the importance of data protection and gives special treatment to the public sector.  

One example of the approach, is the ICO’s action (or lack of it) in the Ministry of Defence’s Afghan Data breach. This involved an MoD official mistakenly emailing a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy.  The breach was only discovered in August 2023, when excerpts of the data appeared on Facebook. By then, the damage was done. A new resettlement scheme for those on the leaked list was set up and has seen 4,500 Afghans arrive in the UK so far. The Afghan Relocation Route has cost £400m so far, and the Government has said it is expected to cost a further £450m. Despite the scale and sensitivity of the breach, the ICO decided not to take any regulatory action; not even a reprimand! In its press release, the ICO praised the MoD’s internal investigation and mitigation efforts, stating that “no further regulatory action is required at this time”.  

Following a review last year, and despite strong criticism of its enforcement track record, the ICO has now announced that it will continue its public sector enforcement approach. In his blog post, John Edwards, said: 

“Fines in the public sector, particularly in local government, risk punishing the same people harmed by a breach by reducing budgets for vital services. They still have their place in some cases, but so do other enforcement tools.  

The review of our public sector approach trial reaffirmed that reprimands drive change and publishing them creates strong reputational incentives for compliance, while also offering other organisations valuable lessons from the mistakes of others… 

Focusing on a proactive approach of working with organisations to identify risks and implement improvements can influence sustainable change, protect public trust, and ensure taxpayer money is invested in prevention rather than punishment. The net benefit of this approach is higher data protection standards and faster remediation, backed by sanctions when necessary.” 

Following a consultation earlier this year, the ICO has also published a clearer definition of organisations in scope and the circumstances under which a fine may be issued.  

STOP PRESS: The law firm, Handley Gill, has just published an analysis of the ICO’s Public Sector Approach trial and the new version of it, essentially concluding that reprimands unaccompanied by enforcement notices won’t achieve the stated objective of driving up data protection standards in the public sector.

Revised GDPR Handbook  

  The data protection landscape continues to evolve. With the Data (Use and Access) Act 2025 now in force, practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.  

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.   

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop. 

Proposed Changes to the EU GDPR: Could we see more changes to the UK GDPR?

Yesterday the European Commission published its long awaited Digital Omnibus Regulation Proposal and Digital Omnibus on AI Regulation Proposal. If approved, these proposals will mean significant changes to the EU GDPR and other EU legislation and may even encourage the UK to further amend the UK GDPR. 

The aim of the “Digital Omnibus” package is to ease administrative burdens for businesses across areas like privacy, cybersecurity and artificial intelligence. Although the EU GDPR is considered balanced and fit for purpose, “targeted changes” are proposed to address concerns, particularly from smaller companies. These include:

  • Clarification of Definitions: The definition of “personal data” is clarified. Information is not considered personal to a company if it does not possess means “reasonably likely” to be used to identify an individual.
  • Processing for AI Training: It is clarified that the processing of personal data for the development and training of AI systems can constitute a “legitimate interest” under certain conditions.
  • Simplified Reporting of Data Breaches: The reporting obligation to supervisory authorities is aligned with the threshold for notifying data subjects. A report is only required if there is a “high risk” to the rights and freedoms of natural persons. The deadline for reporting is extended to 96 hours.
  • Harmonization of Data Protection Impact Assessments (DPIA): National lists of processing operations requiring a DPIA (or not) are to be replaced by unified EU-wide lists to promote harmonisation.
  • Scientific Research: The conditions for data processing for scientific research purposes are clarified by defining “scientific research” and clarifying that this constitutes a legitimate interest.

The EU AI Act also faces a number of amendments, including simplifications for small and medium-sized enterprises and small mid-cap companies in the form of pared-back technical documentation requirements. Other measures involve sandboxes for real-world testing and to “reinforce the AI Office’s powers and centralise oversight of AI systems built on general-purpose AI models, reducing governance fragmentation”.

Both omnibus packages now have a long road ahead as they enter into the trilogue negotiations with European Parliament and the Council of the European Union. It is expected to take at least several months until negotiations are finalised. 

Impact on the UK

The UK has already enacted its own package of amendments to the UK GDPR in the form of the Data (Use and Access) Act 2025 which received Royal Assent on 19th June 2025. The amendments are quite modest even before comparing them to the EU proposals above. 

A more bolder list of amendments were contained in the Data Protection and Digital Information Bill published in 2022 by the Conservative Government. This included proposals to amend the definition of personal data and to replace Data Protection Officers with Senior Responsible Individuals. This bill was later replaced by a diluted bill of the same name (number 2 Bill) only for that to be dropped in the Parliamentary “wash up” stage before the last General Election.

Could the EU reforms (if enacted) lead to the UK making more fundamental changes to the UK GDPR? We doubt it. The Labour Government has more pressing priorities and with the passing of the DUA Act they can say they have “done GDPR reform”. If we get a change in Government, then Reform and the Conservatives might target the UK GDPR as way of reigning in “pesky human rights laws”. 

Data protection professionals need to assess the changes to the UK data protection regime made by the DUA Act. Our half day workshop will explore the Act in detail giving you an action plan for compliance. A revised UK GDPR Handbook is now available incorporating the changes made by the DUA Act. 

New Guidance on AI Risk Management

The development, procurement and deployment of AI systems involving the processing of personal data raises significant risks to data subjects’ fundamental rights and freedoms, including but not limited to privacy and data protection. The principle of accountability enshrined in the UK GDPR and the EU GDPR require Data Controllers to identify and mitigate these risks, as well as to demonstrate how they did so. This is especially important for AI systems that are the product of intricate supply chains often involving multiple actors processing personal data in different capacities.

The European Data Protection Supervisor (EDPS) has just released an important new guidance document to help organisations conduct data protection risk assessments when developing, procuring, or deploying AI systems.  It focuses on the risk of non-compliance with certain data protection principles for which the mitigation strategies that controllers must implement can be technical in nature – namely fairness, accuracy, data minimisation, security and data subjects’ rights. 

Key sections of the document address:

  • the risk management methodology according to ISO 31000:2018
  • the typical development lifecycle of AI systems as well as the different steps involved in their procurement 
  • the notions of interpretability and explainability 
  • an analytical framework for identifying and treating risks that may arise in AI systems, structured according to the data protection principles potentially affected. 

The EDPS has issued this guidance in his role as a data protection supervisory authority for EU institutions. However it is a very useful document for any organisation deploying AI and which requires guidance on how to systematically  assess the risks from a data protection perspective. 

Our AI Governance Practitioner Certificate course, is designed to equip Information Governance professionals with the essential knowledge and skills to management the risk of AI deployment within their organisations. This year 50 delegates, from a variety of backgrounds, have successfully completed the course, giving great feedback

The first course of 2026 starts on 8th January. Places are limited so book early to avoid disappointment. If you require an introduction to AI and information governance, please consider booking on our one day workshop

ICO Enforcement Guidance Consultation Launched 

The Information Commissioner’s Office has launched a consultation on new guidance setting out how it approaches investigations and takes enforcement action. Among other things, the guidance explains:  

  • How the ICO decides whether to open an investigation and the other ways it may instead seek to resolve any concerns. 
  • What to expect from the ICO during an investigation. 
  • How it will use its information gathering powers, including new powers under the Data (Use and Access) Act 2025 to require people to answer questions and organisations to provide reports.  
  • How the ICO decides on the outcome of an investigation and use of its enforcement powers, such as warnings, reprimands, and enforcement and penalty notices. 
  • When it considers settlement with a reduced fine is appropriate and the process involved.  

The new guidance, once finalised, will sit alongside the ICO’s Data Protection Fining Guidance published last year. Together they will replace the statutory guidance currently set out in the Regulatory Action Policy.  

The Data (Use and Access) Act 2025 also includes provisions that will bring the ICO’s investigatory and enforcement powers under the Privacy and Electronic Communications Regulations 2003 (PECR) broadly into line with its powers under the data protection legislation.  While there remain some differences, the ICO proposes to generally take the same approach to the use of its powers in relation to PECR as set out in the draft guidance in relation to the data protection legislation.  

The consultation will run for 12 weeks until Friday 23 January 2026.   

Revised GDPR Handbook 

The data protection landscape continues to evolve. With the Data (Use and Access) Act 2025 now in force, practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR. 

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.  

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop.  

In case you missed it… 

In October, Capita was fined £14 million following a cyber-attack in March 2023 which saw hackers gain access to 6.6 million people’s personal data; from pension and staff records to the details of customers of organisations Capita supports. For some people, this included details of criminal records and financial data. This and other recent cyber-attacks has increased the importance of cyber security training. We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to up skill their employees about cyber security. See also our Managing Personal Data Breaches Workshop. 

Also in October, the BBC reported that Gregg Wallace, the former MasterChef presenter, has issued proceedings against the BBC and BBC Studios for failing to respond to his subject access requests (SAR) in accordance with the UK GDPR.  Wallace was sacked by the BBC in July following an inquiry into alleged misconduct. As the saying goes, “Revenge is a dish best served cold!” Any BBC Executives reading this (if you are not too busy at the moment), are advised to attend ourHow to Handle a Subject Access Request workshop. No doubt there will be a few more SARs to the BBC in the coming weeks… 

The Information Commissioner, John Edwards, recently gave evidence to the House of Commons  Science, Innovation and Technology Committee.   Mr Edwards faced some tough questions about his response to the Afghan data breach, in which a Ministry of Defence (MoD) official mistakenly emailed a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy (ARAP). The breach was only discovered in August 2023, when excerpts of the data appeared on Facebook. By then, the damage was done. A new resettlement scheme for those on the leaked list was set up and has seen 4,500 Afghans arrive in the UK so far. The Afghan Relocation Route has cost £400m so far, and the Government has said it is expected to cost a further £450m.  This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop 

Finally, there are only two FOI Practitioner Certificate courses left till Christmas! This foundation course is designed for those wishing to acquire detailed knowledge of the FOI and develop the practical skills to enable them to become a more effective FOI Officer.  The syllabus has been developed by FOI experts after analysing all the skills, knowledge and competencies required for the FOI Officer role. By the end of the course, you will be able to practically handle FOI requests, apply the exemptions and draft Refusal Notices. You will also be able to differentiate between FOI requests and requests under the Environmental Information Regulations. 

Staying Up to Date: The UK GDPR Handbook (2nd Edition) 

The data protection landscape continues to evolve. With the Data (Use and Access) Act 2025 now in force, practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context.

This edition also covers the amendments made to Article 17 (right to erasure) under the Victims and Prisoners Act 2024, ensuring readers have a complete view of the current regime.

Act Now has included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. As before, the aim is clarity and usability, helping practitioners work confidently within a complex framework.

And for each handbook sold, £1 is donated to the Rainfall Foundation, supporting the reintegration of prison leavers into society, a reminder that compliance and community impact can go hand in hand.

If you’re revisiting your data protection resources this year, this updated edition is a good place to start. Order your copy here.

Information Commissioner Grilled in Parliament 

Last week the Information Commissioner, John Edwards, gave evidence to the House of Commons  Science, Innovation and Technology Committee.  

Mr Edwards faced some tough questions about his response to the Afghan data breach, in which a Ministry of Defence (MoD) official mistakenly emailed a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy (ARAP). The breach was only discovered in August 2023, when excerpts of the data appeared on Facebook. By then, the damage was done. A new resettlement scheme for those on the leaked list was set up and has seen 4,500 Afghans arrive in the UK so far. The Afghan Relocation Route has cost £400m so far, and the Government has said it is expected to cost a further £450m.  

It’s fair to say that overall the committee was not impressed with the ICO’s approach and John Edwards’ answers to some of their questions. Kit Malthouse’s claimed that the Afghan data breach was dealt with through “a few unrecorded meetings and a handshake”. 

Mr Edwards also answered questions about his wider remit. He slipped in that he has served a Notice of Intent on a social media company (Reddit), but did not give any details. If you missed the live session,  you can still watch the recording.
The Information Commissioner’s session start at 9:46 on the recording here.
If you prefer to read an account of his performance, the Independent covers it here

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop.The new (2nd) edition of the UK GDPR Handbook has been published. It contains all the changes made by the Data (Use and Access) Act 2025.  

FOI Practitioner Certificate: Final Two Courses for 2025 

There are only two FOI Practitioner Certificate courses left till Christmas!  

This foundation course is designed for those wishing to acquire detailed knowledge of the FOI and develop the practical skills to enable them to become a more effective FOI Officer.  The syllabus has been developed by FOI experts after analysing all the skills, knowledge and competencies required for the FOI Officer role. By the end of the course, you will be able to practically handle FOI requests, apply the exemptions and draft Refusal Notices. You will also be able to differentiate between FOI requests and requests under the Environmental Information Regulations. 

The course takes place over four days followed by an assessment. Our teaching style is based on practical and engaging workshops covering the theory alongside hands-on application using real life case studies and exercises. Personal tutor support throughout the course, detailed course materials and a comprehensive online resource lab, ensure the best opportunity for success. 

The FOI Learning Pathway  

The updated FOI Practitioner Certificate is part of our learning pathway for FOI Officers. Once completed they can move on to the Intermediate FOI Certificate. This strengthens the foundations established by the FOI Practitioner Certificate. Topics include interpreting information requests, navigating data repositories for relevant information, handling vexatious requests and applying the exemptions. Time will also be spent discussing the historical development and transformative impact of FOI on transparency, accountability and citizen empowerment. International comparisons with the FOI Act will broaden delegates’ perspectives, while critically evaluating its impact and effectiveness will assist them to appreciate the importance of transparency and accountability. By the end of the course, delegates will gain skills in, amongst other things, effectively interpreting information requests, assessing their scope, retrieving relevant information, overcoming challenges in organisational compliance, applying exemptions and crafting clear Refusal Notices.   
 
If you would like a chat to discuss your suitability for any of our certificate courses, please get in touch. 

Prince Andrew: The Data Protection Angle 

Over the weekend, the Mail on Sunday piled more pressure on Prince Andrew.  

It alleged that he asked his police protection officer to investigate his accuser, Virginia Giuffre,  just before the newspaper published a photo of Ms Giuffre’s first meeting with the prince in February 2011. The Mail alleges that Prince Andrew gave the officer her date of birth and social security number. The Sunday Telegraph also claimed that he “sought to dig up dirt” on Ms Giuffre. 

Ms Giuffre, who took her own life earlier this year, said she was among the girls and young women sexually exploited by convicted sex offender Jeffrey Epstein and his wealthy circle. Prince Andrew has consistently denied all allegations against him. 

The Metropolitan Police said on Sunday, “We are aware of media reporting and are actively looking into the claims made.” Of course we don’t have detailed information about the circumstances around latest allegations against Prince Andrew, but (if true) there is a possible breach of Section 170 of the Data Protection Act 2018 (DPA). This makes it a criminal offence for a person to knowingly or recklessly:  

(a) obtain or disclose personal data without the consent of the controller,  

(b) procure the disclosure of personal data to another person without the consent of the controller, or  

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained. 

So if the latest allegations are true, Prince Andrew and/or his police protection officer at the time, could have committed a criminal offence under the DPA 2018. Unlike the other allegations against him, this offence does not carry a prison term; just a fine. Successive Information Commissioners have argued that a custodial sentence under S.170 would be a better deterrent (but to no avail).  

Will the Information Commissioner’s Office be knocking on Prince Andrew’s door? In June 2023, the ICO disclosed that, since 1stJune 2018, 92 cases involving S.170 offences were investigated by its Criminal Investigations Team. There have been a number of more recent S.170 prosecutions. These often involve people accessing/disclosing confidential information for financial gain.  

Depending again on the circumstances, there may also be an offence under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment.  In July 2022, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August 2022, the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of customers’ personal data from vehicle repair garages to generate potential leads for personal injury claims.  

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop.The new (2nd) edition of the UK GDPR Handbook has been published. It contains all the changes made by the Data (Use and Access) Act 2025.