Dr. Malkiat Thiarai Joins the Act Now Team 

Act Now is delighted to welcome Dr. Malkiat Thiarai to our team of associates. 

Our associates play a vital role in delivering our mission: helping to create a more privacy-conscious world by educating IG professionals. At Act Now, we pride ourselves on providing training that is clear, practical, and jargon-free — making complex topics accessible and engaging. 

Every one of our associates brings extensive real-world experience from the information governance sector. This expertise enriches our courses and ensures they remain relevant, insightful, and highly rated by our delegates. We are excited to have Dr. Thiarai join us in continuing this tradition of excellence. 

Dr. Malkiat Thiarai has worked for Birmingham City Council for over 30 years and has led the information governance function for over 20 years. He is currently the Head of Practice – Corporate Information Management and part of the council’s Digital and Technology Services multi-disciplinary leadership team. His role encompasses the duties of the Data Protection Officer as well as other aspects of information governance. He helps to improve the management of council data assets and provide strategic and operational management of information management.   

In 2021, Dr. Malkiat successfully completed a PhD in Urban Science from the University of Warwick. His research focussed on the understanding the challenges and capability of using personal data held within public sector organisations for research purposes and use the analysis to develop new models of service delivery that are focused on social care data whilst balancing the rights of the individual to privacy and a personal life. He has previously completed the LLM Information Rights and Law as well as an MBA in Public Service. 

Dr. Malkiat will be developing new courses around his area of expertise and sitting on our curriculum and exam board. He will also be assisting our team to deliver everything from one-day workshops to advanced practitioner certificate courses. 

Ibrahim Hasan, Director of Act Now Training, said:  

“I am very pleased that Dr. Malkiat has joined our team. I have known Malkiat for over 25 years. I am confident that his strong academic background coupled with experience of working in IG for many years, he will be great contribution to our team developing innovative curricula to help foster a culture of responsible data usage, build public trust and drive positive change.” 

The Role of PACE  in Local Authority Regulatory Investigations 

For local authority investigators, interviewing is at the heart of effective casework. Interviews aren’t just fact finding conversations; they are a formal investigative tool with legal significance. The way you conduct them can determine whether your evidence stands up in court or during enforcement action. 

But good interviewing isn’t just about instinct or experience. It requires a clear understanding of the law, particularly the Police and Criminal Evidence Act 1984 (PACE), and a professional approach supported by structured techniques like the PEACE model (Planning and preparation, engage and explain, account, closure and evaluation). 

When interviews are handled lawfully and skilfully, they generate reliable evidence, support sound decision-making, and protect the public interest. When mishandled, they can result in inadmissible evidence, failed prosecutions, or reputational damage to your authority. 

PACE  

PACE isn’t just for the police. If your investigation might result in a criminal prosecution, PACE applies to you too. This includes interviews carried out under caution by local authority officers acting in their enforcement role; whether you’re interviewing a business owner suspected of misleading trading, a landlord accused of a housing offence or a shop keeper breaching licensing conditions. 

PACE protects the rights of suspects and ensures fairness in the gathering of evidence. The key provisions every local authority investigator must know include: 

Caution 
You must caution a person before asking questions if you suspect them of an offence and intend to use their answers in evidence. The standard caution reads: 
“You do not have to say anything, but it may harm your defence if you do not mention when questioned something which you later rely on in court. Anything you do say may be given in evidence.” 

Using the wrong caution, or failing to use it when required, risks making the evidence inadmissible. 

Right to Legal Advice 
Under PACE, suspects have the right to free legal advice. You must make them aware of this right before the interview starts. Proceeding without making this clear can jeopardise your case. 

Recording Interviews (Code E) 
Local authority investigators must follow the rules for audio-recording interviews when interviewing suspects for indictable offences or where required by enforcement policy. Correct handling, sealing, and storage of recordings protect both you and the interviewee. 

Safeguarding Vulnerable People 
If the interviewee is under 18 or considered vulnerable (for reasons such as mental health or learning difficulties), an appropriate adult must be present during the interview. Failing to ensure this safeguard can invalidate the interview. 

Avoiding Oppression and Misconduct 
You must always act with integrity and fairness, even in difficult interviews. 

Any evidence obtained through threats, unfair pressure, or oppressive behaviour is likely to be excluded.  

PEACE 

While PACE sets the legal rules, PEACE (Planning and preparation, engage and explain, account, closure and evaluation) provides a practical structure for conducting effective, professional interviews in the regulatory enforcement context. 

Unlike formal suspect interviews under PACE, PEACE can also help structure fact-finding interviews with witnesses, business representatives, or those who may later become suspects. 

1. Planning and preparation: Successful interviews start long before you sit down with the interviewee. Good planning involves: 

  • Clarifying your interview objectives. 
  • Understanding the evidence you already have. 
  • Deciding whether a caution is required. 
  • Considering the need for legal advice or an appropriate adult. 
  • Structuring your questions logically. 

Inadequate planning often leads to missed opportunities, legal errors or unreliable evidence. 

2. Engage and explain: Your professional approach is crucial. This includes: 

  • Building rapport and explaining the purpose of the interview. 
  • Clarifying rights and procedures, including the right to legal advice and, if relevant, explaining the caution. 
  • Being neutral, objective, and professional throughout. 

Your approach can affect the cooperation of the interviewee and the credibility of the evidence obtained.  Experienced interviewers have suggested that there is a positive correlation between constructive interpersonal relationships between the suspect and the interviewer and a higher level of information given. 

3. Account clarification and challenge: This is the main part of the interview. This includes: 

  • Starting with open questions, 
  • Gaining the suspects explanation of what has happened in relation to the suspected offence. 
  • Gradually asking more specific questions. 
  • Using closed questions if required to obtain finer details. 

4. Closure: Closing an interview properly matters. You should: 

  • Summarise key points with the interviewee. 
  • Offer them the chance to clarify or add anything. 
  • Explain what will happen next in the investigation. 
  • Ensure all paperwork, recordings, and notes are accurate and complete. 

Closure isn’t just administrative; it helps protect the integrity of the investigation. 

5. Evaluation: After the interview, critically assess what happened. Ask yourself 

  • Did I meet my objectives? 
  • Was the interview PACE compliant? 
  • Has new information come to light requiring further action? 
  • Are my records and recordings complete? 

The evaluation stage reinforces accountability and learning, helping you improve your practice and ensure evidential quality. 

Regulatory investigations often operate in complex legal and social environments. PACE protects the rights of individuals and the admissibility of evidence. PEACE helps you apply structure, professionalism, and investigative skill. Mastering both frameworks is key to investigative success. 

Training 

Interviewing is a professional skill, and like any skill, it needs regular practice and updating. 

  • PACE Training: Make sure you’re familiar with the latest Codes of Practice, particularly around cautions, legal rights, and vulnerable interviewees. 
  • PEACE Interview Skills: Keep refining your questioning techniques, planning, and post-interview evaluation. 
  • Scenario-Based Practice: Realistic training scenarios help bridge the gap between theory and practice. 

Regular training not only sharpens your skills but demonstrates your authority’s commitment to lawful and effective enforcement. 

Act Now has a range of customised in house training courses on RIPA, PACE, investigations and interview techniques. Our associates include Naomi Mathews who is a Senior Solicitor and was a co-ordinating officer for RIPA at a large local authority in the Midlands. Naomi has extensive experience in all areas of regulatory law and investigations.  She has worked as a defence solicitor in private practice and as a prosecutor for the local authority in a range of regulatory matters including Trading Standards, Health and Safety and Environmental prosecutions. Naomi has higher rights of audience to present cases in the Crown Court. 

Get in touch if you would like a free 30 minute consultation to discuss your training needs. 

AI Governance Practitioner Certificate: First Cohort Successfully Completes Course 

Act Now is pleased to report that the first cohort of its new AI Governance Practitioner Certificate has successfully completed the course. 

This course is designed to equip Information Governance professionals with the essential knowledge and skills to navigate AI deployment within their organisations. As we detailed in our previous blog “What is the role of IG Professionals in AI Governance?”, IG professionals should be aware of how this technology works so that they can help to ensure that there is responsible deployment from an IG perspective, just as would be the case with any new technology.   

The first course ran over a four week period in May and June. It consisted of ten delegates from the health sector in Wales. They all successfully completed the course assessment in July. 

The course was extremely well received by the delegates who complimented us on the scope of the syllabus and the delivery style: 

“I took a huge amount from the course which will help shape the development of processes for us internally in the coming months.” Dave Parsons , WASPI Code Manager (Wales Accord on the Sharing of Personal Information)  

“This was a superb course with a lot of information delivered at a carefully managed rate that encouraged discussion and reflection.  Literacy in AI and its application is vital – without it we cannot comprehend the ever changing level of IG threat and risk.” MA, Digital Health and Care Wales

The training was very good. The instructor was also very knowledgeable about the subject.” HP, Digital Health and Care Wales

Cora Suckley, Information Governance Service Manager, Digital Health and Care Wales said: 

“The AI Governance Practitioner Certificate exceeded my expectations. The content was comprehensive and well-structured, successfully bridging the gap between technical AI concepts and essential governance frameworks. The course delved into responsible AI principles, risk management, compliance, policy and ethical considerations, equipping me with practical tools to navigate the evolving regulatory landscape. 

The instructor was excellent and made the sessions interactive, highly engaging and applicable, providing real-world examples. This course provides a solid foundation for implementing AI governance in a meaningful and effective way.” 

Two more cohorts are currently completing the course. The next course starts in September and has a few places left.  

Charity Receives £18,000 GDPR Fine

On Monday, a Scottish Charity (Birthlink) received a GDPR Monetary Penalty Notice of £18,000 after it destroyed approximately 4,800 personal records, up to ten percent of which may be irreplaceable. 

Birthlink is a charity specialising in post-adoption support and advice, for people who have been affected by adoption with a Scottish connection.
Since 1984 it has owned and maintained the Adoption Contact Register for Scotland. The Register allows adopted people, birth parents, birth relatives and relatives of an adopted person to register their details with the aim of being linked to and potentially reunited with family members. 

Key findings from the Information Commissioner’s Office (ICO) investigation include: 

  • Handwritten letters and photographs from birth parents amongst items destroyed 
  • Some people’s access to part of their family histories and identities may have been permanently erased due to systematic data protection failures 
  • Poor records management means true extent of actual loss will never fully be known 
  • The charity had limited knowledge of data protection obligations and lacked cost effective and easy-to-implement policies and procedures, which would likely have prevented the destruction. 

Background 

In January 2021, Birthlink reviewed whether they could destroy ‘Linked Records’ as space was running out in the charity’s filing cabinets. ‘Linked Records’ are files of cases where people had already been linked with the person they sought and can include handwritten letters from birth parents, photographs, and copies of birth certificates.  

Following a February 2021 Board meeting, it was agreed no barriers to the destruction of records existed but that retention periods should apply to certain files and only replaceable records could be destroyed. Due to poor record keeping, it is estimated some records were destroyed on 15 April 2021 with a further 40 bags destroyed on 27 May 2021.  

In August 2023, following an inspection by the Care Inspectorate, the Birthlink Board became aware that irreplaceable items had in fact been destroyed as part of the overall record destruction. It reported the incident to the ICO. 

ICO Findings 

The ICO investigation found the following infringements of the UK GDPR: 

  1. Birthlink’s destruction of manual records containing personal data of approximately 4,800 of its service users without authorisation or lawful basis (“Relevant Processing”) occurred as a result of its failure to implement appropriate organisational measures ensuring the security of the personal data contained in the records. In this regard, the ICO found that Birthlink contravened Articles 5(1)(f) and 32(1)-(2) of the UK GDPR (security). 
  1. A significant contributing factor leading to the Relevant Processing, was Birthlink’s failure to demonstrate compliance with the data protection principles in accordance with Article 5(2) of the UK GDPR. Birthlink has accepted that there was limited understanding of the UK GDPR at the time of the Relevant Processing until around March 2023 when it introduced data protection training for its staff. 
  1. Despite acknowledging the high risk to affected service users arising from the Relevant Processing, Birthlink did not notify the ICO of the personal data breach until 8 September 2023. A delay of two years and five months represents a marked departure from the obligation to notify the ICO within 72 hours of becoming aware of a personal data breach in accordance with Article 33(1) UK GDPR. 

Why a fine now? 

This fine comes two weeks after the catastrophic data breach involving the Ministry of Defence (MoD) was reported, following the High Court lifting a superinjunction. In February 2022, an MoD official mistakenly emailed a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy (ARAP). The data breach also contained personal details of more than 100 British officials including those whose identities are most closely guarded; special forces and spies.  

Despite the scale and sensitivity of the MoD data breach, the ICO decided not to take any regulatory action; not even a reprimand! In its press release, the ICO praised the MoD’s internal investigation and mitigation efforts, stating that “no further regulatory action is required at this time”.  

The ICO has been heavily criticised for their inaction. The Commons Defence Committee said it would launch its own inquiry, and Dame Chi Onwurah, chair of the Commons Committee for Science Innovation and Technology, said that it is writing to the Information Commissioner pushing for an investigation. Following this, the Information Commissioner issued a further statement explaining the ICO approach.  

Of course no one is suggesting that the ICO fine for Birthlink is an attempt by the ICO to move on from the MoD non-enforcement but readers may at least be wondering why a relatively small Scottish charity is fined whilst a large government department (which has been fined previously in similar circumstances) has faced no action at all.  

This case shows the importance of good records management in ensuring GDPR compliance. Our forthcoming workshop will help you implement records management best practice and understand how it can help manage the personal data lifecycle. 

First Commencement Order For the New Data (Use and Access) Act 2025

On 20th August 2025 some provisions of The Data (Use and Access) Act 2025 will come in to force. 

The DUA Act received Royal Assent on 19th June 2025. It amends, rather than replaces, the UK GDPR as well as the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) and the Data Protection Act 2018. (You can read a summary of the Act here.)  

The only substantive amendment (Section 78) to the UK GDPR that came into force on 19th June inserted a new Article 15(1A), relating to subject access requests: 

“…the data subject is only entitled to such confirmation, personal data and other information as the controller is able to provide based on a reasonable and proportionate search for the personal data and other information described in that paragraph.” 

Other provisions of the Act will commence in stages, 2 to 12 months after Royal Assent, according to the Government’s recently published plans for commencement

The first commencement order, The Data (Use and Access) Act 2025 (Commencement No. 1) Regulations 2025, was made on 21st July. This does not bring into force any “show stoppers” (like Recognised Legitimate Interests, the new International Transfer adequacy test or increased fines for PECR breaches). Rather 20th August will see mostly technical provisions come into force, alongside new statutory objectives for the Information Commissioner’s Office when carrying out its functions and provisions requiring the government to prepare a progress update and a report on copyright works and AI systems. The full list is here.

The government says that approximately 6 months after Royal Assent, commencement of the main changes to data protection legislation (in Part 5 of the Act) will take place.  

Data protection professionals need to assess the changes to the UK data protection regime set out in the DUA Act. Our half day workshop will explore the new Act in detail giving you an action plan for compliance. 

When the RIPA Inspector Calls 

Every local authority using (or having the ability to use) covert surveillance, under the Regulation of Investigatory Powers Act 2000 (RIPA), should expect regular inspections by the Investigatory Powers Commissioner’s Office (IPCO). Typically, these are conducted every three years, though frequency may vary based on activity levels and past findings. These inspections are a key part of demonstrating lawful and proportionate use of surveillance powers.  

The Inspection Process 

IPCO inspections are now commonly conducted remotely, although on-site visits still occur when deemed necessary. You will usually be given advance notice and asked to submit key documents, including your RIPA policy, examples of authorisations (even if only historical), and training records. 

The inspection will generally follow this structure: 

  1. Document Review: The inspector will examine your authority’s policy and procedures to assess whether they reflect current law and Home Office Codes of Practice.  
  1. Case Sampling: Even if your authority hasn’t used RIPA powers in recent years, inspectors will want to see how you handle applications when they occur, or how you maintain readiness. If you have used powers, expect a thorough review of sample applications, authorisations, reviews, renewals and cancellations. 
  1. Interviews with Key Personnel: Typically, the inspector will speak with the Senior Responsible Officer (SRO), Authorising Officers and the RIPA Coordinator. They will be looking for a clear understanding of roles, responsibilities, and legal thresholds for authorisation. 
  1. Feedback and Report: The inspector will provide immediate feedback and later issue a formal report highlighting commendations, recommendations and any required actions. 

Common Inspection Findings 

As part of our provision of tailored in house training, we have to read IPCO inspection reports. The following is a list of common mistakes highlighted by IPCO. They are not attributable to any particular organisation.  

RIPA Forms 

  • Use of out of date forms 
  • No Unique Reference Number (URN)  
  • Not amending forms so that only those grounds are present which are available to the public authority e.g. councils – preventing or detecting crime  
  • Pre completed forms  
  • Use of cut and paste in boxes/repetitive narrative 

Authorisation Process  

  • Rubber stamping – no real thought given to authorisation  
  • Necessity, proportionality and collateral intrusion not fully understood/considered  
  • Likelihood of obtaining Confidential Information not fully considered 
  • Some ‘open source’ internet research is being conducted which may actually meet the criteria of Directed Surveillance and therefore require authorisation  
  • Confusion regarding reviews and renewals  
  • Lack of understanding of when a person is a CHIS 
  • Too many Authorising Officers 
  • Authorising Officers are not making adequate provision for destruction of product that is collateral intrusion or of no value to the operation  
  • Joint investigations without authorisation and/or record keeping 
  • Lack of robust management and quality assurance procedures  

Social Media 

  • Failing to consider the application of RIPA to social media monitoring 
  • Lack of understanding of when the Directed Surveillance and CHIS definitions are met 

Record Keeping  

  • Central records not compliant with the Code of Practice  
  • Inadequate monitoring, recording and audit of surveillance equipment  
  • Inadequate handling and storage of surveillance product/evidence 
 

Policies and Procedure Documents 

  • Inadequate/no RIPA policy  
  • Inadequate/out of date guidance document  
  • No CCTV protocol/procedure  

Preparing for an IPCO Inspection 

The key to a smooth inspection lies in preparation. This starts long before the inspection is announced: 

  1. Review and Update Your Policy Regularly: Your RIPA policy should be reviewed at least annually and whenever guidance or legislation changes. Make sure it is accessible to relevant staff and reflects current best practice. 
  1. Keep Your RIPA Registers in Order: Whether your authority uses an electronic register or paper records, they must be accurate and up to date. This includes entries for authorisations that were refused, cancelled or not proceeded with. 
  1. Prioritise Training (see below) 
  1. Test Your Processes: Carry out internal audits or mock inspections. Review recent authorisations (if any), check register completeness, and ensure all relevant staff understand their responsibilities. 
  1. Engage Your SRO: The SRO isn’t just a figurehead; they should champion compliance, oversee training provision, ensure policy updates, and actively monitor RIPA use within the authority. 
  1. Learn from Past Reports: If your authority has had previous inspections, review past reports and ensure all recommendations have been addressed. Be ready to explain what improvements have been made. 
  1. Stay Connected: Keep up with Home Office guidance, IPCO publications and professional networks. Sharing good practice with other local authorities can help avoid common pitfalls. 

Training and Awareness 

The last annual report (2023) published by IPCO states: 

“As a general rule, we encourage local authorities to ensure that authorising officers (AOs) and those members of staff engaged in investigative or enforcement roles, receive either classroom-based or online training from a trusted supplier on an annual or biennial basis.” 

When it comes to training, there is no one size fits all solution. It should be tailored depending on the audience, their role and frequency of using surveillance powers. Consider: 

  • Initial Training for New Staff: Any officer designated as an Authorising Officer or investigator must receive formal RIPA training before undertaking the role. 
  • Refresher Training: Aim for annual refresher sessions. Even if you’ve had no activity, this keeps knowledge alive and demonstrates proactive governance. 
  • Wider Awareness Training: Consider regular briefings for investigative and enforcement teams so they understand when RIPA applies and how to seek authorisation. 

By embedding a culture of continual learning, maintaining robust policies and records, and keeping oversight active, you’ll not only pass your inspection with confidence but also ensure your authority upholds the highest standards of accountability and public trust. 

How We Can Help 

Act Now have a range of training solutions to assist you to raise RIPA awareness and prepare for IPCO inspections: 

  • RIPA Essentials. An e learning course, consisting of an animated video followed by an online quiz. In just 30 minutes your employees can learn about the main provisions of Part 2 of RIPA including the different types of covert surveillance, the serious crime test and the authorisation process. The course also covers how RIPA applies to social media monitoring and how to handle the product of surveillance having regard to data protection.  
  • Online workshops: Our RIPA workshops  provide a thorough explanation of the RIPA requirements, processes and documentation to ensure compliance. Case studies and real life examples help to embed the learning. 
  • In House Training: We have RIPA experts who can deliver customised in house training to your organisation, whether online or face to face. Our associates include Naomi Mathews who is a Senior Solicitor and a co-ordinating officer for RIPA at a large local authority in the Midlands. She is also the authority’s Data Protection Officer and Senior Responsible Officer for CCTV.  

Retail Under Siege Through AI Enabled Cyber Attacks 

The UK retail sector has come under siege in 2025, with an unprecedented wave of cyber attacks. After the Ticketmaster breach in 2024 where millions of users were affected, one would assume retailers had taken note. However, From Marks & Spencer to Louis Vuitton, companies large and small are grappling with relentless, tech-enhanced intrusions that threaten customer trust and digital resilience. It’s almost a daily occurrence these days receiving an email from a company apologising for a data breach. There also seems to be no retailer safe regardless of their size or stature. Sometimes it is a retailer that you may not have even shopped with for a number of years at which point I’m sure you must be thinking, ‘What’s their data retention policy?’ 
 
Below we take a look at some of the major breaches and attacks of 2025 and what you can do to protect your information online. 

High-Profile Retail Cyberattacks of 2025 

Here’s a snapshot of the most disruptive recent cyber incidents: 

Company Date Attack Type Impact & Highlights 
Louis Vuitton UK July 2025 Data breach Customer contact details & purchase history stolen; phishing scams followed 
Marks & Spencer April 2025 Ransomware £3.8M/day in lost revenue; £700M market value wiped; credential theft via vendor 
Harrods May 2025 Attempted breach Real-time containment; no confirmed data loss but serious operational disruption 
Co-op UK May 2025 Ransomware Customer data compromised; back-office systems disabled 
Peter Green Chilled May 2025 Ransomware Disrupted cold-chain deliveries to Tesco, Aldi, Waitrose 
Victoria’s Secret Spring 2025 Web attack E-commerce platform outage during peak shopping period 

These incidents underscore one clear truth: cybercrime is evolving, and no retailer, no matter its size or prestige, is immune. What is worrying is, companies with infinite resources are still extremely vulnerable. 

The Role of AI  

In many of these data breaches, AI was used by hackers to accelerate and deepen the damage. Their tactics included: 

  • Hyper-Personalised Phishing: AI-generated messages mimicked trusted communications, referencing recent purchases to trick recipients. Louis Vuitton customers received convincing fake discount offers. 
  • Credential Cracking and MFA Bypass: AI automated brute-force login attacks, while adversary-in-the-middle techniques stole session tokens to sidestep multi-factor authentication. 
  • Network Reconnaissance: Malicious bots used AI to scan retail systems, identify vulnerabilities, and map out supply chains for deeper impact. 
  • Autonomous Ransomware: Sophisticated strains like DragonForce adapted in real time to avoid detection and self-propagate through connected systems. 
  • Voice Phishing (Vishing): AI-generated voices impersonated IT staff to deceive employees into disclosing access credentials; a tactic especially potent in luxury retail. 

AI has supercharged cybercrime, making attacks faster, more targeted, and far harder to detect. With the emergence of (RaaS) ransomware as a service and (DLS) there is now a marketplace for our data that is much more accessible. 

How Consumers Can Protect Their Data 

While companies bear the financial burden of breaches, consumers often suffer the most; through stolen data, financial fraud, and disrupted services. Lessons for consumers include: 

  • Even luxury brands are vulnerable – don’t assume prestige equals protection. 
  • Cyberattacks are increasingly tailored based on what you buy, how often you shop, and where you live. 
  • Supply chains and vendor access are weak points; your data might be exposed even if the retailer itself isn’t directly breached. 

Whether you shop in-store or online, these simple steps can dramatically improve the security of your personal data: 

Digital Defence 

  • Use Strong, Unique Passwords: A password manager can help you avoid reuse and weak combinations. 
  • Enable Multi-Factor Authentication: Critical for accounts tied to payments or personal information. 
  • Monitor Your Financial Activity: Check bank statements and credit reports for irregularities. Set up alerts where possible. 
  • Be Phishing-Aware: Always verify communications by visiting the retailer’s official website. Don’t click suspicious links or download unexpected attachments. 
  • Don’t Save Your Payment Data: If you can avoid saving your payment/address details with a retailer online then always avoid.  

Data Discipline 

  • Limit the Personal Data You Share: Don’t offer extra details to loyalty schemes or retailers unless absolutely necessary. 
  • Freeze Your Credit (If Breached): Prevent identity thieves from opening new accounts using your stolen details. 

Payment Hygiene 

  • Use Credit Cards Online: They offer better fraud protection and don’t expose your actual bank balance. In addition, you have certain buyer protections when buying on credit card
  • Avoid Public Wi-Fi for Shopping: Use a VPN or shop from secure, private networks. 

The digital age has made shopping easier; but also riskier. Cybersecurity now requires a partnership between retailers and consumers. Companies must implement
zero-trust architectures. AI-powered threat detection and employee cyber-awareness training. Meanwhile, consumers should stay informed, cautious, and quick to respond when their personal data is at risk. 

According to Stanford University’s recent study, human error accounted for 88% of data breaches and a recent Accenture study found that there has been a 97% increase in cyber threats since the start of the Russia/Ukraine war.  
 
We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security. 

The MoD Afghan Data Breach: Could the Information Commissioner have done more? 

On Tuesday, the High Court lifted a superinjunction that prevented scrutiny of one of the most serious personal data breaches involving a UK Government department. In February 2022, a Ministry of Defence (MoD) official mistakenly emailed a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy (ARAP).  

The breach was only discovered in August 2023, when excerpts of the data appeared on Facebook. By then, the damage was done. A new resettlement scheme for those on the leaked list was set up and has seen 4,500 Afghans arrive in the UK so far. The Afghan Relocation Route has cost £400m so far, and the Government has said it is expected to cost a further £450m. Interesting that that the High Court in May 2024 heard it could cost “several billions”. 

Shockingly, people whose details were leaked were only informed on Tuesday. A review of the incident carried out on behalf of the MoD found it was “highly unlikely” an individual would have been targeted solely because of the leaked data, which “may not have spread nearly as widely as initially feared”. On Wednesday though, the Defence Secretary said he was “unable to say for sure” whether anyone had been killed as a result of the data breach. The daughter of an Afghan translator whose details were leaked told the BBC that her whole family “panicked”.  

“No one knows where the data has been sent to – it could be sent to the Taliban, they could have their hands on it,” she said. Her grandmother, who is still in Afghanistan, is “completely vulnerable”, she added. 

This is not the first time the MoD has mishandled Afghan data. In December 2023, it was fined £350,000  for disclosing details of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. The MoD sent an email to a distribution list of Afghan nationals eligible for evacuation using the ‘To’ field, with personal information relating to 245 people being inadvertently disclosed. The email addresses could be seen by all recipients, with 55 people having thumbnail pictures on their email profiles.  
Two people ‘replied all’ to the entire list of recipients, with one of them providing their location.  

ICO’s Response 

Despite the scale and sensitivity of the latest MoD data breach, the Information Commissioner’s Office (ICO) has decided not to take any regulatory action; no, not even a reprimand! In its press release, the ICO praised the MoD’s internal investigation and mitigation efforts, stating that “no further regulatory action is required at this time”. 

Compare this case to the data breach involving the Police Service of Northern Ireland (PSNI). Last year, the ICO fined the PSNI £750,000 after staff mistakenly divulged the surnames of more than 9,483 PSNI officers and staff, their initials and other data in response to a Freedom of Information (FoI) request. The request, via the What Do They Know.Com website, had asked the PSNI for a breakdown of all staff rank and grades. But as well as publishing a table containing the number of people holding positions such as constable, a spreadsheet was included. The information was published on the WDTK website for more than two hours, leaving many fearing for their safety. 

In September las year it was announced that a mediation process involving the PSNI is to take place to attempt to agree the amount of damages to be paid to up to 7,000 staff impacted by the data breach. The final bill could be as much as £240m, according to previous reports. Compare that with the impact and cost of the latest MoD data breach. 

Other ICO enforcement actions in the past few years for security failures include: 

  • Cabinet Office (2020): Fined £500,000 for publishing New Year Honours list online. Cause? Spreadsheet error. 
  • HIV Scotland (2021): Fined £10,000 when it sent an email to 105 people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.   
  • Mermaids (2021): Fined £25,000 for failing to implement an appropriate level of security to its internal email systems, which resulted in documents or emails containing personal data being searchable and viewable online by third parties through internet search engine results.  

In the MoD case, the ICO claims it considered the “critical need to share data urgently” and the MoD’s “steps to protect those most affected”. But urgency wasn’t the issue; it was negligence. The breach occurred during routine verification, not a crisis. Even more concerning, the ICO’s own guidance states that breaches involving unauthorised disclosure of sensitive data, especially where lives are at risk, should trigger enforcement action. 

This lack of action by the ICO raises serious questions about the ICO’s independence and willingness to challenge government departments. Even if it felt a fine was not appropriate, a report to Parliament (under Section 139(3) of Data Protection Act 2018) would have highlighted the seriousness of the issues raised and consequently allowed MP’s to scrutinise the MoD’s actions.  

This breach is a national scandal; not just for its scale, but for the lack of transparency, accountability, and regulatory action. If the UK is serious about data protection, it must demand more from its regulator. Otherwise, the next breach may be even worse and just as quietly buried. 

Yesterday, the Commons Defence Committee confirmed it would launch its own inquiry, and Dame Chi Onwurah, chair of the Commons Committee for Science Innovation and Technology, said that it is writing to the Information Commissioner pushing for an investigation. Watch this space! 

STOP PRESS: This afternoon the BBC reports that the data breach was much worse than previously thought: it contained personal details of more than 100 British officials including those whose identities are most closely guarded – special forces and spies. Is an ICO u turn incoming?

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security.

When AI Misses the Line: What Wimbledon 2025 Teaches Us About Deploying AI in the Workplace 

This year’s Wimbledon Tennis Championships are not just a showcase for elite athleticism but also a high-profile test of Artificial Intelligence. For the first time in the tournament’s 148-year history, all line calls across its 18 courts are made entirely by Hawk-Eye Live, an AI-assisted system that has replaced human line judges. This follows, amongst others, the Semi-Assisted Offside System deployed in last year’s football Champions League after its success in the Qatar World Cup.  

The promise? Faster decisions, greater consistency, and reduced human error. 
The reality? Multiple malfunctions, public apologies, and growing mistrust among players and fans (not to mention losing the ‘best dressed officials’ in sport). 

What Went Wrong? 

  • System Failure Mid-Match: During a high-stakes women’s singles match between Anastasia Pavlyuchenkova and Sonay Kartal, the line-calling system was accidentally switched off for several points. No alerts were raised, and the match proceeded with no accurate judgments. Wimbledon officials later admitted human error was to blame, not the AI. 
  • Misclassification Errors: In the men’s quarter-final between Taylor Fritz and Karen Khachanov, Hawk-Eye incorrectly called a rally forehand a “fault,” apparently confusing it with a serve. Play was halted and the point was replayed, leaving fans and players confused and frustrated. 
  • User Experience Failures: Multiple players, including Emma Raducanu and Jack Draper, complained that some calls were “clearly wrong” and that the system’s announcements were too quiet to hear amid crowd noise. Some players called for the return of human line judges, citing a lack of trust in the technology.  

Lessons for AI and IG Professionals 

Wimbledon’s AI hiccup offers more than a headline; it surfaces deep issues around trust, oversight, and operational design that are relevant to any AI deployment in the workplace. Here are the key lessons: 

1. Automation ≠ Autonomy 

The Wimbledon system is not truly autonomous; it relies on human operators to activate it before each match. When staff forgot to do so, the AI didn’t intervene or alert anyone. This exposes a major pitfall: automated systems are only as reliable as their orchestration layers. 

Governance Principle: Ensure clear workflows and audit trails around when and how AI systems are initiated, paused, or overridden. Build in fail-safe triggers and status checks to prevent silent failures. 

2. Build in Redundancy and Exception Handling 

AI systems excel at pattern recognition in controlled environments but can fail spectacularly at edge cases. Wimbledon’s AI was likely trained on thousands of hours of ball trajectories – but it still confused a forehand rally shot with a serve under unusual conditions. 

Governance Principle: Plan for edge case management. When the AI encounters uncertainty, it should either defer to human review or trigger a fallback protocol.  

3. Usability is a Core Component of Accuracy 

Even when the AI was functioning correctly, players couldn’t always hear the line calls due to low audio volume. What good is a precise call if the user can’t perceive it? 

Governance Principle: Don’t separate accuracy from usability. A technically correct output must be understandable, accessible, and actionable to its end users. Invest in UI/UX design early in the AI lifecycle. 

4. Transparency Builds Trust 

Wimbledon’s initial response (vague statements and slow clarifications) only fuelled player frustration. Trust was eroded not just because of the error, but because of how it was handled. 

Governance Principle: When deploying AI, especially in high-stakes environments, build a culture of transparent accountability. Log decisions, explain anomalies, and communicate clearly when things go wrong. 

5. Hybrid Systems Are Often More Effective Than Pure AI 

While Wimbledon has fully replaced line judges with AI, there’s a strong case for a hybrid model. A combination of automated systems with empowered human oversight could preserve both accuracy and human judgment. 

Governance Principle: Consider augmented intelligence models, where AI supports rather than replaces human decision-makers. This ensures operational continuity and enables learning from both machine and human feedback. 

6. Respect Context and Culture 

Wimbledon isn’t just any tournament; it’s steeped in tradition, where human line judges are part of the spectacle. Removing them altered the tournament’s character, sparking emotional backlash from players and spectators alike. 

Governance Principle: Understand the organisational and cultural context where AI is deployed. Technology doesn’t operate in a vacuum. Change management, stakeholder engagement, and empathy are as important as algorithms. 

The problems with Wimbledon’s AI line-calling system are symptoms of incomplete design thinking. Whether you’re deploying AI in HR analytics, document classification, or customer service, the Wimbledon experience shows that trust isn’t just built on data; it’s built on reliability, clarity, and human-centred design. 

In a world increasingly mediated by automation, we must remember: AI doesn’t replace the need for governance. It raises the stakes for getting it right. And we just wish it was around for the “Hand of God” goal

Are you looking to enhance your career with an AI governance qualification? Our AI Governance Practitioner Certificate is designed to equip compliance professionals with the essential knowledge and skills to navigate this transformative technology while upholding the highest standards of data protection and information governance. The first course was fully booked, and we have added more dates.

The New Data (Use and Access) Act 2025 

The Data (Use and Access) Act 2025 received Royal Assent on 19th June 2025. It is important to note that the new Act will not replace current UK data protection legislation. Rather it will amend the UK GDPR as well as the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) and the Data Protection Act 2018. Most of these amendments will commence in stages, 2 to 12 months after Royal Assent. Exact dates for each measure will be set out in commencement regulations. 

The Bill was introduced into Parliament in October last year. It was trailed in the King’s Speech in July (under its old name of the “Digital Information and Smart Data Bill”) with his Majesty announcing that there would be “targeted reforms to some data laws that will maintain high standards of protection but where there is currently a lack of clarity impeding the safe development and deployment of some new technologies.” However, this statement of intent does not match the reality; many of the core provisions are a “cut and paste” of the Data Protection and Digital Information(No.2) Bill (“DP Bill”), which was dropped by the Conservative Government in the Parliamentary “wash up” stage before last year’s snap General Election. 

Key Provisions 

Let’s examine the key provisions of the new Act.  

Smart Data: The Act retains the provisions from the DP Bill that will enable the creation of a legal framework for Smart Data. This involves companies securely sharing customer data, upon the customer’s (business or consumer) request, with authorised third-party providers (ATPs) who can enhance the customer data with broader, contextual ‘business’ data. These ATPs will provide the customer with innovative services to improve decision making and engagement in a market. Open Banking is the only current example of a regime that is comparable to a ‘Smart Data scheme’. The Act will give such schemes a statutory footing, from which they can grow and expand.  

Digital Identity Products: Just like its predecessor, the Act contains provisions aimed at establishing digital verification services including digital identity products to help people quickly and securely identify themselves when they use online services e.g. to help with moving house, pre-employment checks and buying age restricted goods and services. It is important to note that this is not the same as compulsory digital ID cards as some media outlets have reported. 

Research Provisions: The Act keeps the DP Bill’s provisions that clarify that companies can use personal data for research and development projects, as long as they follow data protection safeguards.  

Legitimate Interests: The Act retains the concept of ‘recognised legitimate interests’ under Article 6 of the UK GDPR- specific purposes for personal data processing such as national security, emergency response, and safeguarding for which Data Controllers will be exempt from conducting a full “Legitimate Interests Assessment” when processing personal data.  

Subject Access Requests: The Act it makes it clear that Data Controllers only have to make reasonable and proportionate searches when someone asks for access to their personal data. 

Automated Decision Making: Like the DP Bill, the Act seeks to limit the right, under Article 22 of the UK GDPR, for a data subject not to be subject to automated decision making or profiling to only cases where Special Category Data is used. Under new article 22A, a decision would qualify as being “based solely on automated processing” if there was “no meaningful human involvement in the taking of the decision”. This could give the green light to companies to use AI techniques on personal data scraped from the internet for the purposes of pre employment background checks. 

International Transfers: The Act maintains most of the DP Bill’s international transfer provisions. There will be a new approach to the test for adequacy applied by the UK Government to countries (and international organisations) and when Data Controllers are carrying out a Transfer Impact Assessment or TIA. The threshold for this new “data protection test” will be whether a jurisdiction offers protection that is “not materially lower” than under the UK GDPR 

Health and Social Care Information: The Act maintains, without any changes, the provisions that establish consistent information standards for health and adult social care IT systems in England, enabling the creation of unified medical records accessible across all related services. 

PECR Changes: One of the most significant changes, copied from the DP Bill, is the increase in fines for breaches of PECR, from £500,000 to UK GDPR levels; meaning organisations could face fines of up to  up to £17.5m of 4% of global annual turnover (whichever is higher) for the most serious infringements. Other changes include allowing cookies to be used without consent for the purposes of web analytics and to install automatic software updates and extending the “soft opt” in for electronic marketing to charities.  

A full list of the changes to the UK data protection regime can be read on the ICO website.  

What is not in the new Act? 

Most of the controversial parts of the DP Bill have been have not made it into the Act. These include: 

  • Replacing the terms “manifestly unfounded” or “excessive” requests, in Article 12 of the UK GDPR, with “vexatious” or “excessive” requests. Explanation and examples of such requests would also have been included.  
  • Exempting all controllers and processors from the duty to maintain a ROPA, under Article 30, unless they are carrying out high risk processing activities.  
  • The “strategic priorities” mechanism, which would have allowed the Secretary of State to set binding priorities for the Information Commissioner. 
  • The requirements for the Information Commissioner to submit codes of practice to the Secretary of State for review and recommendations.  

The UK’s adequacy status under the EU GDPR now expires on 27th December following the recent announcement of a six month extension. Whilst the EU will commence a formal review of adequacy once the Bill receives Royal Assent, nothing in the Bill will jeopardise the free flow of personal between the EU and the UK. The situation would perhaps have been different had the DP Bill made it on to the statute books.  

AI and Copyright 

Much of the delay to the Bill was passing was caused by an issue which was not originally intended to be addressed in the Bill; that of the use of copyright works to train AI. Like the monster plant in Little Shop of Horrors, AI has an insatiable appetite; for data though rather than food. AI applications need a constant supply of data to train (and improve) their output algorithms. This obviously concerns copyright holders such as musicians and writers whose work may be used to train AI models to produce similar output, without the former receiving any financial compensation. A number of copyright infringements lawsuits are set to hit the courts soon. Amongst them, Getty Images’ is suing Stability AI accusing it of using Getty images to train its Stable Diffusion system, which can generate images from text inputs. Similar lawsuits have been launched in the US by novelists and news outlets. 

During the passage of the Bill through Parliament, there was strong disagreement between the Lords and the Commons over an amendment introduced by the crossbench peer and former film director Beeban Kidron. The amendment would have required AI developers to be transparent with copyright owners, about using their material to train AI models. 400 British musicians, writers and artists, including Sir Paul McCartney, signed a letter urging the Government to adopt the amendment. They argued that failing to do so would mean them “giving away” their work to tech firms.  

In the end, the Baroness Kidron dropped her amendment follow repeated rejection in the Commons. I expect this issue to raise its head again soon. The Government’s consultation on AI and copyright ended in February. Amongst other options, it proposes to give copyright holders the right to opt-out of their works being used for training AI. However, the music industry believes that such a measure would offer insufficient protection for copyright holders. In an interview with the BBC, Sir Elton John described the government as “absolute losers” and said he feels “incredibly betrayed” over the Government’s plans. 

Once the Government publishes it response to the copyright consultation, it will have to consider how to take the matter forward. Whether this comes in the form of a new copyright bill or AI regulation bill, expect more parliamentary wranglings as well as celebrity interviews.  

Data protection professionals need to assess the changes to the UK data protection regime. Our half day workshop will explore the new Act in detail giving you an action plan for compliance.