Our 23rd Birthday! Celebrate with Us and Save on Training  

This month marks 23 years of Act Now Training. We delivered our first course in 2003 (on the Data Protection Act 1998!) at the National Railway Museum in York. Fast forward to today, and we deliver over 300 training days a year on AI, GDPR, records management, surveillance law and cyber security; supporting delegates across multiple jurisdictions including the Middle East.  

Our success comes from more than just longevity; we are trusted by clients across every sector, giving us a unique insight into the real-world challenges of information governance. That’s why our education-first approach focuses on practical skills, measurable impact, and lasting value for your organisation. 

Anniversary Offer: To celebrate, we are giving you a £50 discount on any one-day workshop, if you book by 30th September 2025. Choose from our most popular sessions like GDPR and FOI A to Z, or explore new topics like AI and Information Governance and the Risk Managment in IG

Simply quote “23rd Anniversary” on your booking form to claim your discount.

The MoD Afghan Data Breach: Could the Information Commissioner have done more? 

On Tuesday, the High Court lifted a superinjunction that prevented scrutiny of one of the most serious personal data breaches involving a UK Government department. In February 2022, a Ministry of Defence (MoD) official mistakenly emailed a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy (ARAP).  

The breach was only discovered in August 2023, when excerpts of the data appeared on Facebook. By then, the damage was done. A new resettlement scheme for those on the leaked list was set up and has seen 4,500 Afghans arrive in the UK so far. The Afghan Relocation Route has cost £400m so far, and the Government has said it is expected to cost a further £450m. Interesting that that the High Court in May 2024 heard it could cost “several billions”. 

Shockingly, people whose details were leaked were only informed on Tuesday. A review of the incident carried out on behalf of the MoD found it was “highly unlikely” an individual would have been targeted solely because of the leaked data, which “may not have spread nearly as widely as initially feared”. On Wednesday though, the Defence Secretary said he was “unable to say for sure” whether anyone had been killed as a result of the data breach. The daughter of an Afghan translator whose details were leaked told the BBC that her whole family “panicked”.  

“No one knows where the data has been sent to – it could be sent to the Taliban, they could have their hands on it,” she said. Her grandmother, who is still in Afghanistan, is “completely vulnerable”, she added. 

This is not the first time the MoD has mishandled Afghan data. In December 2023, it was fined £350,000  for disclosing details of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. The MoD sent an email to a distribution list of Afghan nationals eligible for evacuation using the ‘To’ field, with personal information relating to 245 people being inadvertently disclosed. The email addresses could be seen by all recipients, with 55 people having thumbnail pictures on their email profiles.  
Two people ‘replied all’ to the entire list of recipients, with one of them providing their location.  

ICO’s Response 

Despite the scale and sensitivity of the latest MoD data breach, the Information Commissioner’s Office (ICO) has decided not to take any regulatory action; no, not even a reprimand! In its press release, the ICO praised the MoD’s internal investigation and mitigation efforts, stating that “no further regulatory action is required at this time”. 

Compare this case to the data breach involving the Police Service of Northern Ireland (PSNI). Last year, the ICO fined the PSNI £750,000 after staff mistakenly divulged the surnames of more than 9,483 PSNI officers and staff, their initials and other data in response to a Freedom of Information (FoI) request. The request, via the What Do They Know.Com website, had asked the PSNI for a breakdown of all staff rank and grades. But as well as publishing a table containing the number of people holding positions such as constable, a spreadsheet was included. The information was published on the WDTK website for more than two hours, leaving many fearing for their safety. 

In September las year it was announced that a mediation process involving the PSNI is to take place to attempt to agree the amount of damages to be paid to up to 7,000 staff impacted by the data breach. The final bill could be as much as £240m, according to previous reports. Compare that with the impact and cost of the latest MoD data breach. 

Other ICO enforcement actions in the past few years for security failures include: 

  • Cabinet Office (2020): Fined £500,000 for publishing New Year Honours list online. Cause? Spreadsheet error. 
  • HIV Scotland (2021): Fined £10,000 when it sent an email to 105 people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.   
  • Mermaids (2021): Fined £25,000 for failing to implement an appropriate level of security to its internal email systems, which resulted in documents or emails containing personal data being searchable and viewable online by third parties through internet search engine results.  

In the MoD case, the ICO claims it considered the “critical need to share data urgently” and the MoD’s “steps to protect those most affected”. But urgency wasn’t the issue; it was negligence. The breach occurred during routine verification, not a crisis. Even more concerning, the ICO’s own guidance states that breaches involving unauthorised disclosure of sensitive data, especially where lives are at risk, should trigger enforcement action. 

This lack of action by the ICO raises serious questions about the ICO’s independence and willingness to challenge government departments. Even if it felt a fine was not appropriate, a report to Parliament (under Section 139(3) of Data Protection Act 2018) would have highlighted the seriousness of the issues raised and consequently allowed MP’s to scrutinise the MoD’s actions.  

This breach is a national scandal; not just for its scale, but for the lack of transparency, accountability, and regulatory action. If the UK is serious about data protection, it must demand more from its regulator. Otherwise, the next breach may be even worse and just as quietly buried. 

Yesterday, the Commons Defence Committee confirmed it would launch its own inquiry, and Dame Chi Onwurah, chair of the Commons Committee for Science Innovation and Technology, said that it is writing to the Information Commissioner pushing for an investigation. Watch this space! 

STOP PRESS: This afternoon the BBC reports that the data breach was much worse than previously thought: it contained personal details of more than 100 British officials including those whose identities are most closely guarded – special forces and spies. Is an ICO u turn incoming?

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security.

£2.31 Million GDPR Fine for Genetic Testing Company. But will the fine be paid? 

The Information Commissioner’s Office (ICO) has fined a US genetic testing company £2.31 million under the UK GDPR following a 2023 cyber-attack. 

23andMe provides genetic testing for, amongst other things, health purposes and ancestry tracing. In 2023 a hacker carried out a credential stuffing attack on the company’s platform, exploiting reused login credentials that were stolen from previous unrelated data breaches. This resulted in unauthorised access to 155,592 UK residents’ personal data; potentially revealing sensitive data such as profile images, race, ethnicity, family trees and health reports. The type and amount of personal data accessed varied depending on the information included in a customer’s account. 

The investigation into 23andMe revealed serious security failings at the time of the 2023 data breach. The company failed to implement appropriate authentication and verification measures, such as mandatory multi-factor authentication, secure password protocols, or unpredictable usernames. It also failed to implement appropriate controls over access to raw genetic data and did not have effective systems in place to monitor, detect, or respond to cyber threats targeting its customers’ sensitive information. 

The ICO also found that 23andMe’s response to the unfolding incident was inadequate. The hacker began their credential stuffing attack in April 2023, before carrying out their first period of intense credential stuffing activity in May 2023.
In August 2023, a claim of data theft affecting over 10 million users was dismissed as a hoax, despite 23andMe having conducted isolated investigations into unauthorised activity on its platform in July 2023. Another wave of credential stuffing followed in September 2023, but the company did not start a full investigation until October 2023, when a 23andMe employee discovered that the stolen data had been advertised for sale on Reddit. Only then did 23andMe confirm that a breach had occurred.  

What happens now? 

The ICO has made much of this penalty and the joint investigation conducted with the Office of the Privacy Commissioner of Canada. John Edwards, the Information Commissioner, said: 

“We carried out this investigation in collaboration with our Canadian counterparts, and it highlights the power of international cooperation in holding global companies to account. Data protection doesn’t stop at borders, and neither do we when it comes to protecting the rights of UK residents.” 

The fine comes after an ICO statement in March which said that a Notice of Intent had been issued of £4.59 million. An almost 50% reduction but, whatever the amount of the fine, the ICO is unlike to see a penny.  

In April 23andMe filed for bankruptcy in the US courts. On Friday it said that it had agreed to the sale of its assets to a non-profit biotech organisation led by its
co-founder and former chief executive. It said the purchase of the company would come with binding commitments to uphold existing policies and consumer protections, such as letting customers delete their accounts, genetic data and opt out of research.
A bankruptcy court is scheduled to hear the case for its approval on Wednesday. 

This case is also a good example of  the extra territorial reach of the UK GDPR.  Article 3(2)(a) UK GDPR as although 23andMe is not established within the UK, it processes the personal data of the affected UK Data Subjects for the purposes of offering goods or services to those individuals. 

This is the third fine issued by the ICO in 2025. In April a £60,000 fine was issued to a law firm and in March an NHS IT supplier was fined £3million. Both also followed cyber-attacks.   

 We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to up skill their employees about cyber security. See also our Managing Personal Data Breaches Workshop. 

What is the Role of IG Professionals in AI Governance? 

The rapid rise of AI deployment in the workplace brings a host of legal and ethical challenges. AI governance is essential to addresses these challenges and ensuring AI systems are transparent, accountable, and aligned with organisational values. 

AI governance requires a multidisciplinary approach involving, amongst others, IT, legal, compliance and industry specialists. IG professionals also possess a unique skill set that makes them key stakeholders in the governance process. Here’s why they should actively position themselves to play a key role in AI governance within their organisations. 

AI Governance is Fundamentally a Data Governance Issue 

At its core, AI is a data-driven technology. The fairness and reliability of AI models depend on the quality, accuracy, and management of data. If AI systems are trained on poor-quality or biased data, they can produce flawed and discriminatory outcomes. (See Amnesty International’s report into police data and algorithms.)  

IG professionals specialise in ensuring that data is accurate, well-structured, and fit for purpose. Without strong data governance, organisations risk deploying AI systems that amplify biases, make inaccurate predictions, or fail to comply with regulatory requirements. 

Regulatory and Compliance Expertise is Critical 

AI governance is increasingly being shaped by regulatory frameworks around the world. The EU AI Act and regulations and guidance from other jurisdictions highlight the growing emphasis on AI accountability, transparency, and risk management. 

IG professionals have expertise in interpreting legislation (such as GDPR, PECR and DPA amongst others) which positions them to help organisations navigate the complex legal landscape surrounding AI. They can ensure that AI governance frameworks comply with data protection principles, consumer rights, and ethical AI standards, reducing the risk of legal penalties and reputational damage. 

Managing AI Risks and Ensuring Ethical AI Practices 

AI introduces new risks, including algorithmic bias, privacy violations, security vulnerabilities, and explainability challenges. Left unchecked, these risks can undermine trust in AI and expose organisations to significant operational and reputational harm. 

IG Governance professionals excel in risk management (After all, that is what DPIAs are about). They are trained to assess and mitigate risks related to data security, data integrity, and compliance, which directly translates to AI governance. By working alongside IT and ethics teams, they can help establish clear policies, accountability structures, and risk assessment frameworks to ensure AI is deployed responsibly. 

Bridging the Gap Between IT, Legal, and Business Functions 

One of the biggest challenges in AI governance is the lack of alignment between different business functions. AI development is often led by technical teams, while compliance and risk management sit with legal and governance teams. Without effective collaboration, governance efforts can become fragmented or ineffective. 

IG professionals act as natural bridges between these groups. Their work already involves coordinating across departments to align data policies, privacy standards, and regulatory requirements. By taking an active role in AI governance, they can ensure cross-functional collaboration, helping organisations balance innovation with compliance. 

Addressing Data Privacy and Security Concerns 

AI often processes vast amounts of sensitive personal data, making privacy and security critical concerns. Organisations must ensure that AI systems comply with data protection laws, implement robust security measures, and uphold individuals’ rights over their data. 

IG and Data Governance professionals are well-versed in data privacy principles, data minimisation, encryption, and access controls. Their expertise is essential in ensuring that AI systems are designed and deployed with privacy-by-design principles, reducing the risk of data breaches and regulatory violations. 

AI Governance Should Fit Within Existing Frameworks 

Organisations already have established governance structures for data management, records retention, compliance, and security. Instead of treating AI governance as an entirely new function, it should be integrated into existing governance models. 

IG and Data Governance professionals are skilled at implementing governance frameworks, policies, and best practices. Their experience can help ensure that AI governance is scalable, sustainable, and aligned with the organisation’s broader data governance strategy. 

Proactive Involvement Prevents Being Left Behind 

If IG professionals do not step up, AI governance may be driven solely by IT, data science, or business teams. While these functions bring valuable expertise, they may overlook regulatory, ethical, and risk considerations. Fundamentally, as IG professionals, our goal is to ensure organisations are using data and any new technology responsibly. 

So we are not saying that IG and DP professionals should become the new AI overlords. But by proactively positioning themselves as key stakeholders in AI governance, IG and Data Governance professionals ensure that organisations take a holistic approach – one that balances innovation, compliance, and risk management. Waiting to be invited to the AI governance conversation risks being sidelined in decisions that will have long-term implications for data governance and organisational risk. 

Final Thoughts 

To reiterate, AI governance should not be the sole responsibility of IG and Data Governance professionals – it requires a collaborative, cross-functional approach. However, their expertise in data integrity, privacy, compliance, and risk management makes them essential players in the AI governance ecosystem. 

As organisations increasingly rely on AI-driven decision-making, IG and Data Governance professionals must ensure that these systems are accountable, transparent, and legally compliant. By stepping up now, they can shape the future of AI governance within their organisations and safeguard them from regulatory, ethical, and operational pitfalls. 

Our new six module AI Governance Practitioner Certificate will empower you to understand AI’s potential, address its challenges, and harness its power responsibly for the public benefit.  

ICO Issues £60,000 GDPR Fine  

The Information Commissioner’s Office (ICO) has fined a Merseyside-based law firm £60,000 following a cyber-attack that led to highly sensitive personal data being published on the dark web. 

DPP Law Ltd (DPP) specialises in a number of areas of law including crime and actions against the police. It suffered the cyber-attack in June 2022 which affected access to the firm’s IT systems for over a week. The hackers were able to move laterally across DPP’s network and take over 32GB of data. DPP only became aware of this after the National Crime Agency contacted the firm to advise information relating to their clients had been posted on the dark web. DPP did not report the incident to the ICO until 43 days after they became aware of it. 

The ICO found that DPP failed to put appropriate measures in place to ensure the security of personal data held electronically. This failure enabled the hackers to gain access to DPP’s network, via an infrequently used administrator account which lacked multi-factor authentication (MFA) and steal large volumes of data. 

This is the second GDPR fine issued to a law firm. In March 2022, the ICO issued a fine of £98,000 to Tuckers Solicitors LLP. The fine followed a ransomware attack on the firm’s IT systems in August 2020. The attacker encrypted 972,191 files, of which 24,712 related to court bundles. 60 of those were exfiltrated by the attacker and released on the dark web. 

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security. See also our Managing Personal Data Breaches Workshop

Supporting Careers in Data Protection Through Apprenticeships 

In today’s digital landscape, data protection and information governance have become critical risk areas for organisations across all sectors. With increasing regulatory demands and evolving threats, the need for skilled professionals in this field has never been greater. Recognising this growing skills gap, Damar Training, with the support of Act Now Training,  launched its innovative Data Protection and Information Governance Apprenticeship programme in late 2022, quickly establishing itself as the leading provider in England.

The programme was developed through extensive consultation with employers, including members of the apprenticeship Trailblazer Group, to ensure it would be commercially attractive, impactful, and of the highest quality. This collaborative approach has led to excellent engagement from employers and individuals, with 243 apprentices starting the programme to date, making Damar the largest provider of this apprenticeship standard in England.

A Flexible, Comprehensive Learning Journey

What sets Damar’s apprenticeship apart is its thoughtfully designed modular structure, with carefully sequenced six-week blocks of learning that cater to diverse learning styles and organisational needs. The gradual layering of technical content and learning activity, designed with the assistance of Act Now Training, ensure that apprentices from both public and private sectors receive an outstanding foundation in the knowledge, skills, and behaviours required for success in data protection roles.

The delivery model combines self-directed learning through engaging online resources with regular one-to-one coaching visits and group coaching sessions.
Extended technical workshops (underpinned by Act Now’s expertise) and quarterly review meetings provide additional support, while dedicated forums allow apprentices to stay updated with the latest developments, engage with peers, and consult with coaches.

This comprehensive approach has yielded impressive results. With a retention rate of 68%, an achievement rate of 65%, and an EPA pass rate of 95% – all above national averages – the programme demonstrates exceptional quality, particularly remarkable for a relatively new offering.

Industry-Leading Expertise

A key strength of Damar’s apprenticeship is its partnership with Act Now, an
award-winning data protection consultancy. This collaboration ensures that the programme’s content remains at the cutting edge of industry developments, including emerging areas such as Artificial Intelligence regulation.

Sarah Murray, Data Protection Officer at ClearData, highlights this benefit: 

“One of the particular stand-outs for me is the workshops. With the content supported by
Act Now, who have such a good reputation in this field, the workshops really put all of the theory into real-life practice.”

Real-World Impact for Employers and Apprentices

The programme serves some of the UK’s major employers, including Heathrow Airport, National Express, the BBC, Auto Trader, Betfred, and Dunelm, alongside various NHS Trusts, universities, government departments, and local councils.

For apprentices, the transformation goes beyond technical knowledge. Many begin with only basic data protection skills and limited confidence. Through the programme, they develop not only technical expertise but also a deeper understanding of the “why” behind data protection practices and the confidence to advise others with authority.

This growth translates into tangible career progression, with 99% of apprentices experiencing positive outcomes – 53% remaining in their current roles with enhanced skills, 18% securing permanent positions, and 28% gaining promotions or additional responsibilities. Some have even become data protection officers with overall responsibility for their organisation’s data protection function.

Employers benefit from immediate practical impacts. Apprentices have improved information assurance audits at Lincoln University, created artificial intelligence policies for Norfolk and Waveney Integrated Care Board, and developed triage request processes for data protection requirements at The Christie NHS Foundation Trust.

Stacey Lawrence, Data Protection Manager at Manchester Airport, emphasises this value: 

“The impact that both apprentices have brought to Manchester Airport has been huge. They work on the front line, to manage all enquiries, data protection breaches, and individual rights requests, and without them we simply wouldn’t be able to do the really sterling work that we do every day.”

A Future-Focused Approach

Damar continues to evolve the programme based on feedback from coaches, apprentices, and employers. Recent improvements include enhanced EPA preparation sessions, now embedded into group coaching. The company maintains close ties with the trailblazer group and leverages Act Now’s expertise to stay ahead of legislative developments.

With another 22 apprentices due to commence in April, the programme’s growth trajectory remains strong. Many employers, including Manchester Airport Group and Nottingham University Hospitals, are returning for their second or third data protection apprentice – perhaps the strongest testament to the programme’s value.

For organisations seeking to strengthen their data protection capabilities and individuals looking to build rewarding careers in this critical field, Damar Training’s Data Protection and Information Governance Apprenticeship offers a proven pathway to success.

If you would like to learn more about the DP and IG  Apprenticeship, please get in touch

Transport for London Cyber Attack 

Transport for London (TfL) is currently dealing with a cyber attack that has targeted its computer systems. Sources within TfL have revealed that staff have been encouraged to work from home where possible, as the attack primarily affects the transport provider’s back-office systems at its corporate headquarters. TfL is collaborating closely with the National Crime Agency and the National Cyber Security Centre to respond to the incident. 

Shashi Verma, TfL’s Chief Technology Officer, said: 

“We have implemented several measures to address an ongoing cybersecurity incident within our internal systems. The security of our systems and customer data is of utmost importance, and we are continuously assessing the situation throughout this incident.”  

Mr Verma emphasised that, although a complete assessment is still underway, there is no current evidence of customer data being compromised. If it turns out that any personal data has been compromised, whether employee or customer data,  of course TfL will need to consider reporting the matter to the Information Commissioner’s Office (ICO) as a personal data breach under Article 33 of the UK GDPR. As a statutory body, failure to do so could lead to TfL being fined up to £8.7 million. If the ICO investigates and finds a breach of the DP Principles (e.g. security) this could rise to £17.5 million. 

Back in the day major cyber incidents involving personal data were sure to be the subject of an ICO fine. In 2018, British Airways and  Marriott International were fined £20 million and  £18.4 million respectively. More recently the ICO has issued more reprimands in line with its policy on public sector enforcement. It recently issued a reprimand to the Electoral Commission following the discovery that unspecified “hostile actors” had managed to gain access to copies of the electoral registers, from August 2021. On 26th June 2024, the ICO announced that it will now review the two-year trial before making a decision on the public sector approach in the autumn.  

This is not the first cyber attack on a major public service provider in the capital.  Last month the ICO announced that it had issued a GDPR Notice of Intent of £6.09 million to an NHS IT supplier. This comes after its findings that the company failed to adequately protect the personal data of 82,946 individuals in breach of Article 32 of the UK GDPR.  As a key IT and software provider for the NHS and other healthcare organisations across the country, Advanced often holds role of Data Processor for many of its clients. The breach in question occurred during a ransomware attack in August 2022. Hackers exploited a vulnerability through a customer account that lacked multi-factor authentication, gaining access to multiple health and care systems operated by Advanced. The compromised data included phone numbers, medical records, and even details on how to access the homes of 890 individuals receiving at-home care. 

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to up skill their employees about cyber security. See also our Managing Personal Data Breaches Workshop

Stolen NHS Patient Data Published on Dark Web

NHS England has now confirmed its patient data, managed by blood test management organisation Synnovis, was stolen in a ransomware attack on 3rd June. According to the BBC some of that data has been published on the dark web by the hackers. 

On 4th June 2024, the Independent reported that two major London hospital trusts had to cancel all non-emergency operations and blood tests due to a significant cyber attack. Both King’s College Hospital Foundation Trust and Guy’s and St Thomas’ Hospitals Foundation Trusts have seen their pathology systems compromised by malware.

Synnovis, the service provider responsible for blood tests, swabs, bowel tests, and other critical services for these hospitals, was targeted in this attack. The impact was widespread, affecting NHS patients across six London boroughs. 

It now transpires that, Qilin, a Russian cyber-criminal group, shared almost 400GB of private information on their darknet site on Thursday night.  A sample of the stolen data seen by the BBC includes patient names, dates of birth, NHS numbers and descriptions of blood tests. NHS England said in a statement that there is “no evidence” that test results have been published, but that “investigations are ongoing”.

The Information Commissioner’s Office said in statement:

“While we are continuing to make enquiries into this matter, we recognise the sensitivity of some of the information in question and the worry this may have caused.

“We would urge anyone concerned about how their data has been handled to check our website for advice and support, as well as visiting NHS England’s website.”

We have two workshops coming up in September (Introduction to Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to up skill their employees about data security. See also our Managing Personal Data Breaches Workshop.  

Stolen NHS Data Published on Dark Web

A large volume of NHS data has been published by a ransomware group on the dark web. This follows the recent cyber attack on NHS Dumfries and Galloway, when cyber criminals were able to access a significant amount of data including patient and
staff-identifiable information. Data relating to a small number of patients was released in March, and the cyber criminals had threatened that more would follow.

Reacting to the latest publication of data, NHS Dumfries and Galloway Chief Executive Julie White said: “This is an utterly abhorrent criminal act by cyber criminals who had threatened to release more data.

“We should not be surprised at this outcome, as this is in line with the way these criminal groups operate.

“Work is beginning to take place with partner agencies to assess the data which has been published. This very much remains a live criminal matter, and we are continuing to work with national agencies including Police Scotland, the National Cyber Security Centre and the Scottish Government.”

Mrs White added: “NHS Dumfries and Galloway is conscious that this may cause increased anxiety and concern for patients and staff, with a telephone helpline sharing the information hosted at our website available from tomorrow.

“Data accessed by the cyber criminals has now been published onto the
dark web – which is not readily accessible to most people.”

“Recognising that this is a live criminal matter, we continue to follow the very clear guidance being provided to us by national law enforcement agencies.”

NHS Dumfries and Galloway advised people to be alert for any attempts to access their work and personal data. It has also set up a helpline for anyone concerned about the attack and is working with police and other agencies as investigations continue.

In December last year, NHS Fife was formally reprimanded by the Information Commissioner’s Office (ICO) following an incident where an unauthorised individual accessed sensitive patient information.

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. 

MOD Payroll Data Hacked

The government has raised concerns about a cyber attack on an armed forces payroll system, with indications pointing towards China as the suspected perpetrator. Defence Secretary Grant Shapps is set to address Members of Parliament today, although he is not expected to directly attribute blame to any specific party.
Instead, he is likely to emphasise the threat posed by cyber espionage activities conducted by hostile states.

The affected system, utilised by the Ministry of Defence (MoD), contains sensitive information such as names and bank details of armed forces personnel, with a few instances where personal addresses may also be included. Managed by an external contractor, the breach came to light in recent days, prompting government action, although there’s no evidence suggesting data was actually extracted from the system.

The investigation into the breach is still in its early stages and attributing responsibility can be a complex and time-consuming process. While official accusations may not be made immediately, suspicions are reportedly pointing towards China, given its history of targeting similar datasets.

Those impacted by the breach will receive communication from the government regarding the incident, with a focus on addressing potential fraud risks rather than immediate personal safety concerns.

At the time of writing it is not clear if the MoD has reported the data breach to the ICO as required by the UK GDPR. In December 2023, the MoD was fined £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. 

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security.