£2.31 Million GDPR Fine for Genetic Testing Company. But will the fine be paid? 

The Information Commissioner’s Office (ICO) has fined a US genetic testing company £2.31 million under the UK GDPR following a 2023 cyber-attack. 

23andMe provides genetic testing for, amongst other things, health purposes and ancestry tracing. In 2023 a hacker carried out a credential stuffing attack on the company’s platform, exploiting reused login credentials that were stolen from previous unrelated data breaches. This resulted in unauthorised access to 155,592 UK residents’ personal data; potentially revealing sensitive data such as profile images, race, ethnicity, family trees and health reports. The type and amount of personal data accessed varied depending on the information included in a customer’s account. 

The investigation into 23andMe revealed serious security failings at the time of the 2023 data breach. The company failed to implement appropriate authentication and verification measures, such as mandatory multi-factor authentication, secure password protocols, or unpredictable usernames. It also failed to implement appropriate controls over access to raw genetic data and did not have effective systems in place to monitor, detect, or respond to cyber threats targeting its customers’ sensitive information. 

The ICO also found that 23andMe’s response to the unfolding incident was inadequate. The hacker began their credential stuffing attack in April 2023, before carrying out their first period of intense credential stuffing activity in May 2023.
In August 2023, a claim of data theft affecting over 10 million users was dismissed as a hoax, despite 23andMe having conducted isolated investigations into unauthorised activity on its platform in July 2023. Another wave of credential stuffing followed in September 2023, but the company did not start a full investigation until October 2023, when a 23andMe employee discovered that the stolen data had been advertised for sale on Reddit. Only then did 23andMe confirm that a breach had occurred.  

What happens now? 

The ICO has made much of this penalty and the joint investigation conducted with the Office of the Privacy Commissioner of Canada. John Edwards, the Information Commissioner, said: 

“We carried out this investigation in collaboration with our Canadian counterparts, and it highlights the power of international cooperation in holding global companies to account. Data protection doesn’t stop at borders, and neither do we when it comes to protecting the rights of UK residents.” 

The fine comes after an ICO statement in March which said that a Notice of Intent had been issued of £4.59 million. An almost 50% reduction but, whatever the amount of the fine, the ICO is unlike to see a penny.  

In April 23andMe filed for bankruptcy in the US courts. On Friday it said that it had agreed to the sale of its assets to a non-profit biotech organisation led by its
co-founder and former chief executive. It said the purchase of the company would come with binding commitments to uphold existing policies and consumer protections, such as letting customers delete their accounts, genetic data and opt out of research.
A bankruptcy court is scheduled to hear the case for its approval on Wednesday. 

This case is also a good example of  the extra territorial reach of the UK GDPR.  Article 3(2)(a) UK GDPR as although 23andMe is not established within the UK, it processes the personal data of the affected UK Data Subjects for the purposes of offering goods or services to those individuals. 

This is the third fine issued by the ICO in 2025. In April a £60,000 fine was issued to a law firm and in March an NHS IT supplier was fined £3million. Both also followed cyber-attacks.   

 We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to up skill their employees about cyber security. See also our Managing Personal Data Breaches Workshop. 

The Data (Use and Access) Bill Ready for the Statute Books 

The Data (Use and Access) Bill has cleared the final hurdle in Parliament and will soon become the Data (Use and Access) Act 2025 following Royal Assent.  

The new Act will amend the UK GDPR as well as PECR and the Data Protection Act 2018. The key changes are summarised in our blog post here. Most of these are not particularly controversial and were in the Data Protection and Digital Information Bill  which failed to make it through Parliamentary “wash up” stage when the General Election was announced last year. 

Much of the delay to the passing of the Bill was caused by amendments proposed by Baroness Kidron in the House of Lords. She wanted more protection for artists whose data is often used to train AI models, especially Generative AI. Her amendment would have required developers to be transparent with copyright owners about using their material to train AI models. 400 British musicians, writers and artists signed a letter saying the Government’s failing to adopt the amendment would mean them “giving away” their work to tech firms. In the end Baroness Kidron, following repeated rejections of her amendment in the House of Commons during the “ping pong” stage, decided to withdraw gracefully. Expect this issue to come up again when the government eventually brings forth AI legislation as mentioned in the King’s Speech. 

We expect most of the substantive provisions to come into force a few months after commencement. Plenty of time for us to update the UK GDPR Handbook

Data protection professionals need to assess the changes to the UK data protection regime. A revised UK GDPR Handbook is now available incorporating the changes made by the DUA Act.

Responding to Format Requests Under FOI 

How does a public authority deal with an FOI request where the applicant requests information in a format in which it is not readily available? Section 11 of the Freedom of Information Act 2000 states: 

“(1) Where, on making his request for information, the applicant expresses a preference for communication by any one or more of the following means, namely— 

(a)the provision to the applicant of a copy of the information in permanent form or in another form acceptable to the applicant, 

(b)the provision to the applicant of a reasonable opportunity to inspect a record containing the information, and 

(c)the provision to the applicant of a digest or summary of the information in permanent form or in another form acceptable to the applicant, 

the public authority shall so far as reasonably practicable give effect to that preference.” 

“(2) In determining for the purposes of this section whether it is reasonably practicable to communicate information by particular means, the public authority may have regard to all the circumstances, including the cost of doing so.” 

A recent appeal decision of the Upper Tribunal sheds light on the meaning of “reasonably practicable” under section 11 and in particular whether it requires information to be disclosed in the requested format up to the point where it is no longer practicable. 

In Walawalkar v (1) Information Commissioner; (2) Maritime and Coastguard Agency [2025] UKUT 105 (AAC), Mr Walawalkar made an FOI request, on behalf Liberty Investigates, for information to the Maritime and Coastguard Agency (MCA) in the following words: 

“Please can you provide me the following under the FOI Act: 

[1] A copy of the recorded audio of all calls between people at sea in the English Channel and HM Coastguard between 00:01am on 15 November 2021 and 23:59pm on 22 November 2021…Please provide as many of these recordings as is retrievable within the cost limit. 

[2] If retrievable within the cost limit, for each audio recording disclosed in response to point 1 – please specify which HM Coastguard control room handled the distress call (eg Dover Maritime Rescue Coordination Centre). 

[3] If retrievable within the cost limit, please also provide a transcript of audio recording of all calls requested in point 1. 

[4] For each call requested in point 1, please provide the HMCG GIN incident number it relates to.” 

The MCA held 55 such recordings. It estimated that transcribing them would require more than 41 hours of staff time which was not reasonably practicable. Mr Walawalkar argued that section 11 involved a ‘sliding scale’ test of providing transcripts of at least some of the information requested up to the point that it was no longer reasonably practicable for the MCA to do so.  

The Tribunal rejected Mr Walawalkar’s argument. It ruled that section 11(1) involves an ‘all or nothing’ test which involves asking if it was reasonably practicable for the MCA to provide all of the information in the preferred means (i.e. transcripts of all the audio calls falling within the request). The Tribunal agreed with the ICO which had applied an ‘all or nothing’ approach and so had the FTT. 

Judge Wright explained: 

“The ICO referred in argument to “the information” being a unitary concept throughout FOIA. I think this is a helpful perspective.  The point may be tested by considering the application of section 12 of FOIA and its costs cap. Assuming the information would otherwise be disclosable under section 1 of FOIA, section 12 of FOIA only makes sense, in terms of calibrating the cost of complying with the request for information, if the section 12 estimate is based on the cost of providing all the information requested. Were it otherwise and section 12 involved a sliding scale of compliance, estimating the cost of complying on the basis of as much of the requested information up to the “appropriate limit”, section 12 would have no useful application as it would always oblige a public authority to comply with the request in respect of as much of the information requested up to the appropriate limit. That is not a tenable reading of section 12. It has no ‘sliding scale’ language within it. Moreover, on the face of it Parliament plainly intended that section 12 would apply so as to allow a public authority to refuse the request if complying with it exceeded the appropriate limit. A sliding scale (that is, as much of the requested information as is within the appropriate limit), is not consonant with that statutory intention.  The costs estimate in section 12 is about complying with “the request” and that is a request for (all) the information of the description specified in the request.” 

The ruling also restated the accepted principle that format requests should only be considered if and when no exemption from disclosure applies to the requested information. In addition, it rejected an argument that the format request is relevant to whether information is held for the purposes of section 1 of FOI. The latter is a logically prior and separate issue under FOI. 

Are you looking to acquire detailed knowledge of the FOI and develop the practical skills to enable you to become a more effective FOI Officer? 

OurFOI Practitioner Certificate has been developed by FOI experts after analysing all the skills, knowledge and competencies required for the FOI Officer role. By the end of the course, you will be able to practically handle FOI requests, apply the exemptions and draft Refusal Notices. You will also be able to differentiate between FOI requests and requests under the Environmental Information Regulations. 

Why Risk Management is Essential for IG Professionals 

GDPR compliance is very much about risk management. Throughout the UK and EU GDPR, Data Controllers are required to implement protective measures corresponding to the level of risk of their personal data processing activities. Consequently, risk management is a foundational skill which all data protection and information governance professionals need to develop.  

Risk in the UK GDPR 

Key provisions of the UK GDPR which mandate a risk-based approach include: 

Article 24 Responsibility of the Controller 

“Taking into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation. Those measures shall be reviewed and updated where necessary.” 

Article 25 Data Protection by Design and by Default 

“Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.” 

Article 32 Security of Processing 

“Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk,…” 

Article 33 Notification of a Personal Data Breach to the Commissioner 

“In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the Commissioner , unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification under this paragraph is not made within 72 hours, it shall be accompanied by reasons for the delay.” 

Article 33 Notification of a Personal Data Breach to the Data Subject 

“When the personal data breach is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall communicate the personal data breach to the data subject without undue delay.” 

Article 35 Data Protection Impact Assessments (DPIAs) 

“Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.” 

Even where the word ‘risk’ is not explicitly used, the concept underpins a number of data protection principles in the UK (and EU) GDPR. For example: 

Accountability Principle  
Data Controllers must be able to demonstrate compliance. This involves documenting risk assessments, decisions, and mitigations; all of which are key components of risk management. 

Lawfulness, Fairness, and Transparency  
Fair and transparent processing demands that Data Controllers consider the potential impacts on data subjects; essentially, assessing and managing risks to data subjects’ rights. 

Data Minimisation and Purpose Limitation 
Ensuring that only necessary data is collected and processed inherently involves evaluating what is proportionate and appropriate, which are concepts rooted in risk assessment. 

Practical Skills DPOs and IG Officers Need 

Given the prominence of risk in the GDPR, DPOs and IG professionals should cultivate the following competencies: 

  • Risk Identification: Being able to recognise threats to data confidentiality, integrity, and availability; whether technical (e.g. cyberattacks) or organisational (e.g. poor access controls). 
  • Risk Analysis: Assessing the likelihood and potential impact of risks and understanding their relevance to the rights and freedoms of individuals. 
  • Risk Evaluation and Prioritisation: Comparing estimated risks against risk tolerance and legal thresholds (e.g. what constitutes ‘high risk’ under Article 35). 
  • Mitigation Planning: Developing and implementing controls to reduce risk to an acceptable level; whether through encryption, training, anonymisation, or policy development. 
  • Ongoing Monitoring: Risk is not static. DPOs must continuously monitor changes in technology, regulation, and business practices that may affect data risk profiles. 

For data protection and IG professionals, risk management is not a ‘nice-to-have’; it is a foundational skill.  

Interested in developing your risk management skills further? Consider enrolling on our new Risk Management in IG workshop 

Information and Records Management Practitioner Certificate: Special Discount at IRMS25 

Effective information and records management is vital for all organisations. It ensures compliance with legal requirements, mitigates risks, preserves institutional memory and facilitates efficiency. It is even more vital in an age of AI as the foundation of any AI system, especially Generative AI, is data. AI algorithms rely on vast amounts of data to learn, make predictions, and generate insights. Therefore, the accuracy, completeness, and reliability of this data are paramount.  

Act Now Training is pleased to report on the success of the Information and Records Management Practitioner Certificate which has been completed by three cohorts since its launch last year. This certificate programme meets the need of information management professionals to equip themselves with practical skills to navigate the full information and records lifecycle. It is one of the outcomes of our work to develop a comprehensive IG skills and competency framework and thereby give delegates a curriculum so they can develop themselves and their careers in a structured way. This also ensures organisations have all the relevant skills within their teams to meet the gaps and needs of the organisation. 

Course Content and Format 

Our comprehensive course syllabus has been designed by leading records management specialists. By the end of the course, delegates will gain skills in, amongst other things, legal frameworks and terminology to data auditing, retention schedules, and digital preservation.  

Scott Sammons is the principal trainer on this course. Scott is a recognised expert on records management. He was previously the Chair of the Information and Records Management Society (2016-2020) and now leads the IRMS work on accreditation. Scott said: 

“Records management is essential good business practice as well as a key component of compliance with IG legislation such as GDPR and FOI. Using practical hands-on teaching methods, we aim to inspire delegates to implement records management best practice in their workplace.” 

The course is structured over four days, approximately one day per month, and can be undertaken online or in the classroom. Each day includes engaging discussions, exercises and case studies. Upon completion, delegates must submit a practical assessment within 30 days. Personal tutor support is provided, throughout the course, together with comprehensive training materials. 

Discount at the IRMS Conference 2025  

Whether you are a records manager, Freedom of Information Officer or Data Protection Officer this practitioner level certificate will teach you the theory of records management alongside practical hands-on application. The next course starts in July. Come and visit us at the IRMS Conference for a special discount voucher. Places are limited, so please book now  to avoid disappointment. 

Visit Us at the IRMS Conference 2025  

We are excited to announce that Act Now Training will be exhibiting at the IRMS Conference (“The Peaky Path to Progress”) in Birmingham next week. 

If you are attending the conference, we invite you to stop by our exhibitor stand. Here is what awaits you (in addition to the visual delight of our special Peaky Blinders themed stand!): 

Training Course Vouchers – For IRMS Delegates Only! 
We are offering exclusive conference-only discounts on our most popular training courses. Whether you’re looking to upskill in AI, data ethics, records management or FOI compliance, we’ve got a course tailored for your goals. 

Exclusive Bags 

Last year’s bags were a must have for any fashion-conscious information governance professional. This year our bags have been designed with a Peaky Blinders theme. Our way of saying thank you for being part of the IRMS community. 

Expert Advice on Training Pathways 

Not sure which training track is right for you or your team? Want to develop your expertise in AI Governance? Our friendly team will be on hand to chat about your goals and help you map out the best learning path; whether you’re just starting or aiming for advanced certification

Let’s Talk Learning 

This year’s conference theme is all about connection, innovation, and the future of information governance – and we are here to help you be at the forefront. Come and chat with us about how our training can support your professional development, boost your team’s capability, and help your organisation stay compliant and competitive. 

We can’t wait to meet you at #IRMS25!

Article 15 GDPR and “Meaningful Information” about Automated Decision-Making: What does this mean for AI? 

Article 15 of the EU and UK GDPR not only gives Data Subjects the right to obtain their personal data from the Data Controller but also the right to receive additional information about the processing. This includes: 

 “the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.” 

A recent ruling by the European Court of Justice (ECJ) sheds light on the concept of “meaningful information” and will have implications for those deploying AI systems. The case in question, C-203/22 Dun & Bradstreet Austria GmbH, concerns an Austrian mobile telecom operator. The company refused to enter into a contract with a customer due to their poor credit score. This decision was based on an automated credit evaluation provided by a third-party credit agency. 

The customer requested access to the information held by the credit agency so that they could understand the decision. The customer was dissatisfied with the disclosed information and so took legal action to demand further clarification on the logic behind the automated decision-making process. The core issue was whether the credit agency was obligated to provide more detailed information about the automated process under Article 15(1)(h) GDPR (as quoted above). The agency argued that doing so would expose trade secrets. However, the court ruled that it must provide “meaningful information about the logic involved” as required by GDPR. 

The Enforcement Court in Austria, tasked with enforcing the ruling, referred the following questions to the ECJ: 

  1. Does “meaningful information about the logic involved” require the controller to provide a comprehensive explanation of the procedures and principles used to come to a specific decision? 
  1. In cases where the controller argues that the requested information involves third-party data protected by the GDPR or trade secrets, is the controller obliged to submit the potentially protected information to supervisory authorities or courts for review? 

Meaningful Information 

In response to the first question, the ECJ confirmed that the phrase “meaningful information about the logic involved” fundamentally refers to all relevant details concerning the automated decision-making process. This includes an explanation of the procedures and principles used to arrive at the decision. 

While the ECJ made it clear that “meaningful information” does not require the disclosure of complex algorithms, it does require a sufficiently detailed explanation of the decision-making process. It emphasised that, in line with Articles 13(2)(f) and 14(2)(g) of the GDPR, which establish transparency requirements, the information must be clear, concise, and easily understandable. Data Subjects should be able to comprehend how their personal data is being processed. The right of access enshrined in Article 15 of the GDPR allows individuals to verify the accuracy and lawfulness of the processing of their personal data, which is a crucial safeguard under Article 22(3) that governs automated decision-making and profiling. 

Trade Secrets  

On the second question, the ECJ struck a delicate balance between Data Subjects’ right to access their data and the protection of third-party rights, such as trade secrets. It reiterated that while data protection is a fundamental right, it must be weighed against intellectual property protections as outlined in Recital 63 of the GDPR. 

The ECJ said that if providing access to personal data could violate the rights of third parties, such as revealing trade secrets, the controller must assess whether it is possible to disclose the information without infringing on third party rights. In cases of conflict, the issue must be referred to the relevant supervisory authority or court to decide on an appropriate solution. 

Importantly, the ECJ ruled that no Member State can impose a blanket ban on disclosing business or trade secrets, as doing so would undermine the GDPR’s requirement for a balanced approach to competing rights. In situations where access requests are contested, controllers are required to provide relevant information to supervisory authorities or courts, enabling an informed decision based on the principle of proportionality. 

So what are the implications of this ECJ ruling for AI systems 

While the ruling specifically focusses on the EU GDPR, it underscores the growing importance of transparency in data processing practices, especially when implementing automated decision-making processes. Organisations using AI for automated decision-making must ensure transparency by providing data subjects with clear, understandable explanations of how decisions are made even if complex algorithms are involved. Developers must design systems that can deliver “meaningful information” about the logic behind automated outcomes, while deployers must ensure this information is communicated effectively to individuals. Transparency is also a key theme of the recently enacted EU AI Act

Act Now recently launched the AI Governance Practitioner Certificate. This course is designed to equip compliance professionals with the essential knowledge and skills to navigate this transformative technology being implemented within their organisations while upholding the highest standards of data protection and information governance. 

Footballers’ Objections to Data Processing: Red Card or Red Herring? 

They play for us, not for the odds, 
They’re not just names for betting gods, 
If you want stats, you best be fair — 
Cos we stand with the players, everywhere!” 

Could this become a popular chant in football stadiums? It could, if a group of football players get their way.  

In the era of data-driven sports and digital fan engagement, betting and gaming companies increasingly rely on detailed player data to power their platforms.
From setting betting odds to fuelling fantasy leagues and live-match experiences, this data is central to the user experience. The data ranges from average goals-per-game for an outfield player to height, weight and passes during a game. Some of this data may be sold to the companies by clubs whilst other data may be collected by using public sources or by attending matches.  

Back in 2021, Ibrahim Hasan was interviewed by BBC Radio 4 when football players were threatening legal action against companies for the trade in their personal data. The players, led by former Cardiff City manager Russell Slade, sought compensation for the trading of their performance data over the past six years by various companies as well as an annual fee for any future use.  We were sceptical, at the time, about legal basis of any potential claim and its likelihood of success (blog post here).   

The GDPR does give players rights over their personal data which allow them to exercise some element of control including the right to see what data is held about them, to object to its processing and to ask for it to be deleted. Last month, Computer Weekly reported that the Global Sports Data and Technology Group, of which Russell Slade is a director, has submitted objection requests, on behalf of the players they represent, to gaming, betting and data-processing companies over the use of their data. They are citing ethical concerns with how the data distribution can affect the players’ career prospects.  

Article 21 of the UK GDPR states: 

“The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) or (f) of Article 6(1), including profiling based on those provisions.” 

Clearly one of the legal basis upon which betting and gaming companies process players’ personal data is legitimate interests (Article 6(1)(f)) of the UK GDPR, and so Article 21 is engaged.  However, the second paragraph of Article 21 provides a reason for the companies to refuse the objection requests: 

“The controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.” 

What could be “compelling legitimate grounds for the processing”?  The companies might argue that their use of player data contributes to a larger economic ecosystemthat ultimately benefits all stakeholders in football, including the players themselves. Their case could rest on the idea that engaging fans through betting and interactive gaming drives up interest in football. Increased viewership, in turn, boosts broadcasting revenues, club sponsorship deals, and the market value of football competitions; benefits that indirectly lead to higher wages and endorsement deals for the players. By providing platforms that stimulate engagement, betting companies help sustain and expand the financial health of the sport, from which players also profit. 

Let’s see where this goes. If court action follows, not only will the result have a big impact on the sports data industry but it could also lead to data protection themed chants on the terraces!  

This and other GDPR developments will be discussed in detail on our upcoming GDPR Update workshop. We have a few places left on our next 
GDPR Practitioner Certificate course starting on 29th May. 

New AI Governance Practitioner Certificate: Dates Published 

Act Now is pleased to publish the 2025 cohort dates for our new AI Governance Practitioner Certificate.  

This course is designed to equip Information Governance professionals with the essential knowledge and skills to navigate AI deployment within their organisations. As we detailed in our previous blog “What is the role of IG Professionals in AI Governance?”, AI implementation is already here. IG professionals should be aware of how this technology works so that they can help to ensure that there is responsible deployment from an IG perspective, just as would be the case with any new technology.  

The course is run over four days and the details of the upcoming cohorts and dates are below. The first cohort in May is already fully booked. 

May: 28th May, 29th May, 18th June, 19th June (Fully Booked) 

June: 27th June, 4th July, 11th July,18th July  

July: 29th July, 5th August,12th August,19th August 

September: 18th September, 25th September, 2nd October, 9th October 

October: 29th October, 5th November, 12th November, 19th November 

We are currently offering a £100 discount for the month of May (for any cohort) on the published price of this course. Please quote the code “Art100” when booking.  

What is the Role of IG Professionals in AI Governance? 

The rapid rise of AI deployment in the workplace brings a host of legal and ethical challenges. AI governance is essential to addresses these challenges and ensuring AI systems are transparent, accountable, and aligned with organisational values. 

AI governance requires a multidisciplinary approach involving, amongst others, IT, legal, compliance and industry specialists. IG professionals also possess a unique skill set that makes them key stakeholders in the governance process. Here’s why they should actively position themselves to play a key role in AI governance within their organisations. 

AI Governance is Fundamentally a Data Governance Issue 

At its core, AI is a data-driven technology. The fairness and reliability of AI models depend on the quality, accuracy, and management of data. If AI systems are trained on poor-quality or biased data, they can produce flawed and discriminatory outcomes. (See Amnesty International’s report into police data and algorithms.)  

IG professionals specialise in ensuring that data is accurate, well-structured, and fit for purpose. Without strong data governance, organisations risk deploying AI systems that amplify biases, make inaccurate predictions, or fail to comply with regulatory requirements. 

Regulatory and Compliance Expertise is Critical 

AI governance is increasingly being shaped by regulatory frameworks around the world. The EU AI Act and regulations and guidance from other jurisdictions highlight the growing emphasis on AI accountability, transparency, and risk management. 

IG professionals have expertise in interpreting legislation (such as GDPR, PECR and DPA amongst others) which positions them to help organisations navigate the complex legal landscape surrounding AI. They can ensure that AI governance frameworks comply with data protection principles, consumer rights, and ethical AI standards, reducing the risk of legal penalties and reputational damage. 

Managing AI Risks and Ensuring Ethical AI Practices 

AI introduces new risks, including algorithmic bias, privacy violations, security vulnerabilities, and explainability challenges. Left unchecked, these risks can undermine trust in AI and expose organisations to significant operational and reputational harm. 

IG Governance professionals excel in risk management (After all, that is what DPIAs are about). They are trained to assess and mitigate risks related to data security, data integrity, and compliance, which directly translates to AI governance. By working alongside IT and ethics teams, they can help establish clear policies, accountability structures, and risk assessment frameworks to ensure AI is deployed responsibly. 

Bridging the Gap Between IT, Legal, and Business Functions 

One of the biggest challenges in AI governance is the lack of alignment between different business functions. AI development is often led by technical teams, while compliance and risk management sit with legal and governance teams. Without effective collaboration, governance efforts can become fragmented or ineffective. 

IG professionals act as natural bridges between these groups. Their work already involves coordinating across departments to align data policies, privacy standards, and regulatory requirements. By taking an active role in AI governance, they can ensure cross-functional collaboration, helping organisations balance innovation with compliance. 

Addressing Data Privacy and Security Concerns 

AI often processes vast amounts of sensitive personal data, making privacy and security critical concerns. Organisations must ensure that AI systems comply with data protection laws, implement robust security measures, and uphold individuals’ rights over their data. 

IG and Data Governance professionals are well-versed in data privacy principles, data minimisation, encryption, and access controls. Their expertise is essential in ensuring that AI systems are designed and deployed with privacy-by-design principles, reducing the risk of data breaches and regulatory violations. 

AI Governance Should Fit Within Existing Frameworks 

Organisations already have established governance structures for data management, records retention, compliance, and security. Instead of treating AI governance as an entirely new function, it should be integrated into existing governance models. 

IG and Data Governance professionals are skilled at implementing governance frameworks, policies, and best practices. Their experience can help ensure that AI governance is scalable, sustainable, and aligned with the organisation’s broader data governance strategy. 

Proactive Involvement Prevents Being Left Behind 

If IG professionals do not step up, AI governance may be driven solely by IT, data science, or business teams. While these functions bring valuable expertise, they may overlook regulatory, ethical, and risk considerations. Fundamentally, as IG professionals, our goal is to ensure organisations are using data and any new technology responsibly. 

So we are not saying that IG and DP professionals should become the new AI overlords. But by proactively positioning themselves as key stakeholders in AI governance, IG and Data Governance professionals ensure that organisations take a holistic approach – one that balances innovation, compliance, and risk management. Waiting to be invited to the AI governance conversation risks being sidelined in decisions that will have long-term implications for data governance and organisational risk. 

Final Thoughts 

To reiterate, AI governance should not be the sole responsibility of IG and Data Governance professionals – it requires a collaborative, cross-functional approach. However, their expertise in data integrity, privacy, compliance, and risk management makes them essential players in the AI governance ecosystem. 

As organisations increasingly rely on AI-driven decision-making, IG and Data Governance professionals must ensure that these systems are accountable, transparent, and legally compliant. By stepping up now, they can shape the future of AI governance within their organisations and safeguard them from regulatory, ethical, and operational pitfalls. 

Our new six module AI Governance Practitioner Certificate will empower you to understand AI’s potential, address its challenges, and harness its power responsibly for the public benefit.