Children’s Image Hosting Platform Fined For Privacy Failures

Last week the Information Commissioner’s Office (ICO) issued its first UK GDPR fine of 2026. MediaLab.AI, Inc. (MediaLab), owner of image sharing and hosting platform Imgur, received a Monetary Penalty Notice of £247,590 for processing children’s personal data in ways that breached the UK GDPR.     

Safeguarding children’s privacy is a key enforcement priority for the ICO. 
In April 2023, it issued a £12.7 million fine to TikTok for a number of breached of the UK GDPR, including failing to use children’s personal data lawfully. The following year, the ICO launched its Children’s code strategy to look closely at social media platforms and video sharing platforms. 
In December it published a progress report on the strategy, reporting good progress and including a ‘proactive supervision programme’ to drive improvements in the industry. Perhaps this latest fine is part of this ‘proactive supervision programme’.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services” (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able to provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states: 

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.” 

Imgur’s terms of use did state that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform. 

In setting the £247,590 penalty amount, the ICO took into consideration the number of children affected by this breach, the degree of potential harm caused, the duration of the contraventions, and the company’s global turnover. It also considered MediaLab’s acceptance of the provisional findings set out in the Notice of Intent issued in September 2025 and its commitment to address the infringements if access to the Imgur platform in the UK is restored in the future. If MediaLab resumes processing the personal data of children in the UK (currently the Imgur site is not available in the UK) without implementing the measures it has committed to, the ICO may take further regulatory action. 

We are waiting for the Monetary Penalty Notice to be published. 
The ICO says it is still considering the redaction of personal and commercially confidential or sensitive information.  

This fine shows that the ICO’s spotlight is firmly on those processing children’s data. The Data (Use and Access) Act 2025, the key provisions of which came into force on last Thursday, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data. 

Listen to the Guardians of Data Podcast for the latest news and views on developments in GDPR, AI, cyber security and FOI.

This and other developments relating to children’s data will be covered in tomorrow’s online workshop, Working with Children’s Data. The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw.

New Guardians of Data Podcast: In Conversation with Jon Baines 

Act Now is pleased to bring you the first episode of a new podcast; Guardians of Data. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information. In each episode we will be speaking with experts and practitioners to unpack the big issues shaping the IG profession.

In information governance, there’s no substitute for learning from those who have walked the path before us. Experienced IG leaders bring a wealth of knowledge from years at the frontline of data protection and information rights – navigating challenges, overcoming obstacles, and shaping best practice along the way. By listening to their stories, we can all grow in confidence and prepare for the IG challenges of tomorrow. 

In the first episode, we are joined by one such IG leader: Jon Baines is a Senior Data Protection Specialist at Mishcon de Reya LLP where he advises on complex data protection and FOI matters. Jon isn’t a lawyer in the traditional sense yet is listed in Legal 500 as a “Rising Star” in the Data Protection, Privacy and Cybersecurity category. Jon is the long-standing chair of the National Association of Data Protection (NADPO) and Freedom of Information Officers. He is regularly sought for comment by specialist and national media and writes extensively on data protection matters. 

In our conversation, Jon shares his journey into IG, his advice for both new starters and seasoned professionals and his perspective on the future of the profession. 

Listen via the player below, or on your preferred podcast app.
Available on Apple Podcasts, Spotify, and all major podcast platforms.

Data (Use and Access) Act: Key Data Provisions In Force on Thursday

The Data (Use and Access) Act 2025 (Commencement No. 6 and Transitional and Saving Provisions) Regulations 2026 were made on 29th January 2026. They bring into force most of the amendments to the UK GDPR, PECR and the DPA 2018 made by The Data (Use and Access) Act 2025 (DUA Act). 

The amendments coming into force on Thursday (5th February 2026), amongst others, cover: 

  • New ‘Recognised legitimate interests’  
  • When time starts for dealing with subject access requests 
  • Automated Decision Making
  • Information to be provided to data subjects 
  • Safeguards for processing for research etc purposes 
  • International Data Transfers 
  • PECR and marketing 

You can read a summary of the amendments here

DUA Act Workshop in Birmingham (Thursday 5th February 2026)

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham.

Revised GDPR Handbook   

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.    

Do Tennis Players Have a Right to Privacy?

John McEnroe is remembered for his on-court outbursts almost as much as for his exquisite shot-making. “You cannot be serious!” is an instantly recognisable sporting catchphrase. When McEnroe was at the height of his career in the 1980s, tennis players’ behaviour was scrutinised almost exclusively through on-court broadcast cameras. What happened off court largely remained unseen. 

Today, tennis, alongside other elite sports, is an environment of continuous monitoring; players are filmed arriving, warming up, competing and exiting. Visibility is a structural feature of the modern sports industry, justified for enhancing fan engagement and serving security, integrity and officiating purposes. But where should the balance lie when such footage reveals players’ emotional states – be it anger, distress or vulnerability? 

This question came up this week when a tennis player, Coco Gauff, called for greater privacy after footage emerged of her smashing her racquet following her Australian Open quarter-final defeat. Crucially, the incident did not occur on court. Gauff was filmed in the players’ area by behind-the-scenes cameras, with the footage later broadcast on television and circulated widely on social media. Gauff said she had made a conscious effort to suppress her emotions until she believed she was away from public view, referencing a similar incident at the 2023 US Open when Aryna Sabalenka was filmed smashing her racquet after losing the final. Since 2019, the Australian Open has shown footage from the players’ zone beneath the Rod Laver Arena, including the gym, warm-up areas and corridors leading from locker rooms. Camera access in these spaces is more restricted at the other Grand Slams.  

Gauff is not alone in raising concerns about behind-the-scenes cameras. Six-time major champion Iga Świątek said this week players are being watched “like animals in the zoo” in Melbourne. Semi-finalist Jessica Pegula described the constant filming as an “invasion of privacy”, adding that players feel “under a microscope constantly”. Tournament organisers, Tennis Australia, responded by emphasising fan engagement, saying the cameras help create a “deeper connection” between players and audiences while insisting that player comfort and privacy remain a priority. 

From a legal perspective, this issue is not merely a matter of optics. Under modern data-protection regimes such as the GDPR and the Australian Privacy Act, video footage of identifiable athletes constitutes personal data. Where that footage reveals emotional states it becomes particularly sensitive. Organisers must therefore be able to justify not only collecting such footage, but retaining, broadcasting and amplifying it. That justification is relatively straightforward during live play, where filming is integral to the sport itself. It becomes much harder once the match has ended. Filming in player tunnels, medical areas or immediately after defeat may be defensible for security or safety reasons. But the retention and circulation of emotionally charged moments for entertainment value sits on far shakier legal ground.  

Players may agree to extensive filming as a condition of participation, but that agreement does not extinguish their broader privacy rights, particularly where footage is used in a way that is disproportionate, stigmatising or disconnected from its original purpose. This tension is becoming harder to ignore as governing bodies simultaneously emphasise mental health and player welfare while permitting practices that expose athletes’ most vulnerable moments to global audiences. 

Other blog posts that may interest you:

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop.  

Who Guards Our Data? Responsibility, Trust, and the Reality of Data Protection 

Data protection is often framed as a question of compliance. Regulations, policies, and frameworks dominate much of the discussion. 

In practice, however, the most important questions are about responsibility, trust, and judgement. 

Every organisation that collects or uses personal data is, in effect, a custodian of that information. With that role comes an expectation: that personal data will be handled carefully, used appropriately, and respected as something that belongs to people, not systems. Meeting those expectations is rarely straightforward. 

Day-to-day data protection decisions are often made under pressure. They involve trade-offs, uncertainty, and situations where the law does not provide a simple or immediate answer. Legislation defines the boundaries, but it does not resolve every ethical or operational question organisations face. 

This is where many of the real challenges of data protection sit, in the grey areas between what is permitted and what is appropriate. 

Guardians of Data was created to explore this space. The podcast brings together people working in privacy and information governance to talk openly about the realities of responsible data use. Rather than focusing on theory or compliance checklists, the conversations centre on how decisions are made in real organisations, and how trust is maintained when handling personal data. 

Each episode is short and focused, examining judgement calls, ethical considerations, and the expectations placed on organisations entrusted with personal data. The aim is not to provide definitive answers, but to encourage thoughtful discussion about what good data stewardship looks like in practice. 

Guardians of Data is intended as a space for reflection and conversation for anyone navigating the responsibilities that come with using personal data in today’s digital environment.

Click below to listen to the podcasts.

Guardians of Data Podcast: Episode 5 

Act Now is pleased to bring you episode 5 of the Guardians of Data podcast.   In information governance, there is no substitute for learning from those who have walked the path before us. Experienced IG leaders bring a wealth of knowledge from years at the frontline of data protection and information rights – navigating challenges, overcoming obstacles and shaping best practice along the way. By sharing…

Guardians of Data Podcast: Episode 4

Act Now is pleased to bring you episode 4 of the Guardians of Data podcast. This is a show where we explore the world of information law and information governance; from privacy and AI to cybersecurity and freedom of information.   The topic of this episode is cyber security. Every week we read about organisations being hacked, held to ransom or their data being stolen. The BBC recently discovered,…

Guardians of Data Podcast: Episode 3

Act Now is pleased to bring you episode 3 of the Guardians of Data podcast. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information.   In the past few weeks, we have had a stark reminder of why transparency in public life is a democratic necessity. The US Government’s release of millions…

New Guardians of Data Podcast: In Conversation with Jon Baines 

Act Now is pleased to bring you the first episode of a new podcast; Guardians of Data. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information. In each episode we will be speaking with experts and practitioners to…

Who Guards Our Data? Responsibility, Trust, and the Reality of Data Protection 

Data protection is often framed as a question of compliance. Regulations, policies, and frameworks dominate much of the discussion.  In practice, however, the most important questions are about responsibility, trust, and judgement.  Every organisation that collects or uses personal data is, in effect, a custodian of that information. With that role comes an expectation: that…

Password Manager Provider Fined £1.2m for GDPR Data Breach 

On 20th November 2025, the Information Commissioner’s Office (ICO) fined password manager provider, LastPass UK Ltd, £1.2 million following a 2022 data breach that compromised the personal data of up to 1.6 million UK users. 

Two security incidents occurred in August 2022 when a hacker gained access first to a corporate laptop of an employee based in Europe and then to a US-based employee’s personal laptop on which the hacker implanted malware and then was able to capture the employee’s master password. The combined detail from both incidents enabled the hacker to access LastPass’ backup database and take personal data which included customer names, emails, phone numbers, and stored website URLs.  

For a good analysis of what went wrong at LastPass and how to avoid such incidents, please read this blog. This is the seventh GDPR fine issued by the ICO in 2025; all have been in relation to cyber security incidents.  In October professional and outsourcing services company Capita received a £14 million fine following a 
cyber-attack  which saw hackers gain access to 6.6 million people’s personal data; from pension and staff records to the details of customers of organisations Capita supports. In March an NHS IT supplier was fined £3million, in April a £60,000 fine was issued to a law firm and in June 23andMe, a US genetic testing company, was fined £2.31 million.  

The ICO has urged organisations to ensure internal security policies explicitly consider and address data breach risks. Where risks are identified access should be restricted to specific user groups. The ICO website is a rich source of information detailing ways to improve practices including Working from home – security checklist for employers, Data security guidance and Device security guidance

Cyber Security Training 

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security. See also our Managing Personal Data Breaches Workshop. 

Revised GDPR Handbook   

The data protection landscape continues to evolve. With the passing of the Data (Use and Access) Act 2025, data protection practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.   

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.    

DUA Act Workshop in Birmingham 

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham on 5th February 2026. 

Post Office Reprimand Following Horizon Data Breach 

You would think that the Post Office has learnt its lessons from the Horizon IT Scandal. And of course it would have taken extra care to ensure that the victims of the UK’s most widespread miscarriage of justice are not further harmed by their actions in dealing with the aftermath. Not so, judging by the Information Commissioner’s Office (ICO) announcement on Tuesday.  

The ICO has issued a reprimand to Post Office Limited following an ‘entirely preventable’ data breach which resulted in the unauthorised disclosure of personal data belonging to hundreds of postmasters who were the victims of the Horizon IT scandal.  The breach occurred when the Post Office’s communications team mistakenly published an unredacted version of a legal settlement document on its corporate website. The document contained the names, home addresses and postmaster status of 502 people who were part of group litigation against the organisation. The document remained publicly accessible for almost two months in 2024, before being removed following notification from an external law firm. 

During its investigation, the ICO found that the Post Office failed to implement appropriate technical and organisational measures to protect people’s personal data. There was a lack of documented policies or quality assurance processes for publishing documents on the Post Office website, as well as insufficient staff training, with no specific guidance on information sensitivity or publishing practices.  

In the ‘gold old days’ such a data breach would have attracted a substantial fine; especially considering the impact on the victims described by their lawyers (‘the shock and anxiety of this incident cannot help but compound all of the adverse harms suffered by our clients as a result of the wider Horizon scandal’.) Remember when the ICO fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online? 

 But we are in a new age of GDPR ‘enforcement’! The ICO says it had initially considered imposing a fine of up to £1.094 million on the Post Office Limited. However, it did not consider that the data protection infringements identified reached the threshold of ‘egregious’ under its public sector approach, and a reprimand has been issued instead. This approach, which was extended recently after a two year trial,  ‘prioritises early engagement and other enforcement tools such as warnings, reprimands, and enforcement notices, while issuing fines for only the most egregious breaches in the public sector’ so says the ICO. Not everyone agrees. The law firm, Handley Gill, has just published an analysis of the ICO’s public sector approach trial and the new version of it, essentially concluding that reprimands unaccompanied by enforcement notices won’t achieve the stated objective of driving up data protection standards in the public sector. 

The ICO highlights the following key lessons from this reprimand: 

  • Establish clear publication protocols: Sensitive documents should go through a formal review and approval process before being published online. A multi-step sign-off process can help prevent errors. 
  • Understand the data you handle: Every team, especially those handling public-facing content, must be trained to recognise personal information and assess its sensitivity in context. This includes understanding the reputational and emotional impact of disclosure. 
  • Centralise and classify documents: Use secure, shared repositories with clear access controls and classification labels. Avoid reliance on personal storage systems such as OneDrive and Google Drive. 
  • Define roles and responsibilities: Ensure that everyone involved in publishing content understands their role and the checks required before publication. 
  • Tailor training to the task: General data protection training is not enough. Teams need specific guidance on publishing protocols, data classification, and risk awareness.  

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop.The new (2nd) edition of the UK GDPR Handbook has been published. It contains all the changes made by the Data (Use and Access) Act 2025. 

ICO Public Sector Enforcement Policy to Continue

Last month, the Information Commissioner’s Office (ICO) announced that it will continue its controversial approach to enforcement of the UK GDPR against public sector organisations.   

A trial of the approach was launched in June 2022, in an open letter to public authorities from John Edwards. In the letter Mr Edwards indicated that greater use would be made of the ICO’s wider powers, including warnings, reprimands and enforcement notices, with fines only issued in the most serious cases. This approach has seen much criticism levelled at the ICO. Opponents say that it reduces the importance of data protection and gives special treatment to the public sector.  

One example of the approach, is the ICO’s action (or lack of it) in the Ministry of Defence’s Afghan Data breach. This involved an MoD official mistakenly emailing a spreadsheet containing personal details of over 18,000 Afghan nationals who had applied to move to the UK under the Afghan Relocations and Assistance Policy.  The breach was only discovered in August 2023, when excerpts of the data appeared on Facebook. By then, the damage was done. A new resettlement scheme for those on the leaked list was set up and has seen 4,500 Afghans arrive in the UK so far. The Afghan Relocation Route has cost £400m so far, and the Government has said it is expected to cost a further £450m. Despite the scale and sensitivity of the breach, the ICO decided not to take any regulatory action; not even a reprimand! In its press release, the ICO praised the MoD’s internal investigation and mitigation efforts, stating that “no further regulatory action is required at this time”.  

Following a review last year, and despite strong criticism of its enforcement track record, the ICO has now announced that it will continue its public sector enforcement approach. In his blog post, John Edwards, said: 

“Fines in the public sector, particularly in local government, risk punishing the same people harmed by a breach by reducing budgets for vital services. They still have their place in some cases, but so do other enforcement tools.  

The review of our public sector approach trial reaffirmed that reprimands drive change and publishing them creates strong reputational incentives for compliance, while also offering other organisations valuable lessons from the mistakes of others… 

Focusing on a proactive approach of working with organisations to identify risks and implement improvements can influence sustainable change, protect public trust, and ensure taxpayer money is invested in prevention rather than punishment. The net benefit of this approach is higher data protection standards and faster remediation, backed by sanctions when necessary.” 

Following a consultation earlier this year, the ICO has also published a clearer definition of organisations in scope and the circumstances under which a fine may be issued.  

STOP PRESS: The law firm, Handley Gill, has just published an analysis of the ICO’s Public Sector Approach trial and the new version of it, essentially concluding that reprimands unaccompanied by enforcement notices won’t achieve the stated objective of driving up data protection standards in the public sector.

Revised GDPR Handbook  

  The data protection landscape continues to evolve. With the Data (Use and Access) Act 2025 now in force, practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.  

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.   

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop. 

Proposed Changes to the EU GDPR: Could we see more changes to the UK GDPR?

Yesterday the European Commission published its long awaited Digital Omnibus Regulation Proposal and Digital Omnibus on AI Regulation Proposal. If approved, these proposals will mean significant changes to the EU GDPR and other EU legislation and may even encourage the UK to further amend the UK GDPR. 

The aim of the “Digital Omnibus” package is to ease administrative burdens for businesses across areas like privacy, cybersecurity and artificial intelligence. Although the EU GDPR is considered balanced and fit for purpose, “targeted changes” are proposed to address concerns, particularly from smaller companies. These include:

  • Clarification of Definitions: The definition of “personal data” is clarified. Information is not considered personal to a company if it does not possess means “reasonably likely” to be used to identify an individual.
  • Processing for AI Training: It is clarified that the processing of personal data for the development and training of AI systems can constitute a “legitimate interest” under certain conditions.
  • Simplified Reporting of Data Breaches: The reporting obligation to supervisory authorities is aligned with the threshold for notifying data subjects. A report is only required if there is a “high risk” to the rights and freedoms of natural persons. The deadline for reporting is extended to 96 hours.
  • Harmonization of Data Protection Impact Assessments (DPIA): National lists of processing operations requiring a DPIA (or not) are to be replaced by unified EU-wide lists to promote harmonisation.
  • Scientific Research: The conditions for data processing for scientific research purposes are clarified by defining “scientific research” and clarifying that this constitutes a legitimate interest.

The EU AI Act also faces a number of amendments, including simplifications for small and medium-sized enterprises and small mid-cap companies in the form of pared-back technical documentation requirements. Other measures involve sandboxes for real-world testing and to “reinforce the AI Office’s powers and centralise oversight of AI systems built on general-purpose AI models, reducing governance fragmentation”.

Both omnibus packages now have a long road ahead as they enter into the trilogue negotiations with European Parliament and the Council of the European Union. It is expected to take at least several months until negotiations are finalised. 

Impact on the UK

The UK has already enacted its own package of amendments to the UK GDPR in the form of the Data (Use and Access) Act 2025 which received Royal Assent on 19th June 2025. The amendments are quite modest even before comparing them to the EU proposals above. 

A more bolder list of amendments were contained in the Data Protection and Digital Information Bill published in 2022 by the Conservative Government. This included proposals to amend the definition of personal data and to replace Data Protection Officers with Senior Responsible Individuals. This bill was later replaced by a diluted bill of the same name (number 2 Bill) only for that to be dropped in the Parliamentary “wash up” stage before the last General Election.

Could the EU reforms (if enacted) lead to the UK making more fundamental changes to the UK GDPR? We doubt it. The Labour Government has more pressing priorities and with the passing of the DUA Act they can say they have “done GDPR reform”. If we get a change in Government, then Reform and the Conservatives might target the UK GDPR as way of reigning in “pesky human rights laws”. 

Data protection professionals need to assess the changes to the UK data protection regime made by the DUA Act. Our half day workshop will explore the Act in detail giving you an action plan for compliance. A revised UK GDPR Handbook is now available incorporating the changes made by the DUA Act. 

New Guidance on AI Risk Management

The development, procurement and deployment of AI systems involving the processing of personal data raises significant risks to data subjects’ fundamental rights and freedoms, including but not limited to privacy and data protection. The principle of accountability enshrined in the UK GDPR and the EU GDPR require Data Controllers to identify and mitigate these risks, as well as to demonstrate how they did so. This is especially important for AI systems that are the product of intricate supply chains often involving multiple actors processing personal data in different capacities.

The European Data Protection Supervisor (EDPS) has just released an important new guidance document to help organisations conduct data protection risk assessments when developing, procuring, or deploying AI systems.  It focuses on the risk of non-compliance with certain data protection principles for which the mitigation strategies that controllers must implement can be technical in nature – namely fairness, accuracy, data minimisation, security and data subjects’ rights. 

Key sections of the document address:

  • the risk management methodology according to ISO 31000:2018
  • the typical development lifecycle of AI systems as well as the different steps involved in their procurement 
  • the notions of interpretability and explainability 
  • an analytical framework for identifying and treating risks that may arise in AI systems, structured according to the data protection principles potentially affected. 

The EDPS has issued this guidance in his role as a data protection supervisory authority for EU institutions. However it is a very useful document for any organisation deploying AI and which requires guidance on how to systematically  assess the risks from a data protection perspective. 

Our AI Governance Practitioner Certificate course, is designed to equip Information Governance professionals with the essential knowledge and skills to management the risk of AI deployment within their organisations. This year 50 delegates, from a variety of backgrounds, have successfully completed the course, giving great feedback

The first course of 2026 starts on 8th January. Places are limited so book early to avoid disappointment. If you require an introduction to AI and information governance, please consider booking on our one day workshop