New AI Governance Practitioner Certificate: Dates Published 

Act Now is pleased to publish the 2025 cohort dates for our new AI Governance Practitioner Certificate.  

This course is designed to equip Information Governance professionals with the essential knowledge and skills to navigate AI deployment within their organisations. As we detailed in our previous blog “What is the role of IG Professionals in AI Governance?”, AI implementation is already here. IG professionals should be aware of how this technology works so that they can help to ensure that there is responsible deployment from an IG perspective, just as would be the case with any new technology.  

The course is run over four days and the details of the upcoming cohorts and dates are below. The first cohort in May is already fully booked. 

May: 28th May, 29th May, 18th June, 19th June (Fully Booked) 

June: 27th June, 4th July, 11th July,18th July  

July: 29th July, 5th August,12th August,19th August 

September: 18th September, 25th September, 2nd October, 9th October 

October: 29th October, 5th November, 12th November, 19th November 

We are currently offering a £100 discount for the month of May (for any cohort) on the published price of this course. Please quote the code “Art100” when booking.  

What is the Role of IG Professionals in AI Governance? 

The rapid rise of AI deployment in the workplace brings a host of legal and ethical challenges. AI governance is essential to addresses these challenges and ensuring AI systems are transparent, accountable, and aligned with organisational values. 

AI governance requires a multidisciplinary approach involving, amongst others, IT, legal, compliance and industry specialists. IG professionals also possess a unique skill set that makes them key stakeholders in the governance process. Here’s why they should actively position themselves to play a key role in AI governance within their organisations. 

AI Governance is Fundamentally a Data Governance Issue 

At its core, AI is a data-driven technology. The fairness and reliability of AI models depend on the quality, accuracy, and management of data. If AI systems are trained on poor-quality or biased data, they can produce flawed and discriminatory outcomes. (See Amnesty International’s report into police data and algorithms.)  

IG professionals specialise in ensuring that data is accurate, well-structured, and fit for purpose. Without strong data governance, organisations risk deploying AI systems that amplify biases, make inaccurate predictions, or fail to comply with regulatory requirements. 

Regulatory and Compliance Expertise is Critical 

AI governance is increasingly being shaped by regulatory frameworks around the world. The EU AI Act and regulations and guidance from other jurisdictions highlight the growing emphasis on AI accountability, transparency, and risk management. 

IG professionals have expertise in interpreting legislation (such as GDPR, PECR and DPA amongst others) which positions them to help organisations navigate the complex legal landscape surrounding AI. They can ensure that AI governance frameworks comply with data protection principles, consumer rights, and ethical AI standards, reducing the risk of legal penalties and reputational damage. 

Managing AI Risks and Ensuring Ethical AI Practices 

AI introduces new risks, including algorithmic bias, privacy violations, security vulnerabilities, and explainability challenges. Left unchecked, these risks can undermine trust in AI and expose organisations to significant operational and reputational harm. 

IG Governance professionals excel in risk management (After all, that is what DPIAs are about). They are trained to assess and mitigate risks related to data security, data integrity, and compliance, which directly translates to AI governance. By working alongside IT and ethics teams, they can help establish clear policies, accountability structures, and risk assessment frameworks to ensure AI is deployed responsibly. 

Bridging the Gap Between IT, Legal, and Business Functions 

One of the biggest challenges in AI governance is the lack of alignment between different business functions. AI development is often led by technical teams, while compliance and risk management sit with legal and governance teams. Without effective collaboration, governance efforts can become fragmented or ineffective. 

IG professionals act as natural bridges between these groups. Their work already involves coordinating across departments to align data policies, privacy standards, and regulatory requirements. By taking an active role in AI governance, they can ensure cross-functional collaboration, helping organisations balance innovation with compliance. 

Addressing Data Privacy and Security Concerns 

AI often processes vast amounts of sensitive personal data, making privacy and security critical concerns. Organisations must ensure that AI systems comply with data protection laws, implement robust security measures, and uphold individuals’ rights over their data. 

IG and Data Governance professionals are well-versed in data privacy principles, data minimisation, encryption, and access controls. Their expertise is essential in ensuring that AI systems are designed and deployed with privacy-by-design principles, reducing the risk of data breaches and regulatory violations. 

AI Governance Should Fit Within Existing Frameworks 

Organisations already have established governance structures for data management, records retention, compliance, and security. Instead of treating AI governance as an entirely new function, it should be integrated into existing governance models. 

IG and Data Governance professionals are skilled at implementing governance frameworks, policies, and best practices. Their experience can help ensure that AI governance is scalable, sustainable, and aligned with the organisation’s broader data governance strategy. 

Proactive Involvement Prevents Being Left Behind 

If IG professionals do not step up, AI governance may be driven solely by IT, data science, or business teams. While these functions bring valuable expertise, they may overlook regulatory, ethical, and risk considerations. Fundamentally, as IG professionals, our goal is to ensure organisations are using data and any new technology responsibly. 

So we are not saying that IG and DP professionals should become the new AI overlords. But by proactively positioning themselves as key stakeholders in AI governance, IG and Data Governance professionals ensure that organisations take a holistic approach – one that balances innovation, compliance, and risk management. Waiting to be invited to the AI governance conversation risks being sidelined in decisions that will have long-term implications for data governance and organisational risk. 

Final Thoughts 

To reiterate, AI governance should not be the sole responsibility of IG and Data Governance professionals – it requires a collaborative, cross-functional approach. However, their expertise in data integrity, privacy, compliance, and risk management makes them essential players in the AI governance ecosystem. 

As organisations increasingly rely on AI-driven decision-making, IG and Data Governance professionals must ensure that these systems are accountable, transparent, and legally compliant. By stepping up now, they can shape the future of AI governance within their organisations and safeguard them from regulatory, ethical, and operational pitfalls. 

Our new six module AI Governance Practitioner Certificate will empower you to understand AI’s potential, address its challenges, and harness its power responsibly for the public benefit.  

Act Now Launches AI Governance Practitioner Certificate for the Middle East 

The Middle East has emerged as a dynamic force in the global AI landscape, making substantial strides in AI deployment and initiatives. Examples include: 

  • In the UAE, the Advanced Technology Research Council has developed the Falcon series of large language models, which have been integrated into sectors like healthcare and adopted internationally.   
  • Saudi Arabia, under its Vision 2030, has invested $40 billion into AI development, focusing on smart cities and energy.  
  • Qatar is expanding its AI infrastructure with Ooredoo, a leading telecom company, investing QR2 billion to enhance its data centres.  

The Middle East’s commitment to AI innovation has seen a parallel focus on governance frameworks. The UAE AI Charter, introduced in 2024, outlines 12 guiding principles emphasising transparency, inclusivity and accountability in AI development.​ In Saudia Arabia, the Saudi Data and Artificial Intelligence Authority has issued AI Ethics Principles and Generative AI Guidelines whilst the KSA government recently launched a consultation on the Global AI Hub Law. 

As AI technologies become increasingly integrated into societal frameworks, the role of governance professionals becomes paramount. Understanding AI governance is essential for several reasons: 

  • Ethical Oversight: AI systems must be developed and deployed ethically to prevent biases and ensure fairness. Governance professionals are instrumental in establishing frameworks that uphold ethical standards.​ 
  • Regulatory Compliance: With nations implementing AI-related regulations and guidelines, professionals must navigate these legal landscapes to ensure compliance and mitigate risks.​ 
  • Public Trust: Transparent and accountable AI practices foster public trust, which is crucial for the widespread adoption of AI technologies.​ 
  • Strategic Leadership: Professionals equipped with AI governance knowledge can lead initiatives that align technological advancements with societal values and objectives.​ 

Building the AI Skillset  

For compliance professionals in the Middle East, this is an opportune moment to acquire expertise in AI governance, ensuring that AI technologies are developed and deployed responsibly, ethically, and in alignment with the region’s strategic goals. There is also a professional development opportunity; compliance professionals can position themselves as forward-thinking leaders who can bridge the gap between law, ethics, and technology. 

With these objectives in mind, Act Now is pleased to launch our new AI Governance Practitioner Certificate (MENA). This course is designed to equip you with the essential knowledge and skills to navigate this transformative technology within the organisations while upholding the highest standards of data protection and information governance.   

In just six modules, this immersive course will empower you to understand AI’s potential, address its challenges, and harness its power responsibly for public benefit. From real-world case studies to the latest regulatory updates, you’ll gain a deep understanding of how to manage AI ethically, securely, and in compliance with emerging laws.    

By completing the course, you will gain the skills to:  

  1. Explain foundational AI concepts, including its technologies, applications, and key milestones in its evolution.  
  1. Identify real-world examples of AI risks and demonstrate an understanding of their legal and ethical dimensions.  
  1. Interpret the role of key legal and regulatory frameworks, such as the UAE and KSA PDPL, GDPR and the EU AI Act, in governing AI systems.  
  1. Evaluate organisational strategies for ensuring transparency, accountability, and fairness in AI development.  
  1. Propose ethical and compliance-focused solutions to mitigate AI risks while balancing innovation and regulatory adherence.  
  1. Apply course concepts to analyse case studies and participate in informed discussions about AI’s role in society and industry.  

This new course builds on the success of our UAE and KSA Data Protection Officer certificates. It will run in July and September and we are also able to deliver it on an
in-house customised basis. Please get in touch to learn more.  

ICO Issues £60,000 GDPR Fine  

The Information Commissioner’s Office (ICO) has fined a Merseyside-based law firm £60,000 following a cyber-attack that led to highly sensitive personal data being published on the dark web. 

DPP Law Ltd (DPP) specialises in a number of areas of law including crime and actions against the police. It suffered the cyber-attack in June 2022 which affected access to the firm’s IT systems for over a week. The hackers were able to move laterally across DPP’s network and take over 32GB of data. DPP only became aware of this after the National Crime Agency contacted the firm to advise information relating to their clients had been posted on the dark web. DPP did not report the incident to the ICO until 43 days after they became aware of it. 

The ICO found that DPP failed to put appropriate measures in place to ensure the security of personal data held electronically. This failure enabled the hackers to gain access to DPP’s network, via an infrequently used administrator account which lacked multi-factor authentication (MFA) and steal large volumes of data. 

This is the second GDPR fine issued to a law firm. In March 2022, the ICO issued a fine of £98,000 to Tuckers Solicitors LLP. The fine followed a ransomware attack on the firm’s IT systems in August 2020. The attacker encrypted 972,191 files, of which 24,712 related to court bundles. 60 of those were exfiltrated by the attacker and released on the dark web. 

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security. See also our Managing Personal Data Breaches Workshop

Nathan Bent Joins Act Now Training Team 

Act Now Training is pleased to announce that Nathan Bent has joined its team of associates

Nathan is a data protection specialist and a Certified Data Ethics Practitioner with the Open Data Institute (ODI). He has delivered over 40 courses in the past six months, covering Data Governance, Data Protection, Data Privacy, Data Security and Data Ethics. 

Throughout his career, Nathan has been recognised as a caring, joyful, and
people-oriented leader who is passionate about developing others and sharing knowledge. Nathan brings this passion and experience into every learning session, using practical, easy-to-understand examples, case studies, and real-life experiences to help each participant succeed. 

Nathan has worked as a Data Protection Officer, Data and Information Governance Manager, Chief Information Security Officer and Head of Data Governance and Technology in varying sectors from Engineering and Energy, to Social Housing and MedTech.  Over the past 25 years, he has led, coached, and educated teams that deliver data management, complex data insights, forecasting, statistical analysis, big data, data visualisation, data ethics, and legal and med-tech systems. 

Nathan said: 

“I am so pleased to be joining the Act Now team whose values and ethos are so closely aligned to mine. I have a lifelong passion for learning and knowledge. My training sessions are known for being dynamic, full of enthusiasm and often filled with laughter. I have made it my mission over the years to make what are often dry or (dare I say) boring subjects, fun and engaging.” 

Alongside Dr. Cedric Krummes, Nathan, our second recent appointment, will assist us in continuing to serve and deliver training courses for our clients. He will conduct
one-day workshops and our new AI Governance Practitioner Certificate. We warmly welcome Nathan to our team of dedicated and passionate trainers. 

Act Now Expands its Training Team 

At Act Now Training, we believe in building a privacy-conscious world. Our goal is to promote trust and respect for privacy, ensuring organisations embed data protection into their operations by default. Our  team of associates break down complex legal concepts, making education accessible and empowering IG professionals to become leaders in their field. By fostering a culture of responsible data usage, we help build public trust and drive positive change. 

With new courses launching in 2025, including the very popular AI Governance Practitioner Certificate, we are seeing a surge in demand for in house and public training courses. To service this demand, we are welcoming an additional information law expert to our team.  

Dr. Cedric Krummes has worked as a Data Protection Officer, Information Governance Manager, Information Security Officer, and Data Owner in various sectors. He has been teaching and training for over 20 years in a wide range of industries, countries, and languages. He has presented papers at international conferences, spoken on panels and facilitated workshops. Cedric can explain complex topics in an engaging, practical, and jargon-free way. He will be assisting our team to deliver everything from one-day workshops to advanced practitioner certificate courses. 

Ibrahim Hasan, Director of Act Now Training, said: 

“I am very pleased that Cedric has joined our team. His strong educational background coupled with experience of working in IG for many years, will help us continue to equip IG professionals with the knowledge and skills they need to navigate the rapidly evolving landscape of data protection and privacy landscape.” 

£3 Million Fine for NHS IT Supplier  

The Information Commissioner’s Office has announced today that it has issued a fine under the UK GDPR to an NHS IT supplier, in relation to a significant data breach in 2022. Following a Notice of Intent issued last year for £6.09 million, Advanced Computer Software Group Ltd  has now been fined £3,076,320. The ICO found that the company failed to adequately protect the personal data of 79,404 individuals in breach of Article 32 of the UK GDPR.  

As a key IT and software provider for the NHS and other healthcare organisations across the country, Advanced often holds role of Data Processor for many of its clients. The breach in question occurred during a ransomware attack in August 2022. Hackers exploited a vulnerability through a customer account that lacked multi-factor authentication, gaining access to multiple health and care systems operated by Advanced. The ICO investigation found that personal data belonging to 79,404 people was taken. This included phone numbers, medical records, and even details on how to access the homes of 890 individuals receiving at-home care. 

The cyber-attack caused widespread disruption, with NHS 111 services impacted and some GPs resorting to pen and paper as electronic systems went offline. At the time, doctors warned that it could take months to clear the backlog of paperwork created by the incident. 

The fine serves as a reminder that Data Processors, like Advanced, have a duty to implement robust technical and organisational measures to safeguard personal data. This includes regularly assessing risks, applying multi-factor authentication, and keeping systems updated with the latest security patches. Data Processors cannot shift the responsibility to Data Controllers; their GDPR security obligations are independent of those of the Data Controller. 

Like previous fines, this one was substantially reduced from the amount announced in the Notice of Intent. In 2018, British Airways faced a Notice of Intent for a £183 million fine due to a cybersecurity breach, but the actual fine  issued in 2020 was reduced to £20 million. Similarly, Marriott International Inc.’s fine dropped from £99 million to £18.4 million after a Notice of Intent in 2020. What is interesting in this case is that the fine follows a “voluntary settlement” where Advanced acknowledged the ICO decision to impose a reduced fine and agreed to pay it without appealing.   

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to up skill their employees about cyber security. See also our Managing Personal Data Breaches Workshop. 

Supporting Careers in Data Protection Through Apprenticeships 

In today’s digital landscape, data protection and information governance have become critical risk areas for organisations across all sectors. With increasing regulatory demands and evolving threats, the need for skilled professionals in this field has never been greater. Recognising this growing skills gap, Damar Training, with the support of Act Now Training,  launched its innovative Data Protection and Information Governance Apprenticeship programme in late 2022, quickly establishing itself as the leading provider in England.

The programme was developed through extensive consultation with employers, including members of the apprenticeship Trailblazer Group, to ensure it would be commercially attractive, impactful, and of the highest quality. This collaborative approach has led to excellent engagement from employers and individuals, with 243 apprentices starting the programme to date, making Damar the largest provider of this apprenticeship standard in England.

A Flexible, Comprehensive Learning Journey

What sets Damar’s apprenticeship apart is its thoughtfully designed modular structure, with carefully sequenced six-week blocks of learning that cater to diverse learning styles and organisational needs. The gradual layering of technical content and learning activity, designed with the assistance of Act Now Training, ensure that apprentices from both public and private sectors receive an outstanding foundation in the knowledge, skills, and behaviours required for success in data protection roles.

The delivery model combines self-directed learning through engaging online resources with regular one-to-one coaching visits and group coaching sessions.
Extended technical workshops (underpinned by Act Now’s expertise) and quarterly review meetings provide additional support, while dedicated forums allow apprentices to stay updated with the latest developments, engage with peers, and consult with coaches.

This comprehensive approach has yielded impressive results. With a retention rate of 68%, an achievement rate of 65%, and an EPA pass rate of 95% – all above national averages – the programme demonstrates exceptional quality, particularly remarkable for a relatively new offering.

Industry-Leading Expertise

A key strength of Damar’s apprenticeship is its partnership with Act Now, an
award-winning data protection consultancy. This collaboration ensures that the programme’s content remains at the cutting edge of industry developments, including emerging areas such as Artificial Intelligence regulation.

Sarah Murray, Data Protection Officer at ClearData, highlights this benefit: 

“One of the particular stand-outs for me is the workshops. With the content supported by
Act Now, who have such a good reputation in this field, the workshops really put all of the theory into real-life practice.”

Real-World Impact for Employers and Apprentices

The programme serves some of the UK’s major employers, including Heathrow Airport, National Express, the BBC, Auto Trader, Betfred, and Dunelm, alongside various NHS Trusts, universities, government departments, and local councils.

For apprentices, the transformation goes beyond technical knowledge. Many begin with only basic data protection skills and limited confidence. Through the programme, they develop not only technical expertise but also a deeper understanding of the “why” behind data protection practices and the confidence to advise others with authority.

This growth translates into tangible career progression, with 99% of apprentices experiencing positive outcomes – 53% remaining in their current roles with enhanced skills, 18% securing permanent positions, and 28% gaining promotions or additional responsibilities. Some have even become data protection officers with overall responsibility for their organisation’s data protection function.

Employers benefit from immediate practical impacts. Apprentices have improved information assurance audits at Lincoln University, created artificial intelligence policies for Norfolk and Waveney Integrated Care Board, and developed triage request processes for data protection requirements at The Christie NHS Foundation Trust.

Stacey Lawrence, Data Protection Manager at Manchester Airport, emphasises this value: 

“The impact that both apprentices have brought to Manchester Airport has been huge. They work on the front line, to manage all enquiries, data protection breaches, and individual rights requests, and without them we simply wouldn’t be able to do the really sterling work that we do every day.”

A Future-Focused Approach

Damar continues to evolve the programme based on feedback from coaches, apprentices, and employers. Recent improvements include enhanced EPA preparation sessions, now embedded into group coaching. The company maintains close ties with the trailblazer group and leverages Act Now’s expertise to stay ahead of legislative developments.

With another 22 apprentices due to commence in April, the programme’s growth trajectory remains strong. Many employers, including Manchester Airport Group and Nottingham University Hospitals, are returning for their second or third data protection apprentice – perhaps the strongest testament to the programme’s value.

For organisations seeking to strengthen their data protection capabilities and individuals looking to build rewarding careers in this critical field, Damar Training’s Data Protection and Information Governance Apprenticeship offers a proven pathway to success.

If you would like to learn more about the DP and IG  Apprenticeship, please get in touch

AI in Local Government: Navigating the Legal Issues 

Artificial Intelligence is revolutionising many sectors, and local government is no exception. Councils are increasingly integrating AI to enhance service delivery, optimise resource management, and engage with citizens. AI Use cases include: 

  • Infrastructure Maintenance and Management: Blackpool Council uses AI for road maintenance through Project Amber; employing AI-powered satellite imagery to detect road damage and potholes.  
  • Public Engagement: Newham Council uses Chatbot Max, a multilingual chatbot, to assist residents with parking permits and penalty charge queries. The council says that in six months, the chatbot handled over 10,000 questions, saved 84 hours in call time, and generated £40,000 in savings.  
  • Crime Prevention and Detection: Wolverhampton Council has installed AI powered CCTV cameras to crack down on fly-tippers. The cameras have 360 degree vision and can recognise when someone is fly-tipping, sending an immediate report to the Council. 
  • Predictive Analytics for Social Services: In 2018 Hackney Council trialled the Early Help Predictive System . By analysing data on debt, housing, unemployment, school attendance, and domestic violence, the AI system profiled families to determine their need for intervention. Although this pilot programme was dropped a year later, there are many other AI tools which aim to help cash strapped councils speed up social work. One such tool is Magic Notes which records social work meetings and emails the social worker a transcript, summary and suggested actions for inclusion in case notes. 

Expect many more AI use cases soon, as the public sector is made to give truth to the Prime Minister recent speech in which he pledged that the Government will use AI’s power to ”turbocharge” the economy and improve public services. 

Legal Considerations  

While AI offers numerous benefits, several legal issues have to be navigated to ensure responsible and lawful use. These include: 

Data Protection and Privacy: Where personal data is involved training or deploying AI models, of course the GDPR applies. The transparency provisions and the requirement for a legal basis are of particular importance. In 2022, the Information Commissioner’s Office (ICO) issued a fine of more than £7.5 million to Clearview AI for GDPR breaches. This related to the way the company compiled its online database containing 20 billion images of people’s faces and data scraped from the internet.  The company did manage to successfully appeal the fine but the ICO, and other GDPR regulators in the EU, have issued clear warnings to AI companies to ensure they comply with GDPR. 

Transparency and Explainability: The decision-making processes of AI systems can be opaque. Clear information about how AI systems operate and make decisions should be provided. The London Borough of Camden has co-created a Data Charter with residents to ensure clarity and accessibility regarding data use, including AI applications. They produced accessible communications and animated explainers to demystify AI processes for the public.  

Bias and Discrimination: AI systems trained on biased data can perpetuate existing inequalities. Last year, a black Uber Eats driver received a payout after “racially discriminatory” facial-recognition checks prevented him accessing the app to secure work. Councils must be vigilant in auditing AI algorithms to detect and mitigate biases. This involves regular assessments and adjustments to ensure AI applications promote fairness and equality. 

Intellectual Property and Copyright: The use of AI, especially Generative AI applications like ChatGPT, may involve the use of copyrighted materials, raising intellectual property concerns. In December, the Government launched a consultation on Copyright and Artificial Intelligence.  

Accountability and Liability: Determining liability when AI systems cause harm is a complex legal issue. Clear accountability frameworks must be established ensuring that there is always human oversight of AI decisions. This includes defining who is responsible for AI-driven actions and implementing mechanisms for redress in cases of error. 

Regulatory Compliance: There is still no sign on an AI Bill which was mentioned in the King’s Speech. However there is plenty of AI guidance for the public sector. The recently published AI Playbook for the UK Government updates and expands on the Generative AI Framework for HMG. It aims to “help government departments and public sector organisations harness the power of a wider range of AI technologies safely, effectively, and responsibly.”  

The adoption of AI in local government presents a unique challenge especially for compliance professionals. By developing a deeper understanding of AI, they can play a leading role in addressing the legal and ethical dilemmas posed by emerging AI technologies as well as position themselves as forward-thinking leaders who can bridge the gap between law, ethics, and technology.  

Act Now recently launched the AI Governance Practitioner Certificate. This course is designed to equip compliance professionals with the essential knowledge and skills to navigate this transformative technology while upholding the highest standards of data protection and information governance.   

We are registering interest in this course which, subject to demand, will run in July, October and November. Register your interest now (no obligation).  

ICO Issues Reprimands to Scottish Councils for Subject Access Delays 

Last week the Information Commissioner’s Office (ICO) issued reprimands to two Scottish councils for repeatedly failing to respond to subject access requests (SARs) within the statutory timeframe under the UK GDPR. 

Many Scottish local authorities have seen an increase in SARs in the past few years, particularly in relation to the Redress Scotland scheme which allows people, who suffered abuse while in care, to apply for redress using supporting documents such as their care record. This increase was reported as 67% between 2021 and 2024.  

In its press release, the ICO says it has supported local authorities to improve their SAR response times and this has led to a 75% improvement, with 13 local authorities reporting a compliance rate of 90% in 2023/24. However, two local authorities have been singled out for a reprimand: 

Why did the ICO not issue a fine? In June 2022, the ICO revised its approach to enforcement of the UK GDPR against public sector organisations choosing to issue reprimands in most cases. Last summer, it announced a review of this approach following criticism that it was not effective in delivering GDPR compliance and that it was unfair to treat the public sector differently to other sectors. 

In December last year, the Commissioner issued a statement following publication of the review report. In short, he has decided to continue with his approach. He said: 

“Feedback from the review said that public authorities saw the publication of reprimands as effective deterrents, mainly due to reputational damage and potential impact on public trust, and how they can be used to capture the attention of senior leaders. Central government departments cited increased engagement and positive changes on the back of reprimands, particularly with our regular interaction with the government’s Chief Operating Officers Network. But wider public sector organisations displayed limited awareness, which means we must do more to share best practice and lessons learned.” 

The Commissioner also launched a consultation on the scope of the public sector enforcement approach and the factors and circumstances that would make it appropriate to issue a fine to a public authority. The deadline for responding to this consultation was 31st January 2025. We await its outcome.  

Enjoy reading our blog? Help us reach 10,000 subscribers by subscribing today!   

Our upcoming Handling SARs course can help you deal with complex subject access requests. Places are limited so book early to avoid disappointment.