Act Now Nominated for IRMS Supplier of the Year Award 

Act Now Training is pleased to announce that it has been nominated for the 2026 Information and Records Management Society (IRMS) awards. 

Each year the IRMS recognises excellence in the field of information management with their prestigious industry awards. These highly sought-after awards are presented at a glittering ceremony at the annual Conference following the Gala Dinner.  

Act Now has been nominated for the Supplier of the Year award which it previously won in 2021, 2022 and 2024. 

Voting is open to IRMS members until Wednesday 18th March 2026. 

If you are an IRMS member, you can login to your account and vote for Act Now here

Thank you for your support! 

Survey of FOI Officers 

Dr Ben Worthy, an academic at Birkbeck College, is looking for Local Government FOI officers to complete another survey. 

The survey is part of a joint US/UK research project, funded by Democracy Fund, which started last year looking at FOI request burdens.  They want to follow up on the use of AI to help make FOI requests. 

The survey should take around 5-10 minutes to complete.  An anonymised summary of the findings will be sent to anyone who wishes to see it.  

You can read more about the project here. The survey can be accessed here.

Guardian of Data Podcast Episode 2: The Grok AI Controversy with Lynn Wyeth 

Act Now is pleased to bring you episode 2 of a new podcast; Guardians of Data. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information. In each episode we will be speaking with experts and practitioners to unpack the big issues shaping the IG profession. 

In the first episode, we were joined by Jon Baines, a Senior Data Protection Specialist at Mishcon de Reya LLP and the long-standing chair of NADPO. In a wide ranging conversation, Jon shared his journey into IG, his advice for both new starters and seasoned professionals and his perspective on the future of the profession. 

In Episode 2 we discuss the recent controversy around Grok AI. 

Grok,  the AI chatbot developed by xAI and integrated into the social media platform X, has caught the attention of governments and regulators across the world after it was used to edit pictures of real women to show them in revealing clothes and suggestive poses. In the UK, Ofcom and the Information Commissioner’s Office have opened formal investigations,  a significant step that signals how seriously AI-related risks are now being taken.  

This controversy raises fundamental questions about how AI systems are designed and overseen and about whether existing laws and board-level oversight are keeping pace. In episode 2, we unpack these issues with the help of Lynn Wyeth, an expert in AI, data protection and responsible technology.  

Listen via the player below, or on your preferred podcast app. 
Available on Apple Podcasts, Spotify, and all major podcast platforms.

Data Protection Complaints Procedure: New ICO Guidance 

The main changes to the UK  data protection regime made by the Data (Use and Access) Act 2025 (DUA Act) came into force on Thursday 5th February 2026. One key provision though is due to commence on 19th June 2026; the requirement for Data Controllers to have a complaints procedure to handle data protection complaints.  

A new section 164A into the Data Protection Act 2018 requires Data Controllers to: 

  • give Data Subjects a way of making data protection complaints; 
  • acknowledge receipt of complaints within 30 days of receiving them; 
  • without undue delay, take appropriate steps to respond to complaints, including making appropriate enquiries, and keep Data Subjects informed; and 
  • without undue delay, tell Data Subjects the outcome of their complaints 

Following a consultation, which closed in October last year, the ICO has published its guidance explaining the new requirements and informing Data Controllers of what they must, should and could do to comply.  

Data protection expert, and guest on the first Guardians of Data podcast, Jon Baines writes on his personal blog that in declining to suggest how long controllers should normally take to respond to data subject complaints, the ICO has missed an opportunity to provide regulatory clarity.  

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop.  

The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact.

Cyber Security and Resilience Bill in Parliament 

On 12th November 2025, the Government introduced the Cyber Security and Resilience (Network and Information Systems) Bill in the House of Commons. This is an important development in the evolution of the UK’s cyber security regulation. The Bill is currently at the Committee stage.

The Bill was trailed in the King’s Speech of July 2024, and was followed by the Government publishing its Cyber security and resilience policy statement. The Bill is designed to update the existing Network and Information Systems Regulations 2018 to raise cyber resilience across key parts of the economy, and to give government and regulators more agile powers to respond to evolving threats. Amongst other things, it will expand cyber security regulation to cover more digital services and supply chains, and mandate increased incident reporting to improve the government’s response to cyber-attacks including where a company has been held to ransom. 

The Bill imposes new maximum penalties similar to GDPR levels. For more serious breaches, the maximum penalty is up to £17 million, or 4% of a regulated entity’s worldwide turnover, whichever is higher. For other breaches, the maximum penalty is up to £10 million, or 2% of a regulated entity’s worldwide turnover, whichever is higher. 

Key Provisions 

Expanded Regulatory Scope: The Bill will broaden the range of organisations and sectors under regulatory oversight, extending beyond essential services and digital providers to include a wider array of entities integral to national infrastructure. ​ 

Enhanced Regulatory Powers: Regulators will receive increased authority to ensure compliance with cybersecurity standards, including proactive investigation capabilities and mechanisms for cost recovery to support their activities. ​ 

Mandatory Incident Reporting: The Bill mandates comprehensive reporting of cyber incidents, notably ransomware attacks, to improve national threat assessment and response strategies. ​ 

Supply Chain Security: The Bill introduces measures to strengthen supply chain security, granting regulators the power to designate ‘Critical Suppliers’ whose services are integral to public sector operations. ​ 

Regulatory Oversight: The Information Commissioner’s Office will gain greater authority to investigate and enforce compliance among digital service providers, including those that supply technology to the public sector. ​ The ICO recently published its response to the Bill. 

For a detailed analysis of the Bill, read this article by law firm Clifford Chance. 

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security. See also our Managing Personal Data Breaches Workshop.

Children’s Image Hosting Platform Fined For Privacy Failures

Last week the Information Commissioner’s Office (ICO) issued its first UK GDPR fine of 2026. MediaLab.AI, Inc. (MediaLab), owner of image sharing and hosting platform Imgur, received a Monetary Penalty Notice of £247,590 for processing children’s personal data in ways that breached the UK GDPR.     

Safeguarding children’s privacy is a key enforcement priority for the ICO. 
In April 2023, it issued a £12.7 million fine to TikTok for a number of breached of the UK GDPR, including failing to use children’s personal data lawfully. The following year, the ICO launched its Children’s code strategy to look closely at social media platforms and video sharing platforms. 
In December it published a progress report on the strategy, reporting good progress and including a ‘proactive supervision programme’ to drive improvements in the industry. Perhaps this latest fine is part of this ‘proactive supervision programme’.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services” (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states: 

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.” 

Imgur’s terms of use did state that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform. 

In setting the £247,590 penalty amount, the ICO took into consideration the number of children affected by this breach, the degree of potential harm caused, the duration of the contraventions, and the company’s global turnover. It also considered MediaLab’s acceptance of the provisional findings set out in the Notice of Intent issued in September 2025 and its commitment to address the infringements if access to the Imgur platform in the UK is restored in the future. If MediaLab resumes processing the personal data of children in the UK (currently the Imgur site is not available in the UK) without implementing the measures it has committed to, the ICO may take further regulatory action. 

We are waiting for the Monetary Penalty Notice to be published. 
The ICO says it is still considering the redaction of personal and commercially confidential or sensitive information.  

This fine shows that the ICO’s spotlight is firmly on those processing children’s data. The Data (Use and Access) Act 2025, the key provisions of which came into force on last Thursday, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data. 

This and other developments relating to children’s data will be covered in tomorrow’s online workshop, Working with Children’s Data. The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw.

New Guardians of Data Podcast: In Conversation with Jon Baines 

Act Now is pleased to bring you the first episode of a new podcast; Guardians of Data. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information. In each episode we will be speaking with experts and practitioners to unpack the big issues shaping the IG profession.

In information governance, there’s no substitute for learning from those who have walked the path before us. Experienced IG leaders bring a wealth of knowledge from years at the frontline of data protection and information rights – navigating challenges, overcoming obstacles, and shaping best practice along the way. By listening to their stories, we can all grow in confidence and prepare for the IG challenges of tomorrow. 

In the first episode, we are joined by one such IG leader: Jon Baines is a Senior Data Protection Specialist at Mishcon de Reya LLP where he advises on complex data protection and FOI matters. Jon isn’t a lawyer in the traditional sense yet is listed in Legal 500 as a “Rising Star” in the Data Protection, Privacy and Cybersecurity category. Jon is the long-standing chair of the National Association of Data Protection (NADPO) and Freedom of Information Officers. He is regularly sought for comment by specialist and national media and writes extensively on data protection matters. 

In our conversation, Jon shares his journey into IG, his advice for both new starters and seasoned professionals and his perspective on the future of the profession. 

Listen via the player below, or on your preferred podcast app.
Available on Apple Podcasts, Spotify, and all major podcast platforms.

Data (Use and Access) Act: Key Data Provisions In Force on Thursday

The Data (Use and Access) Act 2025 (Commencement No. 6 and Transitional and Saving Provisions) Regulations 2026 were made on 29th January 2026. They bring into force most of the amendments to the UK GDPR, PECR and the DPA 2018 made by The Data (Use and Access) Act 2025 (DUA Act). 

The amendments coming into force on Thursday (5th February 2026), amongst others, cover: 

  • New ‘Recognised legitimate interests’  
  • When time starts for dealing with subject access requests 
  • Automated Decision Making
  • Information to be provided to data subjects 
  • Safeguards for processing for research etc purposes 
  • International Data Transfers 
  • PECR and marketing 

You can read a summary of the amendments here

DUA Act Workshop in Birmingham (Thursday 5th February 2026)

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham.

Revised GDPR Handbook   

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.    

Former Council Chief Executive Prosecuted under Section 77 FOI 

Section 77 of the Freedom of Information Act 2000 (FOI) makes it a criminal offence for a person to do anything with the intention of preventing the disclosure of information pursuant to an FOI request. The offence can be committed by any public authority and any person who is employed by, is an officer of, or is subject to the direction of a public authority. Regulation 19 of the Environmental Information Regulations 2004 creates an identical offence, albeit with slightly different provisions governing government departments. 

Last week the trial begun of the former Chief Executive of Mid and East Antrim Borough Council who has been charged with three offences relating to records kept by the council. Anne Donaghy faces three charges under section 77 FOI namely; altering a record to prevent disclosure, attempting to alter records, aiding and abetting the alteration of a record. Ms Donaghy denies the allegations and is contesting the charges. 

A BBC Spotlight programme previously reported that the charges were connected to alleged attempts to delete correspondence relating to the decision to withdraw council staff operating under the post-Brexit trade conditions known as the Northern Ireland Protocol. The staff, who were carrying out checks on goods arriving from Great Britain, were removed because of apparent threats from loyalist paramilitaries. 
It later emerged Ms Donaghy, who was chief executive at the time, had written to the Cabinet Office before the decision to remove staff was taken. She told the UK government graffiti had been directly targeting council staff working on checks. 
The then Agriculture Minister, Edwin Poots, subsequently withdrew inspectors performing the checks at ports in Northern Ireland. However, shortly after, all staff had returned to duties. The Police Service of Northern Ireland (PSNI) issued a threat assessment stating it had no information to support claims of loyalist paramilitaries threatening staff safety. 

Prosecutions under section 77 are extremely rare. The main reason for this is that there must be proof (‘beyond reasonable doubt’) of intent to destroy, conceal, deface etc. This may be difficult to do after the event.   

The only other section 77 prosecution was in March 2020. Nicola Young, a town clerk at Whitchurch Town Council, was fined £400 and ordered to pay £1,493 costs following a guilty plea. The facts of the case are that a person had made an FOI request to the Council for a copy of an audio recording of a council meeting. 
They believed that the written minutes of the meeting had been fabricated and so they wanted to listen to the recording of the meeting. Ms Young deliberately deleted the audio recording a few days later and then advised the requestor that the audio file had been deleted as part of the council’s destruction policy. 

This and other FOI developments will be discussed in our forthcoming FOI workshops . If you are looking for a qualification in freedom of information, our FOI Practitioner Certificate is ideal. 

Do Tennis Players Have a Right to Privacy?

John McEnroe is remembered for his on-court outbursts almost as much as for his exquisite shot-making. “You cannot be serious!” is an instantly recognisable sporting catchphrase. When McEnroe was at the height of his career in the 1980s, tennis players’ behaviour was scrutinised almost exclusively through on-court broadcast cameras. What happened off court largely remained unseen. 

Today, tennis, alongside other elite sports, is an environment of continuous monitoring; players are filmed arriving, warming up, competing and exiting. Visibility is a structural feature of the modern sports industry, justified for enhancing fan engagement and serving security, integrity and officiating purposes. But where should the balance lie when such footage reveals players’ emotional states – be it anger, distress or vulnerability? 

This question came up this week when a tennis player, Coco Gauff, called for greater privacy after footage emerged of her smashing her racquet following her Australian Open quarter-final defeat. Crucially, the incident did not occur on court. Gauff was filmed in the players’ area by behind-the-scenes cameras, with the footage later broadcast on television and circulated widely on social media. Gauff said she had made a conscious effort to suppress her emotions until she believed she was away from public view, referencing a similar incident at the 2023 US Open when Aryna Sabalenka was filmed smashing her racquet after losing the final. Since 2019, the Australian Open has shown footage from the players’ zone beneath the Rod Laver Arena, including the gym, warm-up areas and corridors leading from locker rooms. Camera access in these spaces is more restricted at the other Grand Slams.  

Gauff is not alone in raising concerns about behind-the-scenes cameras. Six-time major champion Iga Świątek said this week players are being watched “like animals in the zoo” in Melbourne. Semi-finalist Jessica Pegula described the constant filming as an “invasion of privacy”, adding that players feel “under a microscope constantly”. Tournament organisers, Tennis Australia, responded by emphasising fan engagement, saying the cameras help create a “deeper connection” between players and audiences while insisting that player comfort and privacy remain a priority. 

From a legal perspective, this issue is not merely a matter of optics. Under modern data-protection regimes such as the GDPR and the Australian Privacy Act, video footage of identifiable athletes constitutes personal data. Where that footage reveals emotional states it becomes particularly sensitive. Organisers must therefore be able to justify not only collecting such footage, but retaining, broadcasting and amplifying it. That justification is relatively straightforward during live play, where filming is integral to the sport itself. It becomes much harder once the match has ended. Filming in player tunnels, medical areas or immediately after defeat may be defensible for security or safety reasons. But the retention and circulation of emotionally charged moments for entertainment value sits on far shakier legal ground.  

Players may agree to extensive filming as a condition of participation, but that agreement does not extinguish their broader privacy rights, particularly where footage is used in a way that is disproportionate, stigmatising or disconnected from its original purpose. This tension is becoming harder to ignore as governing bodies simultaneously emphasise mental health and player welfare while permitting practices that expose athletes’ most vulnerable moments to global audiences. 

Other blog posts that may interest you:

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop.