AI Transcription Tools in Social Work Under Scrutiny 

Anyone remember Dragon Dictate? The first versions of this voice transcription software required users to spend hours training it (usually wearing a headset) by repeating stock phrases many times over. Even after full training, the transcription output was far from accurate. How technology has moved on, especially in the last few years, with the proliferation of AI. 

AI powered transcription software has been rapidly adopted by public sector organisations especially in local authority social work departments. Tools, like Magic Notes and Microsoft Copilot, are used by social workers to record conversations with children and families (e.g. interviews or assessments), transcribe spoken audio into text and generate summaries automatically. These “ambient scribes” listen in real-time or process recordings, reducing the need for manual notetaking; thus allowing professionals to focus on interactions rather than documentation. However the use of such tools, especially in sensitive contexts like social work, is not without risks as was highlighted by a recent report.  

Ada Lovelace Institute Report 

On 11th February 2026, the Ada Lovelace Institute published a report titled “Scribe and prejudice? Exploring the use of AI transcription tools in social care.” The report explored the dynamics of adoption and the impacts of AI transcription tools in adult and children’s social care across 17 local authorities in England and Scotland. Based on interviews with frontline social workers and managers, it highlighted serious risks that should be addressed by users.  

These include, amongst others: 

AI “Hallucinations”: The AI sometimes generates false information that wasn’t said in the recorded conversation. A prominent example involved an AI-generated summary incorrectly stating that a child had expressed suicidal ideation. This kind of error is especially dangerous in child protection or mental health contexts, where it could trigger unnecessary interventions or lead to flawed decisions about care. 

Gibberish, misrepresentations, and other errors: AI generated transcripts have included nonsense phrases, misspelled names, incorrect speaker attributions (especially in multi-person conversations), fabricated statements, irrelevant or foul language insertions and overly formal or academic wording that doesn’t reflect normal social work language. 

Bias and Harmful Stereotyping: Some outputs have reportedly promoted stereotypes or biased perceptions of individuals that weren’t present in the original recording. 

These issues echo broader AI concerns but of course are more serious in the context of social work records. Inaccuracies entering official care records could lead to incorrect decisions about a child’s safety, family support, or adult care; potentially resulting in harm to vulnerable people, professional consequences for social workers or even legal liability. 

Social workers generally bear full responsibility for reviewing and approving these AI outputs (the “human in the loop” safeguard), but practices vary widely according to the report. Some social workers spend minutes checking AI output whilst others spend hours. The report questions how effective this is in high-pressure frontline environments. There is also concern that over-reliance on summarisation features could erode professional judgment and the nuanced, interpretive nature of social work documentation. 

The report notes that in early 2025, one AI transcription tool was already in active use by 85 local authorities for social care. But the Ada Lovelace Institute criticises the “limited and light-touch” approaches to ethics, evaluation, testing, regulation, and risk mitigation so far. It has called for more robust safeguards, better guidance and thorough evaluation before wider use. 

Recommendations 

To ensure the safe and responsible use of AI transcription tools, the Institute urged the government to require local authorities to document their use of such tools through the ‘Algorithmic Transparency Reporting Standard.’ 

It also recommended that social care regulators and local authorities collaborate with relevant sector bodies to develop guidance on using AI transcription tools in statutory processes and formal proceedings, supported by clear accountability structures. 

The Institute added that: ‘To enable end-to-end accountability, regulators and professional bodies should review and revise rules and guidance on professional ethics for social workers and support social workers to collaborate with legal and advisory bodies around procedures for AI use in formal proceedings. An advisory board comprised of people with lived experience of drawing on care should be established to inform these actions.’ 

Further recommendations include: 

  • The UK government should extend its pilots of AI transcription tools to include various locations and public sector contexts. 
  • The UK government should set up a What Works Centre for AI in Public Services to generate and synthesise learnings from pilots and evaluations. 
  • A coalition of researchers, policymakers, civil society and community groups should collaborate on research on the systemic impacts of AI transcription tools. 
  • Local authorities should specify their outcomes and expected impact when procuring AI transcription tools to ensure a shared understanding among staff and users. 

The UK GDPR Angle 

The use of AI powered transcription software will involve processing highly sensitive personal data, including audio recordings and derived transcripts/summaries of conversations involving vulnerable individuals. This triggers UK GDPR obligations, with heightened risks due to the sensitive nature of the data and potential for harm if errors occur. 

Local authorities and social care providers should integrate UK GDPR compliance into procurement, deployment, and ongoing use of AI transcription software. Key practical steps include: 

  • Conduct a DPIA:  Before rollout or expansion, complete a Data Protection Impact Assessment to assess all the risks (e.g., hallucinations affecting accuracy, bias in diverse accents/dialects, unauthorised access). Update DPIAs for new tools or features. Involve the organisation’s Data Protection Officer from the outset. 
  • Choose compliant tools and vendors: Prioritise tools with strong data protection (e.g. UK-hosted data, no unnecessary retention, robust security). Review vendor DPIAs, processor agreements, and compliance certifications.  
  • Establish clear consent and transparency processes: Inform service users upfront about recording, AI involvement, and data use (via privacy notices or verbal explanation). Document decisions and allow opt-outs where appropriate. 
  • Implement strong human oversight and review: Mandate thorough checks of all AI outputs before approving records. Train staff to detect inaccuracies, bias, or inappropriate content. Flag AI-generated sections (e.g. via watermarks or metadata) for transparency and future audits. 
  • Secure data handling and contracts: Use encrypted recording/uploading, limit data shared with tools and delete audio promptly after transcription. Ensure processor contracts (Article 28) specify UK GDPR compliance, audit rights and breach notification. 
  • Monitor, audit and train: Regularly audit tool use and outputs for compliance. Provide targeted training on UK GDPR risks (e.g. accuracy, breaches, bias). Track incidents (e.g. hallucinations) and report serious ones as breaches if required. 
  • Define boundaries for use: Establish consensus on when AI transcription is appropriate (or unacceptable).  

AI transcription offers clear benefits for reducing paperwork and freeing up social workers’ time for direct care. However, strong governance measures must be taken to avoid dangerous inaccuracies slipping into official records, and the potential for biased or harmful decisions. 

If you need to train your staff on responsible use of AI please get in touch to discuss our customised in house training. The following public courses may also interest you: 

AI and Information Governance:  A one day workshop examining the key data protection and IG issues when deploying AI solutions.  

AI Governance Practitioner Certificate training programme: A four day course providing a practical overview of how AI systems are developed, deployed, and regulated, with particular attention to risk, bias, and accountability. 

Act Now Nominated for IRMS Supplier of the Year Award 

Act Now Training is pleased to announce that it has been nominated for the 2026 Information and Records Management Society (IRMS) awards. 

Each year the IRMS recognises excellence in the field of information management with their prestigious industry awards. These highly sought-after awards are presented at a glittering ceremony at the annual Conference following the Gala Dinner.  

Act Now has been nominated for the Supplier of the Year award which it previously won in 2021, 2022 and 2024. 

Voting is open to IRMS members until Wednesday 18th March 2026. 

If you are an IRMS member, you can login to your account and vote for Act Now here

Thank you for your support! 

Survey of FOI Officers 

Dr Ben Worthy, an academic at Birkbeck College, is looking for Local Government FOI officers to complete another survey. 

The survey is part of a joint US/UK research project, funded by Democracy Fund, which started last year looking at FOI request burdens.  They want to follow up on the use of AI to help make FOI requests. 

The survey should take around 5-10 minutes to complete.  An anonymised summary of the findings will be sent to anyone who wishes to see it.  

You can read more about the project here. The survey can be accessed here.

Guardian of Data Podcast Episode 2: The Grok AI Controversy with Lynn Wyeth 

Act Now is pleased to bring you episode 2 of a new podcast; Guardians of Data. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information. In each episode we will be speaking with experts and practitioners to unpack the big issues shaping the IG profession. 

In the first episode, we were joined by Jon Baines, a Senior Data Protection Specialist at Mishcon de Reya LLP and the long-standing chair of NADPO. In a wide ranging conversation, Jon shared his journey into IG, his advice for both new starters and seasoned professionals and his perspective on the future of the profession. 

In Episode 2 we discuss the recent controversy around Grok AI. 

Grok,  the AI chatbot developed by xAI and integrated into the social media platform X, has caught the attention of governments and regulators across the world after it was used to edit pictures of real women to show them in revealing clothes and suggestive poses. In the UK, Ofcom and the Information Commissioner’s Office have opened formal investigations,  a significant step that signals how seriously AI-related risks are now being taken.  

This controversy raises fundamental questions about how AI systems are designed and overseen and about whether existing laws and board-level oversight are keeping pace. In episode 2, we unpack these issues with the help of Lynn Wyeth, an expert in AI, data protection and responsible technology.  

Listen via the player below, or on your preferred podcast app. 
Available on Apple Podcasts, Spotify, and all major podcast platforms.

Data Protection Complaints Procedure: New ICO Guidance 

The main changes to the UK  data protection regime made by the Data (Use and Access) Act 2025 (DUA Act) came into force on Thursday 5th February 2026. One key provision though is due to commence on 19th June 2026; the requirement for Data Controllers to have a complaints procedure to handle data protection complaints.  

A new section 164A into the Data Protection Act 2018 requires Data Controllers to: 

  • give Data Subjects a way of making data protection complaints; 
  • acknowledge receipt of complaints within 30 days of receiving them; 
  • without undue delay, take appropriate steps to respond to complaints, including making appropriate enquiries, and keep Data Subjects informed; and 
  • without undue delay, tell Data Subjects the outcome of their complaints 

Following a consultation, which closed in October last year, the ICO has published its guidance explaining the new requirements and informing Data Controllers of what they must, should and could do to comply.  

Data protection expert, and guest on the first Guardians of Data podcast, Jon Baines writes on his personal blog that in declining to suggest how long controllers should normally take to respond to data subject complaints, the ICO has missed an opportunity to provide regulatory clarity.  

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop.  

The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact.

Cyber Security and Resilience Bill in Parliament 

On 12th November 2025, the Government introduced the Cyber Security and Resilience (Network and Information Systems) Bill in the House of Commons. This is an important development in the evolution of the UK’s cyber security regulation. The Bill is currently at the Committee stage.

The Bill was trailed in the King’s Speech of July 2024, and was followed by the Government publishing its Cyber security and resilience policy statement. The Bill is designed to update the existing Network and Information Systems Regulations 2018 to raise cyber resilience across key parts of the economy, and to give government and regulators more agile powers to respond to evolving threats. Amongst other things, it will expand cyber security regulation to cover more digital services and supply chains, and mandate increased incident reporting to improve the government’s response to cyber-attacks including where a company has been held to ransom. 

The Bill imposes new maximum penalties similar to GDPR levels. For more serious breaches, the maximum penalty is up to £17 million, or 4% of a regulated entity’s worldwide turnover, whichever is higher. For other breaches, the maximum penalty is up to £10 million, or 2% of a regulated entity’s worldwide turnover, whichever is higher. 

Key Provisions 

Expanded Regulatory Scope: The Bill will broaden the range of organisations and sectors under regulatory oversight, extending beyond essential services and digital providers to include a wider array of entities integral to national infrastructure. ​ 

Enhanced Regulatory Powers: Regulators will receive increased authority to ensure compliance with cybersecurity standards, including proactive investigation capabilities and mechanisms for cost recovery to support their activities. ​ 

Mandatory Incident Reporting: The Bill mandates comprehensive reporting of cyber incidents, notably ransomware attacks, to improve national threat assessment and response strategies. ​ 

Supply Chain Security: The Bill introduces measures to strengthen supply chain security, granting regulators the power to designate ‘Critical Suppliers’ whose services are integral to public sector operations. ​ 

Regulatory Oversight: The Information Commissioner’s Office will gain greater authority to investigate and enforce compliance among digital service providers, including those that supply technology to the public sector. ​ The ICO recently published its response to the Bill. 

For a detailed analysis of the Bill, read this article by law firm Clifford Chance. 

We have two workshops coming up (How to Increase Cyber Security in your Organisation and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about cyber security. See also our Managing Personal Data Breaches Workshop.

Children’s Image Hosting Platform Fined For Privacy Failures

Last week the Information Commissioner’s Office (ICO) issued its first UK GDPR fine of 2026. MediaLab.AI, Inc. (MediaLab), owner of image sharing and hosting platform Imgur, received a Monetary Penalty Notice of £247,590 for processing children’s personal data in ways that breached the UK GDPR.     

Safeguarding children’s privacy is a key enforcement priority for the ICO. 
In April 2023, it issued a £12.7 million fine to TikTok for a number of breached of the UK GDPR, including failing to use children’s personal data lawfully. The following year, the ICO launched its Children’s code strategy to look closely at social media platforms and video sharing platforms. 
In December it published a progress report on the strategy, reporting good progress and including a ‘proactive supervision programme’ to drive improvements in the industry. Perhaps this latest fine is part of this ‘proactive supervision programme’.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services” (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states: 

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.” 

Imgur’s terms of use did state that children under 13 could only use the platform with parental supervision. However, the ICO investigation found that, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform. 

In setting the £247,590 penalty amount, the ICO took into consideration the number of children affected by this breach, the degree of potential harm caused, the duration of the contraventions, and the company’s global turnover. It also considered MediaLab’s acceptance of the provisional findings set out in the Notice of Intent issued in September 2025 and its commitment to address the infringements if access to the Imgur platform in the UK is restored in the future. If MediaLab resumes processing the personal data of children in the UK (currently the Imgur site is not available in the UK) without implementing the measures it has committed to, the ICO may take further regulatory action. 

We are waiting for the Monetary Penalty Notice to be published. 
The ICO says it is still considering the redaction of personal and commercially confidential or sensitive information.  

This fine shows that the ICO’s spotlight is firmly on those processing children’s data. The Data (Use and Access) Act 2025, the key provisions of which came into force on last Thursday, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data. 

This and other developments relating to children’s data will be covered in tomorrow’s online workshop, Working with Children’s Data. The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw.

New Guardians of Data Podcast: In Conversation with Jon Baines 

Act Now is pleased to bring you the first episode of a new podcast; Guardians of Data. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information. In each episode we will be speaking with experts and practitioners to unpack the big issues shaping the IG profession.

In information governance, there’s no substitute for learning from those who have walked the path before us. Experienced IG leaders bring a wealth of knowledge from years at the frontline of data protection and information rights – navigating challenges, overcoming obstacles, and shaping best practice along the way. By listening to their stories, we can all grow in confidence and prepare for the IG challenges of tomorrow. 

In the first episode, we are joined by one such IG leader: Jon Baines is a Senior Data Protection Specialist at Mishcon de Reya LLP where he advises on complex data protection and FOI matters. Jon isn’t a lawyer in the traditional sense yet is listed in Legal 500 as a “Rising Star” in the Data Protection, Privacy and Cybersecurity category. Jon is the long-standing chair of the National Association of Data Protection (NADPO) and Freedom of Information Officers. He is regularly sought for comment by specialist and national media and writes extensively on data protection matters. 

In our conversation, Jon shares his journey into IG, his advice for both new starters and seasoned professionals and his perspective on the future of the profession. 

Listen via the player below, or on your preferred podcast app.
Available on Apple Podcasts, Spotify, and all major podcast platforms.

Data (Use and Access) Act: Key Data Provisions In Force on Thursday

The Data (Use and Access) Act 2025 (Commencement No. 6 and Transitional and Saving Provisions) Regulations 2026 were made on 29th January 2026. They bring into force most of the amendments to the UK GDPR, PECR and the DPA 2018 made by The Data (Use and Access) Act 2025 (DUA Act). 

The amendments coming into force on Thursday (5th February 2026), amongst others, cover: 

  • New ‘Recognised legitimate interests’  
  • When time starts for dealing with subject access requests 
  • Automated Decision Making
  • Information to be provided to data subjects 
  • Safeguards for processing for research etc purposes 
  • International Data Transfers 
  • PECR and marketing 

You can read a summary of the amendments here

DUA Act Workshop in Birmingham (Thursday 5th February 2026)

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham.

Revised GDPR Handbook   

The newly updated UK GDPR Handbook (2nd edition) brings these developments together in one practical reference. It includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact. Delegates on our future GDPR certificate courses will receive a complimentary copy of the UK GDPR Handbook as part of their course materials.    

Former Council Chief Executive Prosecuted under Section 77 FOI 

Section 77 of the Freedom of Information Act 2000 (FOI) makes it a criminal offence for a person to do anything with the intention of preventing the disclosure of information pursuant to an FOI request. The offence can be committed by any public authority and any person who is employed by, is an officer of, or is subject to the direction of a public authority. Regulation 19 of the Environmental Information Regulations 2004 creates an identical offence, albeit with slightly different provisions governing government departments. 

Last week the trial begun of the former Chief Executive of Mid and East Antrim Borough Council who has been charged with three offences relating to records kept by the council. Anne Donaghy faces three charges under section 77 FOI namely; altering a record to prevent disclosure, attempting to alter records, aiding and abetting the alteration of a record. Ms Donaghy denies the allegations and is contesting the charges. 

A BBC Spotlight programme previously reported that the charges were connected to alleged attempts to delete correspondence relating to the decision to withdraw council staff operating under the post-Brexit trade conditions known as the Northern Ireland Protocol. The staff, who were carrying out checks on goods arriving from Great Britain, were removed because of apparent threats from loyalist paramilitaries. 
It later emerged Ms Donaghy, who was chief executive at the time, had written to the Cabinet Office before the decision to remove staff was taken. She told the UK government graffiti had been directly targeting council staff working on checks. 
The then Agriculture Minister, Edwin Poots, subsequently withdrew inspectors performing the checks at ports in Northern Ireland. However, shortly after, all staff had returned to duties. The Police Service of Northern Ireland (PSNI) issued a threat assessment stating it had no information to support claims of loyalist paramilitaries threatening staff safety. 

Prosecutions under section 77 are extremely rare. The main reason for this is that there must be proof (‘beyond reasonable doubt’) of intent to destroy, conceal, deface etc. This may be difficult to do after the event.   

The only other section 77 prosecution was in March 2020. Nicola Young, a town clerk at Whitchurch Town Council, was fined £400 and ordered to pay £1,493 costs following a guilty plea. The facts of the case are that a person had made an FOI request to the Council for a copy of an audio recording of a council meeting. 
They believed that the written minutes of the meeting had been fabricated and so they wanted to listen to the recording of the meeting. Ms Young deliberately deleted the audio recording a few days later and then advised the requestor that the audio file had been deleted as part of the council’s destruction policy. 

This and other FOI developments will be discussed in our forthcoming FOI workshops . If you are looking for a qualification in freedom of information, our FOI Practitioner Certificate is ideal.