Oral Disclosure of Personal Data: To GDPR or not to GDPR? 

Here’s a pub quiz question for you, “Can a Data Controller circumvent the requirements of data protection law by disclosing personal data verbally rather than in writing?” The answer was “Yes” under the old Data Protection Act 1998.
In Scott v LGBT Foundation Ltd [2020] WLR 62, the High Court rejected a claim that the LGBT foundation had breached, amongst other things, the claimants data protection rights by disclosing information about him to a GP. The court held that the 1998 Act did not apply to purely verbal communications.  

Nowadays though, the answer to the above question is no; the oral disclosure of personal data amounts to “processing” as defined by Article 4(2) of the GDPR.
So said the Court of Justice of the European Union (CJEU), on 7th March 2024, in a preliminary ruling in the Endemol Shine Finland

The subject of the ruling is a television company which makes a number of reality TV shows in Finland. It had been organising a competition, and was seeking information from the District Court of South Savo for information about possible criminal proceedings involving one of the competition participants. It requested the District Court to disclose the information orally rather than in writing. The District Court refused the request on the basis that there was no legitimate reason for processing the criminal offence data under Finnish law, implementing Article 10 of the GDPR.
On appeal Endemol Shine Finland argued that the GDPR did not apply as the oral disclosure of the information would not constitute processing of personal data under the GDPR. 

Article 4(2) GDPR defines “processing” as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means”. On the face of it, this covers oral processing. However, Article 2 states that GDPR applies to processing of personal data “wholly or partly by automated means”, and processing by non-automated means which “forms or is intended to form part of a filing system.” Article 4(6) GDPR defines “filing system” broadly, covering “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”. 

The Finnish Court of Appeal requested a preliminary ruling from CJEU on the meaning of Article 4(2) and whether the particular processing in this case came within the material scope of the GDPR under Article 2. The CJEU held the concept of processing in Article 4(2) of the GPDR necessarily covered the oral disclosure of personal data. It said the wording of the Article made it apparent that the EU legislature intended to give the concept of processing a broad scope. The court pointed out that the GDPR’s objective was “to ensure a high level of protection of the fundamental rights and freedoms of natural persons” and that “circumventing the application of that regulation by disclosing personal data orally rather than in writing would be manifestly incompatible with that objective”. 

The CJEU went on to consider whether the oral processing of the data would fall within the material scope of the GDPR under Article 2. It held that it was clear from the request for a preliminary ruling that the personal data sought from the District Court of South Savo is contained in “a court’s register of persons” which appeared to be a filing system within the meaning of Article 4(6), and therefore fell within the scope of the GDPR. 

UK Data Controllers should note the wording of Article 4 and Article 2 of the UK GDPR is the same as in the EU GDPR. So whilst this ruling from the CJEU is not binding on UK courts, it would be wise to assume that picking up the phone and making an oral disclosure of personal data will not allow the UK GDPR to be circumvented.   

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 


Transparency in Health and Social Care: New ICO Guidance 

Within the health and social care sector, new technologies that use large amounts of personal data are being used to support both direct care and secondary purposes, such as planning and research. An example is the the use of AI to provide automated diagnoses based on medical imaging data from patients. 

Transparency is a key principle of UK Data Protection legislation. Compliance with the first data protection principle and Article 13 and 14 of the UK GDPR ensures that data subjects are aware of how their personal data is used, allowing them to make informed choices about who they disclose their data to and how to exercise their data rights. 

On Monday the Information Commissioner’s Office (ICO) published new guidance to assist health and social care organisations to comply with their transparency obligations under the UK GDPR. It supplements existing ICO guidance on the principle of transparency and the right to be informed

The guidance is aimed at all organisations, including from the private and third sector, who deliver health and social care services or process health and social care information. This includes local authorities, suppliers to the health and social care sector, universities using health information for research purposes and others (e.g. fire service, police and education) that use health information for their own purposes. The guidance will help them to understand the definition of transparency and assess appropriate levels of transparency, as well as providing practical steps to developing effective transparency information. 

This and other data protection developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.   

DP Bill: Updated Keeling Schedules Published 

The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. If passed, it will make changes to UK data protection legislation including the UK GDPR.

The Government recently published updated versions of Keeling Schedules showing potential changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

Whilst no doubt there will be further amendments, the schedules are worth studying for a clear picture as to the impact of the Bill. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. 

Kate Middleton’s Medical Records: Can anyone go to jail?

Kate Middleton seems to be at the centre of much media (and social media) attention at present. In addition to speculation about her health and whereabouts, there has been much focus and analysis of the now famous photoshopped Mother’s Day photo

This week it was reported that employees at the private London Clinic in Marylebone, where Kate was a patient following abdominal surgery in January, attempted to view her medical records. Reportedly three employees have now been suspended.

The Health Minister, Maria Caulfield, told Sky News it was “pretty severe and serious stuff to be accessing notes that you don’t have permission to”. She also said police had been “asked to look at” whether staff at the clinic attempted to access the princess’s private medical records. 

If the reports are true and individuals are proven to have been “snooping”, what are the consequences? Firstly, this would normally be a matter for the Information Commissioner’s Office (ICO) to investigate rather than the police. Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly: 

(a) obtain or disclose personal data without the consent of the controller, 

(b) procure the disclosure of personal data to another person without the consent of the controller, or 

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained. 

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. In June 2023, the ICO disclosed that since 1st June 2018, 92 cases involving Section 170 offences were investigated by its Criminal Investigations Team.   

Consequences 

Section 170 is only punishable by way of a fine; perpetrators cannot be sent to prison. Although there is now no cap on the maximum fine, prosecutions have resulted in relatively low fines compared to the harm caused.  

Two recent prosecutions have involved employees accessing medical records.
In November 2023, Loretta Alborghetti, was fined for illegally accessing the medical records of over 150 people. The offence took place whilst she worked as a medical secretary at Worcestershire Acute Hospitals NHS Trust. She was ordered to pay a total of £648. 

In August 2022, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust, pleaded guilty to accessing medical records of patients without a valid legal reason. An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off from going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000. 

Computer Misuse Act  

A Section 170 prosecution would have a much greater deterrent effect if the sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail.  

The relatively low fines have led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment. In July 2022, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of people’s personal information from vehicle repair garages to generate potential leads for personal injury claims. 

The ICO has now confirmed that it has a personal data breach report from The London Clinic. If its investigation, finds the Clinic did not comply with its security obligations under the Article 5(1)(f)) and Article 32 of the UK GDPR, it faces a possible maximum Monetary Penalty Notice of £17.5m or 4% of gross annual turnover (whichever is higher). This case highlights the importance of organisations ensuring adequate security measures around sensitive personal data especially where the data relates to high profile individuals.  
 
This and other data protection developments will be discussed in detail on our forthcoming GDPR Update workshop. There are only 3 places left on our next GDPR Practitioner Certificate 

Conservative Party Challenged Over “Data Harvesting” 

In the run up to the General Election this year, political parties in the UK will face the challenge of effectively communicating their message to voters whilst at the same time respecting voters’ privacy. In the past few years, all parties have been accused of riding roughshod over data protection laws in their attempts to convince voters that they ‘have a plan’ or that ‘the country needs change’.  

In May 2017, the Information Commissioner’s Office (ICO) announced that it was launching a formal investigation into the use of data analytics for political purposes after allegations were made about the ‘invisible processing’ of people’s personal data and the micro-targeting of political adverts during the EU Referendum.
This culminated in a report to Parliament and enforcement action against Facebook, Emma’s Diary and some of the companies involved in the Vote Leave Campaign.  

In July 2018 the ICO published a report, Democracy Disrupted, which highlighted significant concerns about transparency around how people’s data was being used in political campaigning. The report revealed a complex ecosystem of digital campaigning with many actors. In 2019, the ICO issued assessment notices to seven political parties. It concluded: 

“The audits found some considerable areas for improvement in both transparency and lawfulness and we recommended several specific actions to bring the parties’ 

processing in compliance with data protection laws. In addition, we recommended that the parties implemented several appropriate technical and organisational measures to meet the requirements of accountability. Overall there was a limited level of assurance that processes and procedures were in place and were delivering data protection compliance.” 

In June 2021, the Conservative Party was fined £10,000 for sending marketing emails to 51 people who did not want to receive them. The messages were sent in the name of Boris Johnson in the eight days after he became Prime Minister in breach of the Privacy and Electronic Communications Regulations 2003 (PECR).  

The Tax Calculator 

The Good Law Project (GLP), a not for profit campaign organisation, is now challenging one aspect of the Conservative Party’s data collection practices. The party’s website contains an online tool which allows an individual to calculate the effect on them of recent changes to National Insurance contributions. However GLP claims this tool is “a simple data-harvesting exercise” which breaches UK data protection laws in a number of ways. It says that a visit to the website automatically leads to the placement of non-essential cookies (related to marketing, analysis and browser tracking), on the visitor’s machine without consent. This is a breach of Regulation 6 of PECR.
GLP also challenges the gathering and use of website visitors’ personal data on the site claiming that (amongst other things) it is neither fair, lawful nor transparent and thus a breach of the UK GDPR 

Director of GLP, Jo Maugham, has taken the first formal step in legal proceedings against the Conservative Party. The full proposed claim is set out in the GLP’s Letter Before Action. The Conservative Party has issued a response arguing that they have acted lawfully and that: 

  • They did obtain consent for the placement of cookies. (GLP disagrees and has now made a 15-page complaint to the ICO.) 
  • They have agreed to change their privacy notice. (GLP is considering whether to ask the court to make a declaration of illegality, claiming that the Tories “have stated publicly that it was lawful while tacitly admitting in private that it is not.”) 
  • They have agreed to the request by GLP to stop processing Jo Maugham’s personal data where that processing reveals his political opinions.  

Following a subject access request, Mr Maugham received 1,384 pages of personal data held about him. GLP claim he is being profiled and believe that such profiling is unlawful. They have instructed barristers with a view to taking legal action.

George Galloway

This is one to watch. If the legal action goes ahead, the result will have implications for other political parties. In any event, in election year, we are already seeing that all political parties data handling practices are going to be under the spotlight.

George Galloway’s landslide win in the Rochdale by-election last week has lead to scrutiny of his party’s processing of Muslim voters’ data. In his blog post , Jon Baines, discusses whether the Workers Party of Britain (led by Mr Galloway) has been processing Special Category Data in breach of the UK GDPR. In the run up to the
by-election, the party had sent different letters to constituents based, it seems, on their religion (or perhaps inferring their religion based on the their name). If this is what it did then, even if the inference is wrong, the party has been processing Special Category Data which requires a lawful basis under Article 9 of the UK GDPR.
In 2022, the ICO issued a fine in the sum of £1,350,000 to Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products. Following the lodging of an appeal by Easylife, the ICO later reduced the fine to £250,000 but the legal basis of the decision still stands. Will the ICO investigate George Galloway?

The DP Bill

The Data Protection and Digital Information (No.2) Bill is currently in the Committee stage of the House of Lords. It will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). Some of the changes will make it easier for political parties to use the personal data of voters and potential voters without the usual GDPR safeguards.
For example political parties could, in the future, rely on “legitimate interests” (as an Article 6 lawful basis) to process process personal data without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These include personal data being processed for the purpose of “democratic engagement”.  The Bill will also amend PECR so that political parties will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest.

As the General Election approaches, and with trust in politics and politicians at a low, all parties need to ensure that they are open, transparent and accountable about how they use voters’ data.  

Our workshop, How to do Marketing in Compliance with GDPR and PECR, is suitable for those advising political parties and any organisation which uses personal data to reach out to potential customers and service users. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.

Act Now Nominated for IRMS Supplier of the Year Award 

Act Now Training is pleased to announce that it has been nominated once again for the 2024 Information and Records Management Society (IRMS) awards. 

Each year the IRMS recognises excellence in the field of information management with their prestigious Industry Awards. These highly sought-after awards are presented at a glittering ceremony at the annual Conference following the Gala Dinner.  

Act Now has been nominated for the Supplier of the Year award which it won in 2021 and 2022. 

Voting is open to IRMS members until Friday 15th March 2024. 

You can vote for Act Now here: https://irms.org.uk/news/666165/Vote-now-for-the-IRMS-Awards-2024.htm  

Thank you for your support! 

The MoD GDPR Fine: The Dangers of Email 

Inadvertent disclosure of personal data on email systems has been the subject of a number of GDPR enforcement actions by the Information Commissioner’s Office (ICO) in the past few years. In 2021, the transgender charity Mermaids was fined £25,000 for failing to keep the personal data of its users secure. The ICO found that Mermaids failed to implement an appropriate level of security to its internal email systems, which resulted in documents or emails containing personal data being searchable and viewable online by third parties through internet search engine results. 

Failure to use blind carbon copy (BCC) correctly in emails is one of the top data breaches reported to the ICO every year. Last year the Patient and Client Council (PCC) and the Executive Office were the subject of ICO reprimands for disclosing personal data in this way. In October 2021, HIV Scotland was issued with a £10,000 GDPR fine when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.  

The latest GDPR fine was issued in December 2023, although the Monetary Penalty Notice has only just been published on the ICO website. The ICO has fined the Ministry of Defence (MoD) £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. 

On 20th September 2021, the MoD sent an email to a distribution list of Afghan nationals eligible for evacuation using the ‘To’ field, with personal information relating to 245 people being inadvertently disclosed. The email addresses could be seen by all recipients, with 55 people having thumbnail pictures on their email profiles.
Two people ‘replied all’ to the entire list of recipients, with one of them providing their location. 

The original email was sent by the team in charge of the UK’s Afghan Relocations and Assistance Policy (ARAP), which is responsible for assisting the relocation of Afghan citizens who worked for or with the UK Government in Afghanistan.
The data disclosed, should it have fallen into the hands of the Taliban, could have resulted in a threat to life. 

Under the UK GDPR, organisations must have appropriate technical and organisational measures in place to avoid disclosing people’s information inappropriately. ICO guidance makes it clear that organisations should use bulk email services, mail merge, or secure data transfer services when sending any sensitive personal information electronically. The ARAP team did not have such measures in place at the time of the incident and was relying on ‘blind carbon copy’ (BCC), which carries a significant risk of human error. 

The ICO, taking into consideration the representations from the MoD, reduced the fine from a starting amount of £1,000,000 to £700,000 to reflect the action the MoD took following the incidents and recognising the significant challenges the ARAP team faced. Under the ICO’s public sector approach, the fine was further reduced to £350,000.  

Organisations must have appropriate policies and training in place to minimise the risks of personal data being inappropriately disclosed via email. To avoid similar incidents, the ICO recommends that organisations should: 

  1. Consider using other secure means to send communications that involve large amounts of data or sensitive information. This could include using bulk email services, mail merge, or secure data transfer services, so information is not shared with people by mistake.  
  1. Consider having appropriate policies in place and training for staff in relation to email communications.  
  1. For non-sensitive communications, organisations that choose to use BCC should do so carefully to ensure personal email addresses are not shared inappropriately with other customers, clients, or other organisations. 

More on email best practice in the ICO’s email and security guidance

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Facial Recognition to Monitor Attendance: ICO Takes Action

Employers have always had a keen interest in monitoring and tracking employees.
In 2017, we addressed this topic in a blog post focusing on the GDPR implications of employee monitoring using GPS trackers and similar devices. Recent advances in surveillance technology, particularly facial recognition, have not only streamlined employee monitoring but has also rendered it more cost-effective and, concurrently, more intrusive. A good example is this video of a coffee shop using facial recognition technology (FRT) and AI to monitor employee productivity. 

In 2022, the TUC warned employee surveillance technology and AI risks “spiralling out of control” without stronger regulation to protect employees. It warned that, left unchecked, these technologies could lead to widespread discrimination, work intensification and unfair treatment. Earlier this year the French Data Protection Regulator, CNIL, fined Amazon  €32m (£27m) under the GDPR for “excessive” surveillance of its workers. The CNIL said Amazon France Logistique, which manages warehouses, recorded data captured by workers’ handheld scanners. It found Amazon tracked activity so precisely that it led to workers having to potentially justify every break. 

Employee surveillance is now primarily regulated in the UK by the UK GDPR.
As with well all activities involving the processing of personal data, the surveillance must be fair, lawful and transparent. The Human Rights Act and the Regulation of Investigatory Powers Act may also apply (see our previous earlier blog post for more detail on these laws).  

On 23rd February 2024, the Information Commissioner’s Office (ICO) issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance. The ICO’s investigation found that Serco Leisure and the trusts had been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities for the purpose of attendance checks and subsequent payment for their time.  

Serco Leisure will not be appealing against the notices; a wise decision! As readers will know they had to have a lawful basis for processing employees’ data under Article 6 of the UK GDPR as well as Article 9 as they were processing Special Category Data (Biometric Data). Consent was not an option due to the imbalance of power between employer and employee. In the words of the Commissioner:  

“Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy. There is no clear way for staff to opt out of the system, increasing the power imbalance in the workplace and putting people in a position where they feel like they have to hand over their biometric data to work there.” 

Serco tried to rely on Article 6(1)(b) and Article 6(1)(f) as lawful bases for processing the employees’ personal data. In relation to Article 6(1)(b) (contractual necessity) it argued that the processing of attendance data was necessary to ensure employees are paid correctly for the time they have worked. The ICO ruled that although recording attendance times may be necessary for Serco to fulfil its obligations under employment contracts, it does not follow that the processing of biometric data is necessary to achieve this purpose especially when less intrusive means could be used to verify attendance. These included radio-frequency identification cards or fobs, or manual sign-in and sign-out sheets. Serco had failed to demonstrate why these less intrusive methods were not appropriate. They did assert that these methods are open to abuse but did provide evidence of widespread abuse, nor why other methods, such as disciplinary action against employees found to be abusing the system, had not been considered to be appropriate.  

Regarding Serco’s reliance on Article 6(1)(f) (legitimate interests), the ICO said that it will not apply if a controller can reasonably achieve the same result in another less intrusive way. As discussed above, Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system is a necessity, rather than simply a further benefit to Serco. The ICO also said: 

“In applying the balancing test required to rely on legitimate interests, Serco has failed to give appropriate weight to the intrusive nature of biometric processing or the risks to data subjects. “ 

In relation to Article 9, the ICO said that Serco had again failed to demonstrate that the processing of biometric data is “necessary” for Serco to process Special Category Data for the purpose of employment attendance checks or to comply with the relevant laws identified by Serco in their submissions.  

The Enforcement Notices not only instruct Serco Leisure and the trusts to stop all processing of biometric data for monitoring employees’ attendance at work, but also require them to destroy all biometric data that they are not legally obliged to retain. This must be done within three months of the notices being issued. 

This enforcement action coincided with the ICO publishing new guidance for all organisations that are considering using people’s biometric data. The guidance outlines how organisations can comply with data protection law when using biometric data to identify people. Last year, the ICO also published guidance on monitoring employees and called on organisations to consider both their legal obligations and their employee’s rights to privacy before they implement any monitoring. 

This is the first time the ICO has taken enforcement action against an employer to stop it processing the biometric data of staff. It will serve as a warning to organisations who use biometric tech just because it is cheap and easy to use without considering the legal implications.  

Our CCTV Workshop will also examine the use of facial recognition technology. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

AI Regulation and the EU AI Act  

2024 is going to be the year of AI regulation. As the impact of AI increases in our daily lives, governments and regulatory bodies globally are grappling with the need to establish clear guidelines and standards for its responsible use. 

ChatGPT 

Ask people about AI and many will talk about AI powered chat bots like ChatGPT and Gemini –The Bard Replacement from Google. The former currently has around 180.5 million users who generated 1.6 billion visits in December 2023. However, with great popularity comes increased scrutiny as well as privacy and regulatory challenges. 

In March 2023, Italy became the first Western country to block ChatGPT when its data protection regulator (Garante Per La Protezione Dei Dati Personali) cited privacy concerns. Garante’s communication to to OpenAI, owner of ChatGPT, highlighted both the lack of a suitable legal basis for the collection and processing of personal data for the purpose of training the algorithms underlying ChatGPT, the potential to produce inaccurate information about individuals and child safety. In total, Garante said that it suspected ChatGPT to be breaching Articles 5, 6, 8, 13 and 25 of the EU GDPR. 

ChatGPT was made accessible in Italy, fours week after the above decision but Garante launched a “fact-finding activity” at the time. This culminated in a statement on 31st January 2024, in which it said it “concluded that the available evidence pointed to the existence of breaches of the provisions contained in the EU GDPR [General Data Protection Regulation]”. The cited breaches are essentially the same as the provisional finding discussed above; focussing on the mass collection of users’ data for training purposes and the risk of younger users may being exposed to inappropriate content. ChatGPT has 30 days to respond with a defence. 

EU AI Act 

Of course there is more to AI than ChatGPT and some would say much more beneficial use cases. Examples include the ability to match drugs to patients, numerous stories of major cancer research breakthroughs as well as the ability for robots to do major surgery. But there are downsides too including bias, lack of transparency, and failure to take account of the ethical implications. 

On 2nd February 2024, EU member states unanimously reached an agreement on the text of the harmonised rules on artificial intelligence, the so-called
Artificial Intelligence Act” (AI Act). The final draft of the Act will be adopted by the European Parliament in a plenary vote in April and will come into force in 2025 with a two year transition period.  

The main provisions of Act can be read here. They do not differ much from the previous draft can be read on our previous blog here. In summary, the AI Act sets out comprehensive rules for AI applications, including a risk-based system to address potential threats to health and safety, and human rights. The Act will ban some AI applications which pose an “unacceptable risk” (e.g. Real-time and remote biometric identification systems, like facial recognition) and impose strict obligations on others considered as “high risk” (e.g. AI in EU-regulated product safety categories such as cars and medical devices). These obligations include adherence to data governance standards, transparency rules, and the incorporation of human oversight mechanisms.  

Despite Brexit, UK businesses and entities engaged in AI-related activities will still be affected by the Act if they intend to operate within the EU market. The Act will have an extra territorial reach just like the EU GDPR.  

UK Response 

The UK Government’s own decisions on how to regulate AI will be influenced by the EU’s approach. An AI White Paper was published in March last year entitled
“A pro-innovation approach to AI regulation”. The paper sets out the UK’s preference not to place AI regulation on a statutory footing but to make use of “regulators’ domain-specific expertise to tailor the implementation of the principles to the specific context in which AI is used.”  

The government’s long-awaited follow-up to the AI White Paper was published last week. 

Key takeaways are: 

  • The government’s  proposals for regulating AI, still revolve around empowering existing regulators to create tailored, context-specific rules that suit the ways the technology is being used in the sectors they scrutinise i.e. no legislation yet (regulators have been given until 30th April 2024 to publish their AI plans). 
     
  • The government generally reaffirmed its commitment to the whitepaper’s proposals, claiming this approach to regulation will ensure the UK remains more agile than “competitor nations” while also putting it on course to be a leader in safe, responsible AI innovation. 
     
  • It will though consider creating “targeted binding requirements” for select companies developing highly capable AI systems. 
     
  • It also committed to conducting regular reviews of potential regulatory gaps on an ongoing basis: “We remain committed to the iterative approach set out in the whitepaper, anticipating that our framework will need to evolve as new risks or regulatory gaps emerge.” 
     

According to Michelle Donelan,  Secretary of State for Science, Innovation and Technology, the UK’s approach to AI regulation has already made the country a world leader in both AI safety and AI development. 
 

“AI is moving fast, but we have shown that humans can move just as fast,” she said. “By taking an agile, sector-specific approach, we have begun to grip the risks immediately, which in turn is paving the way for the UK to become one of the first countries in the world to reap the benefits of AI safely.” 

Practical Steps 

Last year, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using AI: 

  • Take a data protection by design and default approach 
  • Be transparent with people about how you are using their data by regularly reviewing privacy policies
  • Identify the potential risks to people’s privacy by conducting a Data Protection Impact Assessment

In January 2024, the ICO launched  a consultation series on Generative AI, examining how aspects of data protection law should apply to the development and use of the technology. It is expected to issue more AI guidance later in 2024. 

Join our Artificial Intelligence and Machine Learning, How to Implement Good Information Governance workshop for hands-on insights, key resource awareness, and best practices, ensuring you’re ready to navigate AI complexities fairly and lawfully

The Data Protection and Digital Information Bill: Where are we now? 

The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. It will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). It is expected to be passed in May and will probably come into force after a short transitional period.  

The current Bill is not substantially different to the previous version whose passage through Parliament was paused in September 2022 so ministers could engage in “a co-design process with business leaders and data experts” and move away from the “one-size-fits-all’ approach of the European Union’s GDPR.”  

The Same 

Many of the proposals in the new Bill are the same as contained in the previous Bill. These include: 

  • Amended Definition of Personal Data: This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world.

  • Vexatious Data Subject Requests: The terms “manifestly unfounded” or “excessive” requests, in Article 12 of the UK GDPR, will be replaced with “vexatious” or “excessive” requests. Explanation and examples of such requests will also be included. 

  • Data Subject Complaints: Data Controllers will be required to acknowledge receipt of Data Subject complaints within 30 days and respond substantively “without undue delay”. The ICO will be entitled not to accept a complaint if a Data Subject has not made a complaint to the controller first. 

  • Data Protection Officer: The obligation for some controllers and processors to appoint a Data Protection Officer (DPO) will be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals will be required to designate a senior manager as a “Senior Responsible Individual”.  

  • Data Protection Impact Assessments: These will be replaced by leaner and less prescriptive “Assessments of High-Risk Processing.”  

  • International Transfers: There will be a new approach to the test for adequacy applied by the UK Government to countries (and international organisations) and when Data Controllers are carrying out a Transfer Impact Assessment or TIA. The threshold for this new “data protection test” will be whether a jurisdiction offers protection that is “not materially lower” than under the UK GDPR. (For more detail see also our forthcoming International Transfers webinar). 
  • The Information Commission: The Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive. 

  • PECR: Cookies will be allowed to be used without consent for the purposes of web analytics and to install automatic software updates. Furthermore, non-commercial organisations (e.g. charities and political parties) will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest. Finally, there will be an increase to the fines from the current maximum of £500,000 to UK GDPR levels i.e. up to £17.5m of 4% of global annual turnover (whichever is higher).  

The Changes 

The main changes are summarised below: 

  • Scientific Research: The definition of scientific research is amended so that it now includes research for the purposes of commercial activity. This expands the circumstances in which processing for research purposes may be undertaken, providing a broader consent mechanism and exemption to the fair processing requirement. 
  • Legitimate Interests: The Previous Bill proposed that businesses could rely on legitimate interests (Article 6 lawful basis) without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These “recognised” legitimate interests cover purposes for processing such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement.  The new Bill, whilst keeping the above changes, introduces a non-exhaustive list of cases where organisations may rely on the “legitimate interests” legal basis, including for the purposes of direct marketing, transferring data within the organisation for administrative purposes and for the purposes of ensuring the security of network and information systems; although a balancing exercise still needs to be conducted in these cases.  
  • Automated Decision Making: The Previous Bill clarified that its proposed restrictions on automated decision-making under Article 22 UK GDPR should only apply to decisions that are a result of automated processing without “meaningful human involvement”. The new Bill states that profiling will be a relevant factor in the assessment as to whether there has been meaningful human involvement in a decision.  
  • Records of Processing Activities (ROPA): The Previous Bill streamlined the required content of ROPAs. The new Bill exempts all controllers and processors from the duty to maintain a ROPA unless they are carrying out high risk processing activities.  
  • Subject Access: Clause 12 of the Bill introduced at the House of Commons Report Stage amends Article 12 of UK GDPR (and the DPA 2018) so that Data Controllers are only obliged to undertake a reasonable and proportionate search for information request under the right of access.  

Adequacy 

Although the Government states that the new Bill is “a new system of data protection”, it still retains the UK GDPR’s structure and fundamental obligations. Organisations that are already compliant with the UK GDPR will not be required to make any major changes to their systems and processes.  

The EU conducts a review of adequacy with the UK every four years; the next adequacy decision is due on 27th June 2025. Some commentators have suggested that the changes may jeopardise the UK’s adequate status and so impact the free flow of data between the UK and EU. Defend Digital Me, a civil liberties organisation, has claimed that the Bill would, among other things, weaken data subjects’ rights, water down accountability requirements, and reduce the independence of the ICO.  

Other Parts of the Bill 

The Bill would also: 

  • establish a framework for the provision of digital verification services to enable digital identities to be used with the same confidence as paper documents. 
     
  • increase fines for nuisance calls and texts under PECR. 

  • update the PECR to cut down on ‘user consent’ pop-ups and banners. 

  • allow for the sharing of customer data, through smart data schemes, to provide services such as personalised market comparisons and account management. 
  • reform the way births and deaths are registered in England and Wales, enabling the move from a paper-based system to registration in an electronic register.
  • facilitate the flow and use of personal data for law enforcement and national security purposes. 

  • create a clearer legal basis for political parties and elected representatives to process personal data for the purposes of democratic engagement. 

Reading the Parliamentary debates on the Bill, it seems that the Labour party have no great desire to table substantial amendments to be the Bill. Consequently, it is expected that the Bill will be passed in a form similar to the one now published.  

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. Dive into the issues discussed in this blog and secure your spot now.