YMCA Fined for HIV Email Data Breach 

Another day and another ICO fine for a data breach involving email! The Central Young Men’s Christian Association (the Central YMCA) of London has been issued with a Monetary Penalty Notice of £7,500 for a data breach when emails intended for those on a HIV support programme were sent to 264 email addresses using CC instead of BCC, revealing the email addresses to all recipients. This resulted in 166 people being identifiable or potentially identifiable. A formal reprimand has also been issued

Failure to use blind carbon copy (BCC) correctly in emails is one of the top data breaches reported to the ICO every year. In December 2023, the ICO fined the Ministry of Defence (MoD) £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. Again the failure to use blind copy when using e mail was a central cause of the data breach. 

Last year the Patient and Client Council (PCC) and the Executive Office were the subject of ICO reprimands for disclosing personal data in this way. In October 2021, HIV Scotland was issued with a £10,000 GDPR fine when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.  

Organisations must have appropriate policies and training in place to minimise the risks of personal data being inappropriately disclosed via email. To avoid similar incidents, the ICO recommends that organisations should: 

  1. Consider using other secure means to send communications that involve large amounts of data or sensitive information. This could include using bulk email services, mail merge, or secure data transfer services, so information is not shared with people by mistake.  
  1. Consider having appropriate policies in place and training for staff in relation to email communications.  
  1. For non-sensitive communications, organisations that choose to use BCC should do so carefully to ensure personal email addresses are not shared inappropriately with other customers, clients, or other organisations. 

More on email best practice in the ICO’s email and security guidance

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Apprentice Case Study – Meet Evie

In 2022, Act Now Training teamed up with Damar Training to support their delivery of the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first group of apprentices due to undertake the endpoint assessment and so we caught up with Manchester Airport Group apprentice Evie Scott and her manager to get their thoughts on the programme so far.

Evie left college in summer 2022 after A Levels and a BTEC. She wanted to continue learning but in a more hands-on environment:

“In my final year of college, my tutor helped me create a LinkedIn account and I found the Data Protection and Information Governance Practitioner apprenticeship opportunity at Manchester Airport Group. Having previously visited the airport on a school trip I found the range of jobs there fascinating, so I started looking into their apprenticeship opportunities and how they could benefit my career.”

Evie applied successfully for the role of apprentice Data Protection and Information Governance Practitioner at Manchester Airport Group (MAG). Over a year into her job, she is finding the programme engaging and is developing new skills and perspectives that she can apply at work:

“I really enjoy the fact that the apprenticeship programme is challenging yet engaging. I enjoy the further reading aspect as it allows me to gain a greater insight into topics and offers different viewpoints and perspectives which I try adopting into my work.”

Charlotte Lewendon-Jones, Head of Data Protection and Privacy at MAG, has over 30 years’ experience in Information Governance. She was part of the trailblazer group of employers that helped develop the Data Protection and Information Governance Practitioner apprenticeship.

Charlotte manages the Data Protection and Compliance Team at MAG.
When MAG advertised their data protection apprenticeship opportunities in summer 2022, they were overwhelmed by the level of interest. This was testament, Charlotte believes, to the quality of the apprenticeship itself and to MAG’s commitment to its wider apprenticeship programme. On the impact of apprentices so far, she comments:

“The apprentices are confident and bring a fresh viewpoint to the team which brings huge improvements. When the apprentices go on training sessions, I challenge them on some of our processes to see what they have learnt, find ways in which we can do better and support their learning journey.”

About Evie, Charlotte adds:

“Considering Evie didn’t have any experience in data protection and information governance, I feel she’s done really well. Her training started in September 2022 and I’ve seen her confidence grow. Her approach and attitude to work are excellent, she’s gaining great experience, asking fewer questions and making more informed decisions based on her experience and what she’s learnt.”

Finally, we asked Evie how she feels the apprenticeship will impact her moving forward:

“When I apply what I have learnt so far to my workload or tasks I have an appreciation for why things are done in a certain way. I feel the further I get into my apprenticeship more it will continue to influence my everyday tasks, benefit the organisation and help me in my job role.”

“At Damar, we believe in the power of apprenticeships to benefit business and transform lives. We see it every day across the thousands of supportive employers, apprentices and workplace supervisors that we are proud to partner with.”

You can read about the experience of another apprentice (Natasha) here.

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

Oral Disclosure of Personal Data: To GDPR or not to GDPR? 

Here’s a pub quiz question for you, “Can a Data Controller circumvent the requirements of data protection law by disclosing personal data verbally rather than in writing?” The answer was “Yes” under the old Data Protection Act 1998.
In Scott v LGBT Foundation Ltd [2020] WLR 62, the High Court rejected a claim that the LGBT foundation had breached, amongst other things, the claimants data protection rights by disclosing information about him to a GP. The court held that the 1998 Act did not apply to purely verbal communications.  

Nowadays though, the answer to the above question is no; the oral disclosure of personal data amounts to “processing” as defined by Article 4(2) of the GDPR.
So said the Court of Justice of the European Union (CJEU), on 7th March 2024, in a preliminary ruling in the Endemol Shine Finland

The subject of the ruling is a television company which makes a number of reality TV shows in Finland. It had been organising a competition, and was seeking information from the District Court of South Savo for information about possible criminal proceedings involving one of the competition participants. It requested the District Court to disclose the information orally rather than in writing. The District Court refused the request on the basis that there was no legitimate reason for processing the criminal offence data under Finnish law, implementing Article 10 of the GDPR.
On appeal Endemol Shine Finland argued that the GDPR did not apply as the oral disclosure of the information would not constitute processing of personal data under the GDPR. 

Article 4(2) GDPR defines “processing” as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means”. On the face of it, this covers oral processing. However, Article 2 states that GDPR applies to processing of personal data “wholly or partly by automated means”, and processing by non-automated means which “forms or is intended to form part of a filing system.” Article 4(6) GDPR defines “filing system” broadly, covering “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”. 

The Finnish Court of Appeal requested a preliminary ruling from CJEU on the meaning of Article 4(2) and whether the particular processing in this case came within the material scope of the GDPR under Article 2. The CJEU held the concept of processing in Article 4(2) of the GPDR necessarily covered the oral disclosure of personal data. It said the wording of the Article made it apparent that the EU legislature intended to give the concept of processing a broad scope. The court pointed out that the GDPR’s objective was “to ensure a high level of protection of the fundamental rights and freedoms of natural persons” and that “circumventing the application of that regulation by disclosing personal data orally rather than in writing would be manifestly incompatible with that objective”. 

The CJEU went on to consider whether the oral processing of the data would fall within the material scope of the GDPR under Article 2. It held that it was clear from the request for a preliminary ruling that the personal data sought from the District Court of South Savo is contained in “a court’s register of persons” which appeared to be a filing system within the meaning of Article 4(6), and therefore fell within the scope of the GDPR. 

UK Data Controllers should note the wording of Article 4 and Article 2 of the UK GDPR is the same as in the EU GDPR. So whilst this ruling from the CJEU is not binding on UK courts, it would be wise to assume that picking up the phone and making an oral disclosure of personal data will not allow the UK GDPR to be circumvented.   

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 


Transparency in Health and Social Care: New ICO Guidance 

Within the health and social care sector, new technologies that use large amounts of personal data are being used to support both direct care and secondary purposes, such as planning and research. An example is the the use of AI to provide automated diagnoses based on medical imaging data from patients. 

Transparency is a key principle of UK Data Protection legislation. Compliance with the first data protection principle and Article 13 and 14 of the UK GDPR ensures that data subjects are aware of how their personal data is used, allowing them to make informed choices about who they disclose their data to and how to exercise their data rights. 

On Monday the Information Commissioner’s Office (ICO) published new guidance to assist health and social care organisations to comply with their transparency obligations under the UK GDPR. It supplements existing ICO guidance on the principle of transparency and the right to be informed

The guidance is aimed at all organisations, including from the private and third sector, who deliver health and social care services or process health and social care information. This includes local authorities, suppliers to the health and social care sector, universities using health information for research purposes and others (e.g. fire service, police and education) that use health information for their own purposes. The guidance will help them to understand the definition of transparency and assess appropriate levels of transparency, as well as providing practical steps to developing effective transparency information. 

This and other data protection developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.   

DP Bill: Updated Keeling Schedules Published 

The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. If passed, it will make changes to UK data protection legislation including the UK GDPR.

The Government recently published updated versions of Keeling Schedules showing potential changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

Whilst no doubt there will be further amendments, the schedules are worth studying for a clear picture as to the impact of the Bill. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. 

Kate Middleton’s Medical Records: Can anyone go to jail?

Kate Middleton seems to be at the centre of much media (and social media) attention at present. In addition to speculation about her health and whereabouts, there has been much focus and analysis of the now famous photoshopped Mother’s Day photo

This week it was reported that employees at the private London Clinic in Marylebone, where Kate was a patient following abdominal surgery in January, attempted to view her medical records. Reportedly three employees have now been suspended.

The Health Minister, Maria Caulfield, told Sky News it was “pretty severe and serious stuff to be accessing notes that you don’t have permission to”. She also said police had been “asked to look at” whether staff at the clinic attempted to access the princess’s private medical records. 

If the reports are true and individuals are proven to have been “snooping”, what are the consequences? Firstly, this would normally be a matter for the Information Commissioner’s Office (ICO) to investigate rather than the police. Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly: 

(a) obtain or disclose personal data without the consent of the controller, 

(b) procure the disclosure of personal data to another person without the consent of the controller, or 

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained. 

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. In June 2023, the ICO disclosed that since 1st June 2018, 92 cases involving Section 170 offences were investigated by its Criminal Investigations Team.   

Consequences 

Section 170 is only punishable by way of a fine; perpetrators cannot be sent to prison. Although there is now no cap on the maximum fine, prosecutions have resulted in relatively low fines compared to the harm caused.  

Two recent prosecutions have involved employees accessing medical records.
In November 2023, Loretta Alborghetti, was fined for illegally accessing the medical records of over 150 people. The offence took place whilst she worked as a medical secretary at Worcestershire Acute Hospitals NHS Trust. She was ordered to pay a total of £648. 

In August 2022, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust, pleaded guilty to accessing medical records of patients without a valid legal reason. An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off from going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000. 

Computer Misuse Act  

A Section 170 prosecution would have a much greater deterrent effect if the sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail.  

The relatively low fines have led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment. In July 2022, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of people’s personal information from vehicle repair garages to generate potential leads for personal injury claims. 

The ICO has now confirmed that it has a personal data breach report from The London Clinic. If its investigation, finds the Clinic did not comply with its security obligations under the Article 5(1)(f)) and Article 32 of the UK GDPR, it faces a possible maximum Monetary Penalty Notice of £17.5m or 4% of gross annual turnover (whichever is higher). This case highlights the importance of organisations ensuring adequate security measures around sensitive personal data especially where the data relates to high profile individuals.  
 
This and other data protection developments will be discussed in detail on our forthcoming GDPR Update workshop. There are only 3 places left on our next GDPR Practitioner Certificate 

Conservative Party Challenged Over “Data Harvesting” 

In the run up to the General Election this year, political parties in the UK will face the challenge of effectively communicating their message to voters whilst at the same time respecting voters’ privacy. In the past few years, all parties have been accused of riding roughshod over data protection laws in their attempts to convince voters that they ‘have a plan’ or that ‘the country needs change’.  

In May 2017, the Information Commissioner’s Office (ICO) announced that it was launching a formal investigation into the use of data analytics for political purposes after allegations were made about the ‘invisible processing’ of people’s personal data and the micro-targeting of political adverts during the EU Referendum.
This culminated in a report to Parliament and enforcement action against Facebook, Emma’s Diary and some of the companies involved in the Vote Leave Campaign.  

In July 2018 the ICO published a report, Democracy Disrupted, which highlighted significant concerns about transparency around how people’s data was being used in political campaigning. The report revealed a complex ecosystem of digital campaigning with many actors. In 2019, the ICO issued assessment notices to seven political parties. It concluded: 

“The audits found some considerable areas for improvement in both transparency and lawfulness and we recommended several specific actions to bring the parties’ 

processing in compliance with data protection laws. In addition, we recommended that the parties implemented several appropriate technical and organisational measures to meet the requirements of accountability. Overall there was a limited level of assurance that processes and procedures were in place and were delivering data protection compliance.” 

In June 2021, the Conservative Party was fined £10,000 for sending marketing emails to 51 people who did not want to receive them. The messages were sent in the name of Boris Johnson in the eight days after he became Prime Minister in breach of the Privacy and Electronic Communications Regulations 2003 (PECR).  

The Tax Calculator 

The Good Law Project (GLP), a not for profit campaign organisation, is now challenging one aspect of the Conservative Party’s data collection practices. The party’s website contains an online tool which allows an individual to calculate the effect on them of recent changes to National Insurance contributions. However GLP claims this tool is “a simple data-harvesting exercise” which breaches UK data protection laws in a number of ways. It says that a visit to the website automatically leads to the placement of non-essential cookies (related to marketing, analysis and browser tracking), on the visitor’s machine without consent. This is a breach of Regulation 6 of PECR.
GLP also challenges the gathering and use of website visitors’ personal data on the site claiming that (amongst other things) it is neither fair, lawful nor transparent and thus a breach of the UK GDPR 

Director of GLP, Jo Maugham, has taken the first formal step in legal proceedings against the Conservative Party. The full proposed claim is set out in the GLP’s Letter Before Action. The Conservative Party has issued a response arguing that they have acted lawfully and that: 

  • They did obtain consent for the placement of cookies. (GLP disagrees and has now made a 15-page complaint to the ICO.) 
  • They have agreed to change their privacy notice. (GLP is considering whether to ask the court to make a declaration of illegality, claiming that the Tories “have stated publicly that it was lawful while tacitly admitting in private that it is not.”) 
  • They have agreed to the request by GLP to stop processing Jo Maugham’s personal data where that processing reveals his political opinions.  

Following a subject access request, Mr Maugham received 1,384 pages of personal data held about him. GLP claim he is being profiled and believe that such profiling is unlawful. They have instructed barristers with a view to taking legal action.

George Galloway

This is one to watch. If the legal action goes ahead, the result will have implications for other political parties. In any event, in election year, we are already seeing that all political parties data handling practices are going to be under the spotlight.

George Galloway’s landslide win in the Rochdale by-election last week has lead to scrutiny of his party’s processing of Muslim voters’ data. In his blog post , Jon Baines, discusses whether the Workers Party of Britain (led by Mr Galloway) has been processing Special Category Data in breach of the UK GDPR. In the run up to the
by-election, the party had sent different letters to constituents based, it seems, on their religion (or perhaps inferring their religion based on the their name). If this is what it did then, even if the inference is wrong, the party has been processing Special Category Data which requires a lawful basis under Article 9 of the UK GDPR.
In 2022, the ICO issued a fine in the sum of £1,350,000 to Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products. Following the lodging of an appeal by Easylife, the ICO later reduced the fine to £250,000 but the legal basis of the decision still stands. Will the ICO investigate George Galloway?

The DP Bill

The Data Protection and Digital Information (No.2) Bill is currently in the Committee stage of the House of Lords. It will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). Some of the changes will make it easier for political parties to use the personal data of voters and potential voters without the usual GDPR safeguards.
For example political parties could, in the future, rely on “legitimate interests” (as an Article 6 lawful basis) to process process personal data without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These include personal data being processed for the purpose of “democratic engagement”.  The Bill will also amend PECR so that political parties will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest.

As the General Election approaches, and with trust in politics and politicians at a low, all parties need to ensure that they are open, transparent and accountable about how they use voters’ data.  

Our workshop, How to do Marketing in Compliance with GDPR and PECR, is suitable for those advising political parties and any organisation which uses personal data to reach out to potential customers and service users. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.

Act Now Nominated for IRMS Supplier of the Year Award 

Act Now Training is pleased to announce that it has been nominated once again for the 2024 Information and Records Management Society (IRMS) awards. 

Each year the IRMS recognises excellence in the field of information management with their prestigious Industry Awards. These highly sought-after awards are presented at a glittering ceremony at the annual Conference following the Gala Dinner.  

Act Now has been nominated for the Supplier of the Year award which it won in 2021 and 2022. 

Voting is open to IRMS members until Friday 15th March 2024. 

You can vote for Act Now here: https://irms.org.uk/news/666165/Vote-now-for-the-IRMS-Awards-2024.htm  

Thank you for your support! 

The MoD GDPR Fine: The Dangers of Email 

Inadvertent disclosure of personal data on email systems has been the subject of a number of GDPR enforcement actions by the Information Commissioner’s Office (ICO) in the past few years. In 2021, the transgender charity Mermaids was fined £25,000 for failing to keep the personal data of its users secure. The ICO found that Mermaids failed to implement an appropriate level of security to its internal email systems, which resulted in documents or emails containing personal data being searchable and viewable online by third parties through internet search engine results. 

Failure to use blind carbon copy (BCC) correctly in emails is one of the top data breaches reported to the ICO every year. Last year the Patient and Client Council (PCC) and the Executive Office were the subject of ICO reprimands for disclosing personal data in this way. In October 2021, HIV Scotland was issued with a £10,000 GDPR fine when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.  

The latest GDPR fine was issued in December 2023, although the Monetary Penalty Notice has only just been published on the ICO website. The ICO has fined the Ministry of Defence (MoD) £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. 

On 20th September 2021, the MoD sent an email to a distribution list of Afghan nationals eligible for evacuation using the ‘To’ field, with personal information relating to 245 people being inadvertently disclosed. The email addresses could be seen by all recipients, with 55 people having thumbnail pictures on their email profiles.
Two people ‘replied all’ to the entire list of recipients, with one of them providing their location. 

The original email was sent by the team in charge of the UK’s Afghan Relocations and Assistance Policy (ARAP), which is responsible for assisting the relocation of Afghan citizens who worked for or with the UK Government in Afghanistan.
The data disclosed, should it have fallen into the hands of the Taliban, could have resulted in a threat to life. 

Under the UK GDPR, organisations must have appropriate technical and organisational measures in place to avoid disclosing people’s information inappropriately. ICO guidance makes it clear that organisations should use bulk email services, mail merge, or secure data transfer services when sending any sensitive personal information electronically. The ARAP team did not have such measures in place at the time of the incident and was relying on ‘blind carbon copy’ (BCC), which carries a significant risk of human error. 

The ICO, taking into consideration the representations from the MoD, reduced the fine from a starting amount of £1,000,000 to £700,000 to reflect the action the MoD took following the incidents and recognising the significant challenges the ARAP team faced. Under the ICO’s public sector approach, the fine was further reduced to £350,000.  

Organisations must have appropriate policies and training in place to minimise the risks of personal data being inappropriately disclosed via email. To avoid similar incidents, the ICO recommends that organisations should: 

  1. Consider using other secure means to send communications that involve large amounts of data or sensitive information. This could include using bulk email services, mail merge, or secure data transfer services, so information is not shared with people by mistake.  
  1. Consider having appropriate policies in place and training for staff in relation to email communications.  
  1. For non-sensitive communications, organisations that choose to use BCC should do so carefully to ensure personal email addresses are not shared inappropriately with other customers, clients, or other organisations. 

More on email best practice in the ICO’s email and security guidance

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Facial Recognition to Monitor Attendance: ICO Takes Action

Employers have always had a keen interest in monitoring and tracking employees.
In 2017, we addressed this topic in a blog post focusing on the GDPR implications of employee monitoring using GPS trackers and similar devices. Recent advances in surveillance technology, particularly facial recognition, have not only streamlined employee monitoring but has also rendered it more cost-effective and, concurrently, more intrusive. A good example is this video of a coffee shop using facial recognition technology (FRT) and AI to monitor employee productivity. 

In 2022, the TUC warned employee surveillance technology and AI risks “spiralling out of control” without stronger regulation to protect employees. It warned that, left unchecked, these technologies could lead to widespread discrimination, work intensification and unfair treatment. Earlier this year the French Data Protection Regulator, CNIL, fined Amazon  €32m (£27m) under the GDPR for “excessive” surveillance of its workers. The CNIL said Amazon France Logistique, which manages warehouses, recorded data captured by workers’ handheld scanners. It found Amazon tracked activity so precisely that it led to workers having to potentially justify every break. 

Employee surveillance is now primarily regulated in the UK by the UK GDPR.
As with well all activities involving the processing of personal data, the surveillance must be fair, lawful and transparent. The Human Rights Act and the Regulation of Investigatory Powers Act may also apply (see our previous earlier blog post for more detail on these laws).  

On 23rd February 2024, the Information Commissioner’s Office (ICO) issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance. The ICO’s investigation found that Serco Leisure and the trusts had been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities for the purpose of attendance checks and subsequent payment for their time.  

Serco Leisure will not be appealing against the notices; a wise decision! As readers will know they had to have a lawful basis for processing employees’ data under Article 6 of the UK GDPR as well as Article 9 as they were processing Special Category Data (Biometric Data). Consent was not an option due to the imbalance of power between employer and employee. In the words of the Commissioner:  

“Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy. There is no clear way for staff to opt out of the system, increasing the power imbalance in the workplace and putting people in a position where they feel like they have to hand over their biometric data to work there.” 

Serco tried to rely on Article 6(1)(b) and Article 6(1)(f) as lawful bases for processing the employees’ personal data. In relation to Article 6(1)(b) (contractual necessity) it argued that the processing of attendance data was necessary to ensure employees are paid correctly for the time they have worked. The ICO ruled that although recording attendance times may be necessary for Serco to fulfil its obligations under employment contracts, it does not follow that the processing of biometric data is necessary to achieve this purpose especially when less intrusive means could be used to verify attendance. These included radio-frequency identification cards or fobs, or manual sign-in and sign-out sheets. Serco had failed to demonstrate why these less intrusive methods were not appropriate. They did assert that these methods are open to abuse but did provide evidence of widespread abuse, nor why other methods, such as disciplinary action against employees found to be abusing the system, had not been considered to be appropriate.  

Regarding Serco’s reliance on Article 6(1)(f) (legitimate interests), the ICO said that it will not apply if a controller can reasonably achieve the same result in another less intrusive way. As discussed above, Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system is a necessity, rather than simply a further benefit to Serco. The ICO also said: 

“In applying the balancing test required to rely on legitimate interests, Serco has failed to give appropriate weight to the intrusive nature of biometric processing or the risks to data subjects. “ 

In relation to Article 9, the ICO said that Serco had again failed to demonstrate that the processing of biometric data is “necessary” for Serco to process Special Category Data for the purpose of employment attendance checks or to comply with the relevant laws identified by Serco in their submissions.  

The Enforcement Notices not only instruct Serco Leisure and the trusts to stop all processing of biometric data for monitoring employees’ attendance at work, but also require them to destroy all biometric data that they are not legally obliged to retain. This must be done within three months of the notices being issued. 

This enforcement action coincided with the ICO publishing new guidance for all organisations that are considering using people’s biometric data. The guidance outlines how organisations can comply with data protection law when using biometric data to identify people. Last year, the ICO also published guidance on monitoring employees and called on organisations to consider both their legal obligations and their employee’s rights to privacy before they implement any monitoring. 

This is the first time the ICO has taken enforcement action against an employer to stop it processing the biometric data of staff. It will serve as a warning to organisations who use biometric tech just because it is cheap and easy to use without considering the legal implications.  

Our CCTV Workshop will also examine the use of facial recognition technology. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.