Facial Recognition to Monitor Attendance: ICO Takes Action

Employers have always had a keen interest in monitoring and tracking employees.
In 2017, we addressed this topic in a blog post focusing on the GDPR implications of employee monitoring using GPS trackers and similar devices. Recent advances in surveillance technology, particularly facial recognition, have not only streamlined employee monitoring but has also rendered it more cost-effective and, concurrently, more intrusive. A good example is this video of a coffee shop using facial recognition technology (FRT) and AI to monitor employee productivity. 

In 2022, the TUC warned employee surveillance technology and AI risks “spiralling out of control” without stronger regulation to protect employees. It warned that, left unchecked, these technologies could lead to widespread discrimination, work intensification and unfair treatment. Earlier this year the French Data Protection Regulator, CNIL, fined Amazon  €32m (£27m) under the GDPR for “excessive” surveillance of its workers. The CNIL said Amazon France Logistique, which manages warehouses, recorded data captured by workers’ handheld scanners. It found Amazon tracked activity so precisely that it led to workers having to potentially justify every break. 

Employee surveillance is now primarily regulated in the UK by the UK GDPR.
As with well all activities involving the processing of personal data, the surveillance must be fair, lawful and transparent. The Human Rights Act and the Regulation of Investigatory Powers Act may also apply (see our previous earlier blog post for more detail on these laws).  

On 23rd February 2024, the Information Commissioner’s Office (ICO) issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance. The ICO’s investigation found that Serco Leisure and the trusts had been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities for the purpose of attendance checks and subsequent payment for their time.  

Serco Leisure will not be appealing against the notices; a wise decision! As readers will know they had to have a lawful basis for processing employees’ data under Article 6 of the UK GDPR as well as Article 9 as they were processing Special Category Data (Biometric Data). Consent was not an option due to the imbalance of power between employer and employee. In the words of the Commissioner:  

“Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy. There is no clear way for staff to opt out of the system, increasing the power imbalance in the workplace and putting people in a position where they feel like they have to hand over their biometric data to work there.” 

Serco tried to rely on Article 6(1)(b) and Article 6(1)(f) as lawful bases for processing the employees’ personal data. In relation to Article 6(1)(b) (contractual necessity) it argued that the processing of attendance data was necessary to ensure employees are paid correctly for the time they have worked. The ICO ruled that although recording attendance times may be necessary for Serco to fulfil its obligations under employment contracts, it does not follow that the processing of biometric data is necessary to achieve this purpose especially when less intrusive means could be used to verify attendance. These included radio-frequency identification cards or fobs, or manual sign-in and sign-out sheets. Serco had failed to demonstrate why these less intrusive methods were not appropriate. They did assert that these methods are open to abuse but did provide evidence of widespread abuse, nor why other methods, such as disciplinary action against employees found to be abusing the system, had not been considered to be appropriate.  

Regarding Serco’s reliance on Article 6(1)(f) (legitimate interests), the ICO said that it will not apply if a controller can reasonably achieve the same result in another less intrusive way. As discussed above, Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system is a necessity, rather than simply a further benefit to Serco. The ICO also said: 

“In applying the balancing test required to rely on legitimate interests, Serco has failed to give appropriate weight to the intrusive nature of biometric processing or the risks to data subjects. “ 

In relation to Article 9, the ICO said that Serco had again failed to demonstrate that the processing of biometric data is “necessary” for Serco to process Special Category Data for the purpose of employment attendance checks or to comply with the relevant laws identified by Serco in their submissions.  

The Enforcement Notices not only instruct Serco Leisure and the trusts to stop all processing of biometric data for monitoring employees’ attendance at work, but also require them to destroy all biometric data that they are not legally obliged to retain. This must be done within three months of the notices being issued. 

This enforcement action coincided with the ICO publishing new guidance for all organisations that are considering using people’s biometric data. The guidance outlines how organisations can comply with data protection law when using biometric data to identify people. Last year, the ICO also published guidance on monitoring employees and called on organisations to consider both their legal obligations and their employee’s rights to privacy before they implement any monitoring. 

This is the first time the ICO has taken enforcement action against an employer to stop it processing the biometric data of staff. It will serve as a warning to organisations who use biometric tech just because it is cheap and easy to use without considering the legal implications.  

Our CCTV Workshop will also examine the use of facial recognition technology. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

GDPR: One Year on

The General Data Protection Regulation (GDPR) and the Data Protection Act 2018 came into force on 25th May 2018 with much fanfare. The biggest change to data protection law in 20 years, with GDPR carrying a maximum fine of 20 million Euros or 4% of gross annual turnover (whichever is higher), the marketing hype, emails and myths came thick and fast.

There has been no avalanche of massive fines under GDPR. According to a progress report by the European Data Protection Board (EDPB), Supervisory Authorities from 11 EEA countries imposed a total of €55,955,871 in fines. This is not a large amount when you consider it includes a 50 million euro fine on Google issued by the French National Data Protection Commission (CNIL). It followed complaints from two privacy groups who argued, amongst other things, that Google did not have a valid legal basis to process the personal data of the users of its services, particularly for ads personalisation purposes, as they were in effect forcing users to consent.

EPDB figures also show:

  • 67 % of Europeans have heard of GDPR
  • Over 89,000 data breaches have been logged by the EEA Supervisory Authorities. 63% of these have been closed and 37% are ongoing
  • There have been 446 cross border investigations by Supervisory Authorities

Despite the warnings of data armageddon, Year one of GDPR has mostly been a year of learning for Data Controllers and one of raising awareness for Supervisory Authorities. The Information Commissioner’s Office (ICO) in the UK, has produced a GDPR progress report in which it highlights an increased public awareness.In March it surveyed Data Protection Officers. 64% stated that they either agreed or strongly agreed with the statement ‘I have seen an increase in customers and service users exercising their information rights since 25 May 2018’.

The ICO has not issued any fines yet but has used its other enforcement powers extensively. It has issued 15 Assessment Notices and 11 Information Notices in conjunction with various investigations including into data analytics for political purposes, political parties, data brokers, credit reference agencies and others. Two Enforcement Notices have been issued against a data broking company and the HMRC respectively (read our blog) as well as warnings and reprimands across a range of sectors including health, central government, criminal justice, education, retail and finance. (25/6/19 STOP PRESS  – Enforcement notices have been served (25th June), under the 1998 and 2018 Data Protection Acts on the Metropolitan Police, for sustained failures to comply with individuals’ rights in respect of subject access requests.)

The ICO is planning to produce four new codes of practice in 2019 under GDPR. Here are the dates for your diary:

  • A new Data Sharing code. A draft code for formal consultation is expected to be launched in June 2019 and the final version laid before Parliament in the autumn.
  • A new Direct Marketing code to ensure that all activities are compliant with the GDPR, DPA 2018 and the Privacy and Electronic Communications Regulations (PECR). A formal consultation on this will be launched in June 2019 with a view to finalising the code by the end of October.
  • A Data Protection and Journalism code. A formal consultation on this will be launched in June 2019 with a view to laying the final version before Parliament in the summer.
  • A code of practice on political campaigning. The code will apply to all organisations who process personal data for the purpose of political campaigning, i.e. activity relating to elections or referenda. A draft will be published for consultation in July 2019.

Year 2 of GDPR will no doubt see more enforcement action by the ICO including the first fines. According to its progress report though, it will continue to focus on its regulatory priorities which are cyber security, AI Big Data and machine learning, web and cross device tracking for marketing purposes, children’s privacy, use of surveillance and facial recognition, data broking, the use of personal information in political campaigns and Freedom of Information compliance.

Finally, depending on whether there is Brexit deal, we may see some changes to GDPR via the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 which came into force in March this year.

More on these and other developments will be in our GDPR Update webinar and full day workshop presented by Ibrahim Hasan. For those seeking a GDPR qualification, our highly popular practitioner certificate is the best option. Read our testimonials here.

Exit mobile version
%%footer%%