Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

GDPR: One Year on

canstockphoto16138153

The General Data Protection Regulation (GDPR) and the Data Protection Act 2018 came into force on 25th May 2018 with much fanfare. The biggest change to data protection law in 20 years, with GDPR carrying a maximum fine of 20 million Euros or 4% of gross annual turnover (whichever is higher), the marketing hype, emails and myths came thick and fast.

There has been no avalanche of massive fines under GDPR. According to a progress report by the European Data Protection Board (EDPB), Supervisory Authorities from 11 EEA countries imposed a total of €55,955,871 in fines. This is not a large amount when you consider it includes a 50 million euro fine on Google issued by the French National Data Protection Commission (CNIL). It followed complaints from two privacy groups who argued, amongst other things, that Google did not have a valid legal basis to process the personal data of the users of its services, particularly for ads personalisation purposes, as they were in effect forcing users to consent.

EPDB figures also show:

  • 67 % of Europeans have heard of GDPR
  • Over 89,000 data breaches have been logged by the EEA Supervisory Authorities. 63% of these have been closed and 37% are ongoing
  • There have been 446 cross border investigations by Supervisory Authorities

Despite the warnings of data armageddon, Year one of GDPR has mostly been a year of learning for Data Controllers and one of raising awareness for Supervisory Authorities. The Information Commissioner’s Office (ICO) in the UK, has produced a GDPR progress report in which it highlights an increased public awareness.In March it surveyed Data Protection Officers. 64% stated that they either agreed or strongly agreed with the statement ‘I have seen an increase in customers and service users exercising their information rights since 25 May 2018’.

The ICO has not issued any fines yet but has used its other enforcement powers extensively. It has issued 15 Assessment Notices and 11 Information Notices in conjunction with various investigations including into data analytics for political purposes, political parties, data brokers, credit reference agencies and others. Two Enforcement Notices have been issued against a data broking company and the HMRC respectively (read our blog) as well as warnings and reprimands across a range of sectors including health, central government, criminal justice, education, retail and finance. (25/6/19 STOP PRESS  – Enforcement notices have been served (25th June), under the 1998 and 2018 Data Protection Acts on the Metropolitan Police, for sustained failures to comply with individuals’ rights in respect of subject access requests.)

The ICO is planning to produce four new codes of practice in 2019 under GDPR. Here are the dates for your diary:

  • A new Data Sharing code. A draft code for formal consultation is expected to be launched in June 2019 and the final version laid before Parliament in the autumn.
  • A new Direct Marketing code to ensure that all activities are compliant with the GDPR, DPA 2018 and the Privacy and Electronic Communications Regulations (PECR). A formal consultation on this will be launched in June 2019 with a view to finalising the code by the end of October.
  • A Data Protection and Journalism code. A formal consultation on this will be launched in June 2019 with a view to laying the final version before Parliament in the summer.
  • A code of practice on political campaigning. The code will apply to all organisations who process personal data for the purpose of political campaigning, i.e. activity relating to elections or referenda. A draft will be published for consultation in July 2019.

Year 2 of GDPR will no doubt see more enforcement action by the ICO including the first fines. According to its progress report though, it will continue to focus on its regulatory priorities which are cyber security, AI Big Data and machine learning, web and cross device tracking for marketing purposes, children’s privacy, use of surveillance and facial recognition, data broking, the use of personal information in political campaigns and Freedom of Information compliance.

Finally, depending on whether there is Brexit deal, we may see some changes to GDPR via the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 which came into force in March this year.

More on these and other developments will be in our GDPR Update webinar and full day workshop presented by Ibrahim Hasan. For those seeking a GDPR qualification, our highly popular practitioner certificate is the best option. Read our testimonials here.

%d bloggers like this: