Electronic Tagging of Migrants: Enforcement Notice Published by ICO

On 1st March 2024, the Information Commissioner’s Office (ICO) announced that it has issued an Enforcement Notice and warning to the Home Office for failing to sufficiently assess the privacy risks posed by the electronic monitoring of people arriving in the UK via unauthorised means. (Strangely the actual text of the notice and warning were only recently published; three weeks after the ICO press release.)
The decision comes as a result of Privacy International’s complaint (filed in August 2022) against the Home Office policy. The civil liberties pressure group alleged widespread and significant breaches of privacy and data protection law.  

The ICO had been in discussion with the Home Office since August 2022 on its pilot to place ankle tags on, and track the GPS location of, up to 600 migrants who arrived in the UK and were on immigration bail. The purpose of the pilot was to test whether electronic monitoring is an effective way to maintain regular contact with asylum claimants, while reducing the risk of absconding, and to establish whether it is an effective alternative to detention. 

The ICO found the Home Office failed to conduct a Data Protection Impact Assessment (DPIA), in relation to the pilot, which satisfies the requirements of Article 35 of the UK GDPR.  Amongst things, the Home office had failed to sufficiently assess the privacy intrusion of the continuous collection of people’s location information.
It was also found to have breached the Accountability Principle (Article 5(2)) by failing to demonstrate its compliance with Article 5(1), in particular:  

  • Article 5(1)(a) Lawfulness: the Home Office identified the lawful basis for the processing as Article 6(1)(e), and for Special Category Data as Article 9(2)(g) and schedule 1 paragraph 6 DPA 2018. However, it did not demonstrate that the processing was necessary and proportionate for these purposes (neither in its DPIA or staff guidance) including why less privacy-intrusive methods could not meet its objectives.  
  • Article 5(1)(a) Fairness and Transparency: the Home Office’s privacy notice(s) did not demonstrate compliance with minimum transparency requirements, as set out at Articles 12 and 13. It failed to provide clear and easily accessible information to the people being tagged about what personal information is being collected, how it will be used, how long it will be kept for, and who it will be shared with. The privacy information was not set out clearly in one place, was inconsistent and there were information gaps. 
  • Article 5(1)(c) Data Minimisation: the Home Office’s draft DPIA and guidance for staff did not demonstrate that data minimisation will be considered and actioned when requesting access to the personal data produced by the electronic tags. 

Jon Edwards, the Information Commissioner, said: 

“It’s crucial that the UK has appropriate checks and balances in place to ensure people’s information rights are respected and safeguarded. This is even more important for those who might not even be aware that they have those rights. 

“This action is a warning to any organisation planning to monitor people electronically – you must be able to prove the necessity and proportionality of tracking people’s movements, taking into consideration people’s vulnerabilities and how such processing could put them at risk of further harm. This must be done from the outset, not as an afterthought.” 

The Enforcement Notice orders the Home Office to update its internal policies, access guidance and privacy information in relation to the data retained from the pilot scheme. The ICO has also issued a formal warning stating that any future processing by the Home Office on the same basis will be in breach of data protection law and will attract enforcement action.  

Surveillance is a hot topic for the ICO at present. Last month, the ICO issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance.  

The Enforcement Notice and warning are important reading for anyone who wishes to understand how to complete a compliant and meaningful DPIA. The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. Amongst other things, the DPIA provisions in the UK GDPR,  will be replaced by leaner and less prescriptive “Assessments of High-Risk Processing”.  

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Kate Middleton’s Medical Records: Can anyone go to jail?

Kate Middleton seems to be at the centre of much media (and social media) attention at present. In addition to speculation about her health and whereabouts, there has been much focus and analysis of the now famous photoshopped Mother’s Day photo

This week it was reported that employees at the private London Clinic in Marylebone, where Kate was a patient following abdominal surgery in January, attempted to view her medical records. Reportedly three employees have now been suspended.

The Health Minister, Maria Caulfield, told Sky News it was “pretty severe and serious stuff to be accessing notes that you don’t have permission to”. She also said police had been “asked to look at” whether staff at the clinic attempted to access the princess’s private medical records. 

If the reports are true and individuals are proven to have been “snooping”, what are the consequences? Firstly, this would normally be a matter for the Information Commissioner’s Office (ICO) to investigate rather than the police. Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly: 

(a) obtain or disclose personal data without the consent of the controller, 

(b) procure the disclosure of personal data to another person without the consent of the controller, or 

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained. 

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. In June 2023, the ICO disclosed that since 1st June 2018, 92 cases involving Section 170 offences were investigated by its Criminal Investigations Team.   

Consequences 

Section 170 is only punishable by way of a fine; perpetrators cannot be sent to prison. Although there is now no cap on the maximum fine, prosecutions have resulted in relatively low fines compared to the harm caused.  

Two recent prosecutions have involved employees accessing medical records.
In November 2023, Loretta Alborghetti, was fined for illegally accessing the medical records of over 150 people. The offence took place whilst she worked as a medical secretary at Worcestershire Acute Hospitals NHS Trust. She was ordered to pay a total of £648. 

In August 2022, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust, pleaded guilty to accessing medical records of patients without a valid legal reason. An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off from going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000. 

Computer Misuse Act  

A Section 170 prosecution would have a much greater deterrent effect if the sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail.  

The relatively low fines have led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment. In July 2022, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of people’s personal information from vehicle repair garages to generate potential leads for personal injury claims. 

The ICO has now confirmed that it has a personal data breach report from The London Clinic. If its investigation, finds the Clinic did not comply with its security obligations under the Article 5(1)(f)) and Article 32 of the UK GDPR, it faces a possible maximum Monetary Penalty Notice of £17.5m or 4% of gross annual turnover (whichever is higher). This case highlights the importance of organisations ensuring adequate security measures around sensitive personal data especially where the data relates to high profile individuals.  
 
This and other data protection developments will be discussed in detail on our forthcoming GDPR Update workshop. There are only 3 places left on our next GDPR Practitioner Certificate 

Facial Recognition to Monitor Attendance: ICO Takes Action

Employers have always had a keen interest in monitoring and tracking employees.
In 2017, we addressed this topic in a blog post focusing on the GDPR implications of employee monitoring using GPS trackers and similar devices. Recent advances in surveillance technology, particularly facial recognition, have not only streamlined employee monitoring but has also rendered it more cost-effective and, concurrently, more intrusive. A good example is this video of a coffee shop using facial recognition technology (FRT) and AI to monitor employee productivity. 

In 2022, the TUC warned employee surveillance technology and AI risks “spiralling out of control” without stronger regulation to protect employees. It warned that, left unchecked, these technologies could lead to widespread discrimination, work intensification and unfair treatment. Earlier this year the French Data Protection Regulator, CNIL, fined Amazon  €32m (£27m) under the GDPR for “excessive” surveillance of its workers. The CNIL said Amazon France Logistique, which manages warehouses, recorded data captured by workers’ handheld scanners. It found Amazon tracked activity so precisely that it led to workers having to potentially justify every break. 

Employee surveillance is now primarily regulated in the UK by the UK GDPR.
As with well all activities involving the processing of personal data, the surveillance must be fair, lawful and transparent. The Human Rights Act and the Regulation of Investigatory Powers Act may also apply (see our previous earlier blog post for more detail on these laws).  

On 23rd February 2024, the Information Commissioner’s Office (ICO) issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance. The ICO’s investigation found that Serco Leisure and the trusts had been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities for the purpose of attendance checks and subsequent payment for their time.  

Serco Leisure will not be appealing against the notices; a wise decision! As readers will know they had to have a lawful basis for processing employees’ data under Article 6 of the UK GDPR as well as Article 9 as they were processing Special Category Data (Biometric Data). Consent was not an option due to the imbalance of power between employer and employee. In the words of the Commissioner:  

“Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy. There is no clear way for staff to opt out of the system, increasing the power imbalance in the workplace and putting people in a position where they feel like they have to hand over their biometric data to work there.” 

Serco tried to rely on Article 6(1)(b) and Article 6(1)(f) as lawful bases for processing the employees’ personal data. In relation to Article 6(1)(b) (contractual necessity) it argued that the processing of attendance data was necessary to ensure employees are paid correctly for the time they have worked. The ICO ruled that although recording attendance times may be necessary for Serco to fulfil its obligations under employment contracts, it does not follow that the processing of biometric data is necessary to achieve this purpose especially when less intrusive means could be used to verify attendance. These included radio-frequency identification cards or fobs, or manual sign-in and sign-out sheets. Serco had failed to demonstrate why these less intrusive methods were not appropriate. They did assert that these methods are open to abuse but did provide evidence of widespread abuse, nor why other methods, such as disciplinary action against employees found to be abusing the system, had not been considered to be appropriate.  

Regarding Serco’s reliance on Article 6(1)(f) (legitimate interests), the ICO said that it will not apply if a controller can reasonably achieve the same result in another less intrusive way. As discussed above, Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system is a necessity, rather than simply a further benefit to Serco. The ICO also said: 

“In applying the balancing test required to rely on legitimate interests, Serco has failed to give appropriate weight to the intrusive nature of biometric processing or the risks to data subjects. “ 

In relation to Article 9, the ICO said that Serco had again failed to demonstrate that the processing of biometric data is “necessary” for Serco to process Special Category Data for the purpose of employment attendance checks or to comply with the relevant laws identified by Serco in their submissions.  

The Enforcement Notices not only instruct Serco Leisure and the trusts to stop all processing of biometric data for monitoring employees’ attendance at work, but also require them to destroy all biometric data that they are not legally obliged to retain. This must be done within three months of the notices being issued. 

This enforcement action coincided with the ICO publishing new guidance for all organisations that are considering using people’s biometric data. The guidance outlines how organisations can comply with data protection law when using biometric data to identify people. Last year, the ICO also published guidance on monitoring employees and called on organisations to consider both their legal obligations and their employee’s rights to privacy before they implement any monitoring. 

This is the first time the ICO has taken enforcement action against an employer to stop it processing the biometric data of staff. It will serve as a warning to organisations who use biometric tech just because it is cheap and easy to use without considering the legal implications.  

Our CCTV Workshop will also examine the use of facial recognition technology. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Data Protection Bill Faces Scrutiny:
Commissioner Calls for Tighter Safeguards 

In a recent development, the Information Commissioner has weighed in on the debate surrounding the Data Protection and Digital Information Bill (DPDI Bill), legislation aimed at modernising data protection in the UK. While acknowledging the government’s efforts to strengthen the independence of the Information Commissioner’s Office (ICO) and update data protection practices, the Commissioner’s response highlights significant concerns, particularly around the use of personal data in social security contexts. We wrote a detailed breakdown on our blog here

The response, detailed and thorough, applauds the government’s amendments to the bill, recognising their potential to enhance ICO’s autonomy and bring data protection practices up to date with the digital age. However, the Commissioner expresses reservations about the adequacy of safeguards in the current draft of the bill, especially in terms of personal data handling for social security purposes. 

The Commissioner’s concern primarily revolves around the need for more precise language in the bill. This is to ensure that its provisions are fully aligned with established data protection principles, thereby safeguarding individual rights.
The response suggests that the current wording might be too broad or vague, potentially leading to misuse or overreach in the handling of personal data. 

Importantly, the Commissioner has provided detailed technical feedback for further improvements to the bill. It indicates a need for scrutiny and adjustments to the bill to ensure that it not only meets its intended purpose but also robustly protects the rights of individuals. 

While the Commissioner supports the bill’s overarching aim to enhance the UK’s data protection regime, the emphasis is clearly on the necessity of refining the bill.
This is to ensure it strikes the right balance between enabling data use for public and economic benefits and protecting individual privacy rights. 

The response from the Information Commissioner is a significant moment in the ongoing development of the DPDI Bill. It underscores the complexity and importance of legislating in the digital age, where data plays a crucial role in both the economy and personal privacy. 

As the bill progresses, the government and legislators should consider the Commissioner’s input. The balance they strike in the final version of the bill will be a key indicator of the UK’s approach to data protection in a rapidly evolving digital landscape. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. Dive into the issues discussed in this blog and secure your spot now.

Clearview AI Wins Appeal Against GDPR Fine 

Last week a Tribunal overturned a GDPR Enforcement Notice and a Monetary Penalty Notice issued to Clearview AI, an American facial recognition company. In Clearview AI Inc v The Information Commissioner [2023] UKFTT 00819 (GRC), the First-Tier Tribunal (Information Rights) ruled that the Information Commissioner had no jurisdiction to issue either notice, on the basis that the GDPR/UK GDPR did not apply to the personal data processing in issue.  

Background 

Clearview is a US based company which describes itself as the “World’s Largest Facial Network”. Its online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. It allows customers to upload an image of a person to its app; the person is then identified by the app checking against all the images in the Clearview database.  

In May 2022 the ICO issued a Monetary Penalty Notice of £7,552,800 to Clearview for breaches of the GDPR including failing to use the information of people in the UK in a way that is fair and transparent. Although Clearview is a US company, the ICO ruled that the UK GDPR applied because of Article 3(2)(b) (territorial scope). It concluded that Clearview’s processing activities “are related to… the monitoring of [UK resident’s] behaviour as far as their behaviour takes place within the United Kingdom.” 

The ICO also issued an Enforcement Notice ordering Clearview to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems. (see our earlier blog for more detail on these notices.) 

The Judgement  

The First-Tier Tribunal (Information Rights) has now overturned the ICO’s enforcement and penalty notice against Clearview. It concluded that although Clearview did carry out data processing related to monitoring the behaviour of people in the UK (Article Art. 3(2)(b) of the UK GDPR), the ICO did not have jurisdiction to take enforcement action or issue a fine. Both the GDPR and UK GDPR provide that acts of foreign governments fall outside their scope; it is not for one government to seek to bind or control the activities of another sovereign state. However the Tribunal noted that the ICO could have taken action under the Law Enforcement Directive (Part 3 of the DPA 2018 in the UK), which specifically regulates the processing of personal data in relation to law enforcement. 

Learning Points 

While the Tribunal’s judgement in this case reflects the specific circumstances, some of its findings are of wider application: 

  • The term “behaviour” (in Article Art. 3(2)(b)) means something about what a person does (e.g., location, relationship status, occupation, use of social media, habits) rather than just identifying or describing them (e.g., name, date of birth, height, hair colour).  

  • The term “monitoring” not only comes up in Article 3(2)(b) but also in Article 35(3)(c) (when a DPIA is required). The Tribunal ruled that monitoring includes tracking a person at a fixed point in time as well as on a continuous or repeated basis.

  • In this case, Clearview was not monitoring UK residents directly as its processing was limited to creating and maintaining a database of facial images and biometric vectors. However, Clearview’s clients were using its services for monitoring purposes and therefore Clearview’s processing “related to” monitoring under Article 3(2)(b). 

  • A provider of services like Clearview, may be considered a joint controller with its clients where both determine the purposes and means of processing. In this case, Clearview was a joint controller with its clients because it imposed restrictions on how clients could use the services (i.e., only for law enforcement and national security purposes) and determined the means of processing when matching query images against its facial recognition database.  

Data Scraping 

The ruling is not a greenlight for data scraping; where publicly available data, usually from the internet, is collected and processed by companies often without the Data Subject’s knowledge. The Tribunal ruled that this was an activity to which the UK GDPR could apply. In its press release, reacting to the ruling, the ICO said: 

“The ICO will take stock of today’s judgment and carefully consider next steps.
It is important to note that this judgment does not remove the ICO’s ability to act against companies based internationally who process data of people in the UK, particularly businesses scraping data of people in the UK, and instead covers a specific exemption around foreign law enforcement.” 

This is a significant ruling from the First Tier Tribunal which has implications for the extra territorial effect of the UK GDPR and the ICO powers to enforce it. It merits an appeal by the ICO to the Upper Tribunal. Whether this happens depends very much on the ICO’s appetite for a legal battle with a tech company with deep pockets.  

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Updateworkshop.  

The ICO’s Tougher FOI Enforcement Policy 

By Martin Rosenbaum 

Last month the Information Commissioner’s Office announced it was issuing another two Enforcement Notices against public authorities with extreme backlogs of FOI and EIR requests; the Ministry of Defence and the Environment Agency. From the published notices it is clear that both authorities had consistently failed to tackle their excessive delays, despite extensive discussions over many months with the ICO. 

The ICO also issued Practice Recommendations, a lower level of sanction, to three authorities with a poor track record on FOI; Liverpool Council, Tower Hamlets Council and the Medicines and Healthcare Products Regulatory Agency. This brings the total of Enforcement Notices in the past year or so to six, and the number of Practice Recommendations to 12.
As Warren Seddon, the ICO’s Director of FOI, proclaimed in his blog on the subject, both these figures exceed the numbers previously issued by the ICO in the entire 17 years since the FOI Act came into force. 

From my point of view, as a frequent requestor, this is good news.
For requestors, the ICO’s current activity represents a welcome tougher stance on FOI regulation adopted by Seddon and also the Commissioner, John Edwards, since the latter took over at the start of last year.  

Under the previous Commissioner Elizabeth Denham, any strategic enforcement regarding FOI and failing authorities had dwindled to nothing. The experience of requestors was that the FOI system was beset by persistent lengthy delays, both from many authorities and also at the level of ICO complaints.  

The ICO’s Decision Notices would frequently comment on obstruction and incompetence from certain public bodies, as I reported when I was a BBC journalist, but without the regulator then making any serious systematic attempt to change the culture and operations of these authorities.
Under Denham the ICO had also ceased its previous policy of regularly and publicly revealing a list of authorities it was ‘monitoring’ due to their inadequate processing of FOI requests. Although this was in any case a weaker step than issuing formal enforcement notices and practice recommendations, in some cases it did have a positive effect.
Working at the BBC at the time I saw how, when the BBC was put into monitoring by the ICO, it greatly annoyed the information rights section, who brought in extra resources and made sure the BBC was released from it at the first opportunity.  

On the other hand, other public authorities with long-lasting deficiencies, such as the Home Office and the Metropolitan Police, were kept in ICO monitoring repeatedly, without improving significantly and without further, more effective action being taken against them.  

The ICO’s FOI team has also made important progress in the past year in rectifying its own defects in processing complaints, speeding things up and tackling its backlog. This led to a rapid rush of decision notices.
One result is that delay has been shifted further up the system, as the
First-tier Tribunal has been struggling to cope with a concomitant increase in the number of decisions appealed. I understand that the proportion of decisions appealed did not change, although I don’t know if the balance between requestor appeals and authority appeals has altered. 

Another consequence has been that decision notices now tend to be shorter than they used to be, especially those which support the stance of the public authority and thus require less interventionist argument from the ICO. Requestors may need to be reassured that the pressure on ICO staff for speedier decisions does not mean that finely balanced cases end up predominantly being decided on the side of the authority.  

More generally I gather there is some concern within the ICO about its decisions under sections 35 and 36 of FOI, to do with policy formulation and free and frank advice, that some staff have got into a pattern of dismissing requestors’ arguments without properly considering the specific circumstances which may favour disclosure. 

As part of its internal operational changes, a few months ago the ICO introduced a procedure for prioritisation amongst appeals and expediting selected ones. I have seen the evidence of this myself.  A complaint I made in April was prioritised and allocated to a case worker within six weeks and then a decision notice served within another six weeks (although sadly my case was rejected). All done within three months.  

On the other hand a much older appeal that I submitted to the ICO in May 2022 has extraordinarily still not even been allocated to a case worker 15 months later, from what I have been told. This is partly because it relates to the Cabinet Office, which accounts for a large proportion of the ICO’s oldest casework and has been allowed a longer period of time to work through old cases.  

It is interesting to note that the ICO does not proactively tell complainants that their case has been prioritised, even when they have specifically argued it should be at the time of submitting their complaint.
The ICO wants to avoid its staff getting sucked in to disputes about which appeals merit prioritisation. If you want to know whether your case has been prioritised, you have to ask explicitly, and then you will be told. 

The ICO has not yet officially released any statistics about the impact of its new prioritisation policy. However I understand that in the first three months about 60 cases were prioritised and allocated to a case officer to investigate within a month or so. This is a smaller number than might have been expected.  

Around 80 percent of these were prioritised in line with the criterion for the importance of the public interest involved in the issue. And about 60 percent of decisions to prioritise reflected the fact that the requestor was in a good position to disseminate further any information received, possibly as a journalist or campaigner. 

In most of the early decision notices for prioritised complaints the ICO has backed the authority and ruled against disclosure. So if you are a requestor, the fact that the ICO has decided to prioritise your appeal does certainly not mean that it has reached a preliminary decision that you are right.  

Martin Rosenbaum is the author of Freedom of Information: A practical guidebook. The book is aimed at requestors and provides thorough guidance on the workings of the law, how best to frame requests and how to challenge refusals. It will also be valuable to FOI officers and others who want a better understanding of the perspective of requestors. In the book Martin passes on the benefits of all the expertise and experience he acquired during 16 years as the leading specialist in BBC News in using FOI for journalism. 

Act Now Launches New UAE DP Officer Certificate 

Act Now Training is pleased to announce the launch of the new UAE Data Protection Officer Certificate.  

Data Protection law in the Middle East has seen some rapid developments recently. The UAE recently enacted a federal law to comprehensively regulate the processing of personal data in all seven emirates. This will sit alongside current data protection laws regulating businesses in the various financial districts such as the Dubai International Financial Centre (DIFC) Data Protection Law No. 5 of 2020 and the Abu Dhabi Global Market (ADGM) Data Protection Regulations 2021. In addition there are several sector specific laws in the UAE which address personal privacy and data security. Saudi Arabia, Bahrain and Qatar also now have comprehensive data protection laws.   

These laws require a fundamental assessment of the way Middle East businesses handle personal data from collection through to storage, disclosure and destruction. With enhanced rights for individuals and substantial fines for non-compliance no business can afford to ignore the new requirements. 

Act Now’s UAE Data Protection Officer Certificate has been developed following extensive discussions with our clients and partners in the UAE and builds on our experience of delivering training and consultancy in the region. The course focuses on the essential knowledge required by DPOs to successfully navigate the UAE data protection landscape. The course will also help DPOs to develop the skills required to do their job better.
These include interpreting the data protection principles in a practical context, drafting privacy notices, undertaking DPIAs and reporting data breaches. 

The course teaching style is based on four practical and engaging workshops covering theory alongside hands-on application using case studies that equip delegates with knowledge and skills that can be used immediately. Delegates will also have personal tutor support throughout the course and access to a comprehensive revised online resource lab. 

Ibrahim Hasan, director of Act Now Training, said: 

“I am really pleased to be launching this new UAE DPO certificate course. This is an exciting time for data protection law in the Middle East. Act Now is committed to contributing to the development of the DPO function in the region.” 

If you would like to discuss your suitability for this course, please get in touch. It can also be delivered as an in house option.

The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

Experian’s GDPR Appeal: Lawfulness, Fairness, and Transparency

On 20th February 2023, the First-Tier (Information Rights) Tribunal (FTT) overturned an Enforcement Notice issued against Experian by the Information Commissioner’s Office (ICO). 

This case relates to Experian’s marketing arm, Experian Marketing Services (EMS) which provides analytics services for direct mail marketing companies. It obtains personal data from three types of sources; publicly available sources, third parties and Experian’s credit reference agency (CRA) business. The company processes this personal data to build profiles about nearly every UK adult. An individual profile can contain over 400 data points. The company sells access to this data to marketing companies that wish to improve the targeting of their postal direct marketing communications. 

The ICO issued an Enforcement Notice against Experian in April 2020, alleging several GDPR violations namely; Art. 5(1)(a) (Principle 1, Lawfulness, fairness, and transparency), Art. 6(1) (Lawfulness of processing) and Art. 14 (Information to be provided where personal data have not been obtained from the data subject). 

Fair and Transparent Processing: Art 5(1)(a) 

The ICO criticised Experian’s privacy notice for being unclear and for not emphasising the “surprising” aspects of Experian’s processing. It ordered Experian to: 

  • Provide an up-front summary of Experian’s direct marketing processing. 
  • Put “surprising” information (e.g. regarding profiling via data from multiple sources) on the first or second layer of the notice. 
  • Use clearer and more concise language. 
  • Disclose each source and use of data and explain how data is shared, providing examples.  

The ICO also ordered Experian to stop using credit reference agency data (CRA data) for any purpose other than those requested by Data Subjects. 

Lawful Processing: Arts. 5(1)(a) and 6(1) 

All processing of personal data under the GDPR requires a legal basis. Experian processed all personal data held for marketing purposes on the basis of its legitimate interests, including personal data that was originally collected on the basis of consent. Before relying on legitimate interests, controllers must conduct a “legitimate interests assessment” to balance the risks of processing the risks. Experian had done this, but the ICO said the company had got the balance wrong. It ordered Experian to: 

  • Delete all personal data that had been collected via consent and was subsequently being processed on the basis of Experian’s legitimate interests. 
  • Stop processing personal data where an “objective” legitimate interests assessment revealed that the risks of the processing outweigh the benefits. 
  • Review the GDPR compliance of all third parties providing Experian with personal data. 
  • Stop processing any personal data that has not been collected in a GDPR-compliant way. 

Transparency: Art. 14 

Art. 14 GDPR requires controllers to provide notice to data subjects when obtaining personal data from a third-party or publicly available source. Experian did not do provide such notices relying on the exceptions in Art 14. 

Where Experian had received personal data from third parties, it said that it did not need to provide a notice because “the data subject already has the information”. It noted that before a third party sent Experian personal data, the third party would provide Data Subjects with its own privacy notice. That privacy notice would contain links to Experian’s privacy notice.
Where Experian had obtained personal data from a publicly available source, such as the electoral register, it claimed that to provide a notice would involve “disproportionate effort”. 

The ICO did not agree that these exceptions applied to Experian, and ordered it to: 

  • Send an Art. 14 notice to all Data Subjects whose personal data had been obtained from a third-party source or (with some exceptions) a publicly available source. 
  • Stop processing personal data about Data Subjects who had not received an Art. 14 notice. 

The FTT Decision  

The FTT found that Experian committed only two GDPR violations: 

  • Failing to provide an Art. 14 notice to people whose data had been obtained from publicly available sources. 
  • Processing personal data on the basis of “legitimate interests” where that personal data had been originally obtained on the basis of “consent” (by the time of the hearing, Experian had stopped doing this). 

The FTT said that the ICO’s Enforcement Notice should have given more weight to:  

  • The costs of complying with the corrective measures. 
  • The benefits of Experian’s processing. 
  • The fact that Data Subjects would (supposedly) not want to receive an Art. 14 notice. 

The FTT overturned most of the ICO’s corrective measures. The only new obligation on Experian is to send Art. 14 notices in future to some people whose data comes from publicly available sources. 

FTT on Transparency 

Experian had improved its privacy notice before the hearing, and the FTT was satisfied that it met the Art. 14 requirements. It agreed that Experian did not need to provide a notice to Data Subjects where it had received their personal data from a third party. The FTT said that “…the reasonable data subject will be familiar with hyperlinks and how to follow them”.
People who wanted to know about Experian’s processing had the opportunity to learn about it via third-party privacy notices. 

However, the FTT did not agree with Experian’s reliance on the “disproportionate effort” exception. In future, Experian will need to provide Art. 14 notices to some Data Subjects whose personal data comes from publicly available sources. 

FTT on Risks of Processing 

An ICO expert witness claimed that Experian’s use of CRA data presented a risk to Data Subjects. The witness later admitted he had misunderstood this risk. The FTT found that Experian’s use of CRA data actually decreased the risk of harm to Data Subjects. For example, Experian used CRA data to “screen out” data subjects with poor credit history from receiving marketing about low-interest credit cards. The FTT found that this helped increase the accuracy of marketing and was therefore beneficial. As such, the FTT found that the ICO had not properly accounted for the benefits of Experian’s processing of CRA data. 

The ICO’s Planned Appeal 

The FTT’s decision focuses heavily on whether Experian’s processing was likely to cause damage or distress to Data Subjects. Because the FTT found that the risk of damage was low, Experian could rely on exceptions that might not have applied to riskier processing.  

The ICO has confirmed that it will appeal the decision. There are no details yet on their arguments but they may claim that the FTT took an excessively narrow interpretation of privacy harms. 

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop. There are only 3 places left on our next Advanced Certificate in GDPR Practice.  

£4.4 Million GDPR Fine for Construction Company 

This month the UK Information Commissioner’s Office has issued two fines and one Notice of Intent under GDPR. 

The latest fine is three times more than that imposed on Easylife Ltd on 5th October. Yesterday, Interserve Group Ltd was fined £4.4 million for failing to keep personal information of its staff secure.  

The ICO found that the Berkshire based construction company failed to put appropriate security measures in place to prevent a cyber-attack, which enabled hackers to access the personal data of up to 113,000 employees through a phishing email. The compromised data included personal information such as contact details, national insurance numbers, and bank account details, as well as special category data including ethnic origin, religion, details of any disabilities, sexual orientation, and health information. 

The Phishing Email 

In March 2020, an Interserve employee forwarded a phishing email, which was not quarantined or blocked by Interserve’s IT system, to another employee who opened it and downloaded its content. This resulted in the installation of malware onto the employee’s workstation. 

The company’s anti-virus quarantined the malware and sent an alert, but Interserve failed to thoroughly investigate the suspicious activity. If they had done so, Interserve would have found that the attacker still had access to the company’s systems. 

The attacker subsequently compromised 283 systems and 16 accounts, as well as uninstalling the company’s anti-virus solution. Personal data of up to 113,000 current and former employees was encrypted and rendered unavailable. 

The ICO investigation found that Interserve failed to follow-up on the original alert of a suspicious activity, used outdated software systems and protocols, and had a lack of adequate staff training and insufficient risk assessments, which ultimately left them vulnerable to a cyber-attack. Consequently, Interserve had breached Article 5 and Article 32 of GDPR by failing to put appropriate technical and organisational measures in place to prevent the unauthorised access of people’s information. 

Notice of Intent 

Interestingly in this case the Notice of Intent (the pre cursor to the fine) was for also for £4.4million i.e. no reductions were made by the ICO despite Interserve’s representations. Compare this to the ICO’s treatment of two much bigger companies who also suffered cyber security breaches. In July 2018, British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine was reduced to £20 million in July 2020. In November 2020 Marriott International Inc was fined £18.4 million, much lower than the £99 million set out in the original notice. 

The Information Commissioner, John Edwards, has warned that companies are leaving themselves open to cyber-attack by ignoring crucial measures like updating software and training staff: 

“The biggest cyber risk businesses face is not from hackers outside of their company, but from complacency within their company. If your business doesn’t regularly monitor for suspicious activity in its systems and fails to act on warnings, or doesn’t update software and fails to provide training to staff, you can expect a similar fine from my office. 

Leaving the door open to cyber attackers is never acceptable, especially when dealing with people’s most sensitive information. This data breach had the potential to cause real harm to Interserve’s staff, as it left them vulnerable to the possibility of identity theft and financial fraud.” 

We have been here before. On 10th March the ICO  fined Tuckers Solicitors LLP £98,000 following a ransomware attack on the firm’s IT systems in August 2020. The attacker had encrypted 972,191 files, of which 24,712 related to court bundles.  60 of those were exfiltrated by the attacker and released on the dark web.   

Action Points  

Organisations need to strengthen their defences and have plans in place; not just to prevent a cyber-attack but what to do when it does takes place. Here are our top tips: 

  1. Conduct a cyber security risk assessment and consider an external accreditation through  Cyber Essentials. 
  1. Ensure your employees know the risks of malware/ransomware and follows good security practice. At the time of the cyber-attack, one of the two Interserve employees who received the phishing email had not undertaken data protection training. (Our GDPR Essentials  e-learning solution is a very cost effective e learning solution which contains a specific module on keeping data safe.)  
  1. Have plans in place for a cyber security breach. See our Managing Personal Data Breaches workshop.  
  1. Earlier in the year, the ICO worked with NCSC to remind organisations not to pay a ransom in case of a cyber-attack, as it does not reduce the risk to individuals and is not considered as a reasonable step to safeguard data. For more information, take a look at the ICO ransomware guidance or visit the NCSC website to learn about mitigating a ransomware threat via their business toolkit

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 21st November.