CJEU’s FT v. DW Ruling: Navigating Data Subject Access Requests 

In the landmark case FT v. DW (Case C 307/22), the Court of Justice of the European Union (CJEU), delivered a ruling that sheds light on the intricacies of data subject access requests under the EU General Data Protection Regulation (GDPR). The dispute began when DW, a patient, sought an initial complimentary copy of their dental medical records from FT, a dentist, citing concerns about possible malpractice. FT, however, declined the request based on German law, which requires patients to pay for copies of their medical records. The ensuing legal tussle ascended through the German courts, eventually reaching the CJEU, which had to ponder three pivotal questions. These are detailed below. 

Question 1: The Right to a Free Copy of Personal Data 

The first deliberation was whether the GDPR mandates healthcare providers to provide patients with a cost-free copy of their personal data, irrespective of the request’s motive, which DW’s case seemed to imply was for potential litigation. The CJEU, examining Articles 12(5) and 15(3) of the GDPR and indeed Recital 63, concluded that the regulation does indeed stipulate that the first copy of personal data should be free and that individuals need not disclose their reasons for such requests, highlighting the GDPR’s overarching principle of transparency. 

Question 2: Economic Considerations Versus Rights under the GDPR 

The second matter concerned the intersection of the GDPR with
pre-existing national laws that might impinge upon the economic interests of data controllers, such as healthcare providers. The CJEU assessed whether Article 23(1)(i) of the GDPR could uphold a national rule that imposes a fee for the first copy of personal data. The court found that while Article 23(1)(i) could apply to laws pre-dating the GDPR, it does not justify charges for the first copy of personal data, thus prioritizing the rights of individuals over the economic interests of data controllers. 

Question 3: Extent of Access to Medical Records 

The final issue addressed the extent of access to personal data, particularly whether it encompasses the entire medical record or merely a summary. The CJEU clarified that according to Article 15(3) of the GDPR, a “copy” entails a complete and accurate representation of the personal data, not merely a physical document or an abridged version. This means that a patient is entitled to access the full spectrum of their personal data within their medical records, ensuring they can fully verify and understand their information. 

Conclusion 

The CJEU’s decision in FT v DW reaffirms the GDPR’s dedication to data subject rights and offers a helpful interpretation of the GDPR. It highlights the right of individuals to a free first copy of their personal data for any purpose, refuting the imposition of fees by national law for such access, and establishing the right to a comprehensive reproduction of personal data contained within medical records. The judgement goes on to say the data must be complete even if the term ‘copy’ is used as well as being contextual and intelligible as is required by Article 12(1) of the GDPR. 

We will be examining the impact of this on our upcoming Handling SARs course as well as looking at the ruling in our GDPR Update course. Places are limited so book early to avoid disappointment.

Clearview AI Wins Appeal Against GDPR Fine 

Last week a Tribunal overturned a GDPR Enforcement Notice and a Monetary Penalty Notice issued to Clearview AI, an American facial recognition company. In Clearview AI Inc v The Information Commissioner [2023] UKFTT 00819 (GRC), the First-Tier Tribunal (Information Rights) ruled that the Information Commissioner had no jurisdiction to issue either notice, on the basis that the GDPR/UK GDPR did not apply to the personal data processing in issue.  

Background 

Clearview is a US based company which describes itself as the “World’s Largest Facial Network”. Its online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. It allows customers to upload an image of a person to its app; the person is then identified by the app checking against all the images in the Clearview database.  

In May 2022 the ICO issued a Monetary Penalty Notice of £7,552,800 to Clearview for breaches of the GDPR including failing to use the information of people in the UK in a way that is fair and transparent. Although Clearview is a US company, the ICO ruled that the UK GDPR applied because of Article 3(2)(b) (territorial scope). It concluded that Clearview’s processing activities “are related to… the monitoring of [UK resident’s] behaviour as far as their behaviour takes place within the United Kingdom.” 

The ICO also issued an Enforcement Notice ordering Clearview to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems. (see our earlier blog for more detail on these notices.) 

The Judgement  

The First-Tier Tribunal (Information Rights) has now overturned the ICO’s enforcement and penalty notice against Clearview. It concluded that although Clearview did carry out data processing related to monitoring the behaviour of people in the UK (Article Art. 3(2)(b) of the UK GDPR), the ICO did not have jurisdiction to take enforcement action or issue a fine. Both the GDPR and UK GDPR provide that acts of foreign governments fall outside their scope; it is not for one government to seek to bind or control the activities of another sovereign state. However the Tribunal noted that the ICO could have taken action under the Law Enforcement Directive (Part 3 of the DPA 2018 in the UK), which specifically regulates the processing of personal data in relation to law enforcement. 

Learning Points 

While the Tribunal’s judgement in this case reflects the specific circumstances, some of its findings are of wider application: 

  • The term “behaviour” (in Article Art. 3(2)(b)) means something about what a person does (e.g., location, relationship status, occupation, use of social media, habits) rather than just identifying or describing them (e.g., name, date of birth, height, hair colour).  

  • The term “monitoring” not only comes up in Article 3(2)(b) but also in Article 35(3)(c) (when a DPIA is required). The Tribunal ruled that monitoring includes tracking a person at a fixed point in time as well as on a continuous or repeated basis.

  • In this case, Clearview was not monitoring UK residents directly as its processing was limited to creating and maintaining a database of facial images and biometric vectors. However, Clearview’s clients were using its services for monitoring purposes and therefore Clearview’s processing “related to” monitoring under Article 3(2)(b). 

  • A provider of services like Clearview, may be considered a joint controller with its clients where both determine the purposes and means of processing. In this case, Clearview was a joint controller with its clients because it imposed restrictions on how clients could use the services (i.e., only for law enforcement and national security purposes) and determined the means of processing when matching query images against its facial recognition database.  

Data Scraping 

The ruling is not a greenlight for data scraping; where publicly available data, usually from the internet, is collected and processed by companies often without the Data Subject’s knowledge. The Tribunal ruled that this was an activity to which the UK GDPR could apply. In its press release, reacting to the ruling, the ICO said: 

“The ICO will take stock of today’s judgment and carefully consider next steps.
It is important to note that this judgment does not remove the ICO’s ability to act against companies based internationally who process data of people in the UK, particularly businesses scraping data of people in the UK, and instead covers a specific exemption around foreign law enforcement.” 

This is a significant ruling from the First Tier Tribunal which has implications for the extra territorial effect of the UK GDPR and the ICO powers to enforce it. It merits an appeal by the ICO to the Upper Tribunal. Whether this happens depends very much on the ICO’s appetite for a legal battle with a tech company with deep pockets.  

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Updateworkshop.  

Act Now in Dubai: Season 2 

On the 2ndand 3rd October 2023, the UAE held the first ever privacy and data protection law conference; a unique event organised by the Dubai International Financial Centre (DIFC) and data protection practitioners in the Middle East. The conference brought together data protection and security compliance professionals from across the world to discuss the latest developments in the Middle East data protection framework.  

Data Protection law in the Middle East has seen some rapid developments recently. The UAE has enacted its first federal law to comprehensively regulate the processing of personal data in all seven emirates. Once in force (expected to be early next year) this will sit alongside current data protection laws regulating businesses in the various UAE financial districts such as the Dubai International Financial Centre (DIFC) Data Protection Law No. 5 of 2020 and the Abu Dhabi Global Market (ADGM) Data Protection Regulations 2021. Jordan, Oman, Bahrain and Qatar also have comprehensive data protection laws.  Currently what is causing most excitement in the Middle East data protection community is Saudi Arabia’s Personal Data Protection Law (PDPL) which came into force on 14th September 2022.  

The conference agenda covered various topics including the interoperability of data protection laws in the GCC, unlocking data flows in the region, smart cities, the use of facial recognition and data localisation. The focus of day 2 was on AI and machine learning. There were some great panels on this topic discussing AI standards, transparency and the need for regulation.   

Speakers included the UAE Minister for AI, His Excellency Omar Sultan Al Olama, as well as leading data protection lawyers and practitioners from around the world. Elisabeth Denham, former UK Information Commissioner, also addressed the delegates alongside data protection regulators from across the region. Act Now’s director, Ibrahim Hasan, was invited to take part in a panel discussion to share his experience of GDPR litigation and enforcement action in the UK and EU and what lessons can be drawn for the Middle East. 

Alongside Ibrahim, the Act Now team were at the conference to answer delegates’ questions about our UAE and KSA training programmes.
Act Now has delivered training  extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. We were pleased to see there that there was a lot of interest in our courses especially our DPO certificates.  

Following the conference, Ibrahim was invited to deliver a guest lecture to law students at Middlesex University Dubai. This is the biggest university in Dubai with over 4500 students from over 118 countries. Ibrahim talked about the importance of Data Protection law and job opportunities in the information governance profession. He was pleasantly surprised by the students’ interest in the subject and their willingness to consider IG as an alternative career path. A fantastic end to a successful trip. Our thanks to the conference organisers, particularly Lori Baker at the DIFC Commissioner’s Office, and our friends at Middlesex University Dubai for inviting us to address the students.  

Now is the time to train your staff in the new data protection laws in the Middle East. We can deliver online as well as face to face training. All of our training starts with a free analysis call to ensure you have the right level and most appropriate content for your organisation’s needs. Please get in touch to discuss your training or consultancy needs.  

All Go for UK to US Data Transfers 

On 10th July 2023, the European Commission adopted its adequacy decision under Article 45 of GDPR for the EU-U.S. Data Privacy Framework (DPF).
It concluded that the United States ensures an adequate level of protection, comparable to that of the European Union, for personal data transferred from the EU to US companies under the new framework. It means that personal data can flow safely from the EU to US companies participating in the Framework, without having to put in place additional data protection safeguards under the GDPR. 

The question then is, “What about transfers from the UK to the US which were not covered by the above?” The Data Protection (Adequacy) (United States of America) Regulations 2023 (SI 2023/1028) will come into force on 12th October 2023. The effect of the Regulations will be that, as of 12th October 2023, a transfer of personal data from the UK to an entity in the USA which has self-certified to the Trans-Atlantic EU-US Data Privacy Framework and its UK extension and which will abide by the EU-US Data Privacy Framework Principles, will be deemed to offer an adequate level of protection for personal data and shall be lawful in accordance with Article 45(1) UK GDPR.  

Currently, data transfers from the UK to the US under the UK GDPR must either be based on a safeguard, such as standard contractual clauses or binding corporate rules, or fall within the scope of a derogation under Article 49 UK GDPR. 

UK Data Controllers need to update privacy policies and document their own processing activities as necessary to reflect any changes in how they transfer personal data to the US. 

The new US – EU Data Privacy Framework will be discussed in detail on our forthcomingInternational Transfers workshop. 

Act Now Launches New UAE DP Officer Certificate 

Act Now Training is pleased to announce the launch of the new UAE Data Protection Officer Certificate.  

Data Protection law in the Middle East has seen some rapid developments recently. The UAE recently enacted a federal law to comprehensively regulate the processing of personal data in all seven emirates. This will sit alongside current data protection laws regulating businesses in the various financial districts such as the Dubai International Financial Centre (DIFC) Data Protection Law No. 5 of 2020 and the Abu Dhabi Global Market (ADGM) Data Protection Regulations 2021. In addition there are several sector specific laws in the UAE which address personal privacy and data security. Saudi Arabia, Bahrain and Qatar also now have comprehensive data protection laws.   

These laws require a fundamental assessment of the way Middle East businesses handle personal data from collection through to storage, disclosure and destruction. With enhanced rights for individuals and substantial fines for non-compliance no business can afford to ignore the new requirements. 

Act Now’s UAE Data Protection Officer Certificate has been developed following extensive discussions with our clients and partners in the UAE and builds on our experience of delivering training and consultancy in the region. The course focuses on the essential knowledge required by DPOs to successfully navigate the UAE data protection landscape. The course will also help DPOs to develop the skills required to do their job better.
These include interpreting the data protection principles in a practical context, drafting privacy notices, undertaking DPIAs and reporting data breaches. 

The course teaching style is based on four practical and engaging workshops covering theory alongside hands-on application using case studies that equip delegates with knowledge and skills that can be used immediately. Delegates will also have personal tutor support throughout the course and access to a comprehensive revised online resource lab. 

Ibrahim Hasan, director of Act Now Training, said: 

“I am really pleased to be launching this new UAE DPO certificate course. This is an exciting time for data protection law in the Middle East. Act Now is committed to contributing to the development of the DPO function in the region.” 

If you would like to discuss your suitability for this course, please get in touch. It can also be delivered as an in house option.

Privacy Concerns Raised Over Adoption Records on Genealogy Website 

Last week, the names and details of individuals adopted over the past century were found to be accessible on the genealogy website, Scotland’s People. The exposure of these records, alongside other recent data breaches, has ignited a discourse on privacy and security.

Upon being alerted by a concerned mother, who discovered her adopted child’s details on the website, the NRS acted promptly, removing the information within 36 hours. The mother detailed her experience in an interview with BBC Scotland News. She highlighted the potential risk of the website inadvertently enabling individuals to discern the adopted child’s new surname. This revelation is alarming, especially as many adoptive parents opt to retain the first names of their children.

Diving deeper into the website’s database, it was revealed that the platform had information on adoptions dating as far back as 1909, with the most recent entries from 2022. Nick Hobbs, the acting Children’s Commissioner in Scotland, said that the exposed data could be in violation of both the European Convention on Human Rights and the United Nations Convention on the Rights of the Child, both of which enshrine the right to privacy.

While the NRS responded by temporarily removing the records from the site, they highlighted their statutory responsibility to maintain open and searchable registers. They also stressed that this incident didn’t classify as a personal data breach. Nonetheless, as a precautionary measure, they informed the Information Commissioner’s Office (ICO) about the concerns raised.

The ICO, in its statement, underscored the importance of sensitive personal data being managed in congruence with data protection laws. They clarified that while the NRS did notify them, they hadn’t received a formal breach report.  

This incident serves as a poignant reminder of the complexities of balancing transparency and privacy in the digital age. As the debate around personal data continues to evolve, it underscores the need for stringent measures and vigilance in the handling of sensitive information, especially when it pertains to vulnerable demographics.
It is paramount that organisations ensure robust data governance practices to prevent potential breaches and safeguard individual rights. 

We have two workshops coming up in September (Introduction to Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. 

Ibrahim Hasan’s BBC Radio Ulster Interview about the PSNI Data Breach 

Today, Ibrahim Hasan gave an interview to BBC Radio Ulster about the the Police Service of Northern Ireland’s (PSNI) recent data breach. In response to an FOI request, PSNI shared names of all officers and staff, where they were based and their roles. Listen below. More about the PSNI and the Electoral Commission data breaches here.

We have two workshops coming up in September (Introduction to Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. Our Data Mapping workshop is proving very popular with IG and DP Officers who wish to develop this skill.

The Electoral Commission and PSNI: One Day, Two Data Breaches!

Yesterday two major data breaches were reported in the public sector. Both have major implications for individuals’ privacy. They are also a test for the Information Commissioner’s Office’s (ICO) approach to the use of its enforcement power.

In the morning, the Electoral Commission revealed, in a public notice issued under Article 33 and 34 of the UK GDPR, that it has been the victim of a “complex cyber-attack” potentially affecting millions of voters.
It only discovered in October last year that unspecified “hostile actors” had managed to gain access to copies of the electoral registers, from August 2021. Hackers also broke into its emails and “control systems”.

The Commission said the information it held at the time of the attack included the names and addresses of people in the UK who registered to vote between 2014 and 2022.This includes those who opted to keep their details off the open register, which is not accessible to the public but can be purchased. The data accessed also included the names, but not the addresses, of overseas voters.  

The Commission said it is difficult to predict exactly how many people could be affected, but it estimates the register for each year contains the details of around 40 million people. It has warned people to watch out for unauthorised use of their data. The ICO has issued a statement saying it is currently making enquiries into the incident.

And then late last night, and perhaps even more worrying for those involved, the Police Service of Northern Ireland apologised for a data breach affecting thousands of officers. In response to a Freedom of Information (FoI) request, the PSNI mistakenly divulged information on “every police officer and member of police staff”, a senior officer said. The FoI request, via the What Do They Know.Com website, had asked the PSNI for a breakdown of all staff rank and grades. But as well as publishing a table containing the number of people holding positions such as constable, a spreadsheet was included. This contained the surnames of more than 10,000 individuals, their initials and other data, but did not include any private addresses. The information was published on the WDTK website for more than two hours.

The ICO has just issued a statement Cabinet Office the PSNI data breach. A few years ago such data breaches would attract large fines. In 2021 the Cabinet Office was fined £500,000 (later reduced to £50,000) for publishing postal addresses of the 2020 New Year Honours recipients online. In June 2022 John Edwards, the Information Commissioner, announced a new approach towards the public sector with the aim to reduce the impact of fines on the sector. This centred around issuing reprimands rather than fines for the public sector. Since then no public sector organisation has been fined despite some very serious data breaches. In May 2023, Thames Valley Police (TVP) were issued with a reprimand after an ICO investigation found that TVP had inappropriately disclosed contextual information that led to suspected criminals learning the address of a witness (the data subject). As a result of this incident, the data subject moved address and the impact and risk to the data subject remains high.  Many data protection experts have expressed concern about the public sector’s special treatment. In relation to yesterday’s data breaches, anything other than serious enforcement action will lead to further questions for the ICO. 

The scale of the PSNI data breach is huge. The release of the names exposes individuals who are regularly targeted by terrorist groups. Had the breach included addresses, it would have been even more serious. Both these breaches are going to test the ICO’s public sector enforcement policy.

Ibrahim Hasan has given an interview to BBC Radio Ulster about the PSNI data breach. Listen here.

We have two workshops coming up in September (Introduction to Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. Our Data Mapping workshop is proving very popular with IG and DP Officers who wish to develop this skill.

Facial Recognition CCTV Cameras in Every Store?

The Observer recently reported that Home Office officials have developed covert plans to lobby the Information Commissioner’s Office (ICO) in an effort to hasten the adoption of contentious facial recognition technology in high street stores and supermarkets. Critics argue that such technology raises concerns about bias and data privacy.

Despite these objections, the Home Office appears to be pushing for the adoption of facial recognition in stores. The minutes of the recent meeting, obtained under the Freedom of Information Act, appear to show Home Office officials agreeing to write to the ICO praising the merits of facial recognition technology in combating “retail crime”. This ignores critics who claim the technology violates human rights and is biased, particularly against darker-skinned people.

Police minister Chris Philp, senior Home Office officials, and the commercial company Facewatch came to an agreement on the covert strategy on 8th March 2023 during a meeting held behind closed doors. Facewatch provides facial recognition cameras to help retailers combat shoplifting. It has courted controversy and was investigated by the ICO earlier this year following a complaint by Big Brother Watch.

Despite finding multiple UK GDPR violations on 28th March, the ICO told Facewatch it would take no further action. The ICO said it “welcomed” remedial steps that Facewatch had taken, or would take, to address the above violations. Those remedial steps have been redacted from public information about the case.

Facial recognition technology has faced extensive criticism and scrutiny, leading the European Union to consider a ban on its use in public spaces through the upcoming Artificial Intelligence Act. However, the UK’s Data Protection and Digital Information (No.2) Bill proposes to eliminate the government-appointed Surveillance Camera Commissioner role and the requirement for a surveillance camera code of practice.

Our forthcoming CCTV workshop is ideal for those who want to explore the GDPR and privacy issues around all types of CCTV cameras including drones and body worn cameras. Our Advanced Certificate in GDPR Practice is a practical scenario based course designed to help delegates gain the confidence to tackle complex GDPR issues in methodical way.

Council Loses High Court Damages Claim for Misuse of Personal Data 

A recent High Court judgment highlights the importance of data controllers treating personal data in their possession with care and in accordance with their obligations under the General Data Protection Regulation (GDPR). Failure to do so will also expose them to a claim in the tort of misuse of private information.

The Facts

In Yae Bekoe v London Borough of Islington [2023] EWHC 1668 (KB) the claimant, Mr. Bekoe, had an informal arrangement with his neighbour to manage and rent out flats on her behalf, with the income intended to support her care needs. In 2015, Islington Council initiated possession proceedings against Mr Bekoe. During the proceedings, the council submitted evidence to the court, including details of Mr. Bekoe’s bank accounts, mortgage accounts, and balances. This provided a snapshot of Mr. Bekoe’s financial affairs at that time. Some of this information, it appears, was held internally by the Council, and disclosed by one department to another for the purpose of “fraud” whilst other information was received after making a court application for disclosure by the bank and Mr Bekoe.  Subsequently, Mr. Bekoe filed a claim against Islington Council, alleging the misuse of his private information and a breach of the GDPR. Amongst other things, he argued that the council obtained his private information without any legal basis. Mr. Bekoe also claimed that the council failed to comply with its obligations under the GDPR in responding to his Subject Access Request (SAR). He made the request at the start of the legal proceedings, but the council’s response was delayed. Mr Bekoe also claimed that the council was responsible for additional GDPR infringements including failing to disclose further data and destroying his personal data in the form of the legal file which related to ongoing proceedings.

The Judgement

The judge awarded Mr. Bekoe damages of £6,000 considering the misuse of private information, the loss of control over that information, and the distress caused by the breaches of the GDPR. He ruled that the information accessed went beyond what was necessary to demonstrate property-related payments. Regarding the breach of the GDPR, the judge concluded that: 

  • The council significantly breached the GDPR by delaying the effective response to the subject access request for almost four years. 
  • There was additional personal data belonging to Mr. Bekoe held by the council that had not been disclosed, constituting a breach of the GDPR. 
  • While the specifics of the lost or destroyed legal file were unclear, there was a clear failure to provide adequate security for Mr. Bekoe’s personal data, breaching the GDPR. 
  • Considering the inadequate response to the subject access request, the loss or destruction of the legal file, and the failure to ensure adequate security for further personal data, the council breached Mr. Bekoe’s GDPR rights under Articles 5 (data protection principles), 12 (transparency), and 15 (right of access). 
     

The Lessons

Whilst this High Court decision is highly fact-specific and not binding on other courts, it does demonstrate the importance of ensuring there is a sound legal basis for accessing personal data and for properly responding to subject access requests.  Not only do individuals have the right to seek compensation for breaches of the UK GDPR, including failures to respond to subject access requests, the Information Commissioner’s Office (ICO) can take regulatory action which may include issuing reprimands or fines. Indeed, last September the ICO announced it was acting against seven organisations for delays in dealing with Subject Access Requests (SARs). This included government departments, local authorities, and a communications company. 

This and other GDPR developments will be discussed in our forthcoming GDPR Update workshop.