Lloyd v Google: What DPOs need to know

Last week, the UK Supreme Court handed down its much anticipated judgement in the case of Lloyd v Google LLC [2021] UKSC 50. It is a significant case because it answers two important questions (1) whether US style class action lawsuits can be brought for data protection claims and (2) whether damages can be claimed for mere “loss of control” of personal data where no actual damage has been suffered by data subjects. If the Supreme Court had decided that the answer to either of these questions was “yes”, it would have resulted in Data Controllers being targeted with much more costly data breach litigation. 

The present case was brought by Richard Lloyd, a former director of consumer rights group Which?, who alleged that between 2011 and 2012, Google cookies collected data on health, race, ethnicity, sexuality and finance through Apple’s Safari web browser, even when users had chosen a “do not track” privacy setting on their phone. Mr Lloyd sought compensation, under section 13 of the old Data Protection Act 1998. 

Mr Lloyd sought to bring a claim in a representative capacity on behalf of 4 million consumers; a US style “class action”. In the UK, such claims currently need consumers to opt-in, which can be a lengthy process (and costly). Mr Lloyd attempted to set a precedent for opt-out cases, meaning one representative could bring an action on behalf of millions without the latter’s consent. He sought to use Rule 19.6 of the Civil Procedure Rules which allows an individual to such bring a claim where all members of the class have the “same interest” in the claim. Because Google is a US company, Mr Lloyd needed the permission of the English court to pursue his claim. Google won in the High Court only for the decision to be overturned by the Court of Appeal. If Mr Lloyd had succeeded in the Supreme Court on appeal, it could have opened the floodgates to many more mass actions against tech firms (and other data controllers) for data breaches.

The Supreme Court found class actions impermissible in principle in the present case. It said that, in order to advance such an action on behalf of each member of the proposed represented class, Mr Lloyd had to prove that each one of those individuals had both suffered a breach of their rights and suffered actual damage as a result of that breach. Mr. Lloyd had argued that a uniform sum of damages could be awarded to each member of the represented class without having to prove any facts particular to that individual. In particular, he had argued that compensation could be awarded under the DPA 1998 for “loss of control” of personal data constituted by any non–trivial infringement by a data controller of any of the requirements of the DPA 1998.

The Supreme Court  rejected these arguments for two principal reasons. Firstly, the claim was based only on section 13 of the DPA 1998, which states that “an individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage”. The court ruled that “damage” here means material damage, such as financial loss or mental distress, as caused by unlawful processing of personal data in contravention of the DPA 1998 (i.e. simply infringing the DPA 1998 does not in itself constitute “damage”). Secondly, in order to recover compensation under section 13 of the DPA 1998, it is necessary to prove what unlawful processing (by Google) of personal data relating to each individual actually occurred. A representative claim could have been brought to establish whether Google was in breach of the DPA 1998 as a basis for pursuing individual claims for compensation but not here where Mr Lloyd was claiming the same amount of damages (£750) for each of the 4 million iPhone users.

This case was decided under the DPA 1998.  Article 82(1) of the UK GDPR sets out the right to compensation now; “Any person who has suffered material or non-material damage as a result of an infringement of this Regulation shall have the right to receive compensation from the controller or processor for the damage suffered”. The similar wording to the DPA 1998 means that the outcome would be the same if Mr Lloyd had commenced his action post GDPR.

The Lloyd-Google judgment means that those seeking to bring class-action data protection infringement compensation cases have their work cut out. However, claims under Art 82 can still be brought on an individual basis – in fact the judgment seems to indicate that individual cases can have good prospects of success. There is more to come in this area. TikTok is facing a similar case, brought by former Children’s Commissioner Anne Longfield, which alleges that the video-sharing app used children’s data without informed consent. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a one place left on our Advanced Certificate in GDPR Practice course starting in January.

advanced_cert

To Share or Not to Share; That is the Question! 

elaine-casap-qgHGDbbSNm8-unsplash

On 5th October 2021 the Data Sharing Code of Practice from the Information Commissioner’s Office came into effect for UK based Data Controllers.  

The code is not law nor does it ‘enforce’ data sharing, but it does provide some useful steps to consider when sharing personal data either as a one off or as part of an ongoing arrangement. Data Protection professionals, and the staff in the organisations they serve, will still need to navigate a way through various pressures, frameworks, and expectations on the sharing of personal data; case by case, framework by framework. A more detailed post on the contents of the code can be read here.  

Act Now Training is pleased to announce a new full day ‘hands on’ workshop for Data Protection professionals on Data Sharing. Our expert trainer, Scott Sammons, will look at the practical steps to take, sharing frameworks and protocols, risks to consider etc. Scott will also explore how, as part of your wider IG framework, you can establish a proactive support framework; making it easier for staff to understand their data sharing obligations/expectations and driving down the temptation to use a ‘Data Protection Duck out’ for why something was shared/not shared inappropriately.  

Delegates will also be encouraged to bring a data sharing scenario to discuss with fellow delegates and the tutor. This workshop can also be customised and delivered to your organisation at your premises or virtually. Get in touch to learn more.

advanced_cert

Law Enforcement Processing and the Meaning of “authorised by law”

ethan-wilkinson-UJdx3XM3xao-unsplash

In October, there was a decision in the Scottish courts which will be of interest to data protection practitioners and lawyers when interpreting Part 3 of the Data Protection Act 2018 (law enforcement processing)  and more generally the UK GDPR.

The General Teaching Council For Scotland v The Chief Constable of The Police Service of Scotland could fairly be described as a skirmish about expenses (known as costs in other parts of the UK) in seven Petitions to the Court of Session by the General Teaching Council for Scotland (“GTCS”) against the Chief Constable of the Police Service of Scotland (“Police Scotland”). The petitions essentially sought disclosure of information, held by Police Scotland, to the GTCS which the GTCS had asked Police Scotland for, but which the latter had refused to provide. 

This case will be of interest to data protection practitioners for two reasons: (1) there is some consideration by Lord Uist as to what “authorised by law” means in the context of processing personal data under Part 3 DPA 2018 for purposes other than law enforcement purposes; and (2) it contains a salutary reminder that while advice from the Information Commissioner’s Office (ICO) can be useful, it can also be wrong; as well as the responsibilities of data controllers in relation to their decisions.

The GTCS is the statutory body responsible for the regulation of the teaching profession in Scotland. They are responsible for assessing the fitness of people applying to be added to the register of teachers in Scotland as well as the continuing fitness of those already on the register. In reliance of these functions, the GTCS had requested information from Police Scotland in order to assist it in fulfilling these duties. The information held by Police Scotland was processed by them for the law enforcement purposes; it thus fell within Part 3 of the DPA 2018. In response, the GTCS petitioned the Court of Session for orders requiring Police Scotland to release the information. Police Scotland did not oppose the Petitions and argued that it should not be found liable for the expenses of the GTCS in bringing the Petitions to the court. This was on the basis that it had not opposed them and it could not have given the GTCS information without the court’s order.

The ICO advice to Police Scotland

Police Scotland refused to supply the information without a court order on the basis that to do so would be processing the personal data for purposes other than the law enforcement purposes where the disclosure was authorised by law in contravention of the second Data Protection Principle under Section 36 of the DPA 2018 which states:

“(1) The second data protection principle is that – (a) the law enforcement purpose for which personal data is collected on any occasion must be specified, explicit and legitimate, and (b) personal data so collected must not be processed in a manner that is incompatible with the purpose for which it was collected. 

(2) Paragraph (b) of the second data protection principle is subject to subsections (3) and (4). 

(3) Personal data collected for a law enforcement purpose may be processed for any other law enforcement purpose (whether by the controller that collected the data or by another controller) provided that – 

(a) the controller is authorised by law to process that data for the other purpose, and
(b) the processing is necessary and proportionate to that other purpose. 

(4) Personal data collected for any of the law enforcement purposes may not be processed for a purpose that is not a law enforcement purpose unless the processing is authorised by law.” 

Police Scotland was relying upon advice from the ICO. That advice was that Police Scotland “would require either an order of the court or a specific statutory obligation to provide the information”, otherwise Police Scotland would be breaching the requirements of the DPA 2018. A longer form of the advice provided by the ICO to Police Scotland may be found at paragraph 10 of Lord Uist’s decision.

The ICO’s advice to Police Scotland was in conflict with what the ICO said in its code of practice issued under section 121 of the DPA 2018. There the ICO said that “authorised by law” could be “for example, statute, common law, royal prerogative or statutory code”. 

Authorised by Law

Lord Uist decided that the position adopted by Police Scotland, and the advice given to them by the ICO, was “plainly wrong”; concluding that the disclosure of the information requested by the GTCS would have been authorised by law without a court order.

The law recognises the need to balance the public interest in the free flow of information to the police for criminal proceedings, which requires that information given in confidence is not used for other purposes, against the public interest in protecting the public by disclosing confidential  information to regulatory bodies charged with ensuring professionals within their scope of responsibility are fit to continue practising. In essence, when the police are dealing with requests for personal data processed for law enforcement purposes by regulatory bodies, they must have regard to the public interest in ensuring that these regulatory bodies, which exist to protect the public, are able to carry out their own statutory functions.

Perhaps more significantly, the law also recognises that a court order is not required for such disclosures to be made to regulatory bodies. This meant that there was, at common law, a lawful basis upon which Police Scotland could have released the information requested by the GTCS to them. Therefore, Police Scotland would not have been in breach of section 36(4) of the DPA 2018 had they provided the information without a court order.

In essence, a lack of a specific statutory power to require information to be provided to it, or a specific statutory requirement on the police to provide the information, does not mean a disclosure is not authorised by law. It is necessary, as the ICO’s code of practice recognises, to look beyond statute and consider whether there is a basis at common law. 

Police Scotland was required by Lord Uist to meet the expenses of the GTCS in bringing the Petitions. This was because the Petitions had been necessitated by Police Scotland requiring a court order when none was required. Lord Uist was clear that Police Scotland had to take responsibility for their own decision; it was not relevant to consider that they acted on erroneous advice from the ICO.

This case serves as a clear reminder that, while useful, advice from the ICO can be wrong. The same too, of course, applies in respect of the guidance published by the ICO. It can be a good starting point, but it should never be the starting and end point. When receiving advice from the ICO it is necessary to think about that advice critically; especially where, as here, the advice contradicts other guidance published by the ICO. It is necessary to consider why there is a discrepancy and which is correct: the advice or the guidance?
It may, of course, be the case that both are actually incorrect.

The finding of liability for expenses is also a reminder that controllers are ultimately responsible for the decisions that they take in relation to the processing of personal data.
It is not good enough to effectively outsource that decision-making and responsibility to the ICO. Taking tricky questions to the regulator does not absolve the controller from considering the question itself, both before and after seeking the advice of the ICO.

Finally, this case may also be a useful and helpful reference point when considering whether something is “authorised by law” for the purposes of processing under Part 3 of the DPA 2018. It is, however, a first instance decision (the Outer House of the Court of Session being broadly similar in status to the High Court in England and Wales) and that ought to be kept in mind when considering it.

Alistair Sloan is a Devil (pupil) at the Scottish Bar; prior to commencing devilling he was a solicitor in Scotland and advised controllers, data protection officers and data subjects on a range of information law matters.

We have just announced a new full day workshop on Part 3 of the DPA 2018. See also our Part 3 Policy Pack.

advanced_cert

GDPR Fine for Charity E Mail Blunder

A Scottish charity has been issued with a £10,000 monetary penalty notice following the inadvertent disclosure of personal data by email. 

On 18th October, HIV Scotland was found to have breached the security provisions of the UK GDPR, namely Articles 5(1)(f) and 32, when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk. 

The Information Commissioner’s Office (ICO) is urging organisations to revisit their bulk email practices after its investigation found shortcomings in HIV Scotland’s email procedures. These included inadequate staff training, incorrect methods of sending bulk emails by blind carbon copy (bcc) and an inadequate data protection policy. It also found that despite HIV Scotland’s own recognition of the risks in its email distribution and the procurement of a system which enables bulk messages to be sent more securely, it was continuing to use the less secure bcc method seven months after the incident.

On the point of training, HIV Scotland confirmed to the ICO that employees are expected to complete the “EU GDPR Awareness for All” on an annual basis.  The ICO recommended that staff should receive induction training “prior to accessing personal data and within one month of their start date.” Act Now’s e learning course, GDPR Essentials, is designed to teach employees about the key provisions of GDPR and how to keep personal data safe. The course is interactive with a quiz at the end and can be completed in just over 30 minutes. Click here to watch a preview. 

HIV Scotland was also criticised for not having a specific policy on the secure handling of personal data within the organisation. It relied on its privacy policy which was a public facing statement covering points such as cookie use, and data subject access rights; this provided no guidance to staff on the handling of personal and what they must do to ensure that it is kept secure. The Commissioner expects an organisation handling personal data, to maintain policies regarding, amongst other things, confidentiality (see our GDPR policy pack).

This is an interesting case and one which will not give reassurance to the Labour Relations Agency in Northern Ireland which had to apologise last week for sharing the email addresses and, in some cases ,the names of more than 200 service users. The agency deals confidentially with sensitive labour disputes between employees and employers. It said it had issued an apology to recipients and was currently taking advice from the ICO.

Interestingly the ICO also referenced in its ruling, the fact that HIV Scotland made a point of commenting on a similar error by another organisation 8 months prior. In June 2019, NHS Highland disclosed the email addresses of 37 people who were HIV positive. It is understood the patients in the Highlands were able to see their own and other people’s addresses in an email from NHS Highland inviting them to a support group run by a sexual health clinic. At the time HIV Scotland described the breach as “unacceptable”. 

The HIV Scotland fine is the second one the ICO has issued to a charity in the space of 4 months. On 8th July 2021, the transgender charity Mermaids was fined £25,000 for failing to keep the personal data of its users secure. The ICO found that Mermaids failed to implement an appropriate level of security to its internal email systems, which resulted in documents or emails containing personal data being searchable and viewable online by third parties through internet search engine results.

Charities need to consider these ICO fines very carefully and ensure that they have polices, procedures and training in place to avoid enforcement action by the ICO. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in January.

Labour Relations Agency Data Breach: Ibrahim Hasan’s BBC Interview

95505eee-53d6-4784-89be-605782852235-2

The Labour Relations Agency in Northern Ireland has apologised for sharing the email addresses and, in some cases the names, of more than 200 service users.

https://www.bbc.co.uk/news/uk-northern-ireland-58988092

Here is Ibrahim Hasan’s interview with BBC Radio Ulster:

More media interviews by Ibrahim here.

Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Footballers’ Personal Data: Ibrahim Hasan’s BBC Interview

fringer-cat-hddmxlpafgo-unsplash

On Tuesday there was an interesting story in the media about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. 

The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.

Do footballers have rights in relation to this data? Can they use the GDPR to seek compensation for the use of their data?

On Tuesday, Ibrahim Hasan gave an interview to BBC Radio 4’s (PM programme) about this story. You can listen below:

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Ring Doorbells, Domestic CCTV and GDPR

The Daily Mail reports today that, “A female doctor is set to be paid more than £100,000 after a judge ruled that her neighbour’s Ring smart doorbell cameras breached her privacy in a landmark legal battle which could pave the way for thousands of lawsuits over the Amazon-owned device.”

Dr Mary Fairhurst, the Claimant, alleged that she was forced to move out of her home because the internet-connected cameras are so “intrusive”. She also said that the Defendant, Mr Woodard, had harassed her by becoming “aggressive” when she complained to him.

A judge at Oxford County Court, ruled yesterday that Jon Woodard’s use of his Ring cameras amounted to harassment, nuisance and a breach of data protection laws. The Daily Sage goes on to say:

“Yesterday’s ruling is thought to be the first of its kind in the UK and could set precedent for more than 100,000 owners of the Ring doorbell nationally.”

Before Ring doorbell owners rush out to dismantle their devices, let’s pause and reflect on this story. This was not about one person using a camera to watch their house or protect their motorbike. The Defendant had set up a network of cameras around his property which could also be used to watch his neighbour’s comings and goings. 

Careful reading of the judgement leads one to conclude that the legal action brought by the Claimant was really about the use of domestic cameras in such a way as to make a neighbour feel harassed and distressed. She was primarily arguing for protection and relief under the Protection from Harassment Act 1997 and the civil tort of nuisance. Despite the Daily Mail’s sensational headline, the judgement does not put domestic CCTV camera or Ring doorbell owners at risk of paying out thousands of pounds in compensation (as long as they don’t use the cameras to harass their neighbours!). However, it does require owners to think about the legal implications of their systems. Let’s examine the data protection angle.

Firstly, the UK GDPR can apply to domestic CCTV and door camera systems. After all, the owners of such systems are processing personal data (images and even voice recordings) about visitors to their property as well as passers-by and others caught in the systems’ peripheral vision.  However, on the face of it, a domestic system should be covered by Article 2(2)(a) of the UK GDPR which says the law does not apply to “processing of personal data by an individual in the course of purely personal or household activity.” Recital 18 explains further:

“This Regulation does not apply to the processing of personal data by a natural person in the course of a purely personal or household activity and thus with no connection to a professional or commercial activity. Personal or household activities could include correspondence and the holding of addresses, or social networking and online activity undertaken within the context of such activities.”

The judge in this case concluded that the camera system, set up by the Defendant, had collected data outside the boundaries of his property and, in the case of one specific camera, “it had a very wide field of view and captured the Claimant’s personal data as she drove in and out of the car park.” This would take the system outside of the personal and household exemption quoted above, as confirmed by the Information Commissioner’s CCTV guidance:

“If you set up your system so it captures only images within the boundary of your private domestic property (including your garden), then the data protection laws will not apply to you.

But what if your system captures images of people outside the boundary of your private domestic property – for example, in neighbours’ homes or gardens, shared spaces, or on a public footpath or a street?

Then the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA18) will apply to you, and you will need to ensure your use of CCTV complies with these laws.”

Once a residential camera system comes under the provisions of the UK GDPR then of course the owner has to comply with all the Data Protection Principles including the obligation to be transparent (through privacy notices) and to ensure that the data processing is adequate, relevant and not excessive. Data Subjects also have rights in relation to their data including to see a copy of it and ask for it to be deleted (subject to some exemptions).

Judge Clarke said the Defendant had “sought to actively mislead the Claimant about how and whether the cameras operated and what they captured.” This suggests a breach of the First Principle (lawfulness and transparency). There were also concerns about the amount of data some of the cameras captured (Fourth Principle).

Let’s now turn to the level of compensation which could be awarded to the Claimant. Article 82 of the UK GDPR does contain a free standing right for a Data Subject to sue for compensation where they have suffered material or non-material damage, including distress, as a result of a breach of the legislation. However, the figure mentioned by the Daily Mail headline of £100,000 seems far-fetched even for a breach of harassment and nuisance laws let alone GDPR on its own. The court will have to consider evidence of the duration of the breach and the level of damage and distress cause to the Claimant. 

This judgement does not mean that Ring door camera owners should rush out to dismantle them before passing dog walkers make compensation claims. It does though require owners to think carefully about the citing of cameras, the adequacy of notices and the impact of their system on their neighbour’s privacy. 

The Daily Mail story follows yesterday’s BBC website feature about footballers attempting to use GDPR to control use of their performance data (see yesterday’s blog and Ibrahim Hasan’s BBC interview). Early Christmas gifts for data protection professionals to help them highlight the importance and topicality of what they do!

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Ronaldo’s Data and GDPR: Who said data protection is boring?

There is an interesting story this morning on the BBC website about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.  

Now 850 players (Ed – I don’t know if Ronaldo is one of them but I could not miss the chance to mention my favourite footballer!), led by former Cardiff City manager Russell Slade, want compensation for the trading of their performance data over the past six years by various companies. They also want an annual fee from the companies for any future use. The data ranges from average goals-per-game for an outfield player to height, weight and passes during a game. 

BBC News says that an initial 17 major betting, entertainment and data collection firms have been targeted, but Slade’s Global Sports Data and Technology Group has highlighted more than 150 targets it believes have “misused” data. His legal team claim that the fact players receive no payment for the unlicensed use of their data contravenes the General Data Protection Regulation (GDPR). However, the precise legal basis of their claim is unclear. 

In an interview with the BBC, Slade said:

“There are companies that are taking that data and processing that data without the individual consent of that player.”

This suggests a claim for breach of the First Data Protection Principle (Lawfulness and Transparency). However, if the players’ personal data is provided by their clubs e.g., height, weight, performance at training sessions etc. then it may be that players have already consented (and been recompensed for this) as part of their player contract. In any event, Data Protection professionals will know that consent is only one way in which a Data Controller can justify the processing of personal data under Article 6 of GDPR. Article 6(1)(f) allows processing where it:

“is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data… .”

Of course, this requires a balancing exercise considering the interests pursued by the clubs and data companies and the impact on individual players’ privacy. Some would argue that as far as public domain information is concerned, the impact on players’ privacy is minimal. However, “the interests or fundamental rights and freedoms of the data subject’ also include reputational damage, loss of control and financial loss, all of which it could be argued result from the alleged unauthorised use of data.

The BBC article quotes former Wales international Dave Edwards, one of the players behind the move:

“The more I’ve looked into it and you see how our data is used, the amount of channels its passed through, all the different organisations which use it, I feel as a player we should have a say on who is allowed to use it.”

The above seems to suggest that the players’ argument is also about control of their personal data. The GDPR does give players rights over their data which allow them to exercise some element of control including the right to see what data is held about them, to object to its processing and to ask for it to be deleted. It may be that players are exercising or attempting to exercise these rights in order to exert pressure on the companies to compensate them.

Without seeing the paperwork, including the letters before action which have been served on the companies, we can only speculate about the basis of the claim at this stage. Nonetheless, this is an interesting case and one to watch. If the claim is successful, the implications could have far-reaching effects beyond football. Whatever happens it will get data protection being talked about on the terraces!

Ibrahim Hasan, solicitor and director of Act Now Training, has given an interview to BBC Radio 4’s (PM programme) about this story. You can listen again here (from 39) minutes onwards.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

International Transfers under the UK GDPR: What next?

In August, the Information Commissioner’s Office (ICO) launched a public consultation on its much anticipated draft guidance for international transfers of personal data and associated transfer tools. The aim of the consultation is to explore how to address the realities of the UK’s post Brexit data protection regime.

Chapter 5 of the UK GDPR mirrors the international transfer arrangements of the EU GDPR. There is a general prohibition on organisations transferring personal data to a country outside the UK, unless they ensure that data subjects’ rights are protected. This means that, if there is no adequacy decision in respect of the receiving country, one of the safeguards set out in Article 46 of the UK GDPR must be built into the arrangement. These include standard contractual clauses (SCCs) and binding corporate rules. The former need to be included in a contract between the parties (data exporter and importer) and impose certain data protection obligations on both.

The Current Transfer Regime

Until recently, many UK organisations were using the EU’s approved SCCs with a few ICO suggested amendments to fit the UK context. This was despite the fact that they needed updating in the light of the binding judgment of the European Court of Justice(ECJ) in a case commonly known as “Schrems II”. 

In this case the ECJ concluded that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework. They must now consider using the Article 49 derogations or SCCs. If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection legislation, and to put in place “additional measures” to those included in the SCCs. 

In the light of the above, the new EU SCCs were published in June. The European Data Protection Board has also published its guidance on the aforementioned required assessment entitled “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data”.

The Proposed UK Transfer Regime

Following Brexit, the UK is no longer part of the EU. Consequently, the UK has to develop its own international data transfer regime, including SCCs. The ICO is consulting on new guidance as well as a series of proposed international data transfer materials including:

A Transfer Risk Assessment (TRA) – Equivalent to the European Transfer Impact Assessment, this is designed to assist organisations to conduct risk assessments of their international personal data transfers, following the requirements set out in Schrems. The TRA is not mandatory, as organisations are also free to use their own methods to assess risk but does indicate the ICO’s expectations.

An International Data Transfer Agreement – Equivalent to the European SCCs, this a contract that organisations can use when transferring data to countries not covered by adequacy decisions.

The Addendum – This is designed to be used alongside the European Commission SCCs, to allow them to be used to safeguard a transfer under the UK GDPR, instead of the IDTA. It makes limited amendments to the EU SCCs to make them work in a UK context. 

The deadline for responses to the consultation is 5.00pm on Thursday 7th October 2021. The ICO will then review the responses before issuing  the finalised materials (on a date yet to be announced).  Whatever the result of the consultation, organisations need to consider now which of their international data transfers will be affected and what resources will be required to implement the new regime.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop and international transfers webinar.

Our next online GDPR Practitioner Certificate course start in October. We also have a classroom course starting in November in Manchester.