The British Library Hack: A Chapter in Ransomware Resilience

In a stark reminder of the persistent threat of cybercrime, the British Library has confirmed a data breach incident that has led to the exposure of sensitive personal data, with materials purportedly up for auction online. An October intrusion by a notorious cybercrime group targeted the library, which is home to an extensive collection, including over 14 million books.

Recently, the ransomware group Rhysida claimed responsibility, publicly displaying snippets of sensitive data, and announcing the sale of this information for a significant sum of around £600k to be paid in cryptocurrency.

While the group boasts about the data’s exclusivity and sets a firm bidding deadline (today 27th November 2023), the library has only acknowledged a leak of what seems to be internal human resources documents. It has not verified the identity of the attackers nor the authenticity of the sale items. The cyber attack has significantly disrupted the library’s operations, leading to service interruptions expected to span several months.

In response, the library has strengthened its digital defenses, sought expert cybersecurity assistance, and urged its patrons to update their login credentials as a protective measure. The library is working closely with the National Cyber Security Centre and law enforcement to investigate, but details remain confidential due to the ongoing inquiry.

The consequences of the attack have necessitated a temporary shutdown of the library’s online presence. Physical locations, however, remain accessible. Updates can be found the British Library’s X (née twitter) feed. The risk posed by Rhysida has drawn attention from international agencies, with recent advisories from the FBI and US cybersecurity authorities. The group has been active globally, with attacks on various sectors and institutions.

The British Library’s leadership has expressed appreciation for the support and patience from its community as it navigates the aftermath of the cyber attack.

What is a Ransomware Attack?

A ransomware attack is a type of malicious cyber operation where hackers infiltrate a computer system to encrypt data, effectively locking out the rightful users. The attackers then demand payment, often in cryptocurrency, for the decryption key. These attacks can paralyse organisations, leading to significant data loss and disruption of operations.

Who is Rhysida?

The Rhysida ransomware group first came to the fore in May of 2023, following the emergence of their victim support chat portal hosted via the TOR browser. The group identifies as a “cybersecurity team” who highlight security flaws by targeting victims’ systems and spotlighting the supposed potential ramifications of the involved security issues.

How to prevent a Ransomware Attack?

Hackers are becoming more and more sophisticated in ways they target our personal data. We have seen this with banking scams recently. However there are some measures we can implement personally and within our organisations to prevent a ransomware attack.

  1. Avoid Unverified Links: Refrain from clicking on links in spam emails or unfamiliar websites. Hackers frequently disseminate ransomware via such links, which, when clicked, can initiate the download of malware. This malware can then encrypt your data and hold it for ransom​​.

  2. Safeguard Personal Information: It’s crucial to never disclose personal information such as addresses, NI numbers, login details, or banking information online, especially in response to unsolicited communications​​.

  3. Educate Employees: Increasing awareness among employees can be a strong defence. Training should focus on identifying and handling suspicious emails, attachments, and links. Additionally, having a contingency plan in the event of a ransomware infection is important​​.

  4. Implement a Firewall: A robust firewall can act as a first line of defence, monitoring incoming and outgoing traffic for threats and signs of malicious activity. This should be complemented with proactive measures such as threat hunting and active tagging of workloads​​.

  5. Regular Backups: Maintain up-to-date backups of all critical data. In the event of a ransomware attack, having these backups means you can restore your systems to a previous, unencrypted state without having to consider ransom demands.

  6. Create Inventories of Assets and Data: Having inventories of the data and assets you hold allows you to have an immediate knowledge of what has been compromised in the event of an attack whilst also allowing you to update security protocols for sensitive data over time.

  7. Multi-Factor Authentication: Identifying legitimate users in more than one way ensures that you are only granting access to those intended. 

These are some strategies organisations can use as part of a more comprehensive cybersecurity protocol which will significantly reduce the risk of falling victim to a ransomware attack. 

Join us on our workshop “How to increase Cyber Security in your Organisation” and Cyber Security for DPO’s where we discuss all of the above and more helping you create the right foundations for Cyber resilience within your organisation. 

UK Biobank’s Data Sharing Raises Alarm Bells

An investigation by The Observer has uncovered that the UK Biobank, a repository of health data from half a million UK citizens, has been sharing information with insurance companies. This development contravenes the Biobank’s initial pledge to keep this sensitive data out of the hands of insurers, a promise that was instrumental in garnering public trust at the outset. UK Biobank has since come out and responded to the article calling it “disingenuous” and “extremely misleading”. 

A Promise Made, Then Modified 

The UK Biobank was set up in 2006 as a goldmine for scientific discovery, offering researchers access to a treasure trove of biological samples and associated health data. With costs for access set between £3,000 and £9,000, the research derived from this data has been nothing short of revolutionary. However, the foundations of this scientific jewel are now being questioned. 

When the project was first announced, clear assurances were given that data would not be made available to insurance companies, mitigating fears that genetic predispositions could be used discriminatorily in insurance assessments. These assurances appeared in the Biobank’s FAQs and were echoed in parliamentary discussions. 

Changing Terms Amidst Grey Areas 

The Biobank contends that while it does strictly regulate data access, allowing only verified researchers to delve into its database, this includes commercial entities such as insurance firms if the research is deemed to be in the public interest. The boundaries of what constitutes “health-related” and “public interest” are now under scrutiny.   

However, according to the Observer investigation, evidence suggests that this nuance—commercial entities conducting health-related research—was not clearly communicated to participants, especially given the categorical assurances given previously although the UK Biobank categorically denies this and shared its consent form and information leaflet. 

Data Sharing: The Ethical Quandary 

This breach of the original promise has raised the ire of experts in genetics and data privacy, with Prof Yves Moreau highlighting the severity of the breach of trust. The concern is not just about the sharing of data but about the integrity of consent given by participants. The Biobank’s response indicates that the commitments made were outdated and that the current policy, which includes sharing anonymised data for health-related research, was made clear to participants upon enrolment. 

The Ripple Effect of Biobank’s Data Policies 

Further complicating matters is the nature of the companies granted access. Among them are ReMark International, a global insurance consultancy, Lydia.ai, a Canadian “insurtech” firm that wants to give people “personalised and predictive health scores”, and Club Vita, a longevity data analytics company. These companies have utilised Biobank data for projects ranging from disease prediction algorithms to assessing longevity risk factors. The question that is raised is how can one ensure that this is in fact in the Public Interest, do we take a commercial entities word for this? UK Biobank says all research conducted is “consistent with being health-related and in the public interest” and it has an expert data access committee who decide on any complex issues but the who checks the ethics of the ethics committee? The issues with this self-regulation are axiomatic. 

The Fallout and the Future 

This situation has led to a broader conversation about the ethical use of volunteered health data and the responsibility of custodians like the UK Biobank to uphold public trust. As technology evolves and the appetite for data grows across industries, the mechanisms of consent and transparency may need to be revisited.  The Information Commissioner’s Office is now considering the case, spotlighting the crucial need for clarity and accuracy in how organisations manage and utilise sensitive personal information. 

As the UK Biobank navigates these turbulent waters, the focus shifts to how institutions like it can maintain the delicate balance between facilitating scientific progress and safeguarding the privacy rights of individuals who contribute their personal data for the greater good. For the UK Biobank, regaining the trust of its participants and the public is now an urgent task, one that will require more than just a careful review of policies but a reaffirmation of its commitment to ethical stewardship of the data entrusted to it. 

Take a look at our highly popular Data Ethics Course. Places fill up fast so if you would like learn more in this fascinating area, book your place now. 

Saudi Arabia’s First Ever DP Law Comes into Force 

Today (14th September 2023), Saudi Arabia’s first ever data protection law comes into force. Organisations doing business in the Middle East need to carefully consider the impact of the new law on their personal data processing activities. They have until 13th September 2024 to prepare and become fully compliant. 

Background 

The Personal Data Protection Law (PDPL) of Saudi Arabia was implemented by Royal Decree on 14th September 2021. It aims to regulate the collection, handling, disclosure and use of personal data. It will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA) which has published the aforementioned regulations. PDPL was originally going to come fully into force on 23rd March 2022. However, in November 2022, SDAIA published proposed amendments which were passed after public consultation.  

Following a consultation period, we also now have the final versions of the Implementing Regulations and the Personal Data Transfer Regulations; both expand on the general principles and obligations outlined in the PDPL (as amended in March 2023) and introduce new compliance requirements for data controllers. 

More Information  

Summary of the new law: https://actnowtraining.blog/2022/01/10/the-new-saudi-arabian-federal-data-protection-law/  

Summary of the Regulations: https://actnowtraining.blog/2023/07/26/data-protection-law-in-saudi-arabia-implementing-regulation-published/  

Action Plan 

13th September 2024 is not far away. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so could lead to enforcement action and also reputational damage. The following should be part of an action plan for compliance: 
 

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes.  
  1. Training staff at all levels to understand PDPL at how it will impact their role. 
  1. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed. 
  1. Reviewing how records management and information risk  is addressed within the organisation. 
  1. Drafting Privacy Notices  to ensure they set out the minimum information that should be included. 
  1. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification. 
  1. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure. 
  1. Appointing and training a Data Protection Officer. 
     

Act Now in Saudi Arabia 

Act Now Training can help your businesses prepare for the new law.
We have delivered training  extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. We have experience in helping organisations in territories where a new law of this type has been implemented.  

Now is the time to train your staff in the new law. Through our  KSA privacy programme, we offer comprehensive and cost-effective training from one hour awareness-raising webinars to comprehensive full day workshops and DPO certificate courses.  

To help deliver this and other courses, Suzanne Ballabás, an experienced middle-east based data protection specialist, recently joined our team of associates. We can deliver Online or Face to Face training. All of our training starts with a FREE analysis call to ensure you have the right level and most appropriate content for your organisations needs. Please get in touch to discuss your training or consultancy needs. 

Click on the Link Below to see our full Saudi Privacy Programme.

International Transfers Breach Results in Record GDPR Fine for Meta

Personal data transfers between the EU and US is an ongoing legal and political saga. The latest development is yesterday’s largest ever GDPR fine of €1.2bn (£1bn) issued by Ireland’s Data Protection Commission (DPC) to Facebook’s owner, Meta Ireland. The DPC ruled that Meta infringed Article 46 of the EU GDPR in the way it transferred personal data of its users from Europe to the US. 

The Law 

Chapter 5 of the EU GDPR mirrors the international transfer arrangements of the UK GDPR. There is a general prohibition on organisations transferring personal data to a country outside the EU, unless they ensure that data subjects’ rights are protected. This means that, if there is no adequacy decision in respect of the receiving country, one of the safeguards set out in Article 46 must be built into the arrangement. These include standard contractual clauses (SCCs) and binding corporate rules.
The former need to be included in a contract between the parties (data exporter and importer) and impose certain data protection obligations on both. 

The Problem with US Transfers 

In 2020, in a case commonly known as “Schrems II, the European Court of Justice (ECJ) concluded that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal mechanism to ensure GDPR compliance. They must consider using the Article 49 derogations or SCCs. If using the latter, whether for transfers to the US or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection and surveillance legislation, and to put in place “additional supplementary measures” to those included in the SCCs. The problem with the US is that it has stringent surveillance laws which give law enforcement agencies access to personal data without adequate safeguards (according to the ECJ in Schrems). Therefore any additional measures must address this possibility and build in safeguards to protect data subjects. 

In the light of the above, the new EU SCCs were published in June 2021.
The European Data Protection Board has also published its guidance on the aforementioned required assessment entitled “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data”. Meta’s use of the new EU SCC’s and its “additional supplementary measures” were the focus of the DPC’s attention when issuing its decision. 

The Decision 

The DPC ruled that Meta infringed Article 46(1) of GDPR when it continued to transfer personal data from the EU/EEA to the US following the ECJ’s ruling in Schrems II. It found that the measures used by Meta did not address the risks to the fundamental rights and freedoms of data subjects that were identified in Schrems; namely the risk of access to the data by US law enforcement.  

The DPC ruled that Meta should: 

  1. Suspend any future transfer of personal data to the US within five months of the date of the DPC’s decision; 
  1. Pay an administrative fine of €1.2 billion; and, 
  1. Bring its processing operations in line with the requirements of GDPR, within five months of the date of the DPC’s decision, by ceasing the unlawful processing, including storage, in the US of personal data of EEA users transferred in violation of GDPR. 

Meta has said that it will appeal the decision and seek a stay of the ruling, before the Irish courts.  Its President of Global Affairs, Sir Nick Clegg, said:  

“We are therefore disappointed to have been singled out when using the same legal mechanism as thousands of other companies looking to provide services in Europe. 

“This decision is flawed, unjustified and sets a dangerous precedent for the countless other companies transferring data between the EU and US.” 

The Future of US Transfers 

The Information Commissioner’s Office told the BBC that the decision “does not apply in the UK” but said it had “noted the decision and will review the details in due course”. The wider legal ramifications on data transfers from the UK to the US can’t be ignored. 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. A new  UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a Transfer Risk Assessment  as well as supplementary measures where privacy risks are identified.  

On 25th March 2022, the European Commission and the United States announced that they have agreed in principle on a new  Trans-Atlantic Data Privacy Framework. The final agreement is expected to be in place sometime this summer 2023 and will replace the Privacy Shield Framework. It is expected that the UK Government will strike a similar deal once the EU/US one is finalised. However both are likely to be challenged in the courts. 

The Meta fine is one of this year’s major GDPR developments nicely timed; within a few days of the 5th anniversary of GDPR. All organisations, whether in the UK or EU, need to carefully consider their data transfers mechanisms and ensure that they comply with Chapter 5 of GDPR in the light of the DPC’s ruling. A “wait and see’ approach is no longer an option.  

The Meta fine will be discussed in detail on our forthcoming International Transfers workshop. For those who want a 1 hour summary of the UK International Transfer regime we recommend our webinar 

US Data Transfers and Privacy Shield 2.0 

On 14th December 2022, the European Commission published a draft ‘adequacy decision’, under Article 47 of the GDPR, endorsing a new legal framework for transferring personal data from the EU to the USA. Subject to approval by other EU institutions, the decision paves the way for “Privacy Shield 2.0” to be in effect by Spring 2023.

The Background

In July 2020, the European Court of Justice (ECJ) in “Schrems II”, ruled that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework as a legal transfer tool as it failed to protect the rights of EU data subjects when their data was accessed by U.S. public authorities. In particular, the ECJ found that US surveillance programs are not limited to what is strictly necessary and proportionate as required by EU law and hence do not meet the requirements of Article 52 of the EU Charter on Fundamental Rights. Secondly, with regard to U.S. surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the USA, as required by Article 47 of the EU Charter.

The ECJ stated that organisations transferring personal data to the USA can still use the Article 49 GDPR derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporter to make a complex assessment  about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. 

Despite the Schrems II judgment, many organisations have continued to transfer personal data to the USA hoping that regulators will wait for a new Transatlantic data deal before enforcing the judgement.  Whilst the UK Information Commissioner’s Office (ICO) seems to have adopted a “wait and see” approach, other regulators have now started to take action. In February 2022, the French Data Protection Regulator, CNIL, ruled that the use of Google Analytics was a breach of GDPR due to the data being transferred to the USA without appropriate safeguards. This followed a similar decision by the Austrian Data Protection Authority in January. 

The Road to Adequacy

Since the Schrems ruling, replacing the Privacy Shield has been a priority for EU and US officials. In March 2022, it was announced that a new Trans-Atlantic Data Privacy Framework had been agreed in principle. In October, the US President signed an executive order giving effect to the US commitments in the framework. These include commitments to limit US authorities’ access to data exported from the EU to what is necessary and proportionate under surveillance legislation, to provide data subjects with rights of redress relating to how their data is handled under the framework regardless of their nationality, and to establish a Data Protection Review Court for determining the outcome of complaints.

Schrems III?

The privacy campaign group, noyb, of which Max Schrems is Honorary Chairman, is not impressed by the draft adequacy decision. It said in a statement:

“…the changes in US law seem rather minimal. Certain amendments, such as the introduction of the proportionality principle or the establishment of a Court, sound promising – but on closer examination, it becomes obvious that the Executive Order oversells and underperforms when it comes to the protection of non-US persons. It seems obvious that any EU “adequacy decision” that is based on Executive Order 14086 will likely not satisfy the CJEU. This would mean that the third deal between the US Government and the European Commission may fail.”

Max Schrems said: 

… As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again – in flagrant breach of our fundamental rights.”

The draft adequacy decision will now be reviewed by the European Data Protection Board (EDPB) and the European Member States. From the above statements it seems that if Privacy Shield 2.0 is finalised, a legal challenge against it is inevitable.

UK to US Data Transfers 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. At present use of such services usually involves a complicated TRA and execution of standard contractual clauses. A new UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a TRA as well as supplementary measures where privacy risks are identified. 

Good news may be round the corner for UK data exporters. The UK Government is also in the process of making an adequacy decision for the US. We suspect it will strike a similar deal once the EU/US one is finalised.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Our next online GDPR Practitioner Certificate course, starting on 10th January, is fully booked. We have places on the course starting on 19th January. 

We Are Hiring!

Are you a surveillance law expert with a proven track record of delivering practical and engaging training on Part 2 of the Regulation of Investigatory Act 2000 (RIPA) and/or its Scottish equivalent (RIPSA)? 

Due to an increased demand, Act Now Training is recruiting trainers to join its team of experts who deliver in-house and external surveillance training courses throughout the UK and online. These range from one hour webinars to full day courses and aim to help local authority staff practically apply the legislation and prepare for Commissioner inspections.

With more courses planned for 2022, including some new ones, we need trainers who enjoy the challenge of explaining difficult concepts in a practical jargon-free way.

We have opportunities for full time trainers as well as those who wish to add an extra “string to their bow” without leaving their day job. You do not have to be a lawyer and indeed our current team includes an ex police officer and a data protection officer. What is important is that you have practical experience of working with surveillance legislation, have enthusiasm for the subject and want to deliver innovative training (not “death by PowerPoint”) to a range of audiences.

If you think you have what it takes to become an Act Now trainer, please get in touch with your CV explaining your RIPA/RIPSA knowledge and experience. A full privacy policy can be read on our website.

The New Saudi Arabian Federal Data Protection Law 

The Middle East is fast catching up with Europe when it comes to data protection law. The Kingdom of Saudi Arabia(KSA) has enacted its first comprehensive national data protection law to regulate the processing of personal data. This is an important development alongside the passing of the new UAE Federal DP law. It also opens up opportunities for UK and EU Data Protection professionals especially as these new laws are closely aligned with the EU General Data Protection Regulation (GDPR) and the UK GDPR

The KSA Personal Data Protection Law (PDPL) was passed by Royal Decree M/19 of 9/2/1443H on 16 September 2021, approving Resolution No. 98 dated 7/2/1443H (14 September 2021). The detailed Executive Regulations are expected to be published soon and will give more details about the new law. It will be effective from 23rd March 2022 following which there will be a one year implementation period.

Enforcement 

PDPL will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA).The Executive Regulations will set out the administrate penalties that can be imposed on organisations for breaches. Expect large fines for non-compliance alongside other sanctions. PDPL could mirror the GDPR which allows the regulator to impose a fine of up to 20 million Euros or 4% of gross annual turnover, whichever is higher. PDPL also contains criminal offences which carry a term of imprisonment up to 2 years and/or a fine of up to 3 million Saudi Royals (approximately £566,000). Affected parties may also be able to claim compensation.

Territorial Scope

PDPL applies to all organisations that are processing personal data in the KSA irrespective of whether the data relates to Data Subjects living in the KSA. It also has an “extra-territorial” reach by applying to organisations based abroad who are processing personal data of Data Subjects resident in the KSA. Interestingly, unlike the UAE Federal DP law, PDPL does not exempt government authorities from its application although there are various exemptions from certain obligations where the data processing relates to national security, crime detection, statutory purposes etc.

Notable Provisions

PDPL mirrors GDPR’s underlying principles of transparency and accountability and empowers Data Subjects by giving them rights in relation to their personal data. We set out below the notable provisions including links to previous GDPR blog posts for readers wanting more detail, although more information about the finer points of the new law will be included in the forthcoming Executive Regulations. 

  • Personal Data – PDPL applies to the processing of personal data which is defined very broadly to include any data which identifies a living individual. However, unlike GDPR, Article 2 of PDPL includes within its scope, the data of a deceased person if it identifies them or a family member.
  • Registration  Article 23 requires Data Controllers (organisations that collect personal data and determine the purpose for which it is used and the method of processing) to register on an electronic portal that will form a national record of controllers. 
  • Lawful Bases – Like the UAE Federal DP law, PDPL makes consent the primary legal basis for processing personal data. There are exceptions including, amongst others, if the processing achieves a “definite interest” of the Data Subject and it is impossible or difficult to contact the Data Subject.
  • Rights – Data Subjects are granted various rights in Articles 4,5 and 7 of the PDPL which will be familiar to GDPR practitioners. These include the right to information (similar to Art 13 of GDPR), rectification, erasure and  Subject Access. All these rights are subject to similar exemptions found in Article 23 of GDPR.
  • Impact Assessments – Article 22 requires (what GDPR Practitioners call) “DPIAs” to be undertaken in relation to any new high risk data processing operations. This will involve assessing the impact of the processing on the risks to the rights of Data Subjects, especially their privacy and confidentiality.
  • Breach Notification – Article 20 requires organisations to notify the regulator, as well as a Data Subjects, if they suffer a personal data breach which compromises Data Subjects’ confidentiality, security or privacy. The timeframe for notifying will be set by the Executive Regulations.
  • Records Management – Organisations will have to demonstrate compliance with PDPL by keeping records. There is a specific requirement in Article 3 to keep records similar to a Record of Processing Activities(ROPA) under GDPR.
  • International Transfers – Like other data protection regimes PDPL  imposes limitations on the international transfer of personal data outside of the KSA. . There are exceptions; further details will be set out in the Executive Regulations.
  • Data Protection Officers – Organisations (both controllers and processors) will need to appoint at least one officer to be responsible for compliance with PDPL. The DPO can be an employee or an independent service provider and does not need to be located in the KSA. 
  • Training – After 23 March 2022, Data Controllers will be required to hold seminars for their employees to familiarise them with the new law.

Practical Steps

Organisations operating in the KSA, as well as those who are processing the personal data of KSA residents, need to assess the impact of PDPL on their data processing activities. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so will not just lead to enforcement action but also reputational damage. The following should be part of an action plan for compliance:

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes. 
  2. Training staff at all levels to understand PDPL at how it will impact on their role.
  3. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed.
  4. Reviewing how records management and information risk  is addressed within the organisation.
  5. Drafting Privacy Notices to ensure they set out the minimum information that should be included.
  6. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification.
  7. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure.
  8. Appointing and training a  Data Protection Officer.

Act Now Training can help your organisation prepare for PDPL. We are running a webinar on this topic soon and can also deliver more detailed in house training. Please get in touch to discuss you training needs. We are in Dubai and Abu Dhabi from 16th to 21st January 2022 and would be happy to arrange a meeting.

Ronaldo’s Data and GDPR: Who said data protection is boring?

There is an interesting story this morning on the BBC website about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.  

Now 850 players (Ed – I don’t know if Ronaldo is one of them but I could not miss the chance to mention my favourite footballer!), led by former Cardiff City manager Russell Slade, want compensation for the trading of their performance data over the past six years by various companies. They also want an annual fee from the companies for any future use. The data ranges from average goals-per-game for an outfield player to height, weight and passes during a game. 

BBC News says that an initial 17 major betting, entertainment and data collection firms have been targeted, but Slade’s Global Sports Data and Technology Group has highlighted more than 150 targets it believes have “misused” data. His legal team claim that the fact players receive no payment for the unlicensed use of their data contravenes the General Data Protection Regulation (GDPR). However, the precise legal basis of their claim is unclear. 

In an interview with the BBC, Slade said:

“There are companies that are taking that data and processing that data without the individual consent of that player.”

This suggests a claim for breach of the First Data Protection Principle (Lawfulness and Transparency). However, if the players’ personal data is provided by their clubs e.g., height, weight, performance at training sessions etc. then it may be that players have already consented (and been recompensed for this) as part of their player contract. In any event, Data Protection professionals will know that consent is only one way in which a Data Controller can justify the processing of personal data under Article 6 of GDPR. Article 6(1)(f) allows processing where it:

“is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data… .”

Of course, this requires a balancing exercise considering the interests pursued by the clubs and data companies and the impact on individual players’ privacy. Some would argue that as far as public domain information is concerned, the impact on players’ privacy is minimal. However, “the interests or fundamental rights and freedoms of the data subject’ also include reputational damage, loss of control and financial loss, all of which it could be argued result from the alleged unauthorised use of data.

The BBC article quotes former Wales international Dave Edwards, one of the players behind the move:

“The more I’ve looked into it and you see how our data is used, the amount of channels its passed through, all the different organisations which use it, I feel as a player we should have a say on who is allowed to use it.”

The above seems to suggest that the players’ argument is also about control of their personal data. The GDPR does give players rights over their data which allow them to exercise some element of control including the right to see what data is held about them, to object to its processing and to ask for it to be deleted. It may be that players are exercising or attempting to exercise these rights in order to exert pressure on the companies to compensate them.

Without seeing the paperwork, including the letters before action which have been served on the companies, we can only speculate about the basis of the claim at this stage. Nonetheless, this is an interesting case and one to watch. If the claim is successful, the implications could have far-reaching effects beyond football. Whatever happens it will get data protection being talked about on the terraces!

Ibrahim Hasan, solicitor and director of Act Now Training, has given an interview to BBC Radio 4’s (PM programme) about this story. You can listen again here (from 39) minutes onwards.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Covid-19, GDPR and Temperature Checks

Hands holding Thermometer Infrared Gun Isometric Medical Digital Non-Contact.

Emma Garland writes…

Many countries have now been in some form of lockdown for a considerable length of time. As some of the lockdown measures are slowly being eased, one of the possible solutions to prevent a “second wave” is the implementation of temperature checks in shops and workplaces. This involves placing a thermometer on an individual’s forehead. Of course if the temperature is recorded or there is another way the individual can be identified, it will involve processing health data. Care must be taken to consider the GDPR and privacy implications.

Apple reopened stores across Germany on 11th May with extra safety procedures, including temperature checks and social distancing. It is now facing a probe by a regional German data protection regulator into whether its plan to take the temperature of its store customers violates GDPR.

The benefits of temperature check are self-evident. By detecting members of the public or staff who have a high temperature, and not permitting them to enter the store or workplace, staff have less risk of close contact with people who may have COVID 19. Temperature checks are just one small part of stopping the spread of COVID 19 and can be intrusive. What is the lawful basis for processing such data? Art 6(1)(d) of GDPR allows processing where it is:

“…is necessary in order to protect the vital interests of the data subject or of another natural person”

Of course “data concerning health” is also Special Category Data and requires an Article 9 condition to ensure it is lawful. Is a temperature check necessary to comply with employment obligations, for medical diagnosis or for reasons of public health?

All conditions under Article 6 and 9 must satisfy the test of necessity. There are many causes of a high temperature not just COVID 19. There have also been doubts over the accuracy of temperature readings. They take skin temperature, which can vary from core temperature, and do not account for the incubation phase of the disease where people may be asymptomatic.

ICO Guidance

The Information Commissioner’s Office (ICO) has produced guidance on workplace testing which states:

“Data protection law does not prevent you from taking the necessary steps to keep your staff and the public safe and supported during the present public health emergency.
But it does require you to be responsible with people’s personal data and ensure it is handled with care.”

The ICO suggests that  “legitimate interests” or “public task” could be used to justify the processing of personal data as part of a workplace testing regime. The former will require a Legitimate Interests Assessment, where the benefit of the data to the organisation is balanced against the risks to the individual.  In terms of Article 9, the ICO suggests the employment condition, supplemented by Schedule 1 of the Data Protection Act 2018. The logic used here is that employment responsibilities extend to compliance wide range of legislation, including health and safety.

More generally, the ICO says that that technology which could be considered privacy intrusive should have a high justification for usage. It should be part of a well thought out plan, which ensures that it is an appropriate means to achieve a justifiable end. alternatives should also have been fully evaluated. The ICO also states:

“If your organisation is going to undertake testing and process health information, then you should conduct a DPIA focussing on the new areas of risk.”

A Data Protection Impact Assessment should map the flow of the data including collection, usage, retention and deletion as well as the associated risks to individuals.
Some companies are even using thermal cameras as part of COVID 19 testing.
The Surveillance camera Commissioner (SCC) and the ICO have worked together to update the SCC DPIA template, which is specific to surveillance systems.

As shops begin to open and the world establishes post COVID 19 practices, many employers and retailers will be trying to find their “new normal”. People will also have to decide what they are comfortable with. Temperature should be part of a considered approach evaluating all the regulatory and privacy risks.

Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

The NHS COVID 19 Contact Tracing App: Part 4 Questions about Data Retention and Function Creep

photo-1565878025373-d29620e25e32

The first three blog posts in this series have raised many issues about the proposed NHS COVID19 Contact Tracing App (COVID App) including the incomplete DPIA and lack of human rights compliance. In this final post we discuss concerns about how long the data collected by the app will be held and what it will be used for.

From the DPIA and NHSX communications it appears that the purpose of the COVID App is not just to be part of a contact tracing alert system. The app’s Privacy Notice states:

“The information you provide, (and which will not identify you), may also be used for different purposes that are not directly related to your health and care. These include:

  • Research into coronavirus 
  • Planning of services/actions in response to coronavirus
  • Monitoring the progress and development of coronavirus

Any information provided by you and collected about you will not be used for any purpose that is not highlighted above.”

“Research”

Article 89 of the GDPR allows Data Controllers to process personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes, subject to appropriate safeguards set out in Section 19 of the Data Protection Act 2018.

NHSX has said that one of the “appropriate safeguards” to be put in place is anonymisation or de-identification of the users’ data; but only if research purposes can be achieved without the use of personal data. However, even anonymised data can be pieced back together to identify individuals especially where other datasets are matched.
The Open Rights Group says:

“Claims such as ‘The App is designed to preserve the anonymity of those who use it’ are inherently misleading, yet the term has been heavily relied upon by the authors of the DPIA. On top of that, many statements leave ambiguities…”

There are also legitimate concerns about “function creep”. What exactly does “research into coronavirus” mean? Matthew Gould, the chief executive of NHSX, told MPs the app will evolve over time:

“We need to level with the public about the fact that when we launch it, it will not be perfect and that, as our understanding of the virus develops, so will the app. We will add features and develop the way it works.”

Whilst speaking to the Science and Technology Committee, Gould stated that “We’ve been clear the data will only ever be used for the NHS.” This does not rule out the possibility of private companies getting this data as NHS Data Processors.

Data Retention

Privacy campaigners are also concerned about the length of time the personal data collected by the app will be held; for both contacts and for people who have coronavirus. The DPIA and Privacy Notice does not specify a data retention period:

“In accordance with the law, personal data will not be kept for longer than is necessary. The exact retention period for data that may be processed relating to COVID-19 for public health reasons has yet to be set (owing to the uncertain nature of COVID-19 and the impact that it may have on the public).

In light of this, we will ensure that the necessity to retain the data will be routinely reviewed by an independent authority (at least every 6 months).

So, at the time of writing, COVID App users have no idea how long their data will be kept for, nor exactly what for, nor which authority will review it “every six months.” Interestingly the information collected by the wider NHS Test and Trace programme is going to be kept by Public Health England for 20 years. Who is to say this will not be the case for COVID App users’ data?

Interestingly, none of the 15 risks listed in the original DPIA relating to the COVID App trial (see the second blog in this series) include keeping data for longer than necessary or the lawful basis for retaining it past the pandemic, or what it could be used for in future if more personal data is collected in updated versions of the app. As discussed in the third blog in this series, the Joint Human Rights Committee drafted a Bill which required defined purposes and deletion of all of the data at end of the pandemic. The Secretary of State for Health and Social Care, Matt Hancock, quickly rejected this Bill.

The woolly phrase “personal data will not be kept for longer than is necessary” and the fact NHSX admit that the COVID App will evolve in future and may collect more data, gives the Government wriggle room to retain the COVID App users’ data indefinitely and use it for other purposes. Could it be used as part of a government surveillance programme? Both India and China have made downloading their contact tracing app a legal requirement raising concerns of high tech social control.

To use the App or not?

Would we download the COVID App app in its current form? All four blogs in this series show that we are not convinced that it is privacy or data protection compliant. Furthermore, there are worries about the wider NHS’s coronavirus test-and-trace programme. The speed at which it has been set up, concerns raised by people working in it and the fact that no DPIA has been done further undermines confidence in the whole set up. Yesterday we learnt that the Open Rights Group is to challenge the government over amount of data collected and retained by the programme.

Having said all that, we leave it up to readers to decide whether to use the app.
Some privacy experts have been more forthcoming with their views. Phil Booth of @medConfidential calls the Test and Trace programme a “mass data grab” and Paul Bernal, Associate Professor in Law at the University of East Anglia, writes that the Government’s approach – based on secrecy, exceptionalism and deception – means our civic duty may well be to resist the programme actively. Finally if you need a third opinion, Jennifer Arcuri, CEO of Hacker House, has said she would not download the app because “there is no guarantee it’s 100 percent secure or the data is going to be kept secure.” Over to you dear readers!

Will you be downloading the app? Let us know in the comments section below.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

online-gdpr-banner

%d