Saudi Arabia’s First Ever DP Law Comes into Force 

Today (14th September 2023), Saudi Arabia’s first ever data protection law comes into force. Organisations doing business in the Middle East need to carefully consider the impact of the new law on their personal data processing activities. They have until 13th September 2024 to prepare and become fully compliant. 

Background 

The Personal Data Protection Law (PDPL) of Saudi Arabia was implemented by Royal Decree on 14th September 2021. It aims to regulate the collection, handling, disclosure and use of personal data. It will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA) which has published the aforementioned regulations. PDPL was originally going to come fully into force on 23rd March 2022. However, in November 2022, SDAIA published proposed amendments which were passed after public consultation.  

Following a consultation period, we also now have the final versions of the Implementing Regulations and the Personal Data Transfer Regulations; both expand on the general principles and obligations outlined in the PDPL (as amended in March 2023) and introduce new compliance requirements for data controllers. 

More Information  

Summary of the new law: https://actnowtraining.blog/2022/01/10/the-new-saudi-arabian-federal-data-protection-law/  

Summary of the Regulations: https://actnowtraining.blog/2023/07/26/data-protection-law-in-saudi-arabia-implementing-regulation-published/  

Action Plan 

13th September 2024 is not far away. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so could lead to enforcement action and also reputational damage. The following should be part of an action plan for compliance: 
 

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes.  
  1. Training staff at all levels to understand PDPL at how it will impact their role. 
  1. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed. 
  1. Reviewing how records management and information risk  is addressed within the organisation. 
  1. Drafting Privacy Notices  to ensure they set out the minimum information that should be included. 
  1. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification. 
  1. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure. 
  1. Appointing and training a Data Protection Officer. 
     

Act Now in Saudi Arabia 

Act Now Training can help your businesses prepare for the new law.
We have delivered training  extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. We have experience in helping organisations in territories where a new law of this type has been implemented.  

Now is the time to train your staff in the new law. Through our  KSA privacy programme, we offer comprehensive and cost-effective training from one hour awareness-raising webinars to comprehensive full day workshops and DPO certificate courses.  

To help deliver this and other courses, Suzanne Ballabás, an experienced middle-east based data protection specialist, recently joined our team of associates. We can deliver Online or Face to Face training. All of our training starts with a FREE analysis call to ensure you have the right level and most appropriate content for your organisations needs. Please get in touch to discuss your training or consultancy needs. 

Click on the Link Below to see our full Saudi Privacy Programme.

International Transfers Breach Results in Record GDPR Fine for Meta

Personal data transfers between the EU and US is an ongoing legal and political saga. The latest development is yesterday’s largest ever GDPR fine of €1.2bn (£1bn) issued by Ireland’s Data Protection Commission (DPC) to Facebook’s owner, Meta Ireland. The DPC ruled that Meta infringed Article 46 of the EU GDPR in the way it transferred personal data of its users from Europe to the US. 

The Law 

Chapter 5 of the EU GDPR mirrors the international transfer arrangements of the UK GDPR. There is a general prohibition on organisations transferring personal data to a country outside the EU, unless they ensure that data subjects’ rights are protected. This means that, if there is no adequacy decision in respect of the receiving country, one of the safeguards set out in Article 46 must be built into the arrangement. These include standard contractual clauses (SCCs) and binding corporate rules.
The former need to be included in a contract between the parties (data exporter and importer) and impose certain data protection obligations on both. 

The Problem with US Transfers 

In 2020, in a case commonly known as “Schrems II, the European Court of Justice (ECJ) concluded that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal mechanism to ensure GDPR compliance. They must consider using the Article 49 derogations or SCCs. If using the latter, whether for transfers to the US or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection and surveillance legislation, and to put in place “additional supplementary measures” to those included in the SCCs. The problem with the US is that it has stringent surveillance laws which give law enforcement agencies access to personal data without adequate safeguards (according to the ECJ in Schrems). Therefore any additional measures must address this possibility and build in safeguards to protect data subjects. 

In the light of the above, the new EU SCCs were published in June 2021.
The European Data Protection Board has also published its guidance on the aforementioned required assessment entitled “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data”. Meta’s use of the new EU SCC’s and its “additional supplementary measures” were the focus of the DPC’s attention when issuing its decision. 

The Decision 

The DPC ruled that Meta infringed Article 46(1) of GDPR when it continued to transfer personal data from the EU/EEA to the US following the ECJ’s ruling in Schrems II. It found that the measures used by Meta did not address the risks to the fundamental rights and freedoms of data subjects that were identified in Schrems; namely the risk of access to the data by US law enforcement.  

The DPC ruled that Meta should: 

  1. Suspend any future transfer of personal data to the US within five months of the date of the DPC’s decision; 
  1. Pay an administrative fine of €1.2 billion; and, 
  1. Bring its processing operations in line with the requirements of GDPR, within five months of the date of the DPC’s decision, by ceasing the unlawful processing, including storage, in the US of personal data of EEA users transferred in violation of GDPR. 

Meta has said that it will appeal the decision and seek a stay of the ruling, before the Irish courts.  Its President of Global Affairs, Sir Nick Clegg, said:  

“We are therefore disappointed to have been singled out when using the same legal mechanism as thousands of other companies looking to provide services in Europe. 

“This decision is flawed, unjustified and sets a dangerous precedent for the countless other companies transferring data between the EU and US.” 

The Future of US Transfers 

The Information Commissioner’s Office told the BBC that the decision “does not apply in the UK” but said it had “noted the decision and will review the details in due course”. The wider legal ramifications on data transfers from the UK to the US can’t be ignored. 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. A new  UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a Transfer Risk Assessment  as well as supplementary measures where privacy risks are identified.  

On 25th March 2022, the European Commission and the United States announced that they have agreed in principle on a new  Trans-Atlantic Data Privacy Framework. The final agreement is expected to be in place sometime this summer 2023 and will replace the Privacy Shield Framework. It is expected that the UK Government will strike a similar deal once the EU/US one is finalised. However both are likely to be challenged in the courts. 

The Meta fine is one of this year’s major GDPR developments nicely timed; within a few days of the 5th anniversary of GDPR. All organisations, whether in the UK or EU, need to carefully consider their data transfers mechanisms and ensure that they comply with Chapter 5 of GDPR in the light of the DPC’s ruling. A “wait and see’ approach is no longer an option.  

The Meta fine will be discussed in detail on our forthcoming International Transfers workshop. For those who want a 1 hour summary of the UK International Transfer regime we recommend our webinar 

US Data Transfers and Privacy Shield 2.0 

On 14th December 2022, the European Commission published a draft ‘adequacy decision’, under Article 47 of the GDPR, endorsing a new legal framework for transferring personal data from the EU to the USA. Subject to approval by other EU institutions, the decision paves the way for “Privacy Shield 2.0” to be in effect by Spring 2023.

The Background

In July 2020, the European Court of Justice (ECJ) in “Schrems II”, ruled that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework as a legal transfer tool as it failed to protect the rights of EU data subjects when their data was accessed by U.S. public authorities. In particular, the ECJ found that US surveillance programs are not limited to what is strictly necessary and proportionate as required by EU law and hence do not meet the requirements of Article 52 of the EU Charter on Fundamental Rights. Secondly, with regard to U.S. surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the USA, as required by Article 47 of the EU Charter.

The ECJ stated that organisations transferring personal data to the USA can still use the Article 49 GDPR derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporter to make a complex assessment  about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. 

Despite the Schrems II judgment, many organisations have continued to transfer personal data to the USA hoping that regulators will wait for a new Transatlantic data deal before enforcing the judgement.  Whilst the UK Information Commissioner’s Office (ICO) seems to have adopted a “wait and see” approach, other regulators have now started to take action. In February 2022, the French Data Protection Regulator, CNIL, ruled that the use of Google Analytics was a breach of GDPR due to the data being transferred to the USA without appropriate safeguards. This followed a similar decision by the Austrian Data Protection Authority in January. 

The Road to Adequacy

Since the Schrems ruling, replacing the Privacy Shield has been a priority for EU and US officials. In March 2022, it was announced that a new Trans-Atlantic Data Privacy Framework had been agreed in principle. In October, the US President signed an executive order giving effect to the US commitments in the framework. These include commitments to limit US authorities’ access to data exported from the EU to what is necessary and proportionate under surveillance legislation, to provide data subjects with rights of redress relating to how their data is handled under the framework regardless of their nationality, and to establish a Data Protection Review Court for determining the outcome of complaints.

Schrems III?

The privacy campaign group, noyb, of which Max Schrems is Honorary Chairman, is not impressed by the draft adequacy decision. It said in a statement:

“…the changes in US law seem rather minimal. Certain amendments, such as the introduction of the proportionality principle or the establishment of a Court, sound promising – but on closer examination, it becomes obvious that the Executive Order oversells and underperforms when it comes to the protection of non-US persons. It seems obvious that any EU “adequacy decision” that is based on Executive Order 14086 will likely not satisfy the CJEU. This would mean that the third deal between the US Government and the European Commission may fail.”

Max Schrems said: 

… As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again – in flagrant breach of our fundamental rights.”

The draft adequacy decision will now be reviewed by the European Data Protection Board (EDPB) and the European Member States. From the above statements it seems that if Privacy Shield 2.0 is finalised, a legal challenge against it is inevitable.

UK to US Data Transfers 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. At present use of such services usually involves a complicated TRA and execution of standard contractual clauses. A new UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a TRA as well as supplementary measures where privacy risks are identified. 

Good news may be round the corner for UK data exporters. The UK Government is also in the process of making an adequacy decision for the US. We suspect it will strike a similar deal once the EU/US one is finalised.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Our next online GDPR Practitioner Certificate course, starting on 10th January, is fully booked. We have places on the course starting on 19th January. 

We Are Hiring!

Are you a surveillance law expert with a proven track record of delivering practical and engaging training on Part 2 of the Regulation of Investigatory Act 2000 (RIPA) and/or its Scottish equivalent (RIPSA)? 

Due to an increased demand, Act Now Training is recruiting trainers to join its team of experts who deliver in-house and external surveillance training courses throughout the UK and online. These range from one hour webinars to full day courses and aim to help local authority staff practically apply the legislation and prepare for Commissioner inspections.

With more courses planned for 2022, including some new ones, we need trainers who enjoy the challenge of explaining difficult concepts in a practical jargon-free way.

We have opportunities for full time trainers as well as those who wish to add an extra “string to their bow” without leaving their day job. You do not have to be a lawyer and indeed our current team includes an ex police officer and a data protection officer. What is important is that you have practical experience of working with surveillance legislation, have enthusiasm for the subject and want to deliver innovative training (not “death by PowerPoint”) to a range of audiences.

If you think you have what it takes to become an Act Now trainer, please get in touch with your CV explaining your RIPA/RIPSA knowledge and experience. A full privacy policy can be read on our website.

The New Saudi Arabian Federal Data Protection Law 

The Middle East is fast catching up with Europe when it comes to data protection law. The Kingdom of Saudi Arabia(KSA) has enacted its first comprehensive national data protection law to regulate the processing of personal data. This is an important development alongside the passing of the new UAE Federal DP law. It also opens up opportunities for UK and EU Data Protection professionals especially as these new laws are closely aligned with the EU General Data Protection Regulation (GDPR) and the UK GDPR

The KSA Personal Data Protection Law (PDPL) was passed by Royal Decree M/19 of 9/2/1443H on 16 September 2021, approving Resolution No. 98 dated 7/2/1443H (14 September 2021). The detailed Executive Regulations are expected to be published soon and will give more details about the new law. It will be effective from 23rd March 2022 following which there will be a one year implementation period.

Enforcement 

PDPL will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA).The Executive Regulations will set out the administrate penalties that can be imposed on organisations for breaches. Expect large fines for non-compliance alongside other sanctions. PDPL could mirror the GDPR which allows the regulator to impose a fine of up to 20 million Euros or 4% of gross annual turnover, whichever is higher. PDPL also contains criminal offences which carry a term of imprisonment up to 2 years and/or a fine of up to 3 million Saudi Royals (approximately £566,000). Affected parties may also be able to claim compensation.

Territorial Scope

PDPL applies to all organisations that are processing personal data in the KSA irrespective of whether the data relates to Data Subjects living in the KSA. It also has an “extra-territorial” reach by applying to organisations based abroad who are processing personal data of Data Subjects resident in the KSA. Interestingly, unlike the UAE Federal DP law, PDPL does not exempt government authorities from its application although there are various exemptions from certain obligations where the data processing relates to national security, crime detection, statutory purposes etc.

Notable Provisions

PDPL mirrors GDPR’s underlying principles of transparency and accountability and empowers Data Subjects by giving them rights in relation to their personal data. We set out below the notable provisions including links to previous GDPR blog posts for readers wanting more detail, although more information about the finer points of the new law will be included in the forthcoming Executive Regulations. 

  • Personal Data – PDPL applies to the processing of personal data which is defined very broadly to include any data which identifies a living individual. However, unlike GDPR, Article 2 of PDPL includes within its scope, the data of a deceased person if it identifies them or a family member.
  • Registration  Article 23 requires Data Controllers (organisations that collect personal data and determine the purpose for which it is used and the method of processing) to register on an electronic portal that will form a national record of controllers. 
  • Lawful Bases – Like the UAE Federal DP law, PDPL makes consent the primary legal basis for processing personal data. There are exceptions including, amongst others, if the processing achieves a “definite interest” of the Data Subject and it is impossible or difficult to contact the Data Subject.
  • Rights – Data Subjects are granted various rights in Articles 4,5 and 7 of the PDPL which will be familiar to GDPR practitioners. These include the right to information (similar to Art 13 of GDPR), rectification, erasure and  Subject Access. All these rights are subject to similar exemptions found in Article 23 of GDPR.
  • Impact Assessments – Article 22 requires (what GDPR Practitioners call) “DPIAs” to be undertaken in relation to any new high risk data processing operations. This will involve assessing the impact of the processing on the risks to the rights of Data Subjects, especially their privacy and confidentiality.
  • Breach Notification – Article 20 requires organisations to notify the regulator, as well as a Data Subjects, if they suffer a personal data breach which compromises Data Subjects’ confidentiality, security or privacy. The timeframe for notifying will be set by the Executive Regulations.
  • Records Management – Organisations will have to demonstrate compliance with PDPL by keeping records. There is a specific requirement in Article 3 to keep records similar to a Record of Processing Activities(ROPA) under GDPR.
  • International Transfers – Like other data protection regimes PDPL  imposes limitations on the international transfer of personal data outside of the KSA. . There are exceptions; further details will be set out in the Executive Regulations.
  • Data Protection Officers – Organisations (both controllers and processors) will need to appoint at least one officer to be responsible for compliance with PDPL. The DPO can be an employee or an independent service provider and does not need to be located in the KSA. 
  • Training – After 23 March 2022, Data Controllers will be required to hold seminars for their employees to familiarise them with the new law.

Practical Steps

Organisations operating in the KSA, as well as those who are processing the personal data of KSA residents, need to assess the impact of PDPL on their data processing activities. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so will not just lead to enforcement action but also reputational damage. The following should be part of an action plan for compliance:

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes. 
  2. Training staff at all levels to understand PDPL at how it will impact on their role.
  3. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed.
  4. Reviewing how records management and information risk  is addressed within the organisation.
  5. Drafting Privacy Notices to ensure they set out the minimum information that should be included.
  6. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification.
  7. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure.
  8. Appointing and training a  Data Protection Officer.

Act Now Training can help your organisation prepare for PDPL. We are running a webinar on this topic soon and can also deliver more detailed in house training. Please get in touch to discuss you training needs. We are in Dubai and Abu Dhabi from 16th to 21st January 2022 and would be happy to arrange a meeting.

Ronaldo’s Data and GDPR: Who said data protection is boring?

There is an interesting story this morning on the BBC website about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.  

Now 850 players (Ed – I don’t know if Ronaldo is one of them but I could not miss the chance to mention my favourite footballer!), led by former Cardiff City manager Russell Slade, want compensation for the trading of their performance data over the past six years by various companies. They also want an annual fee from the companies for any future use. The data ranges from average goals-per-game for an outfield player to height, weight and passes during a game. 

BBC News says that an initial 17 major betting, entertainment and data collection firms have been targeted, but Slade’s Global Sports Data and Technology Group has highlighted more than 150 targets it believes have “misused” data. His legal team claim that the fact players receive no payment for the unlicensed use of their data contravenes the General Data Protection Regulation (GDPR). However, the precise legal basis of their claim is unclear. 

In an interview with the BBC, Slade said:

“There are companies that are taking that data and processing that data without the individual consent of that player.”

This suggests a claim for breach of the First Data Protection Principle (Lawfulness and Transparency). However, if the players’ personal data is provided by their clubs e.g., height, weight, performance at training sessions etc. then it may be that players have already consented (and been recompensed for this) as part of their player contract. In any event, Data Protection professionals will know that consent is only one way in which a Data Controller can justify the processing of personal data under Article 6 of GDPR. Article 6(1)(f) allows processing where it:

“is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data… .”

Of course, this requires a balancing exercise considering the interests pursued by the clubs and data companies and the impact on individual players’ privacy. Some would argue that as far as public domain information is concerned, the impact on players’ privacy is minimal. However, “the interests or fundamental rights and freedoms of the data subject’ also include reputational damage, loss of control and financial loss, all of which it could be argued result from the alleged unauthorised use of data.

The BBC article quotes former Wales international Dave Edwards, one of the players behind the move:

“The more I’ve looked into it and you see how our data is used, the amount of channels its passed through, all the different organisations which use it, I feel as a player we should have a say on who is allowed to use it.”

The above seems to suggest that the players’ argument is also about control of their personal data. The GDPR does give players rights over their data which allow them to exercise some element of control including the right to see what data is held about them, to object to its processing and to ask for it to be deleted. It may be that players are exercising or attempting to exercise these rights in order to exert pressure on the companies to compensate them.

Without seeing the paperwork, including the letters before action which have been served on the companies, we can only speculate about the basis of the claim at this stage. Nonetheless, this is an interesting case and one to watch. If the claim is successful, the implications could have far-reaching effects beyond football. Whatever happens it will get data protection being talked about on the terraces!

Ibrahim Hasan, solicitor and director of Act Now Training, has given an interview to BBC Radio 4’s (PM programme) about this story. You can listen again here (from 39) minutes onwards.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Covid-19, GDPR and Temperature Checks

Emma Garland writes…

Many countries have now been in some form of lockdown for a considerable length of time. As some of the lockdown measures are slowly being eased, one of the possible solutions to prevent a “second wave” is the implementation of temperature checks in shops and workplaces. This involves placing a thermometer on an individual’s forehead. Of course if the temperature is recorded or there is another way the individual can be identified, it will involve processing health data. Care must be taken to consider the GDPR and privacy implications.

Apple reopened stores across Germany on 11th May with extra safety procedures, including temperature checks and social distancing. It is now facing a probe by a regional German data protection regulator into whether its plan to take the temperature of its store customers violates GDPR.

The benefits of temperature check are self-evident. By detecting members of the public or staff who have a high temperature, and not permitting them to enter the store or workplace, staff have less risk of close contact with people who may have COVID 19. Temperature checks are just one small part of stopping the spread of COVID 19 and can be intrusive. What is the lawful basis for processing such data? Art 6(1)(d) of GDPR allows processing where it is:

“…is necessary in order to protect the vital interests of the data subject or of another natural person”

Of course “data concerning health” is also Special Category Data and requires an Article 9 condition to ensure it is lawful. Is a temperature check necessary to comply with employment obligations, for medical diagnosis or for reasons of public health?

All conditions under Article 6 and 9 must satisfy the test of necessity. There are many causes of a high temperature not just COVID 19. There have also been doubts over the accuracy of temperature readings. They take skin temperature, which can vary from core temperature, and do not account for the incubation phase of the disease where people may be asymptomatic.

ICO Guidance

The Information Commissioner’s Office (ICO) has produced guidance on workplace testing which states:

“Data protection law does not prevent you from taking the necessary steps to keep your staff and the public safe and supported during the present public health emergency.
But it does require you to be responsible with people’s personal data and ensure it is handled with care.”

The ICO suggests that  “legitimate interests” or “public task” could be used to justify the processing of personal data as part of a workplace testing regime. The former will require a Legitimate Interests Assessment, where the benefit of the data to the organisation is balanced against the risks to the individual.  In terms of Article 9, the ICO suggests the employment condition, supplemented by Schedule 1 of the Data Protection Act 2018. The logic used here is that employment responsibilities extend to compliance wide range of legislation, including health and safety.

More generally, the ICO says that that technology which could be considered privacy intrusive should have a high justification for usage. It should be part of a well thought out plan, which ensures that it is an appropriate means to achieve a justifiable end. alternatives should also have been fully evaluated. The ICO also states:

“If your organisation is going to undertake testing and process health information, then you should conduct a DPIA focussing on the new areas of risk.”

A Data Protection Impact Assessment should map the flow of the data including collection, usage, retention and deletion as well as the associated risks to individuals.
Some companies are even using thermal cameras as part of COVID 19 testing.
The Surveillance camera Commissioner (SCC) and the ICO have worked together to update the SCC DPIA template, which is specific to surveillance systems.

As shops begin to open and the world establishes post COVID 19 practices, many employers and retailers will be trying to find their “new normal”. People will also have to decide what they are comfortable with. Temperature should be part of a considered approach evaluating all the regulatory and privacy risks.

Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

The NHS COVID 19 Contact Tracing App: Part 4 Questions about Data Retention and Function Creep

The first three blog posts in this series have raised many issues about the proposed NHS COVID19 Contact Tracing App (COVID App) including the incomplete DPIA and lack of human rights compliance. In this final post we discuss concerns about how long the data collected by the app will be held and what it will be used for.

From the DPIA and NHSX communications it appears that the purpose of the COVID App is not just to be part of a contact tracing alert system. The app’s Privacy Notice states:

“The information you provide, (and which will not identify you), may also be used for different purposes that are not directly related to your health and care. These include:

  • Research into coronavirus 
  • Planning of services/actions in response to coronavirus
  • Monitoring the progress and development of coronavirus

Any information provided by you and collected about you will not be used for any purpose that is not highlighted above.”

“Research”

Article 89 of the GDPR allows Data Controllers to process personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes, subject to appropriate safeguards set out in Section 19 of the Data Protection Act 2018.

NHSX has said that one of the “appropriate safeguards” to be put in place is anonymisation or de-identification of the users’ data; but only if research purposes can be achieved without the use of personal data. However, even anonymised data can be pieced back together to identify individuals especially where other datasets are matched.
The Open Rights Group says:

“Claims such as ‘The App is designed to preserve the anonymity of those who use it’ are inherently misleading, yet the term has been heavily relied upon by the authors of the DPIA. On top of that, many statements leave ambiguities…”

There are also legitimate concerns about “function creep”. What exactly does “research into coronavirus” mean? Matthew Gould, the chief executive of NHSX, told MPs the app will evolve over time:

“We need to level with the public about the fact that when we launch it, it will not be perfect and that, as our understanding of the virus develops, so will the app. We will add features and develop the way it works.”

Whilst speaking to the Science and Technology Committee, Gould stated that “We’ve been clear the data will only ever be used for the NHS.” This does not rule out the possibility of private companies getting this data as NHS Data Processors.

Data Retention

Privacy campaigners are also concerned about the length of time the personal data collected by the app will be held; for both contacts and for people who have coronavirus. The DPIA and Privacy Notice does not specify a data retention period:

“In accordance with the law, personal data will not be kept for longer than is necessary. The exact retention period for data that may be processed relating to COVID-19 for public health reasons has yet to be set (owing to the uncertain nature of COVID-19 and the impact that it may have on the public).

In light of this, we will ensure that the necessity to retain the data will be routinely reviewed by an independent authority (at least every 6 months).

So, at the time of writing, COVID App users have no idea how long their data will be kept for, nor exactly what for, nor which authority will review it “every six months.” Interestingly the information collected by the wider NHS Test and Trace programme is going to be kept by Public Health England for 20 years. Who is to say this will not be the case for COVID App users’ data?

Interestingly, none of the 15 risks listed in the original DPIA relating to the COVID App trial (see the second blog in this series) include keeping data for longer than necessary or the lawful basis for retaining it past the pandemic, or what it could be used for in future if more personal data is collected in updated versions of the app. As discussed in the third blog in this series, the Joint Human Rights Committee drafted a Bill which required defined purposes and deletion of all of the data at end of the pandemic. The Secretary of State for Health and Social Care, Matt Hancock, quickly rejected this Bill.

The woolly phrase “personal data will not be kept for longer than is necessary” and the fact NHSX admit that the COVID App will evolve in future and may collect more data, gives the Government wriggle room to retain the COVID App users’ data indefinitely and use it for other purposes. Could it be used as part of a government surveillance programme? Both India and China have made downloading their contact tracing app a legal requirement raising concerns of high tech social control.

To use the App or not?

Would we download the COVID App app in its current form? All four blogs in this series show that we are not convinced that it is privacy or data protection compliant. Furthermore, there are worries about the wider NHS’s coronavirus test-and-trace programme. The speed at which it has been set up, concerns raised by people working in it and the fact that no DPIA has been done further undermines confidence in the whole set up. Yesterday we learnt that the Open Rights Group is to challenge the government over amount of data collected and retained by the programme.

Having said all that, we leave it up to readers to decide whether to use the app.
Some privacy experts have been more forthcoming with their views. Phil Booth of @medConfidential calls the Test and Trace programme a “mass data grab” and Paul Bernal, Associate Professor in Law at the University of East Anglia, writes that the Government’s approach – based on secrecy, exceptionalism and deception – means our civic duty may well be to resist the programme actively. Finally if you need a third opinion, Jennifer Arcuri, CEO of Hacker House, has said she would not download the app because “there is no guarantee it’s 100 percent secure or the data is going to be kept secure.” Over to you dear readers!

Will you be downloading the app? Let us know in the comments section below.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

The NHS COVID 19 Contact Tracing App: Part 3 The Human Rights Angle

Everyone will agree that the government needs to do everything it can to prevent the further spread of the Coronavirus and to “save lives” (except if your name is Dominic Cummings -Ed). However, there is much less consensus about the what it should do, and this can be seen in the current debate about the proposal to roll out a contact tracing  system and the NHS COVID App. This is the third in a series of blog posts where we examine the COVID App from different perspectives.

On May 7 2020, the  Parliamentary Joint Committee on Human Rights (PJCHR) published its report on the proposed contact tracing system and made a series of important recommendations to address its concerns about the compatibility of the scheme with data protection laws and the Human Rights Act 1998. After waiting for two weeks, the Secretary of State for Health, Matt Hancock, replied to the Committee rejecting its proposals as “unnecessary!” Let us examine those proposals in detail.

The Human Rights Considerations

Section 6 of the Human Rights Act 1998 makes it unlawful for any public authority (that includes the UK government and the NHSX) to act in a way that is incompatible with a Convention right. Article 8(1)of the ECHR states that “Everyone has the right to respect for his private and family life, his home and his correspondence.” This is not an absolute right. Article 8(2) provides that an interference with the right to privacy may be justified if it:

is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”

However, the government also has an obligation to protect the “right to life” enshrined in Article 2 of the ECHR. This means that if the NHS COVID App really can prevent the spread of the virus and save lives, then this is going to a major consideration in deciding whether the interference with Article 8 is necessary and proportionate.

On 7 May the Parliamentary Joint Committee on Human Rights  (PJCHR) published a Report on the NHS COVID App and this provides a very detailed assessment of some of the human rights implications of the “centralised” approach that the NHS has proposed. The overall conclusion of the report is that if the app is effective it could help pave the way out of current lockdown restrictions and help to prevent the spread of Coronavirus. However, it also concludes that the app, in its current form, raises “significant concerns regarding surveillance and the impact on other human rights which must be addressed first.”

How will the COVID App interfere with the right to privacy?

At first glance it would appear that the COVID App does not involve the transfer of any personal data. As explained in the first blog in this series, app user will be given a unique ID which will be made up of a set of random numbers and the first half of a person’s post code. The NHS web site suggests that this ‘anonymises’ the information. However, as the Parliamentary Report notes, there are parts of England where less than 10,000 people live in a post code area and as little as 3 or 4 “bits” of other information could be enough to identify individuals. The report also notes that relying upon people self-reporting alone (without requiring conformation that a person has tested positive for COVID 19) may carry the risks of false alerts thereby impacting on other people’s rights if they have to self-isolate unnecessarily.

Necessary interference?

An interference with a person’s right to privacy under ECHR Article 8 may be justified under Article 8(2) if it is “in accordance with the law” and is “necessary” for the protection of “health” (see above).

To be in accordance with the law, the app must meet the requirements of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 “http://www. legislation.gov.uk/ukpga/2018/12/contents” (DPA). However, as noted below, the PJCHR believes that the “current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system”. The Committee’s recommendations in relation to this are considered below.

The remaining  human rights consideration is whether the interference with peoples’ private lives is “necessary”. The answer to this depends on whether the use of the app will contribute to reducing the spread of COVID 19 and whether it will save lives.
This in turn depends on whether the app works and on the uptake of the app.

Although it was reported that uptake of the app in the Isle of Wight has exceeded 50% of the population, this falls short of the 60% that the government had previously suggested was necessary for the app to be effective. It is also debatable whether it necessarily follows that the uptake will be the same on the mainland. If the App is not capable of achieving its objective of preventing the spread of the virus, then the interference with peoples’ privacy rights will not be proportionate and will not fulfil the requirement of necessity in Article 8(2).

Although many people will probably download the app without thinking about privacy issues (how often do any of us download apps without checking Privacy Notices?), many others may have some real privacy concerns, particularly after the recent media debates. This has not been helped by reports that Serco (the company contracted to train call centre staff for the contact tracing scheme) has accidentally shared the email addresses of 300 contact tracers. Or by the fact that in other parts of the world there is growing concern about the privacy issues related to the use of contact tracing apps. Uptake of the app may be adversely affected if people lack confidence in the way in which data is being processed and why, and in the light of above they may have concerns about data security.

Consequently, the PJCHR’s report includes a series of recommendations aimed at ensuring that “robust privacy protections” are put in place as these are key to ensuring the effectiveness of the app .

Central to their recommendations was a proposal that the government introduce legislation to provide legal certainty about how personal data will be processed by the COVID App. Although individuals’ data protection rights are protected by the GDPR and DPA 2018 the Committee believes that it is “nearly impossible” for the public to understand what will happen to their data and also that it is necessary to turn government assurances about privacy into statutory obligations. The PJCHR sent a copy of their draft Bill to Secretary of State, Matt Hancock. However, on 21 May Matt Hancock rejected that proposal on the basis that the existing law provides “the necessary powers, duties and protections” and that participation in contact tracing and use of the app is voluntary.
In contrast the Australian government has passed additional new privacy protection legislation specifically aimed at the collection, use and disclosure of its COVID safe app data.

The Committee’s other recommendations are:

  1. The appointment of a Digital Contact Tracing Human Rights Commissioner to oversee the use, effectiveness and privacy protections of the app and any data associated with digital contact tracing. It calls for the Commissioner to have the same powers as the Information Commissioner. It would appear that Matt Hancock has also rejected this proposal on the basis that there is already sufficient governance in place.
  2. Particular safeguards for children under 18 to monitor children’s use, ensure against misuse and allow for interviews with parents where appropriate. It is noticeable that the Committee has set the age at 18.
  3. The app’s contribution to reducing the severity of the lockdown and to helping to prevent the spread of COVID 19 must be demonstrated and improved at regular intervals for the collection of the data to be reasonable. Therefore the Secretary of State for Health must review the operation of the app on a three weekly basis and must report to Parliament every three weeks.
  4. Transparency. In the second of this series of blog posts, we noted some of the issues relating to the publication of the Data Protection Impact Assessment. The PJCHR calls for this to be made public as it is updated.
  5. Time limited. The data associated with the contact tracing app must be permanently deleted when it is no longer required and may not be kept beyond the duration of the health emergency. However these terms may be open to some interpretation.

Matt Hancock has written that he will respond to these other issues “in due course”.
It is unclear what this means, but it does not suggest any immediate response.

The Draft Bill

The PJCHR’s draft bill (rejected by Matt Hancock) proposed a number of important provisions, some of which are set out below.

The Bill specifically limited the purpose of the COVID App to:

  1. Protecting the health of individuals who are or may become infected with Coronavirus; and
  2. Preventing or controlling the spread of Coronavirus (a) preventing the spread of Coronavirus.

Additionally it contained provisions  that prohibited the use of centrally held data without specific statutory authorisation; limited the amount of time that data could be held on a smart phone to 28 days followed by automatic deletion unless a person has notified that they have COVID 19 or suspected COVID 19. It also prohibited “data reconstruction” in relation to any centrally held data. The fact that the Bill includes this, seems to suggest an implicit recognition that the Unique IDs are not truly anonymous.

The ‘status’ of the NHS COVID App keeps changing and it still remains to be seen when (and if) it will be rolled out. But the Northern Ireland Assembly has already announced it will be working with the Irish government to produce a coordinated response based on a  decentralised model.  It is reported to be doing this because of the difficulties and uncertainties surrounding the app, and the human rights issues arising from a centralised app.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online   GDPR Practitioner Certificate  course is  fully booked. We have  1 place left   on the course starting on 11th  June. 

Coronavirus and Police Use of Drones

The police have an important rule to play in the current coronavirus lockdown.  However their actions must at all times be proportionate, transparent and (above all) lawful. Only yesterday, British Transport Police admitted they had wrongly charged a woman who was fined £660 under coronavirus legislation. Marie Dinou was arrested at Newcastle Central Station on Saturday after she refused to tell police why she needed to travel. A police and Crown Prosecution Service review said she was charged under the wrong part of the Corona Virus Act. The court will be asked to set the conviction aside.

This is not the only recent incident of the police overstepping the mark. By now most of us will have seen the story about a couple walking their dog in the Peak District. The video was filmed by a drone operated by the Derbyshire Police Drone Unit, and broadcast to the nation on BBC news. According to Derbyshire Police’s Twitter feed (which broadcast the same 90 second footage) the police force wanted to reinforce the government message of ‘stay at home’ and to point out this was not getting through, by effectively ‘shaming’ the couple who were captured on camera.

The video has sparked huge controversy from various circles including civil liberties campaign group Big Brother Watch and a leading member of the judiciary. According to the BBC, Big Brother Watch has described the move as ‘sinister and counter-productive’. Ex Supreme Court Judge, Lord Sumption, has also been very critical.
In BBC Radio 4’s World at One, Lord Sumption made it clear that the police have no legal power to enforce Government Ministers ‘wishes’ and guidance about non-essential travel. Although the government has enacted the Coronavirus Act 2020, this does not give the police any powers to stop individuals from non-essential travel or walking in isolated places. Lord Sumption’s criticism is most tellingly summed up in the following quotation:

“This is what a police state is like, it is a state in which the government can issue orders or express preferences with no legal authority and the police will enforce ministers’ wishes.”

At Act Now we are not able to comment on whether the police have the powers to do this but we respectfully accept Lord’s Sumption’s view that they did not. Our concern is whether the filming and broadcasting of these individuals was GDPR compliant.
Our conclusion is that it was not.

The use of drones poses a privacy risk. The Police Force took the decision to process this personal data for their own purposes (“to get the message across”). They are therefore Data Controllers and must comply with the General Data Protection Regulation (GDPR) in relation to this processing. Images of individuals constitute personal data where it is possible to identify them from those images (GDPR Article 4(1)). It is entirely possible that the individuals captured in that Derbyshire police video could be identified by their clothing, hair colour and the presence of their dog.

Drones can be used to film people in many locations, often without the knowledge of those being filmed. In these circumstances, the processing of personal data must be lawful (GDPR Article 5 (1)). It is questionable which Article 6 basis the police could rely on here. Arguably processing is necessary for a ‘task carried out in the public interest’. However one would have to ask why it was necessary to film and broadcast these individuals. The police could not rely on ‘legitimate interests’ because this does not apply to processing carried out by public authorities in performance of their task (GDPR Article 6 (1)(f)).

Even if the police could identify a lawful basis, the next question is whether this processing is fair. The ICO guidance states that Data Controllers should only process data in ways that people would reasonably expect and not use it in ways that have unjustified adverse effects on them. I would argue that it is highly unlikely that anybody walking their dog in an isolated part of the Peak District would have any reasonable expectation that they would be secretly filmed by a drone and that their images would be broadcast to the nation in an attempt to shame them. So it seems highly unlikely that this processing is fair.

GDPR also requires transparency when processing personal data. This means data subjects should be made aware that their personal data is being processed and why.
The ‘normal’ transparency requirements (usually the GDPR (Articles 12-14) are less onerous for the police when they are processing personal data for law enforcement purposes under Part 3 of the Data Protection Act 2018. However, the police admitted themselves that the filming was for the purposes of ‘getting a  message out’ and this does not fit easily within the definition of law enforcement purposes under S.31 DPA 2018. At best the police could try and argue that the processing was for the purposes of preventing threats to public security, but it is really difficult to see how this would succeed when it was just a couple walking their dog on an isolated stretch of path.

The police did not comply with the Information Commissioner’s tips on responsible drone use, in particular the advice about thinking carefully about sharing images on social media. The ICO cautions that drone users should avoid sharing images that could have unfair or harmful consequences. There is also little evidence that the Police had due regard to at least the first three guiding principles laid down in the Surveillance Camera Code of Practice or whether they conducted a Data Protection Impact Assessment.

On balance, the Derbyshire Police’s decision to film individuals taking a walk in an isolated area, in order to get a message across about not travelling unnecessarily was at best misguided, and at worst unlawful. The coronavirus is changing almost all aspects of our daily lives, and social distancing and self-isolating are the new norms. However, when the police take action it is still vital that they comply with their legal obligations in relation to the processing of personal data.

More on this and other developments in our FREE GDPR update webinar. Looking for a GDPR qualification from the comfort of your home office? Our GDPR Practitioner Certificate is now available as an online option.

Exit mobile version
%%footer%%