The last week has been really busy day for our managing director and data protection expert, Ibrahim Hasan, with a frenzy of media interviews. Well not quite a “frenzy” but three is a start!
Ibrahim was first interviewed on BBC Radio 5 live’s Drive programme by Anna Foster. He spoke about the rules requiring restaurants and pubs to keep contact details of customers and the GDPR/DPA consequences if things go wrong. He emphasised the important of business owners complying with data protection laws and educating their staff on their responsibilities.
Later in the day, Ibrahim had his first live television interview which was broadcast on BBC News 24 and BBC News Worldwide. He was asked about the new NHS Contact Tracing App and the privacy implications. He also talked about the consequences of misusing personal data. We are waiting to receive the recording of this interview. In the meantime you can read the feedback on our social media channels (LinkedIn and Twitter). You can also read more about the previous version of the NHS contact Tracing App in our blog.
Finally, on 18th September, Ibrahim appeared on BBC Radio Berkshire to talk about the same issue. This followed a lady who was contacted by a bus driver for a date using her T and T details!
“A pint of beer and a packet of crisps, Sir? That’ll be £3.90 and your personal data please.”
For some businesses, such as restaurants and pubs, the government is also intending to place an additional obligation. The guidance document states:
“The opening up of the economy following the COVID-19 outbreak is being supported by NHS Test and Trace. You should assist this service by keeping a temporary record of your customers and visitors for 21 days, in a way that is manageable for your business, and assist NHS Test and Trace with requests for that data if needed. This could help contain clusters or outbreaks.”
This new requirement to collect and store personal data, alongside encouraging or compelling customers, clearly raises data protection and privacy implications.
In a statement to the House of Commons on 23rd June 2020, Boris Johnson said, “We will work with the sector to make this manageable.” Speaking to the Guardian newspaper the next day, the Information Commissioner’s Office (ICO), said it was “assessing the potential data protection implications of this proposed scheme and is monitoring developments”. With a week to go before the new rules come into force, both need to get a wriggle on!
Reaction on to the Prime Minister’s statement on social media was nothing but predictable. People immediately started discussing which fake name they would use rather than hand over their personal data. Dominic Cummings and Matthew Hancock seem popular choices.
As we publish this blog, there have been no changes in legislation and no further emergency COVID-19 regulations. Nor have any changes to licensing laws been proposed in order to enforce the collection of this data.
So how can a restaurant manager or pub landlord justify collecting personal data in these circumstances? Let’s consider the lawfulness conditions under Article 6 of GDPR for processing data. If a business will not allow someone to dine or drink in its premises unless a name and address is recorded, they cannot use consent as their condition for processing. The customer is not freely giving their data as they have no real choice if they want to use the premises. There is no contract between the parties at the stage of entering the premises. There’s no statutory requirement in law to demand it or any official authority for businesses to require it. No-one is going to die immediately if the data is not handed over so vital interests cannot be used.
Unless emergency legislation is passed in the next week it appears businesses will have to rely on the “legitimate interests” condition under Article 6 to collect and process the personal data of customers.
If businesses decide it is in their legitimate interests to collect customer contact data, they also need to demonstrate fairness and transparency to meet the requirements of the first data principle. This brings us to Privacy Notices. A quick sampling of my local pubs showed only 3 out of the 10 currently have Article 13 compliant Privacy Notices on their websites. All three were part of national chains. The more local independent pubs do not appear to have a Privacy Notice on their website. How will these pubs explain to customers why they want their data and what they are going to do with it? Perhaps there will be signs to be read upon entering.
Security of the Data
One of the biggest risks to businesses is not keeping this newly collected personal data secure, which could result in the possibility of a data breach under GDPR. Under Article 32 the business needs to take appropriate organisational and technical measures to keep the data secure. Devices will need to be password protected if not encrypted. Access will have to be controlled. New security policies and procedures will need to be put in place by next week.
In addition, all staff will need to be trained, quickly, regarding handling this newly collected data. Stories have already surface in New Zealand, after this system was introduced there, of female customers being harassed by staff who had taken their details from the contact list.
The government has said that businesses need to keep customer contact data for 21 days. This raises more questions for businesses to consider. How will this be implemented?
Do systems allow this retention period? How will paper records be disposed of securely? There’ll be a run on shredders soon!
The government is also asking pubs and restaurants to use apps to enable customers to order at their tables thus limiting contact with others. The Wetherspoons chain has had such an app for ‘table service’ for some time. We know the government likes apps but they too need to be GDPR-compliant.
Furthermore those customers who are unwilling or unable to comply with the new requirements, whether because they object to the collection of data, do not have ID documents or are economically excluded as they do not have smartphones and/or bank accounts face discrimination as they will be unable to access the social spaces that are pubs and restaurants. There could be challenges against such measures on this basis.
Trust and Burden
Ultimately it will be down to individuals as to whether they care about their data enough or would prefer a pint or a pie after 3 long months. It may be that they trust their local restaurant and landlord with their data. Some individuals will decide it’s just not worth the hassle and risk for the sake of a socially distanced Sunday lunch.
Some small businesses may decide that the requirement to processes customers’ personal data in a GDPR complaint way is too much of a burden considering they have 8 days to prepare on top of re-opening, getting staff back and trained and making their premises COVID-secure.
Our GDPR Essentials E learning course is designed to teach frontline staff essential GDPR knowledge in an engaging, fun and interactive way. In just over 30 minutes staff will learn about the key provisions of GDPR and how to keep personal data safe.
Many countries have now been in some form of lockdown for a considerable length of time. As some of the lockdown measures are slowly being eased, one of the possible solutions to prevent a “second wave” is the implementation of temperature checks in shops and workplaces. This involves placing a thermometer on an individual’s forehead. Of course if the temperature is recorded or there is another way the individual can be identified, it will involve processing health data. Care must be taken to consider the GDPR and privacy implications.
Apple reopened stores across Germany on 11th May with extra safety procedures, including temperature checks and social distancing. It is now facing a probe by a regional German data protection regulator into whether its plan to take the temperature of its store customers violates GDPR.
The benefits of temperature check are self-evident. By detecting members of the public or staff who have a high temperature, and not permitting them to enter the store or workplace, staff have less risk of close contact with people who may have COVID 19. Temperature checks are just one small part of stopping the spread of COVID 19 and can be intrusive. What is the lawful basis for processing such data? Art 6(1)(d) of GDPR allows processing where it is:
“…is necessary in order to protect the vital interests of the data subject or of another natural person”
Of course “data concerning health” is also Special Category Data and requires an Article 9 condition to ensure it is lawful. Is a temperature check necessary to comply with employment obligations, for medical diagnosis or for reasons of public health?
All conditions under Article 6 and 9 must satisfy the test of necessity. There are many causes of a high temperature not just COVID 19. There have also been doubts over the accuracy of temperature readings. They take skin temperature, which can vary from core temperature, and do not account for the incubation phase of the disease where people may be asymptomatic.
The Information Commissioner’s Office (ICO) has produced guidance on workplace testing which states:
“Data protection law does not prevent you from taking the necessary steps to keep your staff and the public safe and supported during the present public health emergency.
But it does require you to be responsible with people’s personal data and ensure it is handled with care.”
The ICO suggests that “legitimate interests” or “public task” could be used to justify the processing of personal data as part of a workplace testing regime. The former will require a Legitimate Interests Assessment, where the benefit of the data to the organisation is balanced against the risks to the individual. In terms of Article 9, the ICO suggests the employment condition, supplemented by Schedule 1 of the Data Protection Act 2018. The logic used here is that employment responsibilities extend to compliance wide range of legislation, including health and safety.
More generally, the ICO says that that technology which could be considered privacy intrusive should have a high justification for usage. It should be part of a well thought out plan, which ensures that it is an appropriate means to achieve a justifiable end. alternatives should also have been fully evaluated. The ICO also states:
“If your organisation is going to undertake testing and process health information, then you should conduct a DPIA focussing on the new areas of risk.”
A Data Protection Impact Assessment should map the flow of the data including collection, usage, retention and deletion as well as the associated risks to individuals.
Some companies are even using thermal cameras as part of COVID 19 testing. The Surveillance camera Commissioner (SCC) and the ICO have worked together to update the SCC DPIA template, which is specific to surveillance systems.
As shops begin to open and the world establishes post COVID 19 practices, many employers and retailers will be trying to find their “new normal”. People will also have to decide what they are comfortable with. Temperature should be part of a considered approach evaluating all the regulatory and privacy risks.
Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online GDPR Practitioner Certificatecourse is fully booked. A few places left on the course starting on 2ndJuly.
Everyone will agree that the government needs to do everything it can to prevent the further spread of the Coronavirus and to “save lives” (except if your name is Dominic Cummings -Ed). However, there is much less consensus about the what it should do, and this can be seen in the current debate about the proposal to roll out a contact tracing system and the NHS COVID App. This is the third in a series of blog posts where we examine the COVID App from different perspectives.
On May 7 2020, the Parliamentary Joint Committee on Human Rights (PJCHR) published its report on the proposed contact tracing system and made a series of important recommendations to address its concerns about the compatibility of the scheme with data protection laws and the Human Rights Act 1998. After waiting for two weeks, the Secretary of State for Health, Matt Hancock, replied to the Committee rejecting its proposals as “unnecessary!” Let us examine those proposals in detail.
The Human Rights Considerations
Section 6 of the Human Rights Act 1998 makes it unlawful for any public authority (that includes the UK government and the NHSX) to act in a way that is incompatible with a Convention right. Article 8(1)of the ECHR states that “Everyone has the right to respect for his private and family life, his home and his correspondence.” This is not an absolute right. Article 8(2) provides that an interference with the right to privacy may be justified if it:
“…is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”
However, the government also has an obligation to protect the “right to life” enshrined in Article 2 of the ECHR. This means that if the NHS COVID App really can prevent the spread of the virus and save lives, then this is going to a major consideration in deciding whether the interference with Article 8 is necessary and proportionate.
On 7 May the Parliamentary Joint Committee on Human Rights (PJCHR) published a Report on the NHS COVID App and this provides a very detailed assessment of some of the human rights implications of the “centralised” approach that the NHS has proposed. The overall conclusion of the report is that if the app is effective it could help pave the way out of current lockdown restrictions and help to prevent the spread of Coronavirus. However, it also concludes that the app, in its current form, raises “significant concerns regarding surveillance and the impact on other human rights which must be addressed first.”
How will the COVID Appinterfere with the right to privacy?
At first glance it would appear that the COVID App does not involve the transfer of any personal data. As explained in the first blog in this series, app user will be given a unique ID which will be made up of a set of random numbers and the first half of a person’s post code. The NHS web site suggests that this ‘anonymises’ the information. However, as the Parliamentary Report notes, there are parts of England where less than 10,000 people live in a post code area and as little as 3 or 4 “bits” of other information could be enough to identify individuals. The report also notes that relying upon people self-reporting alone (without requiring conformation that a person has tested positive for COVID 19) may carry the risks of false alerts thereby impacting on other people’s rights if they have to self-isolate unnecessarily.
An interference with a person’s right to privacy under ECHR Article 8 may be justified under Article 8(2) if it is “in accordance with the law” and is “necessary” for the protection of “health” (see above).
To be in accordance with the law, the app must meet the requirements of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 “http://www. legislation.gov.uk/ukpga/2018/12/contents” (DPA). However, as noted below, the PJCHR believes that the “current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system”. The Committee’s recommendations in relation to this are considered below.
The remaining human rights consideration is whether the interference with peoples’ private lives is “necessary”. The answer to this depends on whether the use of the app will contribute to reducing the spread of COVID 19 and whether it will save lives.
This in turn depends on whether the app works and on the uptake of the app.
Although it was reported that uptake of the app in the Isle of Wight has exceeded 50% of the population, this falls short of the 60% that the government had previously suggested was necessary for the app to be effective. It is also debatable whether it necessarily follows that the uptake will be the same on the mainland. If the App is not capable of achieving its objective of preventing the spread of the virus, then the interference with peoples’ privacy rights will not be proportionate and will not fulfil the requirement of necessity in Article 8(2).
Although many people will probably download the app without thinking about privacy issues (how often do any of us download apps without checking Privacy Notices?), many others may have some real privacy concerns, particularly after the recent media debates. This has not been helped by reports that Serco (the company contracted to train call centre staff for the contact tracing scheme) has accidentally shared the email addresses of 300 contact tracers. Or by the fact that in other parts of the world there is growing concern about the privacy issues related to the use of contact tracing apps. Uptake of the app may be adversely affected if people lack confidence in the way in which data is being processed and why, and in the light of above they may have concerns about data security.
Consequently, the PJCHR’s report includes a series of recommendations aimed at ensuring that “robust privacy protections” are put in place as these are key to ensuring the effectiveness of the app .
Central to their recommendations was a proposal that the government introduce legislation to provide legal certainty about how personal data will be processed by the COVID App. Although individuals’ data protection rights are protected by the GDPR and DPA 2018 the Committee believes that it is “nearly impossible” for the public to understand what will happen to their data and also that it is necessary to turn government assurances about privacy into statutory obligations. The PJCHR sent a copy of their draft Bill to Secretary of State, Matt Hancock. However, on 21 May Matt Hancock rejected that proposal on the basis that the existing law provides “the necessary powers, duties and protections” and that participation in contact tracing and use of the app is voluntary.
In contrast the Australian government has passed additional new privacy protection legislation specifically aimed at the collection, use and disclosure of its COVID safe app data.
The Committee’s other recommendations are:
The appointment of a Digital Contact Tracing Human Rights Commissioner to oversee the use, effectiveness and privacy protections of the app and any data associated with digital contact tracing. It calls for the Commissioner to have the same powers as the Information Commissioner. It would appear that Matt Hancock has also rejected this proposal on the basis that there is already sufficient governance in place.
Particular safeguards for children under 18 to monitor children’s use, ensure against misuse and allow for interviews with parents where appropriate. It is noticeable that the Committee has set the age at 18.
The app’s contribution to reducing the severity of the lockdown and to helping to prevent the spread of COVID 19 must be demonstrated and improved at regular intervals for the collection of the data to be reasonable. Therefore the Secretary of State for Health must review the operation of the app on a three weekly basis and must report to Parliament every three weeks.
Transparency. In the second of this series of blog posts, we noted some of the issues relating to the publication of the Data Protection Impact Assessment. The PJCHR calls for this to be made public as it is updated.
Time limited. The data associated with the contact tracing app must be permanently deleted when it is no longer required and may not be kept beyond the duration of the health emergency. However these terms may be open to some interpretation.
Matt Hancock has written that he will respond to these other issues “in due course”.
It is unclear what this means, but it does not suggest any immediate response.
The Draft Bill
The PJCHR’s draft bill (rejected by Matt Hancock) proposed a number of important provisions, some of which are set out below.
The Bill specifically limited the purpose of the COVID App to:
Protecting the health of individuals who are or may become infected with Coronavirus; and
Preventing or controlling the spread of Coronavirus (a) preventing the spread of Coronavirus.
Additionally it contained provisions that prohibited the use of centrally held data without specific statutory authorisation; limited the amount of time that data could be held on a smart phone to 28 days followed by automatic deletion unless a person has notified that they have COVID 19 or suspected COVID 19. It also prohibited “data reconstruction” in relation to any centrally held data. The fact that the Bill includes this, seems to suggest an implicit recognition that the Unique IDs are not truly anonymous.
The ‘status’ of the NHS COVID App keeps changing and it still remains to be seen when (and if) it will be rolled out. But the Northern Ireland Assembly has already announced it will be working with the Irish government to produce a coordinated response based on a decentralised model. It is reported to be doing this because of the difficulties and uncertainties surrounding the app, and the human rights issues arising from a centralised app.
Yesterday the Prime Minister said England will have a “world-beating” Covid 19 contact tracing system from June. Part of this system is the introduction of the NHS contact tracing app (“the Covid App”) which is currently being trialled on the Isle of Wight.
The app was initially meant to be launched across England in mid-May. Yesterday No.10 suggested this will be done “at a later date.” Why the delay? Well if you look at the recently published Data Protection Impact Assessment (DPIA) for the trial it’s obvious that much more work needs to be done. Here is our analysis of some of the issues raised. (If you are new to this subject we suggest you read the first blog in our series which discussed how such apps work and the different models which can be used.)
Background to the DPIA
The start of the App project has not been auspicious; nor does it instil confidence in the people running it. How can the public, let alone privacy professionals, trust the government when they say that the app will respect their privacy?
The trial of the app started on the Isle of Wight before the Information Commissioner’s Office (ICO) had been given sight of the DPIA. Although they have now seen a copy, the ICO is yet to give a formal opinion. Should the trial have gone ahead in this situation?
As demands grew to see the DPIA, NHSX published it as a .pdf document! However embedded documents including the all-important risk register could not be accessed.
So much for transparency! A few days later the word version of the DPIA was published revealing all the documents but there were typos and some names were not redacted. More importantly, those scrutinising it raised concerns that “high risks” in the original documentation had been listed as only “medium risks” in the risk register. NHSX quickly removed the word document and only the .pdf version is now available (here). For the trial to go ahead before all of the promised and finalised accurate documentation had been released again does not engender faith in the app’s ability to protect users’ privacy.
An Ethics Advisory Board has been set up to oversee the Covid App project. In a letter to the Secretary of Health and Social Care, the Board spelt out the 6 principles it expected to be followed; value, impact, security and privacy, accountability, transparency and control.
Some members of the Board have since raised their concerns to the press over how the Board has been responded to. They were also unhappy not to have seen the final DPIA before being asked to comment.
Parliament’s Joint Committee on Human Rights has also been scrutinising the Covid App. It has said that it is not reassured that the app protects privacy and believes that it could be unlawful if the large amount of data gathered proved ineffectual. The Committee has even taken the unusual step of drafting a bill which would require all of the collected data to be deleted after the pandemic is over. (We will look at what data the NHS wants to keep for research purposes and why in our fourth and final blog in this series.)
These serious concerns being raised being by experts and parliamentarians will have a big impact on the public uptake of the app.
Privacy by Design
In line with Article 25 of the GDPR, the app’s DPIA states that it was designed and will continue to evolve with the Privacy by Design principles embedded. They include collecting the minimal amount of data necessary; data not leaving the device without the permission of the user; users’ identities obscured to protect their identity; no third-party trackers; proximity data deleted on users’ phones when no longer required; user can delete the app and its data at any time; personal data will not be kept for longer than is necessary in the central database; data in the central database will not be available to those developing in the app apart from in exceptional circumstances; and provision of any data from the central database will be subject to a data protection impact assessment and establishment of legal basis for the disclosure.
The key part of any DPIA are the risks identified and what mitigation can be put in place to reduce that risk if possible. The documented risks in the Covid App include:
Transferring of data outside of the EEA
Misuse of information by those with access
Adequate data processing agreements with relevant data processors
Lack of technical or organisational measures implemented to ensure appropriate security of the personal data
Personal data not being encrypted both/either in transit or at rest
Lack of testing which would assess and improve the effectiveness of such technical and organisational measures
Inadequate or misleading transparency information
Misuse of reference code issued by app for test requests and results management
Malicious access to sonar backend by cyber-attack. Extraction and re-identification of sonar backend data by combination with other data
Identification of infected individual due to minimal contact – e.g. isolated person with carer who is only contact
Malicious or hypochondriac incorrect self-diagnosis on app
Absence of controls over access to app by children
Lower than expected public trust at launch
Uncertainty about whether users will be able to exercise SRRs in relation to data held in the sonar backend
Uncertainty over retention of individual data items
It is surprising that the Covid App DPIA only identifies 15 risks in such a major project involving sharing Special Category Data. To assess all those risks as low to medium also casts doubts on the robustness of the risk assessments. Recently we heard that wide-ranging security flaws have been flagged by security researchers involved in the Isle of Wight pilot.
There also seems to be a lack of clarity about the data being processed by the app.
In response to the concerns raised, NHXS itself tweeted that the Covid App “does not track location or store any personal information.”
This was quickly objected to by many from the data protection community who disagreed with both assertions and argued the app used pseudonymised data and trackers.
The ICO itself states on its website:,
“Recital 26 (of the GDPR) makes it clear that pseudonymised personal data remains personal data and within the scope of the GDPR”.
The DPIA itself, however, does state that pseudonymised will be used and is personal data. The mixed messages coming from NHSX will only continue to cause confusion and once again erode trust.
What is even more worrying is that there are some risks that have not been identified in the original DPIA:
There is a risk that there could be function creep and identification over time as more personal data is added or different research projects add in other identifiable data sets, along with the risk of interpreting smartphone users’ movements and interactions.
Users’ rights for their data to be erased under Article 17 of the GDPR have been completely removed once the data is used for research purposes. We’ll explore this more in a later blog around research.
No decision has yet been made on retention periods for research. The data could be kept for too long and breach the GDPR Principle 5.
The collection of personal data could be unlawful as it may breach the Human Rights Act 1998. If the app does not prove effective, it is arguable that it is not necessary and proportionate for the purpose it was created. More on this in the third blog in this series.
It also is unclear as to how the NHS risk scoring algorithm works as details have not been published. The Privacy Notice makes no mention of automated processing and is therefore not compliant with Article 13 (2)(f) of the GDPR.
At this moment in time, there are still far too many questions and unaddressed concerns relating to the Covid App to reassure the public that they can download it in good faith, and know exactly what will happen to their data.
Feedback from the privacy community should result in a revised DPIA and further scrutiny. Only after all the questions and concerns have been addressed should the app be launched. Yesterday outsourcing firm Serco apologised after accidentally sharing the email addresses of almost 300 contact tracers. The company is training staff to trace cases of Covid-19 for the UK government!
This is the first in a series of four blog posts, in which Susan Wolf and Lynn Wyeth, take a closer look at the government’s proposed NHS COVID 19 contact tracing app (COVID App) from different perspectives.
On 12 April 2020, the UK Government announced that NHSX, a unit of the NHS responsible for digital innovation, was developing a COVID 19 contact tracing app to help in its attempts to combat the coronavirus pandemic. A trial began on the Isle of Wight on 5 May. This could result in the app being improved before it is used more widely across the UK.
In this first blog we explain what the proposed app will look like, how it will work and how it compares with other contact tracing apps. This will be followed by an analysis of the data protection issues raised by the introduction of the app in the UK. The third blog will examine some of the wider privacy and Human Right’s concerns and the fourth blog will look at more detailed issues relating to anonymisation, the use of the data for research purposes and the impact on data subjects’ rights.
What is Contact Tracing?
Contact tracing has been used for many years throughout the world to enable public health organisations to try and identify who people with contagious diseases have been in contact with so that they can be warned that they may be at risk. It has traditionally involved a manual exercise of a health professional working with a diagnosed patient to try and establish who they may have been in close contact with during the infectious period of the disease. However, with the number of smart phone users worldwide surpassing 3.8 billion (more than half the world’s population) mobile phones can provide a much faster and more accurate tracing system.
What is a Contact Tracing App?
A contact tracing app is a smart phone application that automatically warns people if they have been in close contact with someone who later reports that they have COVID 19 symptoms or who has tested positive. App users are allocated a unique identifier that is transmitted by bluetooth signal on their phone. When they come into close contact with other app users their unique ID’s are exchanged, via bluetooth, between phones. The Telegraph Newspaper neatly describes it as a form of “digital handshake.”
According to Wikipedia, 15 countries have developed a contact tracing app and many others are in the process.
The Different App Models
What happens to the information that is stored on a contact tracing app user’s phone depends upon the type of app that is being used. In recent weeks it has become clear that contact tracing these apps fall into two broad “types” and, according to the Guardian Newspaper on 7th May 2020, the world is split between the so-called decentralised and centralised models. What basically differentiates the two models is the way in which the information that is stored on users’ phones is processed and used to notify others.
The distinguishing feature of the “decentralised model” is that unique ID’s are matched on a user’s smart phone and are not transferred to any central server held by a government or private sector organisation. If a user tests positive for COVID 19 they would “inform” the app, which will would then identify and then notify other app users who have been in close contact with them. The “match” takes place entirely on the user’s smart phone.
When a contact receives a notification this too is entirely private to them. In other words, public health or government organisations are not notified that a user has been in proximity to an infected person. The general perception appears to be that the decentralised model is more “privacy friendly”. According to the Parliamentary Joint Committee on Human Rights , the Information Commissioner’s Office, privacy experts and organisations, as well as the European Parliament and the European Data Protection Board (EDPB) have indicated a preference for a decentralised approach.
Most decentralised models use the Apple and Google programming interface (“APIs”) which supports the contract tracing. This is an important point because it allows the interoperability of bluetooth communication between Apple iPhones and Android phones. The former normally switch off the bluetooth function when the phone is locked; however this API allows bluetooth to function even when an iPhone is locked, thus enabling the contact tracing to operate at all times.
In contrast the “centralised model” involves the transfer of information from the users’ smartphones to a remote server operated by a government organisation or by the private sector on their behalf. The central server then determines who is at risk and who should be notified. The perception is that the centralised model is a less privacy friendly option. However it does allow for useful data to be transferred to a public health organisation and used for epidemiological purposes. A recent BBC article provides a useful graphic illustration of the differences between the two models.
The NHS COVID App
The UK NHS COVID App falls into the general category of “centralised” apps. It is still being piloted in the Isle of Wight and is currently the subject of considerable media and political debate.
Once it is finalised the app will be available for smart phone users to download from the Apple or Google stores. Take up will be voluntary. The information below is based on our current understanding of how the app will work, although this may change in the coming weeks.
Once the app is downloaded users need to provide the first half of their postcode but no other personal information. This will be used along with a random string of numbers to provide each user with their own unique ID. We are told that the first part of the postcode is necessary to enable the NHS to see where there are any COVID 19 hotspots.
When NHS COVID App users come into contact with other app users their phones will exchange the unique ID’s. The app can use bluetooth to measure the distance between people who have the app installed on their phones. The NHS website refers to this as “anonymous proximity information.” However it is debatable whether the unique ID is truly anonymised given the very extremely high threshold for complete anonymity.
Once this information is stored on the phone nothing will happen for 28 days.
The information will be deleted unless the app user intervenes by notifying the NHS that they have COVID 19 symptoms or have tested positive. Alternatively app users can delete the app, and this will delete all of the data, although any data already transmitted to the NHS via notification will not be deleted by the app user.
It has been reported that Apple and Google have refused to make their API available to the NHS to support the use of the NHS app. It remains unclear what the current situation is regarding this.
As it currently stands (and to the best of our knowledge) the app has one central question “How are you feeling today?” If the app user taps that they are feeling unwell they are then they are asked whether they have a high temperature and a persistent cough. If a person indicates that they have both these symptoms, then they are prompted to select a date when the symptoms started.
The ‘centralised’ feature of this app is that if somebody is reporting that they are ill with COVID 19 or have symptoms, then the NHS will receive the unique ID of the person reporting that they are ill along with the unique ID’s of all the other people who they have come into proximity with. It is this transfer of data from the app user’s phone to a remote server that makes this system ‘centralised’.
However, it remains unclear whether notification is mandatory or voluntary. According to the NHS website, users can “allow the NHS COVID 19 app to inform the NHS”.
This wording suggests that this notification to the NHS is voluntary. If this is the case, then this raises some concerns about the value of the system since it would appear to depend upon voluntary notification. There are concerns that if people notify on the basis of symptoms alone it could result in over notification. In Germany the contact tracing app will only trigger alerts if users have tested positive for COVID 19.
On receipt of the information the NHS will use a “risk algorithm” to determine whether the people the user has come into contact with need to be notified. If it identifies that other users need to be notified, they will receive an alert.
The success of the app relies upon various factors including:
The sufficient take up by members of the public. At the moment it looks like the app will be voluntary. It has been reported that government aides think that the app will need to be downloaded by 60% of the population in order to be effective.
Transport Secretary Grant Shapps said at the daily briefing on Thursday that more than 72,300 out of 140,000 residents in the Isle of Wight have downloaded the app.
The technology working (see above regarding the Apple and Google programming interface).
The willingness of members of the public to notify the app that they have tested positive or have COVID 19 symptoms. The former depends upon the availability of testing facilities and the fast turnaround of test results.In a letter to Health Secretary Matt Hancock, the chairman of the Royal College of GPs said long wait times were “undermining confidence” in the results.
The extent to which members of the public will be willing to install and use the app will no doubt depend on whether members of the public believe that the use of the app will help reduce the spread of the virus and save lives. But for others there will inevitably be concerns about the privacy implications of using the app. Some important questions need to be answered:
What will happened to the data after it has been used?
How long will it be held?
Is there a danger of the data being used for other purposes?
What ifs use of the app is made a condition for an “immunity passport”?
The answers to these questions will have a big impact on the extent to which the app complies with GDPR and Human Rights law. We will be looking at these issues in more detail these questions in forthcoming blogs. Stay tuned!
During the current coronavirus pandemic, the health and social care sector as well as the emergency services are all providing an amazing service to those who are in need of urgent medical treatment. This will almost always require the sharing of personal data between organisations.
Even during a pandemic, it is important to note that GDPR still applies to ensure individuals’ privacy is protected whilst vital services are provided. On 19th March 2020 the European Data Protection Board has issued a statement on the processing of personal data in the context of the COVID 19 in which it emphasised this point:
“Data protection rules (such as the GDPR) do not hinder measures taken in the fight against the coronavirus pandemic. The fight against communicable diseases is a valuable goal shared by all nations and therefore, should be supported in the best possible way.
It is in the interest of humanity to curb the spread of diseases and to use modern techniques in the fight against scourges affecting great parts of the world. Even so, the EDPB would like to underline that, even in these exceptional times, the data controller and processor must ensure the protection of the personal data of the data subjects.”
The first data protection principle in Article 5 (1) requires Data Controllers to process personal information “lawfully, fairly and in a transparent manner”. Processing personal data is only lawful if one or more of the six lawful bases listed in Article 6 (1) applies.
If a Data Controller processes personal data about a person’s health (which is a class of Special Category Data) then they must additionally identify one of the ten lawful bases set out in Article 9 (2). These are more detailed than those in Article 6, and are fleshed out further in Schedule 1 of the Data Protection Act 2018. However, there are some overlaps. For example ‘consent’ is a lawful basis in Article 6 (1)(a) and ‘explicit consent’ appears in Article 9(2)(a). Similarly ‘vital interests’ appears in both Articles 6 and 9, however there are differences between the two which we explore below.
Article 6 (1) (d) provides that the processing of personal data is lawful if the processing is necessary to protect the vital interests of the data subject or of another natural person. This raises three points for discussion.
What are vital interests?
When will processing be ‘necessary’?
When can it be used to protect the vital interests of ‘another natural person’?
GDPR Recital 46, specifically refers to processing for the monitoring of epidemics and it seems this lawful basis is intended to be used in situations such as the current pandemic. But what about other interests? Are they vital?
During a recent GDPR workshop one delegate asked whether a person’s financial interests could be classed as a ‘vital interest’ (after all, we all need money to live). The answer is no because the word ‘vital’ is interpreted very narrowly. Recital 46 refers to processing that is “necessary to protect an interest which is essential for the life of the data subject or that of another natural person”. The ICO’s interpretation of this is that this generally only applies where it is necessary to protect someone’s life.
Our Example. Sam becomes acutely ill at work and his employer phones the ambulance service. The employer gives the paramedics Sam’s name and address. The employer can rely on the vital interest’s lawful basis to share this information. If the paramedics need access to Sam’s health records,then the GP will be able to share them for the same reason but will additionally require an Article 9 lawful basis (see below).
However, in our view vital interests can also include situations where there is a risk of significant harm to life. Therefore if an elderly person is forced to self-isolate and depends upon a group of volunteers collecting their essential prescription medicines, then sharing that person’s name and address is arguably necessary to protect their vital interests.
The processing must be “necessary” in order to protect a person’s vital interests. The key question is whether a Data Controller can reasonably protect a person’s vital interests without the processing (sharing their personal data). If they can then the processing will not be necessary. If they cannot then it will be lawful. In the above example, if the employers refused to give the paramedics Sam’s name and address then this could potentially threaten their ability to offer him life-saving treatment. Therefore the sharing of Sam’s personal data is necessary to protect Sam’s vital interests.
Protecting the Vital Interests of Other Persons
Those familiar with the Data Protection Act 1998 will know that the lawful basis in Article 6 (1)(d) is very similar to the one listed in paragraph 4 of Schedule 2 of the 1998 Act. Unlike the old DPA, the GDPR extends this lawful basis to processing that is necessary to protect the vital interests of “another natural person.”However, Recital 46 cautions that “Processing of personal data based on the vital interest of another natural person should in principle take place only where the processing cannot be manifestly based on another legal basis”.
Back to our example. When the paramedics take Sam away in the ambulance, they ask for the names of any employees she may have come into contact with because they are concerned for their health. Can the employer rely on Article 6 (1) (d) to share their names? The answer is no if the employer can find an alternative lawful basis such as consent.
Consequently, as the ICO notes, the processing of one individual’s personal data to protect the vital interests of another is likely to happen only rarely. The ICO gives an example of the processing of a parent’s personal data to protect the vital interests of their child.
What about processing of personal data to save the lives of many others, for instance in a pandemic situation? Recital 46 suggest that this lawful basis may be used to process personal data for this purpose. But it also states that this basis should only be used where processing cannot be based on another legal basis. This could include “legal obligation” or “official authority”.
Special Category Data
A Data Controller sharing health information (or any other Special Category Data) also needs to identify a lawful basis under Article 9 of GDPR. This allows processing if is “is necessary to protect the vital interests of the data subject or of another natural person where the data subject is physically or legally incapable of giving consent.”
This basis is more rigorous than its counterpart in Article 6. It permits the processing of Special Category Data if the processing is necessary to protect the vital interest of the data subject or of another natural person but only “where the data subject is physically or legal incapable of giving consent.” This clearly allows medical practitioners to share health data in emergency medical situations where a patient is unable to consent to it.
If a patient is fit and able (physically and mentally) of giving consent, then a Data Controller cannot rely on Article 9 (2)(c).
Example, a volunteer group has compiled a database of the names and addresses of residents who need their prescriptions collecting. They share these names and addresses with volunteers. The group has asked volunteers to log details of any residents who have COVID 19symptoms in order that they can take steps to protect the lives of the volunteers. The group can only process this information if the person with symptoms explicitly consents to their information being shared (and they understand exactly why their information is being shared). If they are physically able to consent (or refuse to give consent) then the group cannot rely on the vital interests condition.
Although the temptation may be to assume that sharing health data is permissible in the circumstances, the vital interests’ condition in Article 9 (2) (c) has its limits.
Volunteer groups may need to take steps to obtain consent from data subjects and be prepared to explain exactly why they want this information. Article 9 does provide further lawful conditions which may be relevant (Articles 9 (2) (h) and (I)). We will consider the use of these in a future blog post.
Many established charities and recently formed volunteer groups are also now providing essential support services for those members of the community who are at risk, or vulnerable or in need. In order to do this these services may need to share personal data about such people, and often about their health. Whilst this is laudable, they too must be mindful of the GDPR implications. Our recent blog post about Covid 19 volunteer groups goes into more detail.
The outbreak of the coronavirus and the sad news about so many people dying everyday is changing all our lives dramatically. As many of us as are trying to come to terms with the ‘new normal’ of staying home and working from home, the implications of all of this from a data protection and privacy perspective are not likely to be forefront of our minds. However, this is also a period in which we face some of the greatest erosions of our basic freedoms in terms of legal restrictions (Health Protection (Coronavirus, Restrictions) (England) Regulations 2020). As more and more people, particularly those with COVID 19 symptoms, the old and the most vulnerable members of society, are forced to self-isolate, local volunteer groups are springing up to support them.
As the BBC noted on 23rd March 2020, more than a thousand volunteer groups have been set up to help the most vulnerable members of their community.
Even though these groups are doing this with the very best of intentions they still need to comply with data protection laws, specifically the General Data Protection Regulation (GDPR). However, as the Information Commissioner’s blog on this subject makes clear, GDPR does not stop these groups from processing and sharing personal data to support people. The ICO has published some general guidance on its approach during this period. It states that it will not “penalise organisations that… need to prioritise other areas or adapt their usual approach during this extraordinary period”. This appears to suggest that the ICO will not take any regulatory action against a volunteer group that is processing personal data to help others during the current crisis.
In this blog we consider why the GDPR applies, and what basic practical steps volunteer groups should take to ensure they do not fall foul of the legislation.
Most volunteer groups will hold at least two lists of people; a list of those who need help (such as the elderly or people at risk) and a list of volunteers. It is likely that the names of people needing help will be shared with those offering help, but equally could be shared with emergency services if necessary.
The act of compiling lists of contact details and storing them on a PC or sharing them with group administrators falls squarely within the definition of ‘personal data’ and ‘processing’ in GDPR Article 4. This also means that the volunteer group becomes a Data Controller and must ensure that it complies with the GDPR. Volunteer groups cannot take advantage of the processing “in the course of a purely personal or household activity” exemption in Article 2(2)(c).
The personal data processed is likely to be limited to name, address and telephone number, but will almost certainly contain Special Category Data (defined in GDPR Article 9) if any health information is recorded. Volunteer groups will need to be careful ensure to they only collect relevant personal data, otherwise they will breach the ‘data minimisation principle’ in GDPR Article 5(1)(c).
The most fundamental requirement of GDPR is that the processing of personal is ‘lawful, fair and transparent’ (the first data protection principle in GDPR Article 5(1) (a)). Processing of personal data, even in these extraordinary times, is only lawful if the Data Controller has identified one of the lawful basis in Article 6. Consent is likely to be the most obvious lawful basis. However, people must know exactly what they are consenting to and they must understand why their personal data is being processed and who it might be shared with. Alternatively, the processing may be necessary for the legitimate interests of the Data Controller or other third parties (such as the people receiving help). Given the circumstances surrounding the compilation of local lists, and the difficulties in securing consent, this is likely to be the most flexible and useful lawful basis. However, it also requires groups to consider how the processing impacts the interests and fundamental freedoms of data subjects. In essence to consider the reasonable expectations of data subjects. So for example, a person who gives their name because they need help would not expect their name to be widely shared on social media. If a person’s health and safety is at risk, then the volunteer group may be able to rely on the ‘vital interests’ condition in Article 6 (1)(d).
Volunteer groups will inevitably collect some health data, which means that in addition to an Article 6 condition, they need to satisfy one of the lawful conditions in GDPR Article 9. The most obvious one, albeit limited, is if the processing (for instance, sharing) is necessary to protect a person’s “vital interests”, such as saving their life. However, this only applies where the data subject is physically incapable of giving consent. For example if a volunteer knows that an old person is not responding to their calls and is concerned that they may be very ill, then they could share this information with the emergency services or the GP. Other possibilities (aside from explicit consent) could include Art 9 (1) (i) “processing is necessary for reasons of public interest in the area of public health”. The ICO blog suggest that ‘safeguarding individuals’ is a possibility, but it isn’t clear what specific Article 9 condition they are referring to.
GDPR practitioners are likely to be very familiar with the transparency requirements in GDPR Articles 12-14. However small volunteer groups are unlikely to have a web site or the time and resources to draft detailed Privacy Notices. Although the ICO’s blog suggests that it is best that groups have a Privacy Notice (and even provides a link to a template) it also recognises that if this is going to delay vital support then groups can just speak to people. However it cautions that they need to be clear, honest and open about what they are doing with peoples’ personal data. Therefore groups may be well advised to produce a short statement when they collect personal data which provides a brief explanation about why they are collecting it and how they propose using it.
We all hope that this crisis will be over very soon, and we can get back to our normal lives. However some volunteer groups may be tempted to continue offering a neighbourly support service. Although this is to be applauded, it raises data protection issues and specifically compliance with the other data protection principles listed in GDPR Article 5. The personal data was collected for a specific purpose and should not be used after the crisis has ended, for other purposes unless those new purposes are compatible with the original purpose (the ‘purpose limitation principle’ in GDPR Article 5 (1) (b).
In any event, the personal data collected should not be used for longer than is necessary for the purposes for which it was originally processed (the ‘storage limitation principle’ in GDPR Article 5 (1) (e). This means that after the crisis people who have supplied their contact and even health details have a right to expect that their personal data will be safely destroyed. Personal Data must be also be accurate and up to date (the ‘accuracy principle’ in GDPR Article 5 (1) (d)). This is another reason for destroying the personal data once things get back to normal.
Even with limited resources, volunteer groups must take appropriate steps to protect the personal information against any unauthorised or unlawful processing and against accidental loss (the ‘integrity and confidentiality principle’ in GDPR Article 5 (1) (f)). Only a small number of people have access to the data and it should be securely stored. This is particularly important given the fact that a lot of the data will concern vulnerable people.
Nobody is suggesting that volunteer groups become GDPR expert’s over-night, but they still need to ensure basic compliance with the GDPR obligations. The ICO has published guidance on its website and a useful set of Q&A’s.
More on this and other developments in our GDPR webinars. Looking for a GDPR qualification from the comfort of your home office? Our GDPR Practitioner Certificate is now available as an online option.