In March 2020, businesses found themselves having to quickly adapt to managing a remote workforce. The IT department felt the pressure to create the infrastructure to enable this and information security teams looked for ways to effectively monitor the network in the new world. Remote working brings with it a number of data protection and privacy challenges.
Challenge One – People
The number one cause for personal data breaches is people. It only takes a momentary lack of concentration for a senior manager to send the salaries and sickness leave details of their entire team to external clients by email or a very busy CEO to leave their laptop on a train.
There will always be an element of risk to handling personal data, but the acknowledgement of this with mitigation and management can drastically reduce the risk of a large-scale reportable data breach.
Understanding the following can all assist with the risk management strategy of an organisation:
How the workforce usually operate in the office versus how people may have to setup their working environment at home
How their emotions and mental health may be affected during these difficult times and how this could impact their working
What employees need to retain some form of ‘normality’ for their remote working
Challenge Two – Technology
Many employees now work on laptops and some office workers are used to the occasional day working – from home. When this becomes a full-time arrangement for a large number of staff all at once, the technology supplied to employees is put to the test to withstand the almost instantaneous move to remote working.
Managing data appropriately and knowing what data is where, makes governance of risk far easier for those working in the field of cyber security as it is often only once something goes wrong that the unknown ways of working come to light!
Whilst working at home, it is far more tempting for employees to use personal devices, removable storage devices or their own personal drives to access data when easy access to what they need is restricted. Remote access to commonly used applications for the workforce, allows for data to be retained in applications already approved by the organisation for visibility and reduces the risk of additional copies of data being generated or used inappropriately by staff.
Lockdown led a number of individuals to download video conferencing applications to keep in touch with family and friends. For some businesses, the use of video conferencing was not an option prior to March, but now most meetings occur across Teams, Skype or Zoom. The use of video conferencing brings with it many additional risks for a business and the security team must be satisfied that the exchanges within the application are protected by the required company standard.
The press has reported on several cases of “Zoom Bombing” whereby third parties invade organised meetings and cause disruption. The unwelcome guests have been reported to have shared distressing images or displayed inappropriate language to all attendees, some of which have led to police investigations.
Email traffic over the past twelve weeks has inevitably risen for all businesses as workers seek to connect with their colleagues. The amount of data being generated and shared has understandably increased and organisations need to consider this risk over the coming months as business approaches adapt to the new normal.
Inboxes tend to be the hardest data records to effectively manage. Ultimately the user needs to take ownership of the issue. Phishing emails are also one of the most common methods a hacker uses to hack a system and therefore it is imperative users know what to look out for and how to report potential threats.
Awareness campaigns and an active push from managers for their staff to review their inboxes and ‘purge’ what they no longer need are good ideas.
Challenge Three – Paper
Some organisations still rely heavily on paper printouts to run their operations. With individuals now working from home, there needs to be a greater awareness amongst staff around how to appropriately handle paper records and most importantly, how to securely destroy them.
Where employees need to printout records, they should be advised how to manage these at home whilst the phase return to offices continues.
Challenge Four – Data Sharing
Without the option of walking over to someone’s desk to ask a question, people are using email and other communications platforms to deal with queries and share documents.
Data sharing can test the principle of data minimisation as human nature often leads people to share far more than is required for the purpose. Engaging with employees and reminding them of how they must take the time to anonymise data where possible, or remove the excess columns from a spreadsheet before sending it, could prove useful in combatting the problem.
A recent example of where email communications can go horribly wrong, is that of the disclosure of abuse survival victims details whereby the sender of the monthly newsletter failed to anonymise the data of the victims before pressing send.
One way to manage and control the sharing within an organisation is to ensure the data protection policy has clear guidelines around company approved data sharing platforms. The key to keeping data sharing under control is to make the preferred method easy! If too much effort is required with granting external access to a sharing portal, uploading documents with passwords and then having to send links, people will stray and resort to the easier method of email attachments.
So as staff begin to return to work, here are some more practical tips to protect personal data:
Engage with staff to gain an understanding of how their ways of working have changed and what difficulties they are facing with data management.
Ensure that the company policies around remote working, data protection and information security are up-to-date and accessible to all.
Offer a remote IT helpdesk service for employees who are having difficulties operating their hardware or software from home to prevent them using their own devices to work on.
Ensure staff are installing software updates onto their work devices.
Raise awareness of phishing emails and remind staff how to report them safely.
Secure cloud storage solutions should be in place and staff should know how to use them.
Communicate the data breach or incident management procedure to staff.
Account for any additional processing that has been required to take place over the past few months in the Record of Processing Activities.
Samantha Smith is a Data Protection Manager and qualified Solicitor with experience of data protection compliance projects across both public and private sectors.
OurGDPR Essentials E learning course is designed to teach frontline staff essential GDPR knowledge in an engaging, fun and interactive way. In just over 30 minutes staff will learn about the key provisions of GDPR and how to keep personal data safe.
Along with pubs, restaurants and places of worship, many businesses have now re-opened after the lockdown and are requiring their staff to return to work. There has been a lot of guidance about how the physical aspect of premises can facilitate a safe return, but it is also important that employers do not forget the need for good data protection practice. Much of the process of leaving the office may have been done hastily, but many of the practices that are now established will be in place for a significant time to come.
In short, the principles are the same as they always have been. Data protection does not prevent employers from using personal data in a new way to ensure both the workplace and employees are safe. However, it is important that the risks associated with new personal data processing activities are recognised and addressed.
Whether an employer wants to create records of staff who are self-isolating, needs information to understand which staff are vulnerable or share data about staff with the NHS, Data Protection Impact Assessments (DPIAs) are an important tool for planning purposes. They will help to clarify the specified aim, the information flow and the risks associated with the processing. The DPIA will require answers to questions such as what do we want to achieve and what personal data do we need to do it? What systems are we going to use and who is responsible for the data? What are the risks to Data Subjects and how are we going to address them?
Communication is vital. The Information Commissioner’s Office (ICO) states in its blog “Be clear, open and honest with staff about their data”. There might be changes in policy and procedure which have an impact on processing employee personal data. Employers should consider if there is a need to update their privacy notices or even create additional ones
Now is also a good time to think about physical premises and the impact on data security. If employers have implemented a one-way system, does this make is easier for someone to gain access to personal data?
Whatever measures are implemented during and after the pandemic, employees must still be able to exercise their data protection rights. If personal data is not clearly organised across systems, with logical steps in an information flow, then it might not be possible to comply with subject access requests.
Other important steps include amending the organisation’s Record of Processing Activity (RoPA) and the Information Asset Register. Retention periods must also be carefully considered. This is a time of uncertainty which makes ‘just-in-case’ retention periods tempting; but should be avoided. There is nothing wrong with telling people that information has been destroyed as it had reached the end of the retention period for the specified purpose it was collected for.
Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online GDPR Practitioner Certificate course is fully booked. A few places left on the course starting on 6th August.
“A pint of beer and a packet of crisps, Sir? That’ll be £3.90 and your personal data please.”
For some businesses, such as restaurants and pubs, the government is also intending to place an additional obligation. The guidance document states:
“The opening up of the economy following the COVID-19 outbreak is being supported by NHS Test and Trace. You should assist this service by keeping a temporary record of your customers and visitors for 21 days, in a way that is manageable for your business, and assist NHS Test and Trace with requests for that data if needed. This could help contain clusters or outbreaks.”
This new requirement to collect and store personal data, alongside encouraging or compelling customers, clearly raises data protection and privacy implications.
In a statement to the House of Commons on 23rd June 2020, Boris Johnson said, “We will work with the sector to make this manageable.” Speaking to the Guardian newspaper the next day, the Information Commissioner’s Office (ICO), said it was “assessing the potential data protection implications of this proposed scheme and is monitoring developments”. With a week to go before the new rules come into force, both need to get a wriggle on!
Reaction on to the Prime Minister’s statement on social media was nothing but predictable. People immediately started discussing which fake name they would use rather than hand over their personal data. Dominic Cummings and Matthew Hancock seem popular choices.
As we publish this blog, there have been no changes in legislation and no further emergency COVID-19 regulations. Nor have any changes to licensing laws been proposed in order to enforce the collection of this data.
So how can a restaurant manager or pub landlord justify collecting personal data in these circumstances? Let’s consider the lawfulness conditions under Article 6 of GDPR for processing data. If a business will not allow someone to dine or drink in its premises unless a name and address is recorded, they cannot use consent as their condition for processing. The customer is not freely giving their data as they have no real choice if they want to use the premises. There is no contract between the parties at the stage of entering the premises. There’s no statutory requirement in law to demand it or any official authority for businesses to require it. No-one is going to die immediately if the data is not handed over so vital interests cannot be used.
Unless emergency legislation is passed in the next week it appears businesses will have to rely on the “legitimate interests” condition under Article 6 to collect and process the personal data of customers.
If businesses decide it is in their legitimate interests to collect customer contact data, they also need to demonstrate fairness and transparency to meet the requirements of the first data principle. This brings us to Privacy Notices. A quick sampling of my local pubs showed only 3 out of the 10 currently have Article 13 compliant Privacy Notices on their websites. All three were part of national chains. The more local independent pubs do not appear to have a Privacy Notice on their website. How will these pubs explain to customers why they want their data and what they are going to do with it? Perhaps there will be signs to be read upon entering.
Security of the Data
One of the biggest risks to businesses is not keeping this newly collected personal data secure, which could result in the possibility of a data breach under GDPR. Under Article 32 the business needs to take appropriate organisational and technical measures to keep the data secure. Devices will need to be password protected if not encrypted. Access will have to be controlled. New security policies and procedures will need to be put in place by next week.
In addition, all staff will need to be trained, quickly, regarding handling this newly collected data. Stories have already surface in New Zealand, after this system was introduced there, of female customers being harassed by staff who had taken their details from the contact list.
The government has said that businesses need to keep customer contact data for 21 days. This raises more questions for businesses to consider. How will this be implemented?
Do systems allow this retention period? How will paper records be disposed of securely? There’ll be a run on shredders soon!
The government is also asking pubs and restaurants to use apps to enable customers to order at their tables thus limiting contact with others. The Wetherspoons chain has had such an app for ‘table service’ for some time. We know the government likes apps but they too need to be GDPR-compliant.
Furthermore those customers who are unwilling or unable to comply with the new requirements, whether because they object to the collection of data, do not have ID documents or are economically excluded as they do not have smartphones and/or bank accounts face discrimination as they will be unable to access the social spaces that are pubs and restaurants. There could be challenges against such measures on this basis.
Trust and Burden
Ultimately it will be down to individuals as to whether they care about their data enough or would prefer a pint or a pie after 3 long months. It may be that they trust their local restaurant and landlord with their data. Some individuals will decide it’s just not worth the hassle and risk for the sake of a socially distanced Sunday lunch.
Some small businesses may decide that the requirement to processes customers’ personal data in a GDPR complaint way is too much of a burden considering they have 8 days to prepare on top of re-opening, getting staff back and trained and making their premises COVID-secure.
Our GDPR Essentials E learning course is designed to teach frontline staff essential GDPR knowledge in an engaging, fun and interactive way. In just over 30 minutes staff will learn about the key provisions of GDPR and how to keep personal data safe.
Many countries have now been in some form of lockdown for a considerable length of time. As some of the lockdown measures are slowly being eased, one of the possible solutions to prevent a “second wave” is the implementation of temperature checks in shops and workplaces. This involves placing a thermometer on an individual’s forehead. Of course if the temperature is recorded or there is another way the individual can be identified, it will involve processing health data. Care must be taken to consider the GDPR and privacy implications.
Apple reopened stores across Germany on 11th May with extra safety procedures, including temperature checks and social distancing. It is now facing a probe by a regional German data protection regulator into whether its plan to take the temperature of its store customers violates GDPR.
The benefits of temperature check are self-evident. By detecting members of the public or staff who have a high temperature, and not permitting them to enter the store or workplace, staff have less risk of close contact with people who may have COVID 19. Temperature checks are just one small part of stopping the spread of COVID 19 and can be intrusive. What is the lawful basis for processing such data? Art 6(1)(d) of GDPR allows processing where it is:
“…is necessary in order to protect the vital interests of the data subject or of another natural person”
Of course “data concerning health” is also Special Category Data and requires an Article 9 condition to ensure it is lawful. Is a temperature check necessary to comply with employment obligations, for medical diagnosis or for reasons of public health?
All conditions under Article 6 and 9 must satisfy the test of necessity. There are many causes of a high temperature not just COVID 19. There have also been doubts over the accuracy of temperature readings. They take skin temperature, which can vary from core temperature, and do not account for the incubation phase of the disease where people may be asymptomatic.
The Information Commissioner’s Office (ICO) has produced guidance on workplace testing which states:
“Data protection law does not prevent you from taking the necessary steps to keep your staff and the public safe and supported during the present public health emergency.
But it does require you to be responsible with people’s personal data and ensure it is handled with care.”
The ICO suggests that “legitimate interests” or “public task” could be used to justify the processing of personal data as part of a workplace testing regime. The former will require a Legitimate Interests Assessment, where the benefit of the data to the organisation is balanced against the risks to the individual. In terms of Article 9, the ICO suggests the employment condition, supplemented by Schedule 1 of the Data Protection Act 2018. The logic used here is that employment responsibilities extend to compliance wide range of legislation, including health and safety.
More generally, the ICO says that that technology which could be considered privacy intrusive should have a high justification for usage. It should be part of a well thought out plan, which ensures that it is an appropriate means to achieve a justifiable end. alternatives should also have been fully evaluated. The ICO also states:
“If your organisation is going to undertake testing and process health information, then you should conduct a DPIA focussing on the new areas of risk.”
A Data Protection Impact Assessment should map the flow of the data including collection, usage, retention and deletion as well as the associated risks to individuals.
Some companies are even using thermal cameras as part of COVID 19 testing. The Surveillance camera Commissioner (SCC) and the ICO have worked together to update the SCC DPIA template, which is specific to surveillance systems.
As shops begin to open and the world establishes post COVID 19 practices, many employers and retailers will be trying to find their “new normal”. People will also have to decide what they are comfortable with. Temperature should be part of a considered approach evaluating all the regulatory and privacy risks.
Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online GDPR Practitioner Certificatecourse is fully booked. A few places left on the course starting on 2ndJuly.
The first three blog posts in this series have raised many issues about the proposed NHS COVID19 Contact Tracing App (COVID App) including the incomplete DPIA and lack of human rights compliance. In this final post we discuss concerns about how long the data collected by the app will be held and what it will be used for.
From the DPIA and NHSX communications it appears that the purpose of the COVID App is not just to be part of a contact tracing alert system. The app’s Privacy Notice states:
“The information you provide, (and which will not identify you), may also be used for different purposes that are not directly related to your health and care. These include:
Research into coronavirus
Planning of services/actions in response to coronavirus
Monitoring the progress and development of coronavirus
Any information provided by you and collected about you will not be used for any purpose that is not highlighted above.”
Article 89 of the GDPR allows Data Controllers to process personal data for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes, subject to appropriate safeguards set out in Section 19 of the Data Protection Act 2018.
NHSX has said that one of the “appropriate safeguards” to be put in place is anonymisation or de-identification of the users’ data; but only if research purposes can be achieved without the use of personal data. However, even anonymised data can be pieced back together to identify individuals especially where other datasets are matched. The Open Rights Group says:
“Claims such as ‘The App is designed to preserve the anonymity of those who use it’ are inherently misleading, yet the term has been heavily relied upon by the authors of the DPIA. On top of that, many statements leave ambiguities…”
There are also legitimate concerns about “function creep”. What exactly does “research into coronavirus” mean? Matthew Gould, the chief executive of NHSX, told MPs the app will evolve over time:
“We need to level with the public about the fact that when we launch it, it will not be perfect and that, as our understanding of the virus develops, so will the app. We will add features and develop the way it works.”
Whilst speaking to the Science and Technology Committee, Gould stated that “We’ve been clear the data will only ever be used for the NHS.” This does not rule out the possibility of private companies getting this data as NHS Data Processors.
Privacy campaigners are also concerned about the length of time the personal data collected by the app will be held; for both contacts and for people who have coronavirus. The DPIA and Privacy Notice does not specify a data retention period:
“In accordance with the law, personal data will not be kept for longer than is necessary. The exact retention period for data that may be processed relating to COVID-19 for public health reasons has yet to be set (owing to the uncertain nature of COVID-19 and the impact that it may have on the public).
In light of this, we will ensure that the necessity to retain the data will be routinely reviewed by an independent authority (at least every 6 months).”
So, at the time of writing, COVID App users have no idea how long their data will be kept for, nor exactly what for, nor which authority will review it “every six months.” Interestingly the information collected by the wider NHS Test and Trace programme is going to be kept by Public Health England for 20 years. Who is to say this will not be the case for COVID App users’ data?
Interestingly, none of the 15 risks listed in the original DPIA relating to the COVID App trial (see the second blog in this series) include keeping data for longer than necessary or the lawful basis for retaining it past the pandemic, or what it could be used for in future if more personal data is collected in updated versions of the app. As discussed in the third blog in this series, the Joint Human Rights Committee drafted a Bill which required defined purposes and deletion of all of the data at end of the pandemic. The Secretary of State for Health and Social Care, Matt Hancock, quickly rejected this Bill.
The woolly phrase “personal data will not be kept for longer than is necessary” and the fact NHSX admit that the COVID App will evolve in future and may collect more data, gives the Government wriggle room to retain the COVID App users’ data indefinitely and use it for other purposes. Could it be used as part of a government surveillance programme? Both India and China have made downloading their contact tracing app a legal requirement raising concerns of high tech social control.
To use the App or not?
Would we download the COVID App app in its current form? All four blogs in this series show that we are not convinced that it is privacy or data protection compliant. Furthermore, there are worries about the wider NHS’s coronavirus test-and-trace programme. The speed at which it has been set up, concerns raised by people working in it and the fact that no DPIA has been done further undermines confidence in the whole set up. Yesterday we learnt that the Open Rights Group is to challenge the government over amount of data collected and retained by the programme.
Having said all that, we leave it up to readers to decide whether to use the app.
Some privacy experts have been more forthcoming with their views. Phil Booth of @medConfidential calls the Test and Trace programme a “mass data grab” and Paul Bernal, Associate Professor in Law at the University of East Anglia, writes that the Government’s approach – based on secrecy, exceptionalism and deception – means our civic duty may well be to resist the programme actively. Finally if you need a third opinion, Jennifer Arcuri, CEO of Hacker House, has said she would not download the app because “there is no guarantee it’s 100 percent secure or the data is going to be kept secure.” Over to you dear readers!
Will you be downloading the app? Let us know in the comments section below.
Everyone will agree that the government needs to do everything it can to prevent the further spread of the Coronavirus and to “save lives” (except if your name is Dominic Cummings -Ed). However, there is much less consensus about the what it should do, and this can be seen in the current debate about the proposal to roll out a contact tracing system and the NHS COVID App. This is the third in a series of blog posts where we examine the COVID App from different perspectives.
On May 7 2020, the Parliamentary Joint Committee on Human Rights (PJCHR) published its report on the proposed contact tracing system and made a series of important recommendations to address its concerns about the compatibility of the scheme with data protection laws and the Human Rights Act 1998. After waiting for two weeks, the Secretary of State for Health, Matt Hancock, replied to the Committee rejecting its proposals as “unnecessary!” Let us examine those proposals in detail.
The Human Rights Considerations
Section 6 of the Human Rights Act 1998 makes it unlawful for any public authority (that includes the UK government and the NHSX) to act in a way that is incompatible with a Convention right. Article 8(1)of the ECHR states that “Everyone has the right to respect for his private and family life, his home and his correspondence.” This is not an absolute right. Article 8(2) provides that an interference with the right to privacy may be justified if it:
“…is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”
However, the government also has an obligation to protect the “right to life” enshrined in Article 2 of the ECHR. This means that if the NHS COVID App really can prevent the spread of the virus and save lives, then this is going to a major consideration in deciding whether the interference with Article 8 is necessary and proportionate.
On 7 May the Parliamentary Joint Committee on Human Rights (PJCHR) published a Report on the NHS COVID App and this provides a very detailed assessment of some of the human rights implications of the “centralised” approach that the NHS has proposed. The overall conclusion of the report is that if the app is effective it could help pave the way out of current lockdown restrictions and help to prevent the spread of Coronavirus. However, it also concludes that the app, in its current form, raises “significant concerns regarding surveillance and the impact on other human rights which must be addressed first.”
How will the COVID Appinterfere with the right to privacy?
At first glance it would appear that the COVID App does not involve the transfer of any personal data. As explained in the first blog in this series, app user will be given a unique ID which will be made up of a set of random numbers and the first half of a person’s post code. The NHS web site suggests that this ‘anonymises’ the information. However, as the Parliamentary Report notes, there are parts of England where less than 10,000 people live in a post code area and as little as 3 or 4 “bits” of other information could be enough to identify individuals. The report also notes that relying upon people self-reporting alone (without requiring conformation that a person has tested positive for COVID 19) may carry the risks of false alerts thereby impacting on other people’s rights if they have to self-isolate unnecessarily.
An interference with a person’s right to privacy under ECHR Article 8 may be justified under Article 8(2) if it is “in accordance with the law” and is “necessary” for the protection of “health” (see above).
To be in accordance with the law, the app must meet the requirements of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 “http://www. legislation.gov.uk/ukpga/2018/12/contents” (DPA). However, as noted below, the PJCHR believes that the “current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system”. The Committee’s recommendations in relation to this are considered below.
The remaining human rights consideration is whether the interference with peoples’ private lives is “necessary”. The answer to this depends on whether the use of the app will contribute to reducing the spread of COVID 19 and whether it will save lives.
This in turn depends on whether the app works and on the uptake of the app.
Although it was reported that uptake of the app in the Isle of Wight has exceeded 50% of the population, this falls short of the 60% that the government had previously suggested was necessary for the app to be effective. It is also debatable whether it necessarily follows that the uptake will be the same on the mainland. If the App is not capable of achieving its objective of preventing the spread of the virus, then the interference with peoples’ privacy rights will not be proportionate and will not fulfil the requirement of necessity in Article 8(2).
Although many people will probably download the app without thinking about privacy issues (how often do any of us download apps without checking Privacy Notices?), many others may have some real privacy concerns, particularly after the recent media debates. This has not been helped by reports that Serco (the company contracted to train call centre staff for the contact tracing scheme) has accidentally shared the email addresses of 300 contact tracers. Or by the fact that in other parts of the world there is growing concern about the privacy issues related to the use of contact tracing apps. Uptake of the app may be adversely affected if people lack confidence in the way in which data is being processed and why, and in the light of above they may have concerns about data security.
Consequently, the PJCHR’s report includes a series of recommendations aimed at ensuring that “robust privacy protections” are put in place as these are key to ensuring the effectiveness of the app .
Central to their recommendations was a proposal that the government introduce legislation to provide legal certainty about how personal data will be processed by the COVID App. Although individuals’ data protection rights are protected by the GDPR and DPA 2018 the Committee believes that it is “nearly impossible” for the public to understand what will happen to their data and also that it is necessary to turn government assurances about privacy into statutory obligations. The PJCHR sent a copy of their draft Bill to Secretary of State, Matt Hancock. However, on 21 May Matt Hancock rejected that proposal on the basis that the existing law provides “the necessary powers, duties and protections” and that participation in contact tracing and use of the app is voluntary.
In contrast the Australian government has passed additional new privacy protection legislation specifically aimed at the collection, use and disclosure of its COVID safe app data.
The Committee’s other recommendations are:
The appointment of a Digital Contact Tracing Human Rights Commissioner to oversee the use, effectiveness and privacy protections of the app and any data associated with digital contact tracing. It calls for the Commissioner to have the same powers as the Information Commissioner. It would appear that Matt Hancock has also rejected this proposal on the basis that there is already sufficient governance in place.
Particular safeguards for children under 18 to monitor children’s use, ensure against misuse and allow for interviews with parents where appropriate. It is noticeable that the Committee has set the age at 18.
The app’s contribution to reducing the severity of the lockdown and to helping to prevent the spread of COVID 19 must be demonstrated and improved at regular intervals for the collection of the data to be reasonable. Therefore the Secretary of State for Health must review the operation of the app on a three weekly basis and must report to Parliament every three weeks.
Transparency. In the second of this series of blog posts, we noted some of the issues relating to the publication of the Data Protection Impact Assessment. The PJCHR calls for this to be made public as it is updated.
Time limited. The data associated with the contact tracing app must be permanently deleted when it is no longer required and may not be kept beyond the duration of the health emergency. However these terms may be open to some interpretation.
Matt Hancock has written that he will respond to these other issues “in due course”.
It is unclear what this means, but it does not suggest any immediate response.
The Draft Bill
The PJCHR’s draft bill (rejected by Matt Hancock) proposed a number of important provisions, some of which are set out below.
The Bill specifically limited the purpose of the COVID App to:
Protecting the health of individuals who are or may become infected with Coronavirus; and
Preventing or controlling the spread of Coronavirus (a) preventing the spread of Coronavirus.
Additionally it contained provisions that prohibited the use of centrally held data without specific statutory authorisation; limited the amount of time that data could be held on a smart phone to 28 days followed by automatic deletion unless a person has notified that they have COVID 19 or suspected COVID 19. It also prohibited “data reconstruction” in relation to any centrally held data. The fact that the Bill includes this, seems to suggest an implicit recognition that the Unique IDs are not truly anonymous.
The ‘status’ of the NHS COVID App keeps changing and it still remains to be seen when (and if) it will be rolled out. But the Northern Ireland Assembly has already announced it will be working with the Irish government to produce a coordinated response based on a decentralised model. It is reported to be doing this because of the difficulties and uncertainties surrounding the app, and the human rights issues arising from a centralised app.
Yesterday the Prime Minister said England will have a “world-beating” Covid 19 contact tracing system from June. Part of this system is the introduction of the NHS contact tracing app (“the Covid App”) which is currently being trialled on the Isle of Wight.
The app was initially meant to be launched across England in mid-May. Yesterday No.10 suggested this will be done “at a later date.” Why the delay? Well if you look at the recently published Data Protection Impact Assessment (DPIA) for the trial it’s obvious that much more work needs to be done. Here is our analysis of some of the issues raised. (If you are new to this subject we suggest you read the first blog in our series which discussed how such apps work and the different models which can be used.)
Background to the DPIA
The start of the App project has not been auspicious; nor does it instil confidence in the people running it. How can the public, let alone privacy professionals, trust the government when they say that the app will respect their privacy?
The trial of the app started on the Isle of Wight before the Information Commissioner’s Office (ICO) had been given sight of the DPIA. Although they have now seen a copy, the ICO is yet to give a formal opinion. Should the trial have gone ahead in this situation?
As demands grew to see the DPIA, NHSX published it as a .pdf document! However embedded documents including the all-important risk register could not be accessed.
So much for transparency! A few days later the word version of the DPIA was published revealing all the documents but there were typos and some names were not redacted. More importantly, those scrutinising it raised concerns that “high risks” in the original documentation had been listed as only “medium risks” in the risk register. NHSX quickly removed the word document and only the .pdf version is now available (here). For the trial to go ahead before all of the promised and finalised accurate documentation had been released again does not engender faith in the app’s ability to protect users’ privacy.
An Ethics Advisory Board has been set up to oversee the Covid App project. In a letter to the Secretary of Health and Social Care, the Board spelt out the 6 principles it expected to be followed; value, impact, security and privacy, accountability, transparency and control.
Some members of the Board have since raised their concerns to the press over how the Board has been responded to. They were also unhappy not to have seen the final DPIA before being asked to comment.
Parliament’s Joint Committee on Human Rights has also been scrutinising the Covid App. It has said that it is not reassured that the app protects privacy and believes that it could be unlawful if the large amount of data gathered proved ineffectual. The Committee has even taken the unusual step of drafting a bill which would require all of the collected data to be deleted after the pandemic is over. (We will look at what data the NHS wants to keep for research purposes and why in our fourth and final blog in this series.)
These serious concerns being raised being by experts and parliamentarians will have a big impact on the public uptake of the app.
Privacy by Design
In line with Article 25 of the GDPR, the app’s DPIA states that it was designed and will continue to evolve with the Privacy by Design principles embedded. They include collecting the minimal amount of data necessary; data not leaving the device without the permission of the user; users’ identities obscured to protect their identity; no third-party trackers; proximity data deleted on users’ phones when no longer required; user can delete the app and its data at any time; personal data will not be kept for longer than is necessary in the central database; data in the central database will not be available to those developing in the app apart from in exceptional circumstances; and provision of any data from the central database will be subject to a data protection impact assessment and establishment of legal basis for the disclosure.
The key part of any DPIA are the risks identified and what mitigation can be put in place to reduce that risk if possible. The documented risks in the Covid App include:
Transferring of data outside of the EEA
Misuse of information by those with access
Adequate data processing agreements with relevant data processors
Lack of technical or organisational measures implemented to ensure appropriate security of the personal data
Personal data not being encrypted both/either in transit or at rest
Lack of testing which would assess and improve the effectiveness of such technical and organisational measures
Inadequate or misleading transparency information
Misuse of reference code issued by app for test requests and results management
Malicious access to sonar backend by cyber-attack. Extraction and re-identification of sonar backend data by combination with other data
Identification of infected individual due to minimal contact – e.g. isolated person with carer who is only contact
Malicious or hypochondriac incorrect self-diagnosis on app
Absence of controls over access to app by children
Lower than expected public trust at launch
Uncertainty about whether users will be able to exercise SRRs in relation to data held in the sonar backend
Uncertainty over retention of individual data items
It is surprising that the Covid App DPIA only identifies 15 risks in such a major project involving sharing Special Category Data. To assess all those risks as low to medium also casts doubts on the robustness of the risk assessments. Recently we heard that wide-ranging security flaws have been flagged by security researchers involved in the Isle of Wight pilot.
There also seems to be a lack of clarity about the data being processed by the app.
In response to the concerns raised, NHXS itself tweeted that the Covid App “does not track location or store any personal information.”
This was quickly objected to by many from the data protection community who disagreed with both assertions and argued the app used pseudonymised data and trackers.
The ICO itself states on its website:,
“Recital 26 (of the GDPR) makes it clear that pseudonymised personal data remains personal data and within the scope of the GDPR”.
The DPIA itself, however, does state that pseudonymised will be used and is personal data. The mixed messages coming from NHSX will only continue to cause confusion and once again erode trust.
What is even more worrying is that there are some risks that have not been identified in the original DPIA:
There is a risk that there could be function creep and identification over time as more personal data is added or different research projects add in other identifiable data sets, along with the risk of interpreting smartphone users’ movements and interactions.
Users’ rights for their data to be erased under Article 17 of the GDPR have been completely removed once the data is used for research purposes. We’ll explore this more in a later blog around research.
No decision has yet been made on retention periods for research. The data could be kept for too long and breach the GDPR Principle 5.
The collection of personal data could be unlawful as it may breach the Human Rights Act 1998. If the app does not prove effective, it is arguable that it is not necessary and proportionate for the purpose it was created. More on this in the third blog in this series.
It also is unclear as to how the NHS risk scoring algorithm works as details have not been published. The Privacy Notice makes no mention of automated processing and is therefore not compliant with Article 13 (2)(f) of the GDPR.
At this moment in time, there are still far too many questions and unaddressed concerns relating to the Covid App to reassure the public that they can download it in good faith, and know exactly what will happen to their data.
Feedback from the privacy community should result in a revised DPIA and further scrutiny. Only after all the questions and concerns have been addressed should the app be launched. Yesterday outsourcing firm Serco apologised after accidentally sharing the email addresses of almost 300 contact tracers. The company is training staff to trace cases of Covid-19 for the UK government!
This is the first in a series of four blog posts, in which Susan Wolf and Lynn Wyeth, take a closer look at the government’s proposed NHS COVID 19 contact tracing app (COVID App) from different perspectives.
On 12 April 2020, the UK Government announced that NHSX, a unit of the NHS responsible for digital innovation, was developing a COVID 19 contact tracing app to help in its attempts to combat the coronavirus pandemic. A trial began on the Isle of Wight on 5 May. This could result in the app being improved before it is used more widely across the UK.
In this first blog we explain what the proposed app will look like, how it will work and how it compares with other contact tracing apps. This will be followed by an analysis of the data protection issues raised by the introduction of the app in the UK. The third blog will examine some of the wider privacy and Human Right’s concerns and the fourth blog will look at more detailed issues relating to anonymisation, the use of the data for research purposes and the impact on data subjects’ rights.
What is Contact Tracing?
Contact tracing has been used for many years throughout the world to enable public health organisations to try and identify who people with contagious diseases have been in contact with so that they can be warned that they may be at risk. It has traditionally involved a manual exercise of a health professional working with a diagnosed patient to try and establish who they may have been in close contact with during the infectious period of the disease. However, with the number of smart phone users worldwide surpassing 3.8 billion (more than half the world’s population) mobile phones can provide a much faster and more accurate tracing system.
What is a Contact Tracing App?
A contact tracing app is a smart phone application that automatically warns people if they have been in close contact with someone who later reports that they have COVID 19 symptoms or who has tested positive. App users are allocated a unique identifier that is transmitted by bluetooth signal on their phone. When they come into close contact with other app users their unique ID’s are exchanged, via bluetooth, between phones. The Telegraph Newspaper neatly describes it as a form of “digital handshake.”
According to Wikipedia, 15 countries have developed a contact tracing app and many others are in the process.
The Different App Models
What happens to the information that is stored on a contact tracing app user’s phone depends upon the type of app that is being used. In recent weeks it has become clear that contact tracing these apps fall into two broad “types” and, according to the Guardian Newspaper on 7th May 2020, the world is split between the so-called decentralised and centralised models. What basically differentiates the two models is the way in which the information that is stored on users’ phones is processed and used to notify others.
The distinguishing feature of the “decentralised model” is that unique ID’s are matched on a user’s smart phone and are not transferred to any central server held by a government or private sector organisation. If a user tests positive for COVID 19 they would “inform” the app, which will would then identify and then notify other app users who have been in close contact with them. The “match” takes place entirely on the user’s smart phone.
When a contact receives a notification this too is entirely private to them. In other words, public health or government organisations are not notified that a user has been in proximity to an infected person. The general perception appears to be that the decentralised model is more “privacy friendly”. According to the Parliamentary Joint Committee on Human Rights , the Information Commissioner’s Office, privacy experts and organisations, as well as the European Parliament and the European Data Protection Board (EDPB) have indicated a preference for a decentralised approach.
Most decentralised models use the Apple and Google programming interface (“APIs”) which supports the contract tracing. This is an important point because it allows the interoperability of bluetooth communication between Apple iPhones and Android phones. The former normally switch off the bluetooth function when the phone is locked; however this API allows bluetooth to function even when an iPhone is locked, thus enabling the contact tracing to operate at all times.
In contrast the “centralised model” involves the transfer of information from the users’ smartphones to a remote server operated by a government organisation or by the private sector on their behalf. The central server then determines who is at risk and who should be notified. The perception is that the centralised model is a less privacy friendly option. However it does allow for useful data to be transferred to a public health organisation and used for epidemiological purposes. A recent BBC article provides a useful graphic illustration of the differences between the two models.
The NHS COVID App
The UK NHS COVID App falls into the general category of “centralised” apps. It is still being piloted in the Isle of Wight and is currently the subject of considerable media and political debate.
Once it is finalised the app will be available for smart phone users to download from the Apple or Google stores. Take up will be voluntary. The information below is based on our current understanding of how the app will work, although this may change in the coming weeks.
Once the app is downloaded users need to provide the first half of their postcode but no other personal information. This will be used along with a random string of numbers to provide each user with their own unique ID. We are told that the first part of the postcode is necessary to enable the NHS to see where there are any COVID 19 hotspots.
When NHS COVID App users come into contact with other app users their phones will exchange the unique ID’s. The app can use bluetooth to measure the distance between people who have the app installed on their phones. The NHS website refers to this as “anonymous proximity information.” However it is debatable whether the unique ID is truly anonymised given the very extremely high threshold for complete anonymity.
Once this information is stored on the phone nothing will happen for 28 days.
The information will be deleted unless the app user intervenes by notifying the NHS that they have COVID 19 symptoms or have tested positive. Alternatively app users can delete the app, and this will delete all of the data, although any data already transmitted to the NHS via notification will not be deleted by the app user.
It has been reported that Apple and Google have refused to make their API available to the NHS to support the use of the NHS app. It remains unclear what the current situation is regarding this.
As it currently stands (and to the best of our knowledge) the app has one central question “How are you feeling today?” If the app user taps that they are feeling unwell they are then they are asked whether they have a high temperature and a persistent cough. If a person indicates that they have both these symptoms, then they are prompted to select a date when the symptoms started.
The ‘centralised’ feature of this app is that if somebody is reporting that they are ill with COVID 19 or have symptoms, then the NHS will receive the unique ID of the person reporting that they are ill along with the unique ID’s of all the other people who they have come into proximity with. It is this transfer of data from the app user’s phone to a remote server that makes this system ‘centralised’.
However, it remains unclear whether notification is mandatory or voluntary. According to the NHS website, users can “allow the NHS COVID 19 app to inform the NHS”.
This wording suggests that this notification to the NHS is voluntary. If this is the case, then this raises some concerns about the value of the system since it would appear to depend upon voluntary notification. There are concerns that if people notify on the basis of symptoms alone it could result in over notification. In Germany the contact tracing app will only trigger alerts if users have tested positive for COVID 19.
On receipt of the information the NHS will use a “risk algorithm” to determine whether the people the user has come into contact with need to be notified. If it identifies that other users need to be notified, they will receive an alert.
The success of the app relies upon various factors including:
The sufficient take up by members of the public. At the moment it looks like the app will be voluntary. It has been reported that government aides think that the app will need to be downloaded by 60% of the population in order to be effective.
Transport Secretary Grant Shapps said at the daily briefing on Thursday that more than 72,300 out of 140,000 residents in the Isle of Wight have downloaded the app.
The technology working (see above regarding the Apple and Google programming interface).
The willingness of members of the public to notify the app that they have tested positive or have COVID 19 symptoms. The former depends upon the availability of testing facilities and the fast turnaround of test results.In a letter to Health Secretary Matt Hancock, the chairman of the Royal College of GPs said long wait times were “undermining confidence” in the results.
The extent to which members of the public will be willing to install and use the app will no doubt depend on whether members of the public believe that the use of the app will help reduce the spread of the virus and save lives. But for others there will inevitably be concerns about the privacy implications of using the app. Some important questions need to be answered:
What will happened to the data after it has been used?
How long will it be held?
Is there a danger of the data being used for other purposes?
What ifs use of the app is made a condition for an “immunity passport”?
The answers to these questions will have a big impact on the extent to which the app complies with GDPR and Human Rights law. We will be looking at these issues in more detail these questions in forthcoming blogs. Stay tuned!
During the current coronavirus pandemic, the health and social care sector as well as the emergency services are all providing an amazing service to those who are in need of urgent medical treatment. This will almost always require the sharing of personal data between organisations.
Even during a pandemic, it is important to note that GDPR still applies to ensure individuals’ privacy is protected whilst vital services are provided. On 19th March 2020 the European Data Protection Board has issued a statement on the processing of personal data in the context of the COVID 19 in which it emphasised this point:
“Data protection rules (such as the GDPR) do not hinder measures taken in the fight against the coronavirus pandemic. The fight against communicable diseases is a valuable goal shared by all nations and therefore, should be supported in the best possible way.
It is in the interest of humanity to curb the spread of diseases and to use modern techniques in the fight against scourges affecting great parts of the world. Even so, the EDPB would like to underline that, even in these exceptional times, the data controller and processor must ensure the protection of the personal data of the data subjects.”
The first data protection principle in Article 5 (1) requires Data Controllers to process personal information “lawfully, fairly and in a transparent manner”. Processing personal data is only lawful if one or more of the six lawful bases listed in Article 6 (1) applies.
If a Data Controller processes personal data about a person’s health (which is a class of Special Category Data) then they must additionally identify one of the ten lawful bases set out in Article 9 (2). These are more detailed than those in Article 6, and are fleshed out further in Schedule 1 of the Data Protection Act 2018. However, there are some overlaps. For example ‘consent’ is a lawful basis in Article 6 (1)(a) and ‘explicit consent’ appears in Article 9(2)(a). Similarly ‘vital interests’ appears in both Articles 6 and 9, however there are differences between the two which we explore below.
Article 6 (1) (d) provides that the processing of personal data is lawful if the processing is necessary to protect the vital interests of the data subject or of another natural person. This raises three points for discussion.
What are vital interests?
When will processing be ‘necessary’?
When can it be used to protect the vital interests of ‘another natural person’?
GDPR Recital 46, specifically refers to processing for the monitoring of epidemics and it seems this lawful basis is intended to be used in situations such as the current pandemic. But what about other interests? Are they vital?
During a recent GDPR workshop one delegate asked whether a person’s financial interests could be classed as a ‘vital interest’ (after all, we all need money to live). The answer is no because the word ‘vital’ is interpreted very narrowly. Recital 46 refers to processing that is “necessary to protect an interest which is essential for the life of the data subject or that of another natural person”. The ICO’s interpretation of this is that this generally only applies where it is necessary to protect someone’s life.
Our Example. Sam becomes acutely ill at work and his employer phones the ambulance service. The employer gives the paramedics Sam’s name and address. The employer can rely on the vital interest’s lawful basis to share this information. If the paramedics need access to Sam’s health records,then the GP will be able to share them for the same reason but will additionally require an Article 9 lawful basis (see below).
However, in our view vital interests can also include situations where there is a risk of significant harm to life. Therefore if an elderly person is forced to self-isolate and depends upon a group of volunteers collecting their essential prescription medicines, then sharing that person’s name and address is arguably necessary to protect their vital interests.
The processing must be “necessary” in order to protect a person’s vital interests. The key question is whether a Data Controller can reasonably protect a person’s vital interests without the processing (sharing their personal data). If they can then the processing will not be necessary. If they cannot then it will be lawful. In the above example, if the employers refused to give the paramedics Sam’s name and address then this could potentially threaten their ability to offer him life-saving treatment. Therefore the sharing of Sam’s personal data is necessary to protect Sam’s vital interests.
Protecting the Vital Interests of Other Persons
Those familiar with the Data Protection Act 1998 will know that the lawful basis in Article 6 (1)(d) is very similar to the one listed in paragraph 4 of Schedule 2 of the 1998 Act. Unlike the old DPA, the GDPR extends this lawful basis to processing that is necessary to protect the vital interests of “another natural person.”However, Recital 46 cautions that “Processing of personal data based on the vital interest of another natural person should in principle take place only where the processing cannot be manifestly based on another legal basis”.
Back to our example. When the paramedics take Sam away in the ambulance, they ask for the names of any employees she may have come into contact with because they are concerned for their health. Can the employer rely on Article 6 (1) (d) to share their names? The answer is no if the employer can find an alternative lawful basis such as consent.
Consequently, as the ICO notes, the processing of one individual’s personal data to protect the vital interests of another is likely to happen only rarely. The ICO gives an example of the processing of a parent’s personal data to protect the vital interests of their child.
What about processing of personal data to save the lives of many others, for instance in a pandemic situation? Recital 46 suggest that this lawful basis may be used to process personal data for this purpose. But it also states that this basis should only be used where processing cannot be based on another legal basis. This could include “legal obligation” or “official authority”.
Special Category Data
A Data Controller sharing health information (or any other Special Category Data) also needs to identify a lawful basis under Article 9 of GDPR. This allows processing if is “is necessary to protect the vital interests of the data subject or of another natural person where the data subject is physically or legally incapable of giving consent.”
This basis is more rigorous than its counterpart in Article 6. It permits the processing of Special Category Data if the processing is necessary to protect the vital interest of the data subject or of another natural person but only “where the data subject is physically or legal incapable of giving consent.” This clearly allows medical practitioners to share health data in emergency medical situations where a patient is unable to consent to it.
If a patient is fit and able (physically and mentally) of giving consent, then a Data Controller cannot rely on Article 9 (2)(c).
Example, a volunteer group has compiled a database of the names and addresses of residents who need their prescriptions collecting. They share these names and addresses with volunteers. The group has asked volunteers to log details of any residents who have COVID 19symptoms in order that they can take steps to protect the lives of the volunteers. The group can only process this information if the person with symptoms explicitly consents to their information being shared (and they understand exactly why their information is being shared). If they are physically able to consent (or refuse to give consent) then the group cannot rely on the vital interests condition.
Although the temptation may be to assume that sharing health data is permissible in the circumstances, the vital interests’ condition in Article 9 (2) (c) has its limits.
Volunteer groups may need to take steps to obtain consent from data subjects and be prepared to explain exactly why they want this information. Article 9 does provide further lawful conditions which may be relevant (Articles 9 (2) (h) and (I)). We will consider the use of these in a future blog post.
Many established charities and recently formed volunteer groups are also now providing essential support services for those members of the community who are at risk, or vulnerable or in need. In order to do this these services may need to share personal data about such people, and often about their health. Whilst this is laudable, they too must be mindful of the GDPR implications. Our recent blog post about Covid 19 volunteer groups goes into more detail.