Yesterday the Prime Minister said England will have a “world-beating” Covid 19 contact tracing system from June. Part of this system is the introduction of the NHS contact tracing app (“the Covid App”) which is currently being trialled on the Isle of Wight.
The app was initially meant to be launched across England in mid-May. Yesterday No.10 suggested this will be done “at a later date.” Why the delay? Well if you look at the recently published Data Protection Impact Assessment (DPIA) for the trial it’s obvious that much more work needs to be done. Here is our analysis of some of the issues raised. (If you are new to this subject we suggest you read the first blog in our series which discussed how such apps work and the different models which can be used.)
Background to the DPIA
The start of the App project has not been auspicious; nor does it instil confidence in the people running it. How can the public, let alone privacy professionals, trust the government when they say that the app will respect their privacy?
The trial of the app started on the Isle of Wight before the Information Commissioner’s Office (ICO) had been given sight of the DPIA. Although they have now seen a copy, the ICO is yet to give a formal opinion. Should the trial have gone ahead in this situation?
As demands grew to see the DPIA, NHSX published it as a .pdf document! However embedded documents including the all-important risk register could not be accessed.
So much for transparency! A few days later the word version of the DPIA was published revealing all the documents but there were typos and some names were not redacted. More importantly, those scrutinising it raised concerns that “high risks” in the original documentation had been listed as only “medium risks” in the risk register. NHSX quickly removed the word document and only the .pdf version is now available (here). For the trial to go ahead before all of the promised and finalised accurate documentation had been released again does not engender faith in the app’s ability to protect users’ privacy.
An Ethics Advisory Board has been set up to oversee the Covid App project. In a letter to the Secretary of Health and Social Care, the Board spelt out the 6 principles it expected to be followed; value, impact, security and privacy, accountability, transparency and control.
Some members of the Board have since raised their concerns to the press over how the Board has been responded to. They were also unhappy not to have seen the final DPIA before being asked to comment.
Parliament’s Joint Committee on Human Rights has also been scrutinising the Covid App. It has said that it is not reassured that the app protects privacy and believes that it could be unlawful if the large amount of data gathered proved ineffectual. The Committee has even taken the unusual step of drafting a bill which would require all of the collected data to be deleted after the pandemic is over. (We will look at what data the NHS wants to keep for research purposes and why in our fourth and final blog in this series.)
These serious concerns being raised being by experts and parliamentarians will have a big impact on the public uptake of the app.
Privacy by Design
In line with Article 25 of the GDPR, the app’s DPIA states that it was designed and will continue to evolve with the Privacy by Design principles embedded. They include collecting the minimal amount of data necessary; data not leaving the device without the permission of the user; users’ identities obscured to protect their identity; no third-party trackers; proximity data deleted on users’ phones when no longer required; user can delete the app and its data at any time; personal data will not be kept for longer than is necessary in the central database; data in the central database will not be available to those developing in the app apart from in exceptional circumstances; and provision of any data from the central database will be subject to a data protection impact assessment and establishment of legal basis for the disclosure.
The key part of any DPIA are the risks identified and what mitigation can be put in place to reduce that risk if possible. The documented risks in the Covid App include:
- Transferring of data outside of the EEA
- Misuse of information by those with access
- Adequate data processing agreements with relevant data processors
- Lack of technical or organisational measures implemented to ensure appropriate security of the personal data
- Personal data not being encrypted both/either in transit or at rest
- Lack of testing which would assess and improve the effectiveness of such technical and organisational measures
- Inadequate or misleading transparency information
- Misuse of reference code issued by app for test requests and results management
- Malicious access to sonar backend by cyber-attack. Extraction and re-identification of sonar backend data by combination with other data
- Identification of infected individual due to minimal contact – e.g. isolated person with carer who is only contact
- Malicious or hypochondriac incorrect self-diagnosis on app
- Absence of controls over access to app by children
- Lower than expected public trust at launch
- Uncertainty about whether users will be able to exercise SRRs in relation to data held in the sonar backend
- Uncertainty over retention of individual data items
It is surprising that the Covid App DPIA only identifies 15 risks in such a major project involving sharing Special Category Data. To assess all those risks as low to medium also casts doubts on the robustness of the risk assessments. Recently we heard that wide-ranging security flaws have been flagged by security researchers involved in the Isle of Wight pilot.
There also seems to be a lack of clarity about the data being processed by the app.
In response to the concerns raised, NHXS itself tweeted that the Covid App “does not track location or store any personal information.”
This was quickly objected to by many from the data protection community who disagreed with both assertions and argued the app used pseudonymised data and trackers.
The ICO itself states on its website:,
“Recital 26 (of the GDPR) makes it clear that pseudonymised personal data remains personal data and within the scope of the GDPR”.
The DPIA itself, however, does state that pseudonymised will be used and is personal data. The mixed messages coming from NHSX will only continue to cause confusion and once again erode trust.
What is even more worrying is that there are some risks that have not been identified in the original DPIA:
- There is a risk that there could be function creep and identification over time as more personal data is added or different research projects add in other identifiable data sets, along with the risk of interpreting smartphone users’ movements and interactions.
- Users’ rights for their data to be erased under Article 17 of the GDPR have been completely removed once the data is used for research purposes. We’ll explore this more in a later blog around research.
- There is no mention of any risk associated with the Privacy and Electronic Communications (EC Directive) Regulations 2003/2426.
- No decision has yet been made on retention periods for research. The data could be kept for too long and breach the GDPR Principle 5.
- The collection of personal data could be unlawful as it may breach the Human Rights Act 1998. If the app does not prove effective, it is arguable that it is not necessary and proportionate for the purpose it was created. More on this in the third blog in this series.
- It also is unclear as to how the NHS risk scoring algorithm works as details have not been published. The Privacy Notice makes no mention of automated processing and is therefore not compliant with Article 13 (2)(f) of the GDPR.
At this moment in time, there are still far too many questions and unaddressed concerns relating to the Covid App to reassure the public that they can download it in good faith, and know exactly what will happen to their data.
Feedback from the privacy community should result in a revised DPIA and further scrutiny. Only after all the questions and concerns have been addressed should the app be launched. Yesterday outsourcing firm Serco apologised after accidentally sharing the email addresses of almost 300 contact tracers. The company is training staff to trace cases of Covid-19 for the UK government!
This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online GDPR Practitioner Certificate course is fully booked. We have 1 place left on the course starting on 11th June.