Act Now Training is pleased to announce that data protection specialist, Kirsty Squires, has joined its team of associates. Ibrahim Hasan, solicitor and director of Act Now Training, said:
“I am very pleased that Kirsty has joined our team. We were impressed with her ability to explain difficult subjects in a simple jargon-free way with a dash of humour. Her data protection knowledge coupled with real world experience will help us expand our consultancy services and deliver more online and classroom-based workshops to our clients.”
Kirsty is currently the Data Protection Officer for East Northamptonshire Council.
Her job includes developing and delivering a range of bespoke practical training to staff, councillors and partners on a range of subjects including handling complex subject access requests, redaction and publication of data, and GDPR essentials for councillors. Kirsty developed and implemented the council’s GDPR readiness plan and some tools for data flow mapping and ROPA/IAR development and maintenance. She continues to support the authority in developing its data protection culture and its response to COVID-19.
Kirsty’s previous roles in project management, transformation and change, and business analysis have often included managing data protection and information governance projects in the public sector. She has worked with a wide range of services including children’s services/social care, ICT security/PSN compliance, community development/safety, social housing, waste management, contracts and procurement and human resources in both the public and private sectors.
Kirsty also supports the development of information governance and data protection practices for the Future Northants Unitary programme. She leads on the training and policy development elements of information governance for the new unitary authorities in Northamptonshire. Kirsty said:
“I am delighted to be joining Act Now family. Act Now has a great reputation as leaders in the field of information governance. I am particularly excited about adding to their range of interactive online workshops.”
Kirsty will be a delivering a free webinar on Managing Data Protection Risk on 13th May 2020. Places are limited so please book early. For those looking for a GDPR qualification from the comfort of their home office, our GDPR Practitioner Certificate course is now available as an online option.
The question concerned a public authority that had commissioned a report from a third-party, which it intended to use internally, to evaluate various alternative courses of action. The involvement of the third party was limited to the writing and submission of the report. The authority received an EIR request for the report and was considering the application of the internal communications exception Regulation 12(4)(e) of EIR.
Before exploring this exception, it is worth reminding ourselves of some key points, which should inform our thinking:
The EIR contains an explicit statement that public authorities must apply a presumption in favour of disclosure (EIR reg. 12(2)).
In some respects, this the internal communications exception is the most difficult of the EIR exceptions to explain, and that is in part because of the sheer brevity of it:
“(4) …a public authority may refuse to disclose information to the extent that—
(a) – (e) (not relevant)
(e) the request involves the disclosure of internal communications.”
If we take these words on face value, this exception can apply to any communication within a public authority. In this respect it sits oddly with the stated purpose of the EU Directive which is that public authorities must disseminate and make available environmental information to the widest possible extent. It also contrasts with the other more narrowly defined exceptions. Furthermore, because the exception is “class based”, there is no need to demonstrate that any harm will occur by disclosing the information (unlike the exceptions in Regulation 12(5) which are “prejudice based”). However, all EIR exceptions require us to apply the public interest test. This requires us to think about the purpose of the exception and what public interest it seeks to protect. More on this later.
What is a communication?
According to the ICO guidance the concept of a communication is broad and will encompass any information someone intends to communicate to others, or even places on file (including saving it on an electronic filing system) where others may consult it.
It will therefore include not only letters, memos, and emails, but also notes of meetings or any other documents if these are circulated or filed so that they are available to others.
Information recorded by an author solely for their own use is not likely to be regarded as a communication. The ICO also considers that any documents attached to a communication (for instance email attachments) will constitute communications, but that each should be considered separately in deciding whether it is “internal”. (It is also worth noting that the Aarhus Implementation Guide advises that the exception does not usually apply to factual material. The Guide is not legally binding, but the courts may use it as an aid to interpretation.)
What is “internal”?
The EU Directive does not define internal communications. However regulation 12(8) states:
“For the purposes of paragraph (4)(e), internal communications includes communications between government departments.”
So for example communications between DEFRA and DBERR are capable of falling within this exception, but the following will not:
Communications between a government department and a non departmental public body. The First Tier Information Rights Tribunal (the Tribunal) has held that communications between DEFRA and the Marine Management Organisation, which was deliberately established as a non-departmental public body rather than as a departmental one, were not internal communications (DEFRA v Information Commissioner EA/2012/0105).
Communications between a public authority and another public authority such as communications between a district and a county council (ICO Decision Notice FER0623080).
If a public authority uses its own staff to produce a report and circulates that internally in order to evaluate alternative courses of action or inform policy decision making, then the exception is available, subject to the public interest test.
However, it is common practice amongst many public authorities to engage the services of external experts and consultants to produce reports which are used to enable internal deliberation and decision making. Can the public authority apply the internal communications exception to withhold such reports? As is nearly always the case, there is no definitive answer because the Tribunal has been reluctant to devise a standard test as to what amounts to an internal or external communication. The answer will largely depend upon the contextual and factual circumstances of the consultant’s engagement (see Secretary for State for Transport v Information Commissioner EA/2008/0052).
The extent to which an external consultant is “embedded” into an organisation will be a major factual and contextual consideration. In the above case Sir Rod Eddington (retired British Airways CEO) was commissioned to work with the Department of Transport. He was not under a contact and nor was he paid. However, he had a designated office within the department and a business card with the departmental logo on it. He effectively led a team of civil servants and had confidential access to Ministers’ and senior civil servants’ views. Consequently the Tribunal decided that his draft report was an internal communication and could be withheld under the Regulation 12(4)(e) exception, subject to the public interest test. In another case (Salford City Council v Information Commissioner EA/2015/027), the Tribunal accepted that communications between Salford City Council and a company called Urban Vision were “internal.” On the facts the company had been formed as a joint venture between the Council, Capita and a construction company to carry out the Council’s planning, building control and highway functions and the Council communicated with it in the same way as its internal officers.
In contrast, in South Gloucestershire Council v Information Commissioner and Bovis Homes Ltd (EA/2009/0032) the Tribunal rejected the council’s arguments that a firm of external consultants was integrated into the council. It remarked that the facts in the Department of Transport case were “exceptional”. In this case the council had engaged the consultants in the ordinary way by means of a contract. Although the consultants worked closely with the council they were not “embedded” or “integrated” into the organisation. The consultants’ role was important because they brought an independent view from outside.
Returning to the bulletin board question that prompted this blog, the answer must be it depends on the nature of an organisation’s relationship with the author of the external report. If, like in the South Gloucestershire case, a public authority has commissioned an external consultant to prepare a report that provides an independent view on alternative courses of action, this is unlikely to benefit from the internal communication exception.
If a consultant is seconded to work within a public authority, as a team player contributing to internal discussion, then there is a possibility that communications may be classed as internal.
The public interest test
The internal communications exception can be used to withhold a very wide range of information. The major constraint on its use lies in the application of the public interest test. The ICO guidance makes it clear that public interest arguments should be focussed on the protection of internal deliberations and the decision-making processes. This is because the purpose of this exception is to protect a public authority’s need for a “private thinking space”. This provides a significant limit to the application of the exception. The ICO and the Tribunal will review the disputed “communications” to see whether they relate to the type of policy formulation or decision making that requires full and frank deliberation for which a safe space is required. Consequently the exception will not apply to communications that are primarily administrative in nature (see John Kuschnir Information Commissioner EA/2014/0030).
Many of the cases applying the internal communications exception have concerned central government departments and the need for a safe space in relation to policy formulation. Other types of organisations can use it too though. For example, Basildon Council successfully used the exception to withhold internal communications between two council departments about a planning application. The Tribunal accepted there was a clear need for robust internal debate within a safe space during the period of the planning application and for all issues to be explored without the full glare of publicity. (Rodney Cole v Information Commissioner, EA/2014/0059).
The internal communications exception under EIR is complex. It is important for practitioners to read the ICO guidance and also keep up to date with the latest Tribunal decisions.
This course has been specifically designed to ensure delegates receive all the fantastic features of our location-based course but in a live online learning environment accessible from anywhere in the world. Class sizes are 50% smaller to ensure delegates receive all the attention and support they need to get the best out of the course.
The four days of training are split up in to three online sessions per day with slides, case studies and exercises. We have also built in 1 to 1 tutor time, with Ibrahim Hasan at the end of each day to provide individual support.
Delegates will receive a comprehensive set of course materials in the post, including our very popular GDPR Handbook. Access to our updated online Resource Lab, which now includes over 20 hours of videos on key aspects of the syllabus is also part of the course.
Act Now has over 17 years’ experience of designing and delivering online training.
In the run up to the launch of this new course we conducted some online sessions of our classroom based GDPR certificate course. This is what delegates said:
I would recommend online as an option for this course. It was very interactive.
There was always opportunity to ask questions and chat with other candidates.
We had a bit of fun too which made it even better!
RP, Yorkshire Ambulance Service NHS Trust
I found the online part of the course informative and worthwhile. The speaker maximised delegates’ input and gave feedback on our views which was helpful.
The process of joining the online meetings was perfectly smooth and clear.
NB, Bromsgrove District & Redditch Borough Councils
Ibrahim Hasan said:
“These are difficult times when traditional face-to-face training is not an option.
Our online GDPR Practitioner Certificate ensures that the learning does not stop.
We have used our experience of training over 6,000 information governance professionals to design an online course that is interactive, cost effective and which addresses the challenges of an online learning environment. This course is a cost-effective alternative to classroom-based training without compromising on quality.”
Places are limited so be sure to book your place as soon as possible. With a great introductory price of £1695 + vat which is a saving of £455, there is no better time to seize the opportunity.
The outbreak of the coronavirus and the sad news about so many people dying everyday is changing all our lives dramatically. As many of us as are trying to come to terms with the ‘new normal’ of staying home and working from home, the implications of all of this from a data protection and privacy perspective are not likely to be forefront of our minds. However, this is also a period in which we face some of the greatest erosions of our basic freedoms in terms of legal restrictions (Health Protection (Coronavirus, Restrictions) (England) Regulations 2020). As more and more people, particularly those with COVID 19 symptoms, the old and the most vulnerable members of society, are forced to self-isolate, local volunteer groups are springing up to support them.
As the BBC noted on 23rd March 2020, more than a thousand volunteer groups have been set up to help the most vulnerable members of their community.
Even though these groups are doing this with the very best of intentions they still need to comply with data protection laws, specifically the General Data Protection Regulation (GDPR). However, as the Information Commissioner’s blog on this subject makes clear, GDPR does not stop these groups from processing and sharing personal data to support people. The ICO has published some general guidance on its approach during this period. It states that it will not “penalise organisations that… need to prioritise other areas or adapt their usual approach during this extraordinary period”. This appears to suggest that the ICO will not take any regulatory action against a volunteer group that is processing personal data to help others during the current crisis.
In this blog we consider why the GDPR applies, and what basic practical steps volunteer groups should take to ensure they do not fall foul of the legislation.
Most volunteer groups will hold at least two lists of people; a list of those who need help (such as the elderly or people at risk) and a list of volunteers. It is likely that the names of people needing help will be shared with those offering help, but equally could be shared with emergency services if necessary.
The act of compiling lists of contact details and storing them on a PC or sharing them with group administrators falls squarely within the definition of ‘personal data’ and ‘processing’ in GDPR Article 4. This also means that the volunteer group becomes a Data Controller and must ensure that it complies with the GDPR. Volunteer groups cannot take advantage of the processing “in the course of a purely personal or household activity” exemption in Article 2(2)(c).
The personal data processed is likely to be limited to name, address and telephone number, but will almost certainly contain Special Category Data (defined in GDPR Article 9) if any health information is recorded. Volunteer groups will need to be careful ensure to they only collect relevant personal data, otherwise they will breach the ‘data minimisation principle’ in GDPR Article 5(1)(c).
The most fundamental requirement of GDPR is that the processing of personal is ‘lawful, fair and transparent’ (the first data protection principle in GDPR Article 5(1) (a)). Processing of personal data, even in these extraordinary times, is only lawful if the Data Controller has identified one of the lawful basis in Article 6. Consent is likely to be the most obvious lawful basis. However, people must know exactly what they are consenting to and they must understand why their personal data is being processed and who it might be shared with. Alternatively, the processing may be necessary for the legitimate interests of the Data Controller or other third parties (such as the people receiving help). Given the circumstances surrounding the compilation of local lists, and the difficulties in securing consent, this is likely to be the most flexible and useful lawful basis. However, it also requires groups to consider how the processing impacts the interests and fundamental freedoms of data subjects. In essence to consider the reasonable expectations of data subjects. So for example, a person who gives their name because they need help would not expect their name to be widely shared on social media. If a person’s health and safety is at risk, then the volunteer group may be able to rely on the ‘vital interests’ condition in Article 6 (1)(d).
Volunteer groups will inevitably collect some health data, which means that in addition to an Article 6 condition, they need to satisfy one of the lawful conditions in GDPR Article 9. The most obvious one, albeit limited, is if the processing (for instance, sharing) is necessary to protect a person’s “vital interests”, such as saving their life. However, this only applies where the data subject is physically incapable of giving consent. For example if a volunteer knows that an old person is not responding to their calls and is concerned that they may be very ill, then they could share this information with the emergency services or the GP. Other possibilities (aside from explicit consent) could include Art 9 (1) (i) “processing is necessary for reasons of public interest in the area of public health”. The ICO blog suggest that ‘safeguarding individuals’ is a possibility, but it isn’t clear what specific Article 9 condition they are referring to.
GDPR practitioners are likely to be very familiar with the transparency requirements in GDPR Articles 12-14. However small volunteer groups are unlikely to have a web site or the time and resources to draft detailed Privacy Notices. Although the ICO’s blog suggests that it is best that groups have a Privacy Notice (and even provides a link to a template) it also recognises that if this is going to delay vital support then groups can just speak to people. However it cautions that they need to be clear, honest and open about what they are doing with peoples’ personal data. Therefore groups may be well advised to produce a short statement when they collect personal data which provides a brief explanation about why they are collecting it and how they propose using it.
We all hope that this crisis will be over very soon, and we can get back to our normal lives. However some volunteer groups may be tempted to continue offering a neighbourly support service. Although this is to be applauded, it raises data protection issues and specifically compliance with the other data protection principles listed in GDPR Article 5. The personal data was collected for a specific purpose and should not be used after the crisis has ended, for other purposes unless those new purposes are compatible with the original purpose (the ‘purpose limitation principle’ in GDPR Article 5 (1) (b).
In any event, the personal data collected should not be used for longer than is necessary for the purposes for which it was originally processed (the ‘storage limitation principle’ in GDPR Article 5 (1) (e). This means that after the crisis people who have supplied their contact and even health details have a right to expect that their personal data will be safely destroyed. Personal Data must be also be accurate and up to date (the ‘accuracy principle’ in GDPR Article 5 (1) (d)). This is another reason for destroying the personal data once things get back to normal.
Even with limited resources, volunteer groups must take appropriate steps to protect the personal information against any unauthorised or unlawful processing and against accidental loss (the ‘integrity and confidentiality principle’ in GDPR Article 5 (1) (f)). Only a small number of people have access to the data and it should be securely stored. This is particularly important given the fact that a lot of the data will concern vulnerable people.
Nobody is suggesting that volunteer groups become GDPR expert’s over-night, but they still need to ensure basic compliance with the GDPR obligations. The ICO has published guidance on its website and a useful set of Q&A’s.
More on this and other developments in our GDPR webinars. Looking for a GDPR qualification from the comfort of your home office? Our GDPR Practitioner Certificate is now available as an online option.
Could a recent Supreme Court decision on information sharing lead to “terrorists” escaping justice? Part 3 of the Data Protection Act 2018 (DPA) regulates the processing of personal data for law enforcement purposes by Competent Authorities which includes, amongst others, government departments and the police.
The case of Elgizouli (Appellant) v Secretary of State for the Home Department (Respondent)  UKSC 10 is interesting because it examines the application of GDPR’s less well-known cousin to a complex situation involving the possible extradition of alleged terrorists to the United States. The Supreme Court ruled that the UK acted unlawfully by personal data with the US that could lead to the execution of two British citizens accused of being part of an Islamic State murder squad known as “The Beatles”. Seven justices concluded that the decision in 2018 by the Home Secretary breached Part 3 of the DPA.
Shafee Elsheikh and Alexander Kotey are currently in US custody in Iraq having been linked to 27 murders in Syria carried out by “The Beatles”. In June 2015, the US made a mutual legal assistance (MLA) request to the UK in relation to an investigation into the activities of that group. The then Home Secretary, Sajid Javid, requested an assurance that any information the UK supplied would not be used by the US, directly or indirectly, in a prosecution that could lead to the imposition of the death penalty on the two men. The US refused to provide this assurance and, in June 2018, Mr Javid agreed to provide the information anyway.
Elsheikh’s mother, Maha Elgizouli, challenged (by way of judicial review) the Home Secretary’s decision to share that information with the US, not to prevent him from being prosecuted and jailed but, to protect him from the death penalty. Her claim was dismissed by the High Court, which certified two legal questions of public importance for the Supreme Court to answer:
Whether it is unlawful for the Secretary of State to exercise his power to provide an MLA so as to supply evidence to a foreign state that will facilitate the imposition of the death penalty in that state on the individual in respect of whom the evidence is sought.
Whether (and if so in what circumstances) it is lawful under Part 3 of the DPA, as interpreted in the light of relevant principles of EU data protection law, for law enforcement authorities in the UK to transfer personal data to law enforcement authorities abroad for use in capital criminal proceedings.
The Supreme Court allowed the appeal. Most of the Justices dismissed the challenge brought under the common law (question 1 above) to the Home Secretary’s decision but they unanimously held that the decision failed to comply with part 3 of the DPA (question 2). Data Protection professionals, especially those in law enforcement agencies, will be particularly interested in the court’s analysis of the rules relating to international transfers, as set out in Chapter 5 of the DPA
Section 73 of the DPA, like Article 49 of the GDPR, prohibits transfers of personal data to a third country unless a number of conditions are met. Condition two is that the transfer :
“(a) is based on an adequacy decision (see section 74),
(b) if not based on an adequacy decision, is based on there being appropriate safeguards (see section 75), or
(c) if not based on an adequacy decision or on there being appropriate safeguards, is based on special circumstances (see section 76)”
The court noted that the transfer in question was not based on an adequacy decision; nor was it based on appropriate safeguards which are set out in Section 75(1):
“A transfer of personal data to a third country or an international organisation is based on there being appropriate safeguards where—
(a) a legal instrument containing appropriate safeguards for the protection of personal data binds the intended recipient of the data, or
(b) the controller, having assessed all the circumstances surrounding transfers of that type of personal data to the third country or international organisation, concludes that appropriate safeguards exist to protect the data.”
The lawfulness of the transfer therefore stands or falls on the “special circumstances” condition in section 73. This will only apply, according to section 76, if the transfer is necessary for any of the following five purposes:
“(a) to protect the vital interests of the data subject or another person,
(b) to safeguard the legitimate interests of the data subject,
(c) for the prevention of an immediate and serious threat to the public security of a member State or a third country,
(d) in individual cases for any of the law enforcement purposes, or
(e) in individual cases for a legal purpose.”
The court ruled that a transfer on the basis of special circumstances can only occur following an assessment of what is strictly necessary. Such an assessment was not made by the Home Secretary before sharing the information with the US. Hence the transfer was unlawful. Lord Carnwath said:
“The decision was based on political expediency, rather than consideration of strict necessity under the statutory criteria.”
Furthermore, in relation to the special circumstances gateway, section 76(2) states:
“Subsection (1)(d) and (e) do not apply if the controller determines that fundamental rights and freedoms of the data subject override the public interest in the transfer”.
Lady Hale found that these “fundamental rights and freedoms” include the rights protected by the European Convention on Human Rights, the most fundamental of which is the right to life. This points towards an interpretation of section 76(2) which, even if an assessment had been made, would not allow the transfer of personal data to facilitate a prosecution which could result in the death penalty for UK citizens.
So there you have it; a very careful analysis by the Supreme Court of the international transfer provisions under Part 3 of the DPA. There must now be a further court decision over what the UK must do to comply with the law, including potentially asking the US to return the shared information. This could lead to the two individuals in question avoiding extradition to the US where they would, if convicted, face the death penalty. Of course, the UK government can still bring them back to the UK to face justice.
This and other developments will be discussed in our forthcoming information law webinars. We have created a policy pack containing essential document templates to help you meet the requirements of Part 3 of the DPA 2018.
Responding to the Covid-19 pandemic is stretching our public services. Most obviously the NHS is diverting all the resources it can to meeting critical health needs. But local authorities are also struggling to maintain vital services in the face of unprecedented demands and staff who, if not already ill and self-isolating, are obliged to comply with social distancing measures. Other public authorities are facing logistical challenges in maintaining services and some are even having to put some staff on HMRC-funded furlough.
In such challenging circumstances, where does dealing with information requests under Freedom of Information and DataProtection laws sit in the scheme of priorities? Many authorities who are fortunate enough to have staff dedicated to handling FOI requests or data subject access requests will have re-tasked them to undertake more business-critical roles. Where staff have information request handling as only part of their role, other more pressing duties are likely to trump FOI and DP timescales. And where staff are working from home and access to premises either discouraged or forbidden, manual records may remain inaccessible for weeks or months to come. Where requests are made by post, they may be delivered to offices which will not be staffed for some time.
The response of the Scottish Government has been robust. On 1 April 2020, the Scottish Parliament passed the Coronavirus (Scotland) Bill which, while retaining the statutory requirement to “respond promptly”, extends the timescale for responding to requests under the Freedom of Information (Scotland) Act 2002 from twenty to sixty working days. Moreover, Part 2 of Schedule 6 provides a mechanism for the Scottish Ministers to allow Scottish public authorities to extend the timescale, subject to providing written notice to the applicant, by a further forty working days, where the authority “determines that it is not reasonably practicable to respond to the request within the relevant period because of… (a) the volume and complexity of the information requested, or (b) the overall number of requests being dealt with by the authority at the time that the request is made.”
The emergency legislation also allows the Scottish Information Commissioner to find that a public authority has not failed in their duties under FOISA if he is satisfied that the failure to respond within timescales was due to the impact of coronavirus and reasonable in the circumstances. The Scottish Information Commissioner for his part is keen to remind public authorities that their duty to respond promptly remains, that the measures are temporary, and that they do not extend to the Environmental Information (Scotland) Regulations 2004 (EISR).
Of course, the Scottish Parliament cannot legislate with regard to data protection (where EU and UK legislation applies) nor can it amend the timescales for requests under the EISR as they implement the obligations of the Aarhus Convention. But as far as they can do so, the Scottish Government and Parliament have sought to relax the demands of information requests in the face of the pandemic.
For data subject access requests under GDPR (or s 45 of the Data Protection Act 2018 where they relate to law enforcement processing) and requests under the Freedom of Information Act 2000, there is no relaxation of the law. This was despite the call to do so from some quarters, including the Local Government Association who called on Parliament to include measures “temporarily relaxing the requirements on councils in regard to GDPR and FOI”. We rely instead on flexibility from the Information Commissioner as regulator.
While the UK Government did not take the opportunity of the Coronavirus Act to take extend time limits(and would be unable to do so in any case with regard to GDPR as we are still in the transition period), the ICO has made clear they will not penalise organisations who have made understandable decisions to prioritise other tasks. As they state on their website, “We are a reasonable and pragmatic regulator, one that does not operate in isolation from matters of serious public concern. Regarding compliance with information rights work when assessing a complaint brought to us during this period, we will take into account the compelling public interest in the current health emergency.”
Organisations should therefore be reassured that they are unlikely to face official censure or significant public criticism if they make reasonable decisions to prioritise other tasks to protect and serve the public ahead of normal levels of service for FOI requests and subject access requests. If your organisation, almost inevitably, is finding it difficult to meet the timescales at this difficult time, we would suggest you take a common-sense and measured approach:
Make a record of your decisions to re-allocate resources from handling information rights requests to other service-delivery priorities;
Document the practical challenges (such as inaccessibility of manual records or post, and unavailability of key colleagues) which mean that it is “reasonable in all the circumstances” that the organisation is not able to meet normal levels of performance;
Manage the expectations of applicants through your website and in your acknowledgements of requests and your automated email responses, and continue to communicate with applicants as far as you are able to do so;
At the point at which your organisation, and the rest of humanity, is beginning to recover from the Covid-19 emergency, develop and document an action plan for addressing any backlog of requests which has built up.
At Act Now, we are passionate about the importance of information rights: They are at the heart of our democracy and our human rights. But the right to life must take priority over others, and we would be the first to recognise that organisations and individuals must make decisions which put people first, particularly at a time of global emergency.
The police have an important rule to play in the current coronavirus lockdown. However their actions must at all times be proportionate, transparent and (above all) lawful. Only yesterday, British Transport Police admitted they had wrongly charged a woman who was fined £660 under coronavirus legislation. Marie Dinou was arrested at Newcastle Central Station on Saturday after she refused to tell police why she needed to travel. A police and Crown Prosecution Service review said she was charged under the wrong part of the Corona Virus Act. The court will be asked to set the conviction aside.
This is not the only recent incident of the police overstepping the mark. By now most of us will have seen the story about a couple walking their dog in the Peak District. The video was filmed by a drone operated by the Derbyshire Police Drone Unit, and broadcast to the nation on BBC news. According to Derbyshire Police’s Twitter feed (which broadcast the same 90 second footage) the police force wanted to reinforce the government message of ‘stay at home’ and to point out this was not getting through, by effectively ‘shaming’ the couple who were captured on camera.
The video has sparked huge controversy from various circles including civil liberties campaign group Big Brother Watch and a leading member of the judiciary. According to the BBC,Big Brother Watch has described the move as ‘sinister and counter-productive’. Ex Supreme Court Judge, Lord Sumption, has also been very critical.
In BBC Radio 4’s World at One, Lord Sumption made it clear that the police have no legal power to enforce Government Ministers ‘wishes’ and guidance about non-essential travel. Although the government has enacted the Coronavirus Act 2020, this does not give the police any powers to stop individuals from non-essential travel or walking in isolated places. Lord Sumption’s criticism is most tellingly summed up in the following quotation:
“This is what a police state is like, it is a state in which the government can issue orders or express preferences with no legal authority and the police will enforce ministers’ wishes.”
At Act Now we are not able to comment on whether the police have the powers to do this but we respectfully accept Lord’s Sumption’s view that they did not. Our concern is whether the filming and broadcasting of these individuals was GDPR compliant.
Our conclusion is that it was not.
The use of drones poses a privacy risk. The Police Force took the decision to process this personal data for their own purposes (“to get the message across”). They are therefore Data Controllers and must comply with the General Data Protection Regulation (GDPR) in relation to this processing. Images of individuals constitute personal data where it is possible to identify them from those images (GDPR Article 4(1)). It is entirely possible that the individuals captured in that Derbyshire police video could be identified by their clothing, hair colour and the presence of their dog.
Drones can be used to film people in many locations, often without the knowledge of those being filmed. In these circumstances, the processing of personal data must be lawful (GDPR Article 5 (1)). It is questionable which Article 6 basis the police could rely on here. Arguably processing is necessary for a ‘task carried out in the public interest’. However one would have to ask why it was necessary to film and broadcast these individuals. The police could not rely on ‘legitimate interests’ because this does not apply to processing carried out by public authorities in performance of their task (GDPR Article 6 (1)(f)).
Even if the police could identify a lawful basis, the next question is whether this processing is fair. The ICO guidance states that Data Controllers should only process data in ways that people would reasonably expect and not use it in ways that have unjustified adverse effects on them. I would argue that it is highly unlikely that anybody walking their dog in an isolated part of the Peak District would have any reasonable expectation that they would be secretly filmed by a drone and that their images would be broadcast to the nation in an attempt to shame them. So it seems highly unlikely that this processing is fair.
GDPR also requires transparency when processing personal data. This means data subjects should be made aware that their personal data is being processed and why.
The ‘normal’ transparency requirements (usually the GDPR (Articles 12-14) are less onerous for the police when they are processing personal data for law enforcement purposes under Part 3 of the Data Protection Act 2018. However, the police admitted themselves that the filming was for the purposes of ‘getting a message out’ and this does not fit easily within the definition of law enforcement purposes under S.31 DPA 2018. At best the police could try and argue that the processing was for the purposes of preventing threats to public security, but it is really difficult to see how this would succeed when it was just a couple walking their dog on an isolated stretch of path.
On balance, the Derbyshire Police’s decision to film individuals taking a walk in an isolated area, in order to get a message across about not travelling unnecessarily was at best misguided, and at worst unlawful. The coronavirus is changing almost all aspects of our daily lives, and social distancing and self-isolating are the new norms. However, when the police take action it is still vital that they comply with their legal obligations in relation to the processing of personal data.
The long-awaited decision in the Supreme Court appeal by Morrison Supermarkets was handed down yesterday. WM Morrison Supermarkets plc (Appellant) v Various Claimants (Respondents)  UKSC 12 concerned an appeal by Morrisons from an earlier decision by the Court of Appeal. The latter agreed with the previous High Court judgement that Morrisons was liable for the actions of its former employee who stole and then maliciously posted the payroll details of his colleagues online before leaving his job. Employers will now breathe a big sigh of relief. The earlier decisions seem to suggest that no matter what precautions an employer takes, it would still be liable for the actions its rogue employees.
Let’s look at the facts in a bit more detail before turning to the judgment.
In January 2014 a file containing personal details of almost 100,000 Morrisons’ employees was posted on a file sharing website and later a CD, containing a copy of the data, was received by three UK newspapers. The file contained names, addresses, gender, date of birth, home and mobile phone numbers, National Insurance numbers, bank sort codes, bank account numbers and salary details. None of the newspapers published the story and one of them informed Morrisons who called the police after having the file removed from the file sharing website.
Andrew Skelton, a senior IT auditor at Morrisons, who had previously been subject to disciplinary action for another matter, had been tasked with preparing the file for Morrisons’ auditors, when he decided to take his revenge. He was charged with various offences and later sentenced to eight years in prison.
Over 5,000 employees of Morrisons later brought a group legal action for damages. They argued that Morrisons was liable for Skelton’s malicious misuse of their personal data. The judge ruled that Morrisons had not breached the Data Protection Act 1998 (this case started before GDPR came into force) because they had adequate security in place to protect the data, in compliance with the then 7th Data Protection Principle. He ruled that Morrisons was not primarily to blame for the incident but it was vicariously liable for Skelton’s malicious actions as his employer. The judge took account of, amongst other things, the fact that Morrisons had selected Skelton for a trusted position which involved transferring the personal data to their auditors, KPMG. The Court of Appeal agreed.
The case was primarily about the employment law principle of “vicarious liability.” It aimed to answer the question; when is an employer liable for the actions of an employee when they deliberately behave in a way designed to harm their employer and others? Are they still acting within the scope of their employment or “on a frolic of their own”? The facts of the case also meant that data protection officers and lawyers were watching with bated breath and asking “Can an employer be legally responsible for data breaches caused entirely by their employee?”
The Supreme Court unanimously allowed Morrisons’ appeal. It ruled that whatever Skelton was doing when he disclosed his colleagues’ personal data, he was not acting “in the course of his employment”, and accordingly no vicarious liability could be imposed.
However, Morrisons lost on the argument that the Data Protection Act 1998 (DPA) operated so as to exclude vicarious liability. This principle can also be applied to the GDPR and so employers can “never say never” when it comes to vicariously liability for malicious data breaches by staff. It all depends on the facts of the breach.
This case only went as far as it did because the aggrieved employees failed to show, at first instance, that Morrisons was primarily liable for the data breach. If an employer fails to comply with its security obligations in a manner that is causally relevant to a rogue employees actions, it can still be exposed to primary liability under Article 32 of GDPR as well as the now 6thData Protection Principle.
Data Controllers and Processors need to consider doing the following:
Check your data protection and security policies and procedures. Who has access to personal data? Is it based on a need to know? Are they a trusted employee?
If you are like me, and currently self-isolating, then it is entirely possible that you are spending more time than usual browsing the internet, doing online shopping, buying books on your Kindle or watching movies on Amazon Prime. However, if you are looking for something educational (and food for thought) then I would recommend you take the time to watch the Panorama documentary “Amazon: What They Know About Us” screened on BBC 1 on 17th February 2020. You can draw your own conclusions, but for me the documentary made scary viewing and raised so many data protection issues that it made my head ache.
The programme charts the almost exponential growth of Amazon from 1994, when it was an online book seller, to the current position as ‘corporate superpower’.
According to Wikipedia Amazon is now the second company in history to reach a market cap of $1 trillion and Jeff Bezos, Amazon’s Chief Executive and founder, is described as the richest person on the planet. Whilst a great deal of this is already well known, the programme sheds light on Amazon’s more recent entry into other markets, and it is these current and prospective ventures that are particularly concerning from a data protection and privacy perspective.
It’s all about the data
Right from the start, Amazon fully understood the value of personal data. Its mission to be the ‘earth’s most customer centric’ company sounds very positive. However such ‘altruistic’ ambitions disguise the company’s mission of turning our personal data into big bucks. As one commentator, a Harvard Business School Professor notes, users of Amazon are not in fact just customers, they are ‘sources of raw material’ and that raw material is the personal data that Amazon collects every time we interact with it.
So how does Amazon collect so much data?
As early as 1995 Amazon recognised that it could use the data supplied by its online purchasers, through their browsing history and online purchases, to predict what books, music or videos they might be interested in purchasing. Later they appointed computer scientists to use algorithms to record and track all the personal data to create ‘digital DNA profiles’ of customers. By selecting one individual customer they had the capacity to predict ‘everything about that person’ based on what that customer clicked and didn’t click (their click streams histories).
As Amazon expanded into Amazon Market Place it invited other sellers onto the platform, in order to become the “everything store”. Amazon used a standard agreement with third party sellers that enabled them to sell their products on the Amazon platform, but effectively gave Amazon the rights over the sellers’ customer data.
These agreements allowed Amazon to operate as both a retailer and a marketplace and to use customer data from third party sellers to secure a competitive advantage against them. In July 2019, the EU Competition Commission opened up an investigation into the possible anti-competitive behaviour of Amazon, which could result in a possible fine of up to 10% of its annual global turnover under EU competition rules.
Of course, anybody using the Amazon website is entitled to review the company’s Privacy Notice to see what personal data is collected and why it is processed.
However, even to my relatively trained eye this doesn’t really convey the full extent of how much personal data Amazon collects from people whenever they use an Amazon service. One privacy campaigner made a request to Amazon for details of her click stream history (as anyone can do under the right of access using Article 15 of the GDPR). She was shocked to discover that 100 purchases had generated 15,000 pieces of information about her, based on her click stream. Amazon were able to tell which days she had taken holidays, or was sick, or when she couldn’t sleep at night.
The sheer volume of personal data that Amazon collects, and processes is demonstrated by the fact that Amazon operated a data warehouse called ‘Helix’ to analyse customers’ personal data ‘over the entirety of their lifetime’. It processes the data of hundreds of millions of people worldwide.
What about Alexa?
The BBC documentary also touches on one question that I have frequently heard people ask: ‘Can Alexa (Amazon’s voice assistant) listen to my conversations?’. The answer is yes. Amazon acknowledges that their workers can listen to anything that you say when the Amazon Echo’s blue light is on, and some of these private conversations are transcribed. If that’s disturbing, then Amazon’s ambitions for Alexa are even more worrying.
Amazon aspires for most things in the home to be Alexa enabled. This could result in the entire activity in the home being recorded. The more people interact with Alexa the more information that Amazon will be able to collect, or as one person said, it wants everything that people do in their homes to be ‘mic’d’ and recorded.
Coupled with this the company has obtained a patent that will enable Alexa to embed certain ‘sniffer’ algorithms to identify ‘trigger words’ that will enable Amazon to send direct marketing messages to Alexa users. Amazon says it has no current plans to do this, but equally is doesn’t refute the possibility. Commentators say that this increased data collection, particularly collecting data about people in their homes, will enable Amazon to start influencing and shaping people’s behaviour, and this constitutes a real threat to democracy and privacy.
Doorbells and Drones
In 2019 Amazon made nearly $12 billion profit and used some of that profit to buy into other lucrative markets that enable it to collect yet more data about people.
The BBC documentary charts the purchase of ‘’Ring’ a manufacturer of smart video doorbells. These doorbells allow users to record anyone who comes to their door, and are marketed as a means of ensuring the security of people’s homes (See Ring UK). However, in practice they are most likely to capture images of friends and neighbours and people delivering goods. (Forgive me for being sceptical but I wonder how many burglars or intruders are polite enough to ring first). However, Amazon is known to have given 1000 ‘Amazon Ring’ doorbells to three police forces in the UK and these are being embraced by Suffolk Police for their crime fighting potential. (Amazon may have provided free doorbells to other police forces but, in response to a BBC freedom of information request, only three police forces have confirmed that they have received the free doorbells.)
At this point you may be thinking that extra home security is a good thing. However, in America Amazon has created a ‘Ring Neighbours app’ that enables ‘ring’ users to share footage with others to create a digital neighbourhood scheme. This data is being shared with 913 US police forces who can obtain the data with the resident’s consent and without a warrant. There are concerns that the app may become available here in the UK.
According to Amazon the ring doorbells are not marketed as a surveillance device. However Tony Porter, the Surveillance Camera Commissioner considers that if the app were to be introduced into the UK it would change the dynamic of the surveillance from being a community form of reassurance to a state form of surveillance.
This clearly needs to be addressed by the Information Commissioner and through the General Data Protection Regulation. Tony Porter states that “we could end up living in a surveillance state.”
Then there is the Prime Air Drone; a delivery aerial drone equipped with cameras and sensors. Two weeks after its launch in 2019, Amazon was granted patent rights to allow it to use delivery drones for aerial home security. Amazon calls this ‘surveillance as a service’ and that the drone would be an ‘opt in’ service. However, even a fully consented opt in by subscribers of this service would not address the privacy issues of others who would inevitably be filmed by such drones. According to the Surveillance Commissioner, this could take us into a whole new area of unregulated territory and a shift into a surveillance state.
Save for some statements by the Surveillance Camera Commissioner, the documentary doesn’t address the data protection issues in particular whether the activities of Amazon comply with the General Data Protection Regulation(GDPR). However, it quite clearly raises numerous issues about lawful and transparent processing and several other GDPR compliance issues.
Jeff Bezos’ take on this is that the Amazon’s use of our data should be for us to decide. The implication being that if users aren’t happy then they don’t need to use Amazon services. However, as one former Amazon Executives says, “don’t necessarily see it as Big Brother if it is done carefully”, which probably reflects the fact that most people don’t really know the full extent of what is going on.