Law Firm Fined For GDPR Breach: What Went Wrong? 

On 10th March the Information Commissioner’s Office (ICO) announced that it had fined Tuckers Solicitors LLP £98,000 for a breach of GDPR.

The fine follows a ransomware attack on the firm’s IT systems in August 2020. The attacker had encrypted 972,191 files, of which 24,712 related to court bundles.  60 of those were exfiltrated by the attacker and released on the dark web.  Some of the files included Special Category Data. Clearly this was a personal data breach, not just for the fact that data was released on the dark web, but because of the unavailability of personal data (though encryption by the attacker) which is also cover by the definition in Article 4 GDPR. Tuckers reported the breach to the ICO as well as affected individuals through various means including social media

The ICO found that between 25th May 2018 (the date the GDPR came into force) and 25th August 2020 (the date on which the Tuckers reported the personal data breach), Tuckers had contravened Article 5(1)(f) of the GDPR (the sixth Data Protection Principle, Security) as it failed to process personal data in a manner that ensured appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures. The ICO found its starting point for calculating the breach to be 3.25 per cent of Tuckers’ turnover for 30 June 2020. It could have been worse; the maximum for a breach of the Data Protection Principles is 4% of gross annual turnover.

In reaching its conclusions, the Commissioner gave consideration to Article 32 GDPR, which requires a Data Controller, when implementing appropriate security measures, to consider:

 “…the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons”.

What does “state of the art” mean? In this case the ICO considered, in the context of “state of the art”, relevant industry standards of good practice including the ISO27000 series, the National Institutes of Standards and Technology (“NIST”), the various guidance from the ICO itself, the National Cyber Security Centre (“NCSC”), the Solicitors Regulatory
Authority, Lexcel and NCSC Cyber Essentials.

The ICO concluded that there are a number of areas in which Tuckers had failed to comply with, and to demonstrate that it complied, with the Security Principle. Their technical and organisational measures were, over the relevant period, inadequate in the following respects:

Lack of Multi-Factor Authentication (“MFA”)

MFA is an authentication method that requires the user to provide two or more verification factors to gain access to an online resource. Rather than just asking for a username and password, MFA requires one or more additional verification factors, which decreases the likelihood of a successful cyber-attack e.g. a code from a fob or text message. Tuckers had not used MFA on its remote access solution despite its own GDPR policy requiring it to be used where available. 

Patch Management 

Tuckers told the ICO that part of the reason for the attack was the late application of a software patch to fix a vulnerability. In January 2020 this patch was rated as “critical” by the NCSC and others. However Tuckers only installed it 4 months later. 

Failure to Encrypt Personal data

The personal data stored on the archive server, that was subject to this attack, had not been encrypted. The ICO accepted that encryption may not have prevented the ransomware attack. However, it would have mitigated some of the risks the attack posed to the affected data subjects especially given the sensitive nature of the data.

Action Points 

Ransomware is on the rise. Organisations need to strengthen their defences and have plans in place; not just to prevent a cyber-attack but what to do when it does takes place:

  1. Conduct a cyber security risk assessment and consider an external accreditation through Cyber Essentials. The ICO noted that in October 2019, Tuckers was assessed against the Cyber Essentials criteria and found to have failed to meet crucial aspects. The fact that some 10 months later it had still not resolved this issue was, in the Commissioner’s view, sufficient to constitute a negligent approach to data security obligations.
  2. Making sure everyone in your organisation knows the risks of malware/ransomware and follows good security practice. Our GDPR Essentials e learning solution contains a module on keeping data safe.
  3. Have plans in place for a cyber security breach. See our Managing Personal Data Breaches workshop

More useful advice in the ICO’s guidance note on ransomeware and DP compliance.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in April.

advanced_cert

First ICO GDPR Fine Reduced on Appeal

photo-1580971266928-ff5d40c194a7

The first GDPR fine issued by the Information Commissioner’s Office (ICO) has been reduced by two thirds on appeal.

In December 2019, Doorstep Dispensaree Ltd, a company which supplies medicines to customers and care homes, was the subject of a Monetary Penalty Notice of £275,000 for failing to ensure the security of Special Category Data. Following an investigation, the ICO ruled that the company had left approximately 500,000 documents in unlocked containers at the back of its premises in Edgware. The ICO launched its investigation after it was alerted by the Medicines and Healthcare Products Regulatory Agency, which was carrying out its own separate enquiry into the company.

The unsecured documents included names, addresses, dates of birth, NHS numbers, medical information and prescriptions belonging to an unknown number of people.
The ICO held that this gave rise to infringements of GDPR’s security and data retention obligations. It also issued an Enforcement Notice after finding, amongst other things, that the company’s privacy notices and internal policies were not up to scratch.

On appeal, the First Tier Tribunal (Information Rights) ruled that the original fine of £275,000 should be reduced to £92,000. It concluded that 73,719 documents had been seized by the MHRA, and not approximately 500,000 as the ICO had estimated. She also held that 12,491 of those documents contained personal data and 53,871 contained Special Category Data.

A key learning point from this appeal is that data controllers cannot be absolved of responsibility for personal data simply because data processors breach contractual terms around security. The company argued that, by virtue of Article 28(1) of GDPR, its data destruction company (JPL) had become the data controller of the offending data because it was processing the data otherwise than in accordance with their instructions. In support of this argument it relied on its contractual arrangement with JPL, under which JPL was only authorised to destroy personal data in relation to DDL- sourced excess medication and equipment and must do so securely and in good time. 

The judge said:

“The issue of whether a processor arrogated the role of controller in this context must be considered by reference to the Article 5(2) accountability principle. This provides the controller with retained responsibility for ensuring compliance with the Article 5(1) data processing principles, including through the provision of comprehensive data processing policies. Although it is possible that a tipping point may be reached whereby the processor’s departure from the agreed policies becomes an arrogation of the controller’s role, I am satisfied that this does not apply to the facts of this case.” 

This case shows the importance of data controllers keeping a close eye on data processors especially where they have access to or are required to destroy or store sensitive data. Merely relying on the data processor contract is not enough to avoid ICO enforcement. 

Our  GDPR Practitioner Certificate is our most popular certificate course available both online and classroom. We have added more dates.

First GDPR Fine Issued to a Charity

christopher-bill-rrTRZdCu7No-unsplash

On 8th July 2021, the Information Commissioner’s Office (ICO) fined the transgender charity Mermaids £25,000 for failing to keep the personal data of its users secure.
In particular this led to a breach of the Articles 5(l)(f) and 32(1) and (2) of the GDPR. 

The ICO found that Mermaids failed to implement an appropriate level of organisational and technical security to its internal email systems, which resulted in documents or emails containing personal data, including in some cases relating to children and/or including in some cases special category data, being searchable and viewable online by third parties through internet search engine results.  

The ICO’s investigation began after it received a data breach report from the charity in relation to an internal email group it set up and used from August 2016 until July 2017 when it was decommissioned. The charity only became aware of the breach in June 2019. 

The ICO found that the group was created with insufficiently secure settings, leading to approximately 780 pages of confidential emails to be viewable online for nearly three years. This led to personal data, such as names and email addresses, of 550 people being searchable online. The personal data of 24 of those people was sensitive as it revealed how the person was coping and feeling, with a further 15 classified as Special Category Data as mental and physical health and sexual orientation were exposed. 

The ICO’s investigation found Mermaids should have applied restricted access to its email group and could have considered pseudonymisation or encryption to add an extra layer of protection to the personal data it held.  

During the investigation the ICO discovered Mermaids had a negligent approach towards data protection with inadequate policies and a lack of training for staff. Given the implementation of the UK GDPR as well as the wider discussion around gender identity, the charity should have revisited its policies and procedures to ensure appropriate measures were in place to protect people’s privacy rights. 

Steve Eckersley, Director of Investigations said: 

“The very nature of Mermaids’ work should have compelled the charity to impose stringent safeguards to protect the often vulnerable people it works with. Its failure to do so subjected the very people it was trying to help to potential damage and distress and possible prejudice, harassment or abuse. 

“As an established charity, Mermaids should have known the importance of keeping personal data secure and, whilst we acknowledge the important work that charities undertake, they cannot be exempt from the law.” 

Up to April 2021, European Data Protection regulators had issued approximately €292 million worth of fines under GDPR. The greatest number of fines have been issued by Spain (212), Italy (67) and Romania (52) (source).  

Up to last week, the ICO had only issued four GDPR fines. Whilst fines are not the only GDPR enforcement tool, the ICO has faced criticism for lack of GDPR enforcement compared to PECR

The first ICO GDPR fine was issued back in December 2019 to a London-based pharmacy. Doorstep Dispensaree Ltd, was issued with a Monetary Penalty Notice of £275,000 for failing to ensure the security of Special Category Data. In November 2020, Ticketmaster had to pay a fine of £1.25m following a cyber-attack on its website which compromised millions of customers’ personal information. Others ICO fines include British Airways and Marriott which concerned cyber security breaches.  

It remains to be seen if the Mermaids fine is the start of more robust GDPR enforcement action by the ICO. It will certainly be a warning to all Data Controllers, particularly charities, to ensure that they have up to data protection data policies and procedures.  

Act Now Training’s GDPR Essentials e learning course is ideal for frontline staff who need to learn about data protection in a quick and cost-effective way. You can watch the trailer here. 

We only have two places left on our Advanced Certificate in GDPR Practice course starting in September.  

GDPR News Roundup

So much has happened in the world of data protection recently. Where to start?

International Transfers

In April, the European Data Protection Board’s (EDPB) opinions (GDPR and Law Enforcement Directive (LED)) on UK adequacy were adopted. The EDPB has looked at the draft EU adequacy decisions. It acknowledge that there is alignment between the EU and UK laws but also expressed some concerns. It has though issued a non-binding opinion recommending their acceptance. If accepted the two adequacy decisions will run for an initial period of four years. More here.

Last month saw the ICO’s annual data protection conference go online due to the pandemic. Whilst not the same as a face to face conference, it was still a good event with lots of nuggets for data protection professionals including the news that the ICO is working on bespoke UK standard contractual clauses (SCCs) for international data transfers. Deputy Commissioner Steve Wood said: 

“I think we recognise that standard contractual clauses are one of the most heavily used transfer tools in the UK GDPR. We’ve always sought to help organisations use them effectively with our guidance. The ICO is working on bespoke UK standard clauses for international transfers, and we intend to go out for consultation on those in the summer. We’re also considering the value to the UK for us to recognise transfer tools from other countries, so standard data transfer agreements, so that would include the EU’s standard contractual clauses as well.”

Lloyd v Google 

The much-anticipated Supreme Court hearing in the case of Lloyd v Google LLC took place at the end of April. The case concerns the legality of Google’s collection and use of browser generated data from more than 4 million+ iPhone users during 2011-12 without their consent.  Following the two-day hearing, the Supreme Court will now decide, amongst other things, whether, under the DPA 1998, damages are recoverable for ‘loss of control’ of data without needing to identify any specific financial loss and whether a claimant can bring a representative action on behalf of a group on the basis that the group have the ‘same interest’ in the claim and are identifiable. The decision is likely to have wide ranging implications for representative actions, what damages can be awarded for and the level of damages in data protection cases. Watch this space!

Ticketmaster Appeal

In November 2020, the ICO fined Ticketmaster £1.25m for a breach of Articles 5(1)(f) and 32 GPDR (security). Ticketmaster appealed the penalty notice on the basis that there had been no breach of the GDPR; alternatively that it was inappropriate to impose a penalty, and that in any event the sum was excessive. The appeal has now been stayed by the First-Tier Tribunal until 28 days after the pending judgment in a damages claim brought against Ticketmaster by 795 customers: Collins & Others v Ticketmaster UK Ltd (BL-2019-LIV-000007). 

Age Appropriate Design Code

This code came into force on 2 September 2020, with a 12 month transition period. The Code sets out 15 standards organisations must meet to ensure that children’s data is protected online. It applies to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have the best possible access to online services whilst minimising data collection and use.

With less than four months to go (2 September 2021) the ICO is urging organisations and businesses to make the necessary changes to their online services and products. We are planning a webinar on the code. Get in touch if interested.

AI and Automated Decision Making

Article 22 of GDPR provides protection for individuals against purely automated decisions with a legal or significant impact. In February, the Court of Amsterdam ordered Uber, the ride-hailing app, to reinstate six drivers who it was claimed were unfairly dismissed “by algorithmic means.” The court also ordered Uber to pay the compensation to the sacked drivers.

In April EU Commission published a proposal for a harmonised framework on AI. The framework seeks to impose obligations on both providers and users of AI. Like the GDPR the proposal includes fine levels and an extra-territorial effect. (Readers may be interested in our new webinar on AI and Machine Learning.)

Publicly Available Information

Just because information is publicly available it does not provide a free pass for companies to use it without consequences. Data protection laws have to be complied with. In November 2020, the ICO ordered the credit reference agency Experian Limited to make fundamental changes to how it handles personal data within its direct marketing services. The ICO found that significant ‘invisible’ processing took place, likely affecting millions of adults in the UK. It is ‘invisible’ because the individual is not aware that the organisation is collecting and using their personal data. Experian has lodged an appeal against the Enforcement Notice.

Interesting that recently the Spanish regulator has fined another credit reference agency, Equifax, €1m for several failures under the GDPR. Individuals complained about Equifax’s use of their personal data which was publicly available. Equifax had also failed to provide the individuals with a privacy notice. 

Data Protection by Design

The Irish data protection regulator issued its largest domestic fine recently. Irish Credit Bureau (ICB) was fined €90,000 following a change in the ICB’s computer code in 2018 resulted in 15,000 accounts having incorrect details recorded about their loans before the mistake was noticed. Amongst other things, the decision found that the ICB infringed Article 25(1) of the GDPR by failing to implement appropriate technical and organisational measures designed to implement the principle of accuracy in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects (aka DP by design and by default). 

Data Sharing 

The ICO’s Data Sharing Code of Practice provides organisations with a practical guide on how to share personal data in line with data protection law. Building on the code, the ICO recently outlined its plans to update its guidance on anonymisation and pseudonymisation, and to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing.

UK GDPR Handbook

The UK GDPR Handbook is proving very popular among data protection professionals.

It sets out the full text of the UK GDPR laid out in a clear and easy to read format. It cross references the EU GDPR recitals, which also now form part of the UK GDPR, allowing for a more logical reading. The handbook uses a unique colour coding system that allows users to easily identify amendments, insertions and deletions from the EU GDPR. Relevant provisions of the amended DPA 2018 have been included where they supplement the UK GDPR. To assist users in interpreting the legislation, guidance from the Information Commissioner’s Office, Article 29 Working Party and the European Data Protection Board is also signposted. Read what others have said:

“A very useful, timely, and professional handbook. Highly recommended.”

“What I’m liking so far is that this is “just” the text (beautifully collated together and cross-referenced Articles / Recital etc.), rather than a pundits interpretation of it (useful as those interpretations are on many occasions in other books).”

“Great resource, love the tabs. Logical and easy to follow.”

Order your copy here.

These and other GDPR developments will also be discussed in detail in our online GDPR update workshop next week.

The Marriott Data Breach Fine

Niagara Falls, Ontario, Canada - September 3, 2019: Sign of Marriott on the building in Niagara Falls, Ontario, Canada. Marriott International is an American hospitality company.

The Information Commissioner’s Office (ICO) has issued a fine to Marriott International Inc for a cyber security breach which saw the personal details of millions of hotel guests being accessed by hackers. The fine does not come as a surprise as it follows a Notice of Intent, issued in July 2018. The amount of £18.4 million though is much lower than the £99 million set out in the notice.  

The Data 

Marriott estimates that 339 million guest records worldwide were affected following a cyber-attack in 2014 on Starwood Hotels and Resorts Worldwide Inc. The attack, from an unknown source, remained undetected until September 2018, by which time the company had been acquired by Marriott.  

The personal data involved differed between individuals but may have included names, email addresses, phone numbers, unencrypted passport numbers, arrival/departure information, guests’ VIP status and loyalty programme membership number. The precise number of people affected is unclear as there may have been multiple records for an individual guest. Seven million guest records related to people in the UK. 

The Cyber Attack 

In 2014, an unknown attacker installed a piece of code known as a ‘web shell’ onto a device in the Starwood system giving them the ability to access and edit the contents of this device remotely. This access was exploited in order to install malware, enabling the attacker to have remote access to the system as a privileged user. As a result, the attacker would have had unrestricted access to the relevant device, and other devices on the network to which that account would have had access. Further tools were installed by the attacker to gather login credentials for additional users within the Starwood network.
With these credentials, the database storing reservation data for Starwood customers was accessed and exported by the attacker. 

The ICO acknowledged that Marriott acted promptly to contact customers and the ICO.
It also acted quickly to mitigate the risk of damage suffered by customers. However it was found to have breached the Security Principle (Article 5(1)(f)) and Article 32 (Security of personal data). The fine only relates to the breaches from 25 May 2018, when GDPR came into effect, although the ICO’s investigation traced the cyber-attack back to 2014. 

Data Protection Officers are encouraged to read the Monetary Penalty Notice as it not only sets out the reasons for the ICO’s conclusion but also the factors it has taken into account in deciding to issue a fine and how it calculated the amount.  

It is also essential that DPOs have a good understanding of cyber security. We have some places available on our Cyber Security for DPOs workshop in November. 

The Information Commissioner, Elizabeth Denham, said: 

“Personal data is precious and businesses have to look after it. Millions of people’s data was affected by Marriott’s failure; thousands contacted a helpline and others may have had to take action to protect their personal data because the company they trusted it with had not.”

“When a business fails to look after customers’ data, the impact is not just a possible fine, what matters most is the public whose data they had a duty to protect.” 

Marriott said in statement:  

“Marriott deeply regrets the incident. Marriott remains committed to the privacy and security of its guests’ information and continues to make significant investments in security measures for its systems. The ICO recognises the steps taken by Marriott following discovery of the incident to promptly inform and protect the interests of its guests.”

Marriott has also said that it does not intend to appeal the fine, but this is not the end of the matter. It is still facing a civil class action in the High Court for compensation on behalf of all those affected by the data breach.  

This is the second highest GDPR fine issued by the ICO. On 16th October British Airways was fined £20 million also for a cyber security breach. (You can read more about the causes of cyber security breaches in our recent blog post.) The first fine was issued in December 2019 to Doorstep Dispensaree Ltd for a for a comparatively small amount of £275,000. 

This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online GDPR Practitioner Certificate is fully booked.We have added more courses. 

Act Now launches GDPR Policy Pack

ACT NOW NEWS

The first fine was issued recently under the General Data Protection Regulation (GDPR) by the Austrian data protection regulator. Whilst relatively modest at 4,800 Euros, it shows that regulators are ready and willing to exercise their GDPR enforcement powers.

Article 24 of GDPR emphasises the need for Data Controllers to demonstrate compliance through measures to “be reviewed and updated where necessary”. This includes the implementation of “appropriate data protection policies by the controller.” This can be daunting especially for those beginning their GDPR compliance journey.

Act Now has applied its information governance knowledge and experience to create a GDPR policy pack containing essential documentation templates to help you meet the requirements of GDPR as well as the Data Protection Act 2018. The pack includes, amongst other things, template privacy notices as well as procedures for data security and data breach reporting. Security is a very hot topic after the recent £500,000 fine levied on Equifax by the Information Commissioner under the Data Protection Act 1998.

We have also included template letters to deal with Data Subjects’ rights requests, including subject access. The detailed contents are set out below:

  • User guide
  • Policies
    • Data Protection Policy
    • Special Category Data Processing (DPA 2018)
    • CCTV
    • Information Security
  • Procedures
    • Data breach reporting
    • Data Protection Impact Assessment template
    • Data Subject rights request templates
  • Privacy Notices
    • Business clients and contacts
    • Customers
    • Employees and volunteers
    • Public authority services users
    • Website users
    • Members
  • Records and Tracking logs
    • Information Asset Register
    • Record of Processing Activity (Article 30)
    • Record of Special Category Data processing
    • Data Subject Rights request tracker
    • Information security incident log
    • Personal data breach log
    • Data protection advice log

The documents are designed to be as simple as possible while meeting the statutory requirements placed on Data Controllers. They are available as an instant download (in Word Format). Sequential files and names make locating each document very easy.

Click here to read sample documents.

The policy pack gives a useful starting point for organisations of all sizes both in the public and private sector. For only £149 plus VAT (special introductory price) it will save you hours of drafting time. Click here to buy now or visit or our website to find out more.

Act Now provides a full GDPR Course programme including one day workshops, e learning, healthchecks and our GDPR Practitioner Certificate. 

GDPR and Employee Surveillance

canstockphoto18907084

The regulatory framework around employee surveillance is complex and easy to fall foul of. A few years ago, West Yorkshire Fire Service faced criticism when a 999 operator, who was on sick leave, found a GPS tracker planted on her car by a private detective hired by her bosses.

At present all employers have to comply with the Data Protection Act 1998 (DPA) when conducting surveillance, as they will be gathering and using personal data about living identifiable individuals. Part 3 of the Information Commissioner’s Data Protection Employment Practices Code (Employment Code) is an important document to follow to avoid DPA breaches. It covers all types of employee surveillance from video monitoring and vehicle tracking to email and Internet monitoring.

When the General Data Protection Regulation (GDPR) comes into force (25th May 2018) it will replace the DPA. The general rules applicable to employee monitoring as espoused by the DPA and the Employment Code will remain the same.  However there will be more for employers to do to demonstrate GDPR compliance.

Data Protection Impact Assessment

One of the main recommendations of the Employment Code is that employers should undertake an impact assessment before undertaking surveillance. This is best done in writing and should, amongst other things, consider whether the surveillance is necessary and proportionate to what is sought to be achieved.

Article 35 of GDPR introduces the concept of a Data Protection Impact Assessment (DPIA) (also known as a Privacy Impact Assessment) as a tool, which can help Data Controllers (in this case employers) identify the most effective way to comply with their GDPR obligations. A DPIA is required when the data processing is “likely to result in a high risk to the rights and freedoms of natural persons” (Article 35(1)). Employee surveillance is likely to be high risk according to the criteria set out by the Article 29 Working Party in its recently published draft data protection impact assessment guidelines.

The GDPR sets out the minimum features which must be included in a DPIA:

  • A description of the processing operations and the purposes, including, where applicable, the legitimate interests pursued by the Data Controller.
  • An assessment of the necessity and proportionality of the processing in relation to the purpose.
  • An assessment of the risks to individuals.
  • The measures in place to address risk, including security, and to demonstrate that the Data Controller is complying with GDPR.

Before doing a DPIA, the Data Protection Officer’s advice, if one has been designated, must be sought as well as the views (if appropriate) of Data Subjects or their representatives. In some cases the views of the Information Commissioner’s Office (ICO) may have to be sought as well. In all cases the Data Controller is obliged to retain a record of the DPIA.

Failure to carry out a DPIA when one is required can result in an administrative fine of up to 10 million Euros, or in the case of an undertaking, up to 2% of the total worldwide annual turnover of the preceding financial year, whichever is higher.

Our recent blog post and forthcoming DPIA webinar will be useful for those conducting DPIAs.

Article 6 – Lawfulness

All forms of processing of personal data (including employee surveillance) has to be lawful by reference to the conditions set out in Article 6 of GDPR (equivalent to Schedule 2 of the DPA). One of these conditions is consent. Article 4(11) states:

‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her;

As discussed in our previous blog post, consent will be more difficult to achieve under GDPR. This is especially so for employers conducting employee surveillance. According to the Information Commissioner’s draft guidance on consent under GDPR:

“consent will not be freely given if there is imbalance in the relationship between the individual and the controller – this will make consent particularly difficult for public authorities and for employers, who should look for an alternative lawful basis.”

Employers (and public authorities) may well need to look for another condition in Article 6 to justify the surveillance. This could include where processing is necessary:

  • for compliance with a legal obligation to which the Data Controller is subject (Article 6(1)(c));
  • for the performance of a task carried out in the public interest or in the exercise of official authority vested in the Data Controller (Article 6(1)(e)); or
  • for the purposes of the legitimate interests pursued by the Data Controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child (Article 6(1)(f)).

Legitimate interests (Article 6(1)(f)) will be a favourite condition amongst employers as usually the surveillance will be done to prevent or detect crime or to detect or stop abuse of the employers’ resources e.g. vehicles, internet and email facilities etc.

Public Authorities

Article 6 states that the legitimate interests condition shall not apply to processing carried out by public authorities in the performance of their tasks. Herein lies a potential problem for, amongst others, local authorities, government departments, and quangos.

Such organisations will have to consider the applicability of the legal obligation and public interests/official authority conditions (Article 6(1)(c) and Article 6(1)(e)) respectively). We can expect lots of arguments about what surveillance is in the public interest and when official authority is involved. If the surveillance involves a public authority using covert techniques or equipment to conduct the surveillance, it is easy to assume that Part 2 of the Regulation of Investigatory Powers Act 2000 (“RIPA”) applies and so the latter condition is met. However, the Investigatory Powers Tribunal has ruled in the past that not all covert surveillance of employees is regulated by RIPA (See C v The Police and the Secretary of State for the Home Department (14th November 2006, No: IPT/03/32/H),).

More detail on the RIPA and human rights angle to employee surveillance can be found in our blog post here. More on the DPA angle here.

We also have a specific blog post on the legal implications of social media monitoring as well as a forthcoming webinar.

Transparency

All Data Controllers, including employers, have an obligation to ensure that they are transparent in terms of the how they use employee’s information. Consideration will also have to be given to as to what extent general information will have to be supplied to employees in respect for the employer’s surveillance activities (See our blog post on Privacy Notices).

Surveillance of employees can be a legal minefield. Our forthcoming webinar on GDPR and employee surveillance will be useful for personnel officers, lawyers, IT staff and auditors who may be conducting or advising on employee surveillance.

 

Act Now can help with your GDPR preparations. We offer a GDPR health check service and our workshops and GDPR Practitioner Certificate (GDPR.Cert) courses are filling up fast.

Ghost in the machine

By Paul Simpkins

Like any normal UK male I like to watch sport on TV. As the season all over Europe comes to a conclusion the titles and cups are being decided. Exactly the wrong time to take a holiday. Why?

Because despite Sky Go and BT allowing you to watch their products on your laptop or other device while you’re away from home things stop working when you leave the UK. It’s nothing to do with Brexit. Your device works out that you’ve left and suddenly many services that you use frequently start to deny you access for the simple reason that you’re away from home. If you want to watch the destination of the titles and cups you have to hope that you can find a friendly bar with a TV and hope the locals aren’t supporting the team that is playing your team.  You may have to consume alcohol and even sing sporting anthems badly but that’s part of the fun.

If you prefer to sit in the safety of your hotel room or rural gite or caravan there is another solution. Buy a wifi session. Your venue will probably sell you one for a few euros and you can watch in peace with a steaming cappuccino. Trouble is your device may still not allow you to connect to UK channels as it will still think you’re away from home as your IP address identifies your location.

But there’s a solution for that as well. Buy an app that masks your IP address. I’ve used this one.

blg paul 1

And it’s worked well. For free it will tell your computer sitting in Bordeaux that it’s really in Manchester so it will be able to watch iPlayer, Sky & BT without a problem. Yabba dabba doo!

Until recently when I purchased a month’s wifi from the site where I am currently staying. The company concerned is called Ozmosis.

blg paul 2

It’s full of lovely pictures of people enjoying themselves on holiday (the sunglasses give it away) using their wifi on holiday parks throughout Europe. 8 million users no less. So I bought a month’s wifi from them.

When it came to Champions league semi finals I thought I’d watch. It took a while. You have to run Cyberghost and find out that only 2000 free places exist and they count down at about ten a second until wow you’re sorted and watch the IP address emigrating from south west France to Manchester via a slow moving graphic then eventually log on to BT sport. Even then it often doesn’t work.

No problem. It was worth the effort. Until the following morning when you try to log on to the internet as usual. It doesn’t work. Suddenly it dawns on you via series of messages from Ozmosis they’ve identified a streaming service on your computer which violates their terms and conditions and they have terminated your wifi (after 6 of 31 days).

You ring the help line and you have to admit that you’ve been a naughty boy using an IP masking routine; apologise, delete it from your machine and they restore your wifi.

But then you think…

Who are they to say what I can do with their product? I buy it. It connects me to the internet. Can I watch porn channels with it? Can I hack health services all over Europe with it?  If I buy product A that enables me to do many things can the provider of Product A stop  me from doing B, C and D, E and F with their enabling product? 

If I bought a Kindle and loaded it with racist literature could Amazon stop me reading it?

If I bought a car and was told by the salesman that I couldn’t drive to Chipping Sodbury because they didn’t like the name.

If I bought a mobile phone but was limited in the numbers I could call?

(other off the wall examples sought by the author)

So there you are. I can buy wifi and perform normal functions like check my email or look at my bank account or whatsapp my auntie but not watch Atletico Madrid fail to beat Real Madrid without being penalised by a faceless sysadmin near Montpellier who cuts off a service I’ve paid for because I’m doing something they don’t like.  I have no other option on my campsite. Ozmosis have a monopoly.

OK millions of people streaming a major football match might use a lot of bandwidth but that’s what most European males on a campsite want to do. Saying in the T & C that you can’t do it makes buying the wifi worthless. Increase your capability Ozmosis or get out of the sector (

but they’re making zillions of euros so they won’t do that).

I expect a torrent of abuse from normal people who live without watching big sporting events but living in France for several weeks eating quality food and drinking cheap quality wine and beer while enjoying temperatures 10 degrees higher than the UK needs some mitigation otherwise it would be Paradise Lost – buts that’s another story.

RIPA and Communications Data: IoCCo Annual Report

ripa24

 

 

 

 

 

 

 

 

 

 

 

 

In October 2015 the Prime Minister appointed Sir Stanley Burnton as the new Interception of Communications Commissioner replacing Sir Anthony May. Sir Stanley’s function is to keep under review the interception of communications and the acquisition and disclosure of communications data by public authorities under the Regulation of Investigatory Powers Act 2000 (RIPA).

Local authorities, as well as other agencies, have powers under Part I Chapter 2 of RIPA to acquire communications data from Communications Service Providers (CSPs). The definition of “communications data” includes information relating to the use of a communications service (e.g. phone, internet, post) but does not include the contents of the communication itself. It is broadly split into 3 categories: “traffic data” i.e. where a communication was made from, to whom and when; “service data” i.e. the use made of the service by any person e.g. itemised telephone records; “subscriber data” i.e. any other information that is held or obtained by a CSP on a person they provide a service to.

Some public authorities have access to all types of communications data e.g. the Police, the Ambulance Service and HM Revenues and Customs. Local authorities are restricted to subscriber and service use data and then only where it is required for the purpose of preventing or detecting crime or preventing disorder. For example, a benefit fraud investigator may be able to obtain an alleged fraudster’s mobile phone bill. As with other RIPA powers, e.g. Directed Surveillance under Part 2, there are forms to fill out and strict tests of necessity and proportionality to satisfy.

On 8th September 2016, Sir Stanley laid his 2015 annual report before Parliament. The report covers the period January to December 2015. Key findings around communications data powers include:

  • 761,702 items of communications data were acquired during 2015.
  • 48% of the items of communications data were traffic data, 2% service use information and 50% subscriber information.
  • 7% of the applications for communications data were made by police forces and law enforcement agencies, 5.7% by the intelligence agencies and 0.6% by local authorities and other public authorities.
  • Only 71 local authorities reported using these powers. The majority of these used them on less than 10 occasions.
  • Out of the 975 applications made by local authorities in 2015, Kent County Council made 107 of these whilst five councils made just 1 application each.

A big reason for the low use of these powers by local authorities is that, since 1st November 2012, they have had to obtain Magistrates’ approval for even the simplest communications data applications (e.g. mobile subscriber checks).

Another reason may be that since December 2015 last year, the Home Office has required councils to go through the National Anti Fraud Network to access communications data rather than make direct applications to CSPs. This has also made the internal SPoC’s (Single Point of Contact) role redundant. Consequently the Commissioner no longer conduct inspections of individual local  authorities; choosing to inspect NAFN instead.

In March 2015 a new Code of Practice for the Acquisition and Disclosure of Communications Data by public authorities came into force.  It contains several policy changes, which will require careful consideration.

When the Investigatory Powers Bill comes into force it will change the communications data access regime.  Read our blog and watch this space.

Do you make use of these powers and need refresher training? Act Now is running a live one hour webinar on this topic. We also offer a whole host of training in this area. Please visit our website to find out more!

The Investigatory Powers Bill: Implications for Local Authorities

 

canstockphoto17336195

 

 

 

 

 

 

 

 

 

 

The government’s controversial Draft Investigatory Powers Bill was published in early November. Amongst other things, the Bill:

  • Requires web and phone companies to store records of websites visited by every citizen for 12 months for access by police, security services and some public bodies.
  • Makes explicit in law for the first time the Security Services’ powers for the bulk collection of large volumes of personal communications data.
  • Makes explicit in law for the first time the powers of the Security Services and police to hack into and bug computers and phones. It also places new legal obligation on companies to assist in these operations to bypass encryption.
  • Requires internet and phone companies to maintain “permanent capabilities” to intercept and collect the personal data passing over their networks. They will also be under a wider power to assist the security services and the police in the interests of national security.

Much has been written about the civil liberties implications of the new Bill, dubbed “the Snoopers’ Charter.” It has been criticised by the United Nations, the Opposition and civil liberties groups.

A Committee has been formed to consider the key issues raised by the Bill, including whether the powers sought are necessary, whether they are legal and whether they are workable and clearly defined. The Committee is now inviting written evidence to be received by 21st  December 2015 (call for evidence).

Some of the questions the Committee are inviting evidence on include:

  • To what extent is it necessary for the security and intelligence services and law enforcement to have access to investigatory powers such as those contained in the draft Bill?
  • Are there sufficient operational justifications for undertaking targeted and bulk interception, and are the proposed authorisation processes for such interception activities appropriate and workable?
  • Should the security and intelligence services have access to powers that allow them to undertake targeted and bulk equipment interference? Should law enforcement also have access to such powers?

The Committee is due to report back by February 2016.

What will the effect be of the Investigatory Powers Bill on local authorities? Is it true that councils will be given powers to view citizens’ internet history (according to the Telegraph)? The answer is no.

Sam Lincoln has written an in-depth analysis of the bill, detailing and dissecting its various points. Please take a look here.

Sam has designed our RIPA E-Learning Package which is an interactive online learning tool, ideal for those who need a RIPA refresher before an OSC inspection. Our 2016 RIPA workshops will include an update on the Bill.

%d bloggers like this: