YMCA Fined for HIV Email Data Breach 

Another day and another ICO fine for a data breach involving email! The Central Young Men’s Christian Association (the Central YMCA) of London has been issued with a Monetary Penalty Notice of £7,500 for a data breach when emails intended for those on a HIV support programme were sent to 264 email addresses using CC instead of BCC, revealing the email addresses to all recipients. This resulted in 166 people being identifiable or potentially identifiable. A formal reprimand has also been issued

Failure to use blind carbon copy (BCC) correctly in emails is one of the top data breaches reported to the ICO every year. In December 2023, the ICO fined the Ministry of Defence (MoD) £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. Again the failure to use blind copy when using e mail was a central cause of the data breach. 

Last year the Patient and Client Council (PCC) and the Executive Office were the subject of ICO reprimands for disclosing personal data in this way. In October 2021, HIV Scotland was issued with a £10,000 GDPR fine when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.  

Organisations must have appropriate policies and training in place to minimise the risks of personal data being inappropriately disclosed via email. To avoid similar incidents, the ICO recommends that organisations should: 

  1. Consider using other secure means to send communications that involve large amounts of data or sensitive information. This could include using bulk email services, mail merge, or secure data transfer services, so information is not shared with people by mistake.  
  1. Consider having appropriate policies in place and training for staff in relation to email communications.  
  1. For non-sensitive communications, organisations that choose to use BCC should do so carefully to ensure personal email addresses are not shared inappropriately with other customers, clients, or other organisations. 

More on email best practice in the ICO’s email and security guidance

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

The MoD GDPR Fine: The Dangers of Email 

Inadvertent disclosure of personal data on email systems has been the subject of a number of GDPR enforcement actions by the Information Commissioner’s Office (ICO) in the past few years. In 2021, the transgender charity Mermaids was fined £25,000 for failing to keep the personal data of its users secure. The ICO found that Mermaids failed to implement an appropriate level of security to its internal email systems, which resulted in documents or emails containing personal data being searchable and viewable online by third parties through internet search engine results. 

Failure to use blind carbon copy (BCC) correctly in emails is one of the top data breaches reported to the ICO every year. Last year the Patient and Client Council (PCC) and the Executive Office were the subject of ICO reprimands for disclosing personal data in this way. In October 2021, HIV Scotland was issued with a £10,000 GDPR fine when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.  

The latest GDPR fine was issued in December 2023, although the Monetary Penalty Notice has only just been published on the ICO website. The ICO has fined the Ministry of Defence (MoD) £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. 

On 20th September 2021, the MoD sent an email to a distribution list of Afghan nationals eligible for evacuation using the ‘To’ field, with personal information relating to 245 people being inadvertently disclosed. The email addresses could be seen by all recipients, with 55 people having thumbnail pictures on their email profiles.
Two people ‘replied all’ to the entire list of recipients, with one of them providing their location. 

The original email was sent by the team in charge of the UK’s Afghan Relocations and Assistance Policy (ARAP), which is responsible for assisting the relocation of Afghan citizens who worked for or with the UK Government in Afghanistan.
The data disclosed, should it have fallen into the hands of the Taliban, could have resulted in a threat to life. 

Under the UK GDPR, organisations must have appropriate technical and organisational measures in place to avoid disclosing people’s information inappropriately. ICO guidance makes it clear that organisations should use bulk email services, mail merge, or secure data transfer services when sending any sensitive personal information electronically. The ARAP team did not have such measures in place at the time of the incident and was relying on ‘blind carbon copy’ (BCC), which carries a significant risk of human error. 

The ICO, taking into consideration the representations from the MoD, reduced the fine from a starting amount of £1,000,000 to £700,000 to reflect the action the MoD took following the incidents and recognising the significant challenges the ARAP team faced. Under the ICO’s public sector approach, the fine was further reduced to £350,000.  

Organisations must have appropriate policies and training in place to minimise the risks of personal data being inappropriately disclosed via email. To avoid similar incidents, the ICO recommends that organisations should: 

  1. Consider using other secure means to send communications that involve large amounts of data or sensitive information. This could include using bulk email services, mail merge, or secure data transfer services, so information is not shared with people by mistake.  
  1. Consider having appropriate policies in place and training for staff in relation to email communications.  
  1. For non-sensitive communications, organisations that choose to use BCC should do so carefully to ensure personal email addresses are not shared inappropriately with other customers, clients, or other organisations. 

More on email best practice in the ICO’s email and security guidance

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

HelloFresh fined by the ICO

The Information Commissioner’s Office (ICO) has fined food delivery company HelloFresh £140,000 for a campaign of 79 million spam emails and 1 million spam texts over a seven-month period

HelloFresh, under its official name Grocery Delivery E-Services UK Limited, was deemed to contravene regulation 22 of the Privacy and Electronic Communications Regulations 2003. 

Key points from this case include: 

  1. Inadequate Consent Mechanism: The opt-in statement used by HelloFresh did not specifically mention the use of text messages for marketing. While there was a mention of email marketing, it was ambiguously tied to an age confirmation statement, which could mislead customers into consenting. 
  1. Lack of Transparency: Customers were not properly informed that their data would continue to be used for marketing purposes for up to 24 months after they cancelled their subscriptions with HelloFresh. 
  1. Continued Contact Post Opt-Out: The ICO’s investigation revealed that HelloFresh continued to contact some individuals even after they had explicitly requested for the communications to stop. 
  1. Volume of Complaints: The investigation was triggered by numerous complaints, both to the ICO and through the 7726 spam message reporting service. 
  1. Substantial Fine: As a result of these findings, HelloFresh was fined £140,000. 
     
    Andy Curry, Head of Investigations at the ICO, emphasised the severity of the breach, noting that HelloFresh failed to provide clear opt-in and opt-out information, leading to a bombardment of unwanted marketing communications. The ICO’s decision to impose a fine reflects their commitment to enforce the law and protect customer data rights. 

This case serves as a reminder of the importance of complying with data protection and electronic communications regulations, especially in terms of obtaining clear and informed consent for marketing communications.

Dive deeper into the realm of data protection with our UK GDPR Practitioner Certificate, offering crucial insights into compliance essentials highlighted in this blog. Limited spaces are available for our January cohort – book now to enhance your understanding and navigate data regulations with confidence. 

Clearview AI Wins Appeal Against GDPR Fine 

Last week a Tribunal overturned a GDPR Enforcement Notice and a Monetary Penalty Notice issued to Clearview AI, an American facial recognition company. In Clearview AI Inc v The Information Commissioner [2023] UKFTT 00819 (GRC), the First-Tier Tribunal (Information Rights) ruled that the Information Commissioner had no jurisdiction to issue either notice, on the basis that the GDPR/UK GDPR did not apply to the personal data processing in issue.  

Background 

Clearview is a US based company which describes itself as the “World’s Largest Facial Network”. Its online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. It allows customers to upload an image of a person to its app; the person is then identified by the app checking against all the images in the Clearview database.  

In May 2022 the ICO issued a Monetary Penalty Notice of £7,552,800 to Clearview for breaches of the GDPR including failing to use the information of people in the UK in a way that is fair and transparent. Although Clearview is a US company, the ICO ruled that the UK GDPR applied because of Article 3(2)(b) (territorial scope). It concluded that Clearview’s processing activities “are related to… the monitoring of [UK resident’s] behaviour as far as their behaviour takes place within the United Kingdom.” 

The ICO also issued an Enforcement Notice ordering Clearview to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems. (see our earlier blog for more detail on these notices.) 

The Judgement  

The First-Tier Tribunal (Information Rights) has now overturned the ICO’s enforcement and penalty notice against Clearview. It concluded that although Clearview did carry out data processing related to monitoring the behaviour of people in the UK (Article Art. 3(2)(b) of the UK GDPR), the ICO did not have jurisdiction to take enforcement action or issue a fine. Both the GDPR and UK GDPR provide that acts of foreign governments fall outside their scope; it is not for one government to seek to bind or control the activities of another sovereign state. However the Tribunal noted that the ICO could have taken action under the Law Enforcement Directive (Part 3 of the DPA 2018 in the UK), which specifically regulates the processing of personal data in relation to law enforcement. 

Learning Points 

While the Tribunal’s judgement in this case reflects the specific circumstances, some of its findings are of wider application: 

  • The term “behaviour” (in Article Art. 3(2)(b)) means something about what a person does (e.g., location, relationship status, occupation, use of social media, habits) rather than just identifying or describing them (e.g., name, date of birth, height, hair colour).  

  • The term “monitoring” not only comes up in Article 3(2)(b) but also in Article 35(3)(c) (when a DPIA is required). The Tribunal ruled that monitoring includes tracking a person at a fixed point in time as well as on a continuous or repeated basis.

  • In this case, Clearview was not monitoring UK residents directly as its processing was limited to creating and maintaining a database of facial images and biometric vectors. However, Clearview’s clients were using its services for monitoring purposes and therefore Clearview’s processing “related to” monitoring under Article 3(2)(b). 

  • A provider of services like Clearview, may be considered a joint controller with its clients where both determine the purposes and means of processing. In this case, Clearview was a joint controller with its clients because it imposed restrictions on how clients could use the services (i.e., only for law enforcement and national security purposes) and determined the means of processing when matching query images against its facial recognition database.  

Data Scraping 

The ruling is not a greenlight for data scraping; where publicly available data, usually from the internet, is collected and processed by companies often without the Data Subject’s knowledge. The Tribunal ruled that this was an activity to which the UK GDPR could apply. In its press release, reacting to the ruling, the ICO said: 

“The ICO will take stock of today’s judgment and carefully consider next steps.
It is important to note that this judgment does not remove the ICO’s ability to act against companies based internationally who process data of people in the UK, particularly businesses scraping data of people in the UK, and instead covers a specific exemption around foreign law enforcement.” 

This is a significant ruling from the First Tier Tribunal which has implications for the extra territorial effect of the UK GDPR and the ICO powers to enforce it. It merits an appeal by the ICO to the Upper Tribunal. Whether this happens depends very much on the ICO’s appetite for a legal battle with a tech company with deep pockets.  

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Updateworkshop.  

The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

Mega GDPR Fines for Meta

On 4th January 2023, Ireland’s Data Protection Commission (DPC) announced the conclusion of two inquiries into the data processing operations of Meta Platforms Ireland Limited (“Meta Ireland”) in connection with the delivery of its Facebook and Instagram services. Not only does this decision significantly limit Meta’s ability to gather information from its users to tailor and sell advertising, it also provides useful insight into EU regulators’ view about how to comply with Principle 1 of GDPR i.e. the need to ensure personal data is “processed lawfully, fairly and in a transparent manner in relation to the data subject”(Article 5).

In decisions dated 31st December 2022, the DPC fined Meta Ireland €210 million and €180 million, relating to its Facebook and Instagram services respectively. The fines were imposed in connection with the company’s practise of monetising users’ personal data by running personalised adverts on their social media accounts. Information about a social media user’s digital footprint, such as what videos prompt them to stop scrolling or what types of links they click on, is used by marketers to get personalised adverts in front of people who are the most likely to buy their products. This practice helped Meta generate $118 billion in revenue in 2021.

The DPC’s decision was the result of two complaints from Facebook and Instagram users, supported by privacy campaign group NOYB, both of which raised the same basic issue: how Meta obtains legal permission from users to collect and use their personal data for personalised advertising. Article 6(1) of GDPR states that:

“Processing shall be lawful only if and to the extent that at least one of the following applies:

  1. the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
  • processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;”

In advance of the GDPR coming into force on 25th May 2018, Meta Ireland changed the Terms of Service for its Facebook and Instagram services. It also flagged the fact that it was changing the legal basis upon which it relies to process users’ personal data under Article 6 in the context of the delivery of the Facebook’s and Instagram’s services (including behavioural advertising). Having previously relied on the consent of users to the processing of their personal data, the company now sought to rely on the “contract” legal basis for most (but not all) of its processing operations. Existing and new users were required to click “I accept” to indicate their acceptance of the updated Terms of Service in order to continue using Facebook and Instagram. The services would not be accessible if users declined to do so.

Meta Ireland considered that, on accepting the updated Terms of Service, a contract was concluded between itself and the user. Consequently the processing of the user’s personal data in connection with the delivery of its Facebook and Instagram services was necessary for the performance of this “contract” which includes the provision of personalised services and behavioural advertising.  This, it claimed, provided a lawful basis by reference to Article 6(1)(b) of the GDPR.

The complainants contended that Meta Ireland was in fact still looking to rely on consent to provide a lawful basis for its processing of users’ data. They argued that, by making the accessibility of its services conditional on users accepting the updated Terms of Service, Meta Ireland was in fact “forcing” them to consent to the processing of their personal data for behavioural advertising and other personalised services. This was not real consent as defined in Article 4 of GDPR:

“any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her;” (our emphasis)

Following comprehensive investigations, consultation with other EU DP regulators (a process required by GDPR in such cases) and final rulings by the European Data Protection Board, the DPC made a number of findings; notably:

1. Meta Ireland did not provide clear information about its processing of users’ personal data, resulting in users having insufficient clarity as to what processing operations were being carried out on their personal data, for what purpose(s), and by reference to which of the six legal bases identified in Article 6. The DPC said this violated Articles 12 (transparency) and 13(1)(c) (information to be provide to the data subject) of GDPR. It also considered it to be a violation of Article 5(1)(a), which states that personal data must be processed lawfully, fairly and transparently.

2. Meta Ireland cannot rely on the contract legal basis for justifying its processing. The delivery of personalised advertising (as part of the broader suite of personalised services offered as part of the Facebook and Instagram services) could not be said to be necessary to perform the core elements of what was said to be a much more limited form of contract. The DPC adopted this position following a ruling by the EDPB, which agreed with other EU regulators’ representations to the DPC.

In addition to the fines, Meta Ireland has been directed to ensure its data processing operations comply with GDPR within a period of 3 months. It has said it will appeal; not surprising considering the decision has the potential to require it to make costly changes to its personalised advertising-based business in the European Union, one of its largest markets. 

It is important to note that this decision still allows Meta to use non-personal data (such as the content of a story) to personalise adverts or to ask users to give their consent to targeted adverts. However under GDPR users should be able to withdraw their consent at any time.  If a large number do so, it would impact one of the most valuable parts of Meta’s business. 

The forthcoming appeals by Meta will provide much needed judicial guidance on the GDPR particular Principle 1. Given the social media giant’s deep pockets, expect this one to run and run.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

ICO Reprimand for Misuse of Children’s Data: A Proportionate Response or a Let Off?

Last week, the Department for Education received a formal reprimand from the Information Commissioner’s Office(ICO) over a “serious breach” of the GDPR involving the unauthorised sharing of up to 28 million children’s personal data. But the Department has avoided a fine, despite a finding of “woeful” data protection practices.

The reprimand followed the ICO’s investigation into the sharing of personal data stored on the Learning Records Service (LRS) database, for which the DfE is the Data Controller. LRS provides a record of pupils’ qualifications that education providers can access. It contains both personal and Special Category Data and at the time of the incident there were 28 million records stored on it. Some of those records would have pertained to children aged 14 and over. 

The ICO started its investigation after receiving a breach report from the DfE about the unauthorised access to the LRS database. The DfE had only become aware of the breach after an exposé in a national Sunday newspaper.

The ICO found that the DfE’s poor due diligence meant that it continued to grant Trustopia access to the database when it advised the DfE that it was the new trading name for Edududes Ltd, which had been a training provider. Trustopia was in fact a screening company and used the database to provide age verification services to help gambling companies confirm customers were over 18. The ICO ruled that the DfE failed to:

  • protect against the unauthorised processing by third parties of data held on the LRS database for reasons other than the provision of educational services. Data Subjects were unaware of the processing and could not object or otherwise withdraw from this processing. Therefore the DfE failed to process the data fairly and lawfully in accordance with Article 5 (1)(a). 
  • have appropriate oversight to protect against unauthorised processing of personal data held on the LRS database and had also failed to ensure its confidentiality in accordance with Article 5 (1)(f). 

The ICO conducted a simultaneous investigation into Trustopia, during which the company confirmed it no longer had access to the database and the cache of data held in temporary files had been deleted. Trustopia was dissolved before the ICO investigation concluded and therefore regulatory action was not possible.

The DfE has been ordered to implement the following five measures to improve its compliance: 

  1. Improve transparency around the processing of the LRS database so Data Subjects are aware and are able to exercise their Data Subject rights, in order to satisfy the requirements of Article 5 (1)(a) of the UK GDPR. 
  • Review all internal security procedures on a regular basis to identify any additional preventative measures that can be implemented. This would reduce the risk of a recurrence to this type of incident and assist compliance with Article 5 (1)(f) of the UK GDPR. 
  • Ensure all relevant staff are made aware of any changes to processes as a result of this incident, by effective communication and by providing clear guidance. 
  • Complete a thorough and detailed Data Protection Impact Assessment, which adequately assesses the risk posed by the processing. This will enable the DfE to identify and mitigate the data protection risks for individuals. 

This investigation could, and many would say should, have resulted in a fine. However, in June 2022 John Edwards, the Information Commissioner, announced a new approach towards the public sector with the aim to reduce the impact of fines on the sector. Had this new trial approach not been in place, the DfE would have been issued with a fine of over £10 million. In a statement, John Edwards said:

“No-one needs persuading that a database of pupils’ learning records being used to help gambling companies is unacceptable. Our investigation found that the processes put in place by the Department for Education were woeful. Data was being misused, and the Department was unaware there was even a problem until a national newspaper informed them.

“We all have an absolute right to expect that our central government departments treat the data they hold on us with the utmost respect and security. Even more so when it comes to the information of 28 million children.

“This was a serious breach of the law, and one that would have warranted a £10 million fine in this specific case. I have taken the decision not to issue that fine, as any money paid in fines is returned to government, and so the impact would have been minimal. But that should not detract from how serious the errors we have highlighted were, nor how urgently they needed addressing by the Department for Education.”

The ICO also followed its new public sector enforcement approach when issuing a reprimand to NHS Blood and Transplant Service. In August 2019, the service inadvertently released untested development code into a live system for matching transplant list patients with donated organs. This error led to five adult patients on the non-urgent transplant list not being offered transplant livers at the earliest possible opportunity. The ICO said that, if the revised enforcement approach had not been in place, the service would have received a fine of £749,856. 

Some would say that the DFE has got off very lightly here and, given their past record, perhaps more stringent sanctions should have been imposed. Two years ago, the ICO criticised the DfE for secretly sharing children’s personal data with the Home Office, triggering fears it could be used for immigration enforcement as part of the government’s hostile environment policy. 

Many will question why the public sector merits this special treatment. It is not as if it has been the subject of a disproportionate number of fines. The first fine to a public authority was only issued in December 2021 (more than three and a half years after GDPR came into force) when the Cabinet Office was fined £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online. This was recently reduced to £50,000 following a negotiated settlement of a pending appeal.

Compare the DfE reprimand with last month’s Monetary Penalty Notice in the sum of £1,350,000 issued to a private company, Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products. With austerity coming back with a vengeance, no doubt the private sector will question the favourable terms for the public sector. 

Perhaps the Government will come to the private sector’s rescue. Following the new DCMS Secretary for State’s speech  last month, announcing a plan to replace the UK GDPR with a new “British data protection system” which cuts the “burdens” for British businesses, DCMS officials have said further delays to the Data Protection and Digital Information Bill are on the way. A new public consultation will be launched soon.

So far the EU is not impressed. A key European Union lawmaker has described meetings with the U.K. government over the country’s data protection reform plans as “appalling.” Italian MEP Fulvio Martusciello from the center-right European People’s Party said his impression from the visit was that Britain is “giving in on privacy in exchange for business gain.”

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 21st November. 

£4.4 Million GDPR Fine for Construction Company 

This month the UK Information Commissioner’s Office has issued two fines and one Notice of Intent under GDPR. 

The latest fine is three times more than that imposed on Easylife Ltd on 5th October. Yesterday, Interserve Group Ltd was fined £4.4 million for failing to keep personal information of its staff secure.  

The ICO found that the Berkshire based construction company failed to put appropriate security measures in place to prevent a cyber-attack, which enabled hackers to access the personal data of up to 113,000 employees through a phishing email. The compromised data included personal information such as contact details, national insurance numbers, and bank account details, as well as special category data including ethnic origin, religion, details of any disabilities, sexual orientation, and health information. 

The Phishing Email 

In March 2020, an Interserve employee forwarded a phishing email, which was not quarantined or blocked by Interserve’s IT system, to another employee who opened it and downloaded its content. This resulted in the installation of malware onto the employee’s workstation. 

The company’s anti-virus quarantined the malware and sent an alert, but Interserve failed to thoroughly investigate the suspicious activity. If they had done so, Interserve would have found that the attacker still had access to the company’s systems. 

The attacker subsequently compromised 283 systems and 16 accounts, as well as uninstalling the company’s anti-virus solution. Personal data of up to 113,000 current and former employees was encrypted and rendered unavailable. 

The ICO investigation found that Interserve failed to follow-up on the original alert of a suspicious activity, used outdated software systems and protocols, and had a lack of adequate staff training and insufficient risk assessments, which ultimately left them vulnerable to a cyber-attack. Consequently, Interserve had breached Article 5 and Article 32 of GDPR by failing to put appropriate technical and organisational measures in place to prevent the unauthorised access of people’s information. 

Notice of Intent 

Interestingly in this case the Notice of Intent (the pre cursor to the fine) was for also for £4.4million i.e. no reductions were made by the ICO despite Interserve’s representations. Compare this to the ICO’s treatment of two much bigger companies who also suffered cyber security breaches. In July 2018, British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine was reduced to £20 million in July 2020. In November 2020 Marriott International Inc was fined £18.4 million, much lower than the £99 million set out in the original notice. 

The Information Commissioner, John Edwards, has warned that companies are leaving themselves open to cyber-attack by ignoring crucial measures like updating software and training staff: 

“The biggest cyber risk businesses face is not from hackers outside of their company, but from complacency within their company. If your business doesn’t regularly monitor for suspicious activity in its systems and fails to act on warnings, or doesn’t update software and fails to provide training to staff, you can expect a similar fine from my office. 

Leaving the door open to cyber attackers is never acceptable, especially when dealing with people’s most sensitive information. This data breach had the potential to cause real harm to Interserve’s staff, as it left them vulnerable to the possibility of identity theft and financial fraud.” 

We have been here before. On 10th March the ICO  fined Tuckers Solicitors LLP £98,000 following a ransomware attack on the firm’s IT systems in August 2020. The attacker had encrypted 972,191 files, of which 24,712 related to court bundles.  60 of those were exfiltrated by the attacker and released on the dark web.   

Action Points  

Organisations need to strengthen their defences and have plans in place; not just to prevent a cyber-attack but what to do when it does takes place. Here are our top tips: 

  1. Conduct a cyber security risk assessment and consider an external accreditation through  Cyber Essentials. 
  1. Ensure your employees know the risks of malware/ransomware and follows good security practice. At the time of the cyber-attack, one of the two Interserve employees who received the phishing email had not undertaken data protection training. (Our GDPR Essentials  e-learning solution is a very cost effective e learning solution which contains a specific module on keeping data safe.)  
  1. Have plans in place for a cyber security breach. See our Managing Personal Data Breaches workshop.  
  1. Earlier in the year, the ICO worked with NCSC to remind organisations not to pay a ransom in case of a cyber-attack, as it does not reduce the risk to individuals and is not considered as a reasonable step to safeguard data. For more information, take a look at the ICO ransomware guidance or visit the NCSC website to learn about mitigating a ransomware threat via their business toolkit

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 21st November.  

ICO Fines “World’s Largest Facial Network”

The Information Commissioner’s Office has issued a Monetary Penalty Notice of £7,552,800 to Clearview AI Inc for breaches of the UK GDPR. 

Clearview is a US based company which describes itself as the “World’s Largest Facial Network”. It allows customers, including the police, to upload an image of a person to its app, which is then checked against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. This service was used on a free trial basis by a number of UK law enforcement agencies. The trial was discontinued and the service is no longer being offered in the UK. However Clearview has customers in other countries, so the ICO ruled that is still processing the personal data of UK residents.

The ICO was of the view that, given the high number of UK internet and social media users, Clearview’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge. It found the company had breached the UK GDPR by:

  • failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
  • failing to have a lawful reason for collecting people’s information;
  • failing to have a process in place to stop the data being retained indefinitely;
  • failing to meet the higher data protection standards required for biometric data (Special Category Data):
  • asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.

The ICO has also issued an enforcement notice ordering Clearview to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.

The precise legal basis for the ICO’s fine will only be known when (hopefully not if) it decides to publish the Monetary Penalty Notice. The information we have so far suggests that it considered breaches of Article 5 (1st and 5th Principles – lawfulness, transparency and data retention) Article 9 (Special Category Data) and Article 14 (privacy notice) amongst others.  (UPDATE – the notice has now been published here)

Whilst substantially lower than the £17 million Notice of Intent, issued in November 2021, this fine shows that the new Information Commissioner, John Edwards, is willing to take on at least some of the big tech companies. 

The ICO enforcement action comes after a joint investigation with the Office of the Australian Information Commissioner (OAIC). The latter also ordered the company to stop processing citizens’ data and delete any information it held. France, Itlay and Canada have also sanctioned the company under the EU GDPR. 

So what next for Clearview? The ICO has very limited means to enforce a fine against foreign entities.  Clearview has no operations or offices in the UK so it could just refuse to pay. This may be problematic from a public relations perspective as many of Clearview’s customers are law enforcement agencies in Europe who may not be willing to associate themselves with a company that has been found to have breached EU privacy laws. 

When the Italian DP regulator fined Clearview €20m (£16.9m) earlier this year, it responded by saying it did not operate in any way that brought it under the jurisdiction of the EU GDPR. Could it argue the same in the UK, where it also has no operations, customers or headquarters? Students of our  UK GDPR Practitioner certificate course will know that the answer lies in Article 3(2) which is sets out the extra territorial effect of the UK GDPR:

This Regulation applies to the relevant processing of personal data of data subjects who are in the United Kingdom by a controller or processor not established in the United Kingdom where the processing activities are related to:

  1. the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the United Kingdom; or
  2. the monitoring of their behaviour as far as their behaviour takes place within the United Kingdom. [our emphasis]

Whilst clearly Clearview (no pun intended) is not established in the UK, the ICO is of the view it is covered by the UK GDPR due to Article 3(2). See the statement of the Commissioner, John Edwards:

“Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images. The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.”

If Clearview does appeal, we will hopefully receive judicial guidance about the territorial scope of the  UK GDPR.   

UPDATE 19/10/22): Clearview’s appeal against the ICO’s £7.5 million fine is scheduled for 21-23 November in the First Tier Tribunal(Information Rights).

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

advanced_gdpr_cert

Law Firm Fined For GDPR Breach: What Went Wrong? 

On 10th March the Information Commissioner’s Office (ICO) announced that it had fined Tuckers Solicitors LLP £98,000 for a breach of GDPR.

The fine follows a ransomware attack on the firm’s IT systems in August 2020. The attacker had encrypted 972,191 files, of which 24,712 related to court bundles.  60 of those were exfiltrated by the attacker and released on the dark web.  Some of the files included Special Category Data. Clearly this was a personal data breach, not just for the fact that data was released on the dark web, but because of the unavailability of personal data (though encryption by the attacker) which is also cover by the definition in Article 4 GDPR. Tuckers reported the breach to the ICO as well as affected individuals through various means including social media

The ICO found that between 25th May 2018 (the date the GDPR came into force) and 25th August 2020 (the date on which the Tuckers reported the personal data breach), Tuckers had contravened Article 5(1)(f) of the GDPR (the sixth Data Protection Principle, Security) as it failed to process personal data in a manner that ensured appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures. The ICO found its starting point for calculating the breach to be 3.25 per cent of Tuckers’ turnover for 30 June 2020. It could have been worse; the maximum for a breach of the Data Protection Principles is 4% of gross annual turnover.

In reaching its conclusions, the Commissioner gave consideration to Article 32 GDPR, which requires a Data Controller, when implementing appropriate security measures, to consider:

 “…the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons”.

What does “state of the art” mean? In this case the ICO considered, in the context of “state of the art”, relevant industry standards of good practice including the ISO27000 series, the National Institutes of Standards and Technology (“NIST”), the various guidance from the ICO itself, the National Cyber Security Centre (“NCSC”), the Solicitors Regulatory
Authority, Lexcel and NCSC Cyber Essentials.

The ICO concluded that there are a number of areas in which Tuckers had failed to comply with, and to demonstrate that it complied, with the Security Principle. Their technical and organisational measures were, over the relevant period, inadequate in the following respects:

Lack of Multi-Factor Authentication (“MFA”)

MFA is an authentication method that requires the user to provide two or more verification factors to gain access to an online resource. Rather than just asking for a username and password, MFA requires one or more additional verification factors, which decreases the likelihood of a successful cyber-attack e.g. a code from a fob or text message. Tuckers had not used MFA on its remote access solution despite its own GDPR policy requiring it to be used where available. 

Patch Management 

Tuckers told the ICO that part of the reason for the attack was the late application of a software patch to fix a vulnerability. In January 2020 this patch was rated as “critical” by the NCSC and others. However Tuckers only installed it 4 months later. 

Failure to Encrypt Personal data

The personal data stored on the archive server, that was subject to this attack, had not been encrypted. The ICO accepted that encryption may not have prevented the ransomware attack. However, it would have mitigated some of the risks the attack posed to the affected data subjects especially given the sensitive nature of the data.

Action Points 

Ransomware is on the rise. Organisations need to strengthen their defences and have plans in place; not just to prevent a cyber-attack but what to do when it does takes place:

  1. Conduct a cyber security risk assessment and consider an external accreditation through Cyber Essentials. The ICO noted that in October 2019, Tuckers was assessed against the Cyber Essentials criteria and found to have failed to meet crucial aspects. The fact that some 10 months later it had still not resolved this issue was, in the Commissioner’s view, sufficient to constitute a negligent approach to data security obligations.
  2. Making sure everyone in your organisation knows the risks of malware/ransomware and follows good security practice. Our GDPR Essentials e learning solution contains a module on keeping data safe.
  3. Have plans in place for a cyber security breach. See our Managing Personal Data Breaches workshop

More useful advice in the ICO’s guidance note on ransomeware and DP compliance.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in April.

advanced_cert