A New GDPR Fine and a New ICO Enforcement Approach

Since May 25th 2018, the Information Commissioner’s Office (ICO) has issued ten GDPR fines. The latest was issued on 30th June 2022 to Tavistock and Portman NHS Foundation Trust for £78,400. The Trust had accidentally revealing 1,781 adult gender identity patients’ email addresses when sending out an email.

This is the second ICO fine issued to a Data Controller in these circumstances. In 2021, HIV Scotland was fined £10,000 when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk. 

The latest fine was issued to Tavistock and Portman NHS Foundation Trust following an e mail sent in early September 2019. The Trust intended to run a competition inviting patients of the adult Gender Identity Clinic to provide artwork to decorate a refurbished clinic building. It sent two identical emails promoting the competition (one to 912 recipients, and the second to 869 recipients) before realising they had not Bcc’d the addresses.

It was clear from the content of the email that all the recipients were patients of the clinic, and there was a risk further personal details could be found by researching the email addresses. The Trust immediately realised the error and tried, unsuccessfully, to recall the emails. It wrote to all the recipients to apologise and informed the ICO later that day.

The ICO investigation found:

  • Two similar, smaller incidents had affected a different department of the same Trust in 2017. While that department had strengthened their processes as a result, the learning and changes were not implemented across the whole Trust.
  • The Trust was overly reliant on people following policy to prevent bulk emails using ‘to’ in Outlook. There were no technical or organisational safeguards in place to prevent or mitigate against this very predictable human error. The Trust has since procured specialist bulk email software and set “a maximum ‘To’ recipient” rule on the email server.

The ICO reduced the fine issued to the Trust from £784,800 to £78,400 to reflect the ICO’s new approach to working more effectively with public authorities. This approach, which will be trialled over the next two years, was outlined in an open letter from the UK Information Commissioner John Edwards to public authorities. It will see more use of the Commissioner’s discretion to reduce the impact of fines on the public sector, coupled with better engagement including publicising lessons learned and sharing good practice. 

In practice, the new approach will mean an increased use of the ICO’s wider powers, including warnings, reprimands and enforcement notices, with fines only issued in the most serious cases. When a fine is considered, the decision notice will give an indication on the amount of the fine the case would have attracted. This will provide information to the wider economy about the levels of penalty others can expect from similar conduct. Additionally, the ICO will be working more closely with the public sector to encourage compliance with data protection law and prevent harms before they happen.

The ICO followed its new approach recently when issuing a reprimand to NHS Blood and Transplant Service. in August 2019, the service inadvertently released untested development code into a live system for matching transplant list patients with donated organs. This error led to five adult patients on the non-urgent transplant list not being offered transplant livers at the earliest possible opportunity. The service remedied the error within a week, and none of the patients involved experienced any harm as a result. The ICO says that, if the revised enforcement approach had not been in place, the service would have received a fine of £749,856. 

The new approach will be welcome news to the public sector at a time of pressure on budgets. However some have questioned why the public sector merits this special treatment. It is not as if it has been the subject of a disproportionate number of fines. The first fine to a public authority was only issued in December 2021 (more than three and a half years after GDPR came into force) when the Cabinet Office was fined £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online. Perhaps the ICO is already thinking about the reform of its role following the DCMS’s response to last year’s GDPR consultation. It will be interesting to see if others, particularly the charity sector, lobby for similar treatment. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

ICO Fines “World’s Largest Facial Network”

The Information Commissioner’s Office has issued a Monetary Penalty Notice of £7,552,800 to Clearview AI Inc for breaches of the UK GDPR. 

Clearview is a US based company which describes itself as the “World’s Largest Facial Network”. It allows customers, including the police, to upload an image of a person to its app, which is then checked against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. This service was used on a free trial basis by a number of UK law enforcement agencies. The trial was discontinued and the service is no longer being offered in the UK. However Clearview has customers in other countries, so the ICO ruled that is still processing the personal data of UK residents.

The ICO was of the view that, given the high number of UK internet and social media users, Clearview’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge. It found the company had breached the UK GDPR by:

  • failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
  • failing to have a lawful reason for collecting people’s information;
  • failing to have a process in place to stop the data being retained indefinitely;
  • failing to meet the higher data protection standards required for biometric data (Special Category Data):
  • asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.

The ICO has also issued an enforcement notice ordering Clearview to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.

The precise legal basis for the ICO’s fine will only be known when (hopefully not if) it decides to publish the Monetary Penalty Notice. The information we have so far suggests that it considered breaches of Article 5 (1st and 5th Principles – lawfulness, transparency and data retention) Article 9 (Special Category Data) and Article 14 (privacy notice) amongst others.  

Whilst substantially lower than the £17 million Notice of Intent, issued in November 2021, this fine shows that the new Information Commissioner, John Edwards, is willing to take on at least some of the big tech companies. 

The ICO enforcement action comes after a joint investigation with the Office of the Australian Information Commissioner (OAIC). The latter also ordered the company to stop processing citizens’ data and delete any information it held. France, Itlay and Canada have also sanctioned the company under the EU GDPR. 

So what next for Clearview? The ICO has very limited means to enforce a fine against foreign entities.  Clearview has no operations or offices in the UK so it could just refuse to pay. This may be problematic from a public relations perspective as many of Clearview’s customers are law enforcement agencies in Europe who may not be willing to associate themselves with a company that has been found to have breached EU privacy laws. 

When the Italian DP regulator fined Clearview €20m (£16.9m) earlier this year, it responded by saying it did not operate in any way that brought it under the jurisdiction of the EU GDPR. Could it argue the same in the UK, where it also has no operations, customers or headquarters? Students of our  UK GDPR Practitioner certificate course will know that the answer lies in Article 3(2) which is sets out the extra territorial effect of the UK GDPR:

This Regulation applies to the relevant processing of personal data of data subjects who are in the United Kingdom by a controller or processor not established in the United Kingdom where the processing activities are related to:

  1. the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the United Kingdom; or
  2. the monitoring of their behaviour as far as their behaviour takes place within the United Kingdom. [our emphasis]

Whilst clearly Clearview (no pun intended) is not established in the UK, the ICO is of the view it is covered by the UK GDPR due to Article 3(2). See the statement of the Commissioner, John Edwards:

“Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images. The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.”

If Clearview does appeal, we will hopefully receive judicial guidance about the territorial scope of the  UK GDPR.   

UPDATE 26/5/22): The ICO has now published the Clearview MPN and EN. You can read them here.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

advanced_gdpr_cert

Cabinet Office Receives £500,000 GDPR Fine

The Information Commissioner’s Office (ICO) has fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online.

The New Year Honours list is supposed to “recognise the achievements and service of extraordinary people across the United Kingdom.” However in 2020 the media attention was on the fact that, together with the names of recipients, the Cabinet Office accidentally published their addresses; a clear breach of the General Data Protection Regulation (GDPR) particularly the sixth data protection principle and Article 32 (security).

The Honours List file contained the details of 1097 people, including the singer Sir Elton John, cricketer Ben Stokes, the politician Iain Duncan Smith and the TV cook Nadiya Hussain. More than a dozen MoD employees and senior counter-terrorism officers as well as holocaust survivors were also on the list which was published online at 10.30pm on Friday 26th December 2019. After becoming aware of the data breach, the Cabinet Office removed the weblink to the file. However, the file was still cached and accessible online to people who had the exact webpage address.

The personal data was available online for a period of two hours and 21 minutes and it was accessed 3,872 times. The vast majority of people on the list had their house numbers, street names and postcodes published with their name. One of the lessons here is, always have a second person check the data before pressing “publish”.

This is the first ever GDPR fine issued by the ICO to a public sector organisation. A stark contrast to the ICO’s fines under the DPA 1998 where they started with a local authority. Article 82(1) sets out the right to compensation:

“Any person who has suffered material or non-material damage as a result of an infringement of this Regulation shall have the right to receive compensation from the controller or processor for the damage suffered.”

It will be interesting to see how many of the affected individuals pursue a civil claim. 

(See also our blog post from the time the breach was reported.)

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a one place left on our Advanced Certificate in GDPR Practice course starting in January.

To Share or Not to Share; That is the Question! 

elaine-casap-qgHGDbbSNm8-unsplash

On 5th October 2021 the Data Sharing Code of Practice from the Information Commissioner’s Office came into effect for UK based Data Controllers.  

The code is not law nor does it ‘enforce’ data sharing, but it does provide some useful steps to consider when sharing personal data either as a one off or as part of an ongoing arrangement. Data Protection professionals, and the staff in the organisations they serve, will still need to navigate a way through various pressures, frameworks, and expectations on the sharing of personal data; case by case, framework by framework. A more detailed post on the contents of the code can be read here.  

Act Now Training is pleased to announce a new full day ‘hands on’ workshop for Data Protection professionals on Data Sharing. Our expert trainer, Scott Sammons, will look at the practical steps to take, sharing frameworks and protocols, risks to consider etc. Scott will also explore how, as part of your wider IG framework, you can establish a proactive support framework; making it easier for staff to understand their data sharing obligations/expectations and driving down the temptation to use a ‘Data Protection Duck out’ for why something was shared/not shared inappropriately.  

Delegates will also be encouraged to bring a data sharing scenario to discuss with fellow delegates and the tutor. This workshop can also be customised and delivered to your organisation at your premises or virtually. Get in touch to learn more.

advanced_cert

Law Enforcement Processing and the Meaning of “authorised by law”

ethan-wilkinson-UJdx3XM3xao-unsplash

In October, there was a decision in the Scottish courts which will be of interest to data protection practitioners and lawyers when interpreting Part 3 of the Data Protection Act 2018 (law enforcement processing)  and more generally the UK GDPR.

The General Teaching Council For Scotland v The Chief Constable of The Police Service of Scotland could fairly be described as a skirmish about expenses (known as costs in other parts of the UK) in seven Petitions to the Court of Session by the General Teaching Council for Scotland (“GTCS”) against the Chief Constable of the Police Service of Scotland (“Police Scotland”). The petitions essentially sought disclosure of information, held by Police Scotland, to the GTCS which the GTCS had asked Police Scotland for, but which the latter had refused to provide. 

This case will be of interest to data protection practitioners for two reasons: (1) there is some consideration by Lord Uist as to what “authorised by law” means in the context of processing personal data under Part 3 DPA 2018 for purposes other than law enforcement purposes; and (2) it contains a salutary reminder that while advice from the Information Commissioner’s Office (ICO) can be useful, it can also be wrong; as well as the responsibilities of data controllers in relation to their decisions.

The GTCS is the statutory body responsible for the regulation of the teaching profession in Scotland. They are responsible for assessing the fitness of people applying to be added to the register of teachers in Scotland as well as the continuing fitness of those already on the register. In reliance of these functions, the GTCS had requested information from Police Scotland in order to assist it in fulfilling these duties. The information held by Police Scotland was processed by them for the law enforcement purposes; it thus fell within Part 3 of the DPA 2018. In response, the GTCS petitioned the Court of Session for orders requiring Police Scotland to release the information. Police Scotland did not oppose the Petitions and argued that it should not be found liable for the expenses of the GTCS in bringing the Petitions to the court. This was on the basis that it had not opposed them and it could not have given the GTCS information without the court’s order.

The ICO advice to Police Scotland

Police Scotland refused to supply the information without a court order on the basis that to do so would be processing the personal data for purposes other than the law enforcement purposes where the disclosure was authorised by law in contravention of the second Data Protection Principle under Section 36 of the DPA 2018 which states:

“(1) The second data protection principle is that – (a) the law enforcement purpose for which personal data is collected on any occasion must be specified, explicit and legitimate, and (b) personal data so collected must not be processed in a manner that is incompatible with the purpose for which it was collected. 

(2) Paragraph (b) of the second data protection principle is subject to subsections (3) and (4). 

(3) Personal data collected for a law enforcement purpose may be processed for any other law enforcement purpose (whether by the controller that collected the data or by another controller) provided that – 

(a) the controller is authorised by law to process that data for the other purpose, and
(b) the processing is necessary and proportionate to that other purpose. 

(4) Personal data collected for any of the law enforcement purposes may not be processed for a purpose that is not a law enforcement purpose unless the processing is authorised by law.” 

Police Scotland was relying upon advice from the ICO. That advice was that Police Scotland “would require either an order of the court or a specific statutory obligation to provide the information”, otherwise Police Scotland would be breaching the requirements of the DPA 2018. A longer form of the advice provided by the ICO to Police Scotland may be found at paragraph 10 of Lord Uist’s decision.

The ICO’s advice to Police Scotland was in conflict with what the ICO said in its code of practice issued under section 121 of the DPA 2018. There the ICO said that “authorised by law” could be “for example, statute, common law, royal prerogative or statutory code”. 

Authorised by Law

Lord Uist decided that the position adopted by Police Scotland, and the advice given to them by the ICO, was “plainly wrong”; concluding that the disclosure of the information requested by the GTCS would have been authorised by law without a court order.

The law recognises the need to balance the public interest in the free flow of information to the police for criminal proceedings, which requires that information given in confidence is not used for other purposes, against the public interest in protecting the public by disclosing confidential  information to regulatory bodies charged with ensuring professionals within their scope of responsibility are fit to continue practising. In essence, when the police are dealing with requests for personal data processed for law enforcement purposes by regulatory bodies, they must have regard to the public interest in ensuring that these regulatory bodies, which exist to protect the public, are able to carry out their own statutory functions.

Perhaps more significantly, the law also recognises that a court order is not required for such disclosures to be made to regulatory bodies. This meant that there was, at common law, a lawful basis upon which Police Scotland could have released the information requested by the GTCS to them. Therefore, Police Scotland would not have been in breach of section 36(4) of the DPA 2018 had they provided the information without a court order.

In essence, a lack of a specific statutory power to require information to be provided to it, or a specific statutory requirement on the police to provide the information, does not mean a disclosure is not authorised by law. It is necessary, as the ICO’s code of practice recognises, to look beyond statute and consider whether there is a basis at common law. 

Police Scotland was required by Lord Uist to meet the expenses of the GTCS in bringing the Petitions. This was because the Petitions had been necessitated by Police Scotland requiring a court order when none was required. Lord Uist was clear that Police Scotland had to take responsibility for their own decision; it was not relevant to consider that they acted on erroneous advice from the ICO.

This case serves as a clear reminder that, while useful, advice from the ICO can be wrong. The same too, of course, applies in respect of the guidance published by the ICO. It can be a good starting point, but it should never be the starting and end point. When receiving advice from the ICO it is necessary to think about that advice critically; especially where, as here, the advice contradicts other guidance published by the ICO. It is necessary to consider why there is a discrepancy and which is correct: the advice or the guidance?
It may, of course, be the case that both are actually incorrect.

The finding of liability for expenses is also a reminder that controllers are ultimately responsible for the decisions that they take in relation to the processing of personal data.
It is not good enough to effectively outsource that decision-making and responsibility to the ICO. Taking tricky questions to the regulator does not absolve the controller from considering the question itself, both before and after seeking the advice of the ICO.

Finally, this case may also be a useful and helpful reference point when considering whether something is “authorised by law” for the purposes of processing under Part 3 of the DPA 2018. It is, however, a first instance decision (the Outer House of the Court of Session being broadly similar in status to the High Court in England and Wales) and that ought to be kept in mind when considering it.

Alistair Sloan is a Devil (pupil) at the Scottish Bar; prior to commencing devilling he was a solicitor in Scotland and advised controllers, data protection officers and data subjects on a range of information law matters.

We have just announced a new full day workshop on Part 3 of the DPA 2018. See also our Part 3 Policy Pack.

advanced_cert

Ticketmaster Fined £1.25m Over Cyber Attack

0_MGP_CHP_270618TICKETMASTER_0736ticketmasterJPG

GDPR fines are like a number 65 bus. You wait for a long time and then three arrive at once. In the space of a month the Information Commissioner’s Office (ICO) has issued three Monetary Penalty Notices. The latest requires Ticketmaster to pay £1.25m following a cyber-attack on its website which compromised millions of customers’ personal information.  

The ICO investigation into this breach found a vulnerability in a third-party chatbot built by Inbenta Technologies, which Ticketmaster had installed on its online payments page. A cyber-attacker was able to use the chatbot to access customer payment details which included names, payment card numbers, expiry dates and CVV numbers. This had the potential to affect 9.4million Ticketmaster customers across Europe including 1.5 million in the UK. 

As a result of the breach, according to the ICO, 60,000 payment cards belonging to Barclays Bank customers had been subjected to known fraud. Another 6000 cards were replaced by Monzo Bank after it suspected fraudulent use. The ICO said these bank and others had warned Ticketmaster of suspected fraud. Despite these warnings it took nine weeks to start monitoring activity on its payments page. 

The ICO found that Ticketmaster failed to: 

  • Assess the risks of using a chat-bot on its payment page 
  • Identify and implement appropriate security measures to negate the risks 
  • Identify the source of suggested fraudulent activity in a timely manner 

James Dipple-Johnstone, Deputy Information Commissioner, said: 

“When customers handed over their personal details, they expected Ticketmaster to look after them. But they did not. 

Ticketmaster should have done more to reduce the risk of a cyber-attack. Its failure to do so meant that millions of people in the UK and Europe were exposed to potential fraud. 

The £1.25milllion fine we’ve issued today will send a message to other organisations that looking after their customers’ personal details safely should be at the top of their agenda.” 

In a statement, Ticketmaster said:  

“Ticketmaster takes fans’ data privacy and trust very seriously. Since Inbenta Technologies was breached in 2018, we have offered our full cooperation to the ICO.
We plan to appeal [against] today’s announcement.” 

Ticketmaster’s appeal will put the ICO’s reasoning and actions, when issuing fines, under judicial scrutiny. This will help GDPR practitioners faced with similar ICO investigations.   

Ticketmaster is also facing civil legal action by thousands of fraud victims. Law firm Keller Lenkner, which represents some of these victims, said: 

“While several banks tried to alert Ticketmaster of potential fraud, it took an unacceptable nine weeks for action to be taken, exposing an estimated 1.5 million UK customers,” said Kingsley Hayes, the firm’s head of cyber-crime.  

Data Protection Officers are encouraged to read the Monetary Penalty Notice as it not only sets out the reasons for the ICO’s conclusion but also the factors it has taken into account in deciding to issue a fine and how it calculated the amount. This fine follows hot on the heels of the British Airways and Marriott fines which also concerned cyber security breaches. (You can read more about the causes of cyber security breaches in our recent blog post.) 

75% of fines issued by the ICO under GDPR relate to cyber security. This is a top regulatory priority for the ICO as well as supervisory authorities across Europe.
Data Protection Officers should place cyber security at the top of their learning and development plan for 2021.  

We have some places available on our forthcoming Cyber Security for DPOs workshop. This and other GDPR developments will be covered in our next online GDPR update workshop.

The ICO’s New Subject Access Guidance

markus-winkler-afW1hht0NSs-unsplash

GDPR has introduced some new Data Subject rights including the right to erasure and data portability. The familiar right of Subject Access though still remains albeit with some additional obligations. Last week the Information Commissioner’s Office (ICO) published its long awaited right of access detailed guidance following a consultation exercise in December. The guidance provides some much needed clarification on key subject access issues Data Controllers have been grappling with since May 2018. 

Reasonable Searches 

Sometimes Data Subjects make subject access requests with the aim of creating maximum work for the recipient. “I want to see all the documents you hold which have my name in them, including e mails” is a common one. How much effort has to be made when searching for such information? The new guidance states that Controllers should make reasonable efforts to find and retrieve the requested information. However, they are “not required to conduct searches that would be unreasonable or disproportionate to the importance of providing access to the information.” Factors to consider when determining whether searches may be unreasonable or disproportionate are:

  • the circumstances of the request; 
  • any difficulties involved in finding the information; and 
  • the fundamental nature of the right of access. 

Thus there is no obligation to make every possible effort to find all instances of personal data on the Data Controller’s systems. However, the burden of proof is on Controllers to be able to justify why a search is unreasonable or disproportionate. 

Stopping the Clock 

Data Controllers have one month to respond to a subject access request. Normally this period starts from the day the request is received. Previously the ICO guidance stated that the day after receipt counted as ‘day one’. They revised their position last year following a Court of Justice (CJEU) ruling

Data Controllers can ask the Data Subject to clarify their request, if it is unclear what they want, but this often leaves little time to meet the one month deadline. Having considered consultation responses, the ICO’s position now is that where a request requires clarification, in certain circumstances, the clock can be stopped whilst Controllers are waiting for clarification. 

Manifestly Unfounded and Excessive 

Article 12(5) of GDPR allows Data Controllers to refuse a Data Subject request or charge a fee where it is “manifestly unfounded or excessive.” The burden of proving this is on the Controllers whose staff often struggle with these concepts. The ICO has now provided additional guidance on these terms. 

A request may be manifestly unfounded if: 

  • The individual clearly has no intention to exercise their right of access; or 
  • The request is malicious in intent and is being used to harass an organisation with no real purpose other than to cause disruption. For example, the individual: 
  • explicitly states, in the request itself or in other communications, that they intend to cause disruption; 
  • makes unsubstantiated accusations against you or specific employees which are clearly prompted by malice; 
  • targets a particular employee against whom they have some personal grudge; or 
  • systematically sends different requests to the Controller as part of a campaign, e.g. once a week, with the intention of causing disruption. 

To determine whether a request is manifestly excessive Data Controllers need to consider whether it is clearly or obviously unreasonable. They should base this on whether the request is proportionate when balanced with the burden or costs involved in dealing with the request. This will mean taking into account all the circumstances of the request, including: 

  • the nature of the requested information; 
  • the context of the request, and the relationship between the Controller and the individual; 
  • whether a refusal to provide the information or even acknowledge if the Controller holds it may cause substantive damage to the individual; 
  • the Controller’s available resources; 
  • whether the request largely repeats previous requests and a reasonable interval hasn’t elapsed; or 
  • whether it overlaps with other requests (although if it relates to a completely separate set of information it is unlikely to be excessive).  

The Fee 

What can be included when charging a fee for manifestly unfounded or excessive requests? The new guidance says Data Controllers can take into account the administrative costs of: 

  • assessing whether or not they are processing the information; 
  • locating, retrieving and extracting the information; 
  • providing a copy of the information; and 
  • communicating the response to the individual 

A reasonable fee may include the costs of: 

  • photocopying, printing, postage and any other costs involved in transferring the information to the individual; 
  • equipment and supplies (e.g. discs, envelopes or USB devices) 

Staff time can also be included in the above based on the estimated time it will take staff to comply with the specific request, charged at a reasonable hourly rate. In the absence of relevant regulations under the Data Protection Act 2018, the ICO encourages Data Controllers to publish their criteria for charging a  fee and how they calculate it.  

Finally, the new ICO guidance emphasises the importance of preparation particularity the need to have: 

  • Training for employees to enable them to recognise subject access requests;  
  • Specific people appointed to deal with requests; 
  • Policies and procedures; and  
  • Technical systems in place to assist with the retrieval of requested information. 

Our Handling Subject Access Requests workshop is now available online. It covers all aspects of dealing with SARs including identifying and applying exemptionsLooking for a GDPR Qualification? Final places left on our online GDPR Practitioner Certificate

British Airways: Proposed GDPR Fine Likely to be Reduced

suhyeon-choi-tTfDMaRq-FE-unsplash

In July 2019, the Information Commissioner’s Office (ICO) signalled its intention to use its powers to issue to issue Monetary Penalty Notices (fines) under the General Data Protection Regulation (GDPR).  Two Notices of Intent were issued with much fanfare.

One of the Notices was issued to British Airways for the eye watering some of £183 Million. This was the result of names, email addresses and credit card information being stolen by hackers from the BA website. According to the statement from the ICO at the time 500,000 customers were compromised in this incident.

Remember that this was a Notice of Intent and not a fine. After many months of delays and the coronavirus lockdown, we are now in a position to hazard a good guess as to the amount of the actual fine. Thanks to the reporting requirements for listed companies it is very likely that British Airways will be fined much less than the £184 million announced a year ago, and could be as little as 10% of that amount.

On 31st July, IAG ( British Airways parent company) issued its Interim Management Report for the six months ended June 30, 2020 which states:

The exceptional charge of €22 million represents management’s best estimate of the amount of any penalty issued by the Information Commissioner’s Office (ICO) in the United Kingdom, relating to the theft of customer data at British Airways in 2018. The process is ongoing and no final penalty notice has been issued“.

It will be interesting to see what happens to the other Notice of Intent, relating to Marriott Hotels for £99 Million, as well as the ICO’s investigation into the more recent EasyJet data breach. Watch this space!

This and other GDPR developments will be covered in our new online GDPR update workshopThe Lockdown is the perfect time to train your staff about GDPR and keeping data safe. With GDPR Essentials e learning course they can do this from the comfort of their own home. 

 

Act Now Supporting Innovative Digital DPIA Project

EQaZlPcXsAEyAX4

Act Now Training is pleased to announce that it is supporting a new public sector collaboration to co-design and develop a digital approach to Data Protection Impact Assessments (DPIAs).

This innovative six month project will help Data Controllers conducting DPIAs to ensure that a ’Data Protection by Design and Default’ approach is embedded into the process. The project is also supported by the Information Commissioner’s Office, NHSX and the Information and Records Management Society.

Greater Manchester Combined Authority, the London Office of Technology and Innovation, Norfolk County Council and the University of Nottingham are leading the project which follows on from a successful alpha phase undertaken last year. A full project overview can be read here: https://cc2i.org.uk/digital-dpia/

Ibrahim Hasan, Director of Act Now Training, said:

“We are really pleased to be supporting this innovative new project alongside the Information Commissioner’s Office, NHSX and the IRMS. A digital DPIA solution will be a valuable tool to help DPOs ensure that privacy and data protection are at the heart of every new data driven project.”

Are you a public authority wishing to a share in this exciting new project and shape the future of the Digital DPIA? Using a proven co-funding approach (similar to crowdfunding, but on a corporate level), the collective is actively looking for partners to join them in this cost-neutral project.

A webinar on the project and approach is being hosted on Wednesday 12th at 2pm. Led by Stephen Girling, Information Governance Project Manager at GMCA and Lianne Hawkins, Head of Service Design at Looking Local, this webinar will cover:

  • The background and outcomes of the original Digital DPIA alpha project undertaken by GMCA – including the headline business case
  • The benefits of a uniform approach to DPIAs across public sector
  • The work packages planned to deliver a digital DPIA solution
  • Partner benefits and their motivation to be part of this collaborative approach
  • Project partners timelines & what’s involved

We would encourage all our blog subscribers to register for the webinar here: http://bit.ly/2ScGdi2 A recording of the webinar will also be available. Please email  irene.zdziebko@cc2i.org.uk 

First Fine under GDPR

canstockphoto3157426

The Information Commissioner’s Office (ICO) has issued the first fine under GDPR to a London-based pharmacy. Doorstep Dispensaree Ltd, has been issued with a Monetary Penalty Notice of £275,000 for failing to ensure the security of Special Category Data.

The company, which supplies medicines to customers and care homes, left approximately 500,000 documents in unlocked containers at the back of its premises in Edgware. The documents included names, addresses, dates of birth, NHS numbers, medical information and prescriptions belonging to an unknown number of people. The ICO held that this gave rise to infringements GDPR’s security and data retention obligations. Following a thorough investigation the ICO also concluded that the company’s privacy notices and internal policies were not up to scratch.

The ICO launched its investigation into Doorstep Dispensaree after it was alerted to the insecurely stored documents by the Medicines and Healthcare Products Regulatory Agency, which was carrying out its own separate enquiry into the pharmacy. Steve Eckersley, Director of Investigations at the ICO, said:

“The careless way Doorstep Dispensaree stored special category data failed to protect it from accidental damage or loss. This falls short of what the law expects and it falls short of what people expect.”

Doorstep Dispensaree has also been issued with an enforcement notice, under Section 149 of the Data Protection Act 2018, due to the significance of the contraventions. It has three months to:

Training seems to feature heavily in the ICO’s Enforcement Notice. GDPR requires all organisations to ensure that their employees are aware of their role in protecting personal data. How to do this without them spending valuable time away from the office or overspending the training budget?

GDPR Essentials is a new e learning course from Act Now Training designed to teach those working on the frontline essential GDPR knowledge in an engaging, fun and interactive way. In less than one hour employees will learn about the key provisions of GDPR and how to keep personal data safe. Click here to read more and watch a demo.

After issuing Notices of Intent to two high profile companies for millions of pounds (British Airways and Marriot) the Information Commissioner has finally issued an actual fine, albeit for a much lower amount and to a less well known company. Data Controllers and Processors need to read the penalty notice carefully and ensure that are not repeating the same mistakes as Doorstep Dispensaree Ltd.

These and other GDPR developments will be discussed in detail in our GDPR update workshop.

%d bloggers like this: