2023 IRMS Awards: Act Now nominated in all three categories

Act Now Training is pleased to announce that it has been nominated for the 2022 Information and Records Management Society (IRMS) awards in all three categories. 

Each year the IRMS recognises excellence in the field of information management with their prestigious Industry Awards. These highly sought-after awards are presented at a glittering ceremony at the annual Conference following the Gala Dinner. In 2021 and 2022 Act Now won the Supplier of the Year award. We are hoping to make it three in a row!

For 2023 Act Now has been nominated for the following awards: 

  • Team of the Year
  • Innovation of the Year
  • Supplier of the Year

The first two nominations acknowledge our work with Damar Limited to deliver the new Data Protection and Information Governance Practitioner Level 4 Apprenticeship. Act Now Training has teamed up Damar to produce materials and expert training to help ensure apprentices develop their skills and knowledge for a successful career information governance.

Voting is open to IRMS members until Friday 24th March 2023.

The voting page is here: https://irms.org.uk/surveys/?id=2023Awards

Vote now for your favourite training company!

The New DP Reform Bill: What’s Changed?

On 8th March 2023, the UK Department for Science, Information and Technology (DSIT) published the Data Protection and Digital Information (No.2) Bill (“the new Bill”). If enacted, it will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

According to the DSIT press release, the Bill will result in a “new common-sense-led UK version of the EU’s GDPR [and will] will reduce costs and burdens for British businesses and charities, remove barriers to international trade and cut the number of repetitive data collection pop-ups online.” It also claims that the reforms are “expected to unlock £4.7 billion in savings for the UK economy over the next 10 years.” How this figure has been calculated is not explained but we have been here before! Remember the red bus?

How did we get here?

This is the second version of a bill designed to reform the UK data protection regime. In July 2022, the Government published the Data Protection and Digital Information Bill (“the previous Bill”). This was paused in September 2022 so ministers could engage in “a co-design process with business leaders and data experts” and move away from the “one-size-fits-all’ approach of European Union’s GDPR.” On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, then the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”. Another full consultation round was expected but never materialised.

The previous Bill have now been withdrawn. We will provide analysis and updates on the new Bill, as it progresses through Parliament, over the coming months. An initial summary of the key proposals, both old and new, is set out below:

What remains the same from the original bill?

Many of the proposals in the new Bill are the same as contained in the previous Bill. For a detailed analysis please read our previous blog post. Here is a summary:

  • Amended Definition of Personal Data: This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. 

  • Vexatious Data Subject Requests: The terms “manifestly unfounded” or “excessive” requests, in Article 12 of the UK GDPR, will be replaced with “vexatious” or “excessive” requests. Explanation and examples of such requests will also be included.

  • Data Subject Complaints: Data Controllers will be required to acknowledge receipt of Data Subject complaints within 30 days and respond substantively “without undue delay”. The ICO will be entitled not to accept a complaint, if a Data Subject has not made a complaint to the controller first.

  • Data Protection Officer: The obligation for some controllers and processors to appoint a Data Protection Officer (DPO) will be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals will be required to designate a senior manager as a “Senior Responsible Individual”. 

  • Data Protection Impact Assessments: These will be replaced by leaner and less prescriptive “Assessments of High Risk Processing”. 

  • International Transfers: There will be a new approach to the test for adequacy applied by the UK Government to countries (and international organisations) and when Data Controllers are carrying out a Transfer Impact Assessment or TIA. The threshold for this new “data protection test” will be whether a jurisdiction offers protection that is “not materially lower” than under the UK GDPR. (For more detail see also our forthcoming International Transfers webinar).
  • The Information Commission: The Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive.

  • Business Data: The Secretary of State and the Treasury will be given the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. 

  • PECR: Cookies will be allowed to be used without consent for the purposes of web analytics and to install automatic software updates. Furthermore non-commercial organisations (e.g. charities and political parties) will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest. Finally, there will be an increase to the fines from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

What has changed?

The new Bill does not make any radical changes to the previous Bill; rather it clarifies some points and provides a bit more flexibility in other areas. The main changes are summarised below:

  • Scientific Research: The definition of scientific research is amended so that it now includes research for the purposes of commercial activity.
    This expands the circumstances in which processing for research purposes may be undertaken, providing a broader consent mechanism and exemption to the fair processing requirement.
  • Legitimate Interests: The previous Bill proposed that businesses could rely on legitimate interests (Article 6 lawful basis) without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These “recognised” legitimate interests cover purposes for processing such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement.  The new Bill, whilst keeping the above changes, introduces a non-exhaustive list of cases where organisations may rely on the “legitimate interests” legal basis, including for the purposes of direct marketing, transferring data within the organisation for administrative purposes and for the purposes of ensuring the security of network and information systems; although a balancing exercise still needs to be conducted in these cases. 

  • Automated Decision Making: The previous Bill clarified that its proposed restrictions on automated decision-making under Article 22 UK GDPR should only apply to decisions that are a result of automated processing without “meaningful human involvement”. The new Bill states that profiling will be a relevant factor in the assessment as to whether there has been meaningful human involvement in a decision. 
  • Records of Processing Activities (ROPA): The previous Bill streamlined the required content of ROPAs. The new Bill exempts all controllers and processors from the duty to maintain a ROPA unless they are carrying out high risk processing activities. 

The Impact

The EU conducts a review of adequacy with the UK every four years; the next adequacy decision is due on 27th June 2025. Some commentators have suggested that the changes may jeopardise the UK’s adequate status and so impact the free flow of data between the UK and EU. We disagree. Although the Government states that the new Bill is “a new system of data protection”, it still retains the UK GDPR’s structure and fundamental obligations. Some tinkering around the edges is not really going to have much of an impact (see the helpful redline version of the new Bill produced by the good people at Hogen Lovells). Organisations that are already compliant with the UK GDPR will not be required to make any major changes to their systems and processes. 

The new Bill has been introduced at the first reading stage. The second reading, due to be scheduled within the next few weeks, which will be the first time the Government’s data protection reforms will be debated in Parliament. We expect the Bill to be passed in a form similar to the one now published and come into force later this year.

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop. There are only 3 places left on our next Advanced Certificate in GDPR Practice

Rogue Employees and Personal Data

Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly:

(a) obtain or disclose personal data without the consent of the controller,

(b) procure the disclosure of personal data to another person without the consent of the controller, or

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained.

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. Two recent prosecutions highlight the willingness of the Information Commissioner’s Office (ICO) to use section 170 to make examples of individuals who seek to access/steal data from their employers for personal gain. 

In January, Asif Iqbal Khan pleaded guilty to stealing data of accident victims whilst working as a Customer Solutions Specialist for the RAC. Over a single month in 2019, the RAC had received 21 complaints from suspicious drivers who received calls from claims management companies following accidents in which the RAC had assisted.

A review of individuals that had accessed these claims found that Mr Khan was the only employee to access all 21. An internal investigation later reported suspicious behaviour from Mr Khan including taking photos of his computer screen with his phone. A search warrant, executed by the ICO, seized two phones from Mr Khan and a customer receipt for £12,000. The phones contained photos of data relating to over 100 accidents.

Khan appeared at Dudley Magistrates Court in January 2023 where he pleaded guilty to two counts of stealing data in breach of Section 170 of the DPA 2018. He was fined £5,000 and ordered to pay a victim surcharge as well as court costs.

This is the second recent prosecution under Section 170. In August last year, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust pleaded guilty to accessing medical records of patients without a valid legal reason.

An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000.

Of course a S.170 prosecution would have a much greater deterrent effect if the available sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail. This has led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment.  In July last year, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August, the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of customers’ personal data from vehicle repair garages to generate potential leads for personal injury claims.

Employer Liability

If a disgruntled or rogue employee commits an offence under section 170, might their employer also be liable for the consequences?

In 2020, the Supreme Court ruled that as an employer, Morrisons Supermarket could not be held responsible when an employee, Andrew Skelton, uploaded a file containing the payroll data of thousands of Morrisons employees to a publicly accessible website as well as leaking it to several newspapers. The court decided that, whatever Skelton was doing when he disclosed his colleagues’ personal data, he was not acting “in the course of his employment”, and accordingly no vicarious liability could be imposed under the old Data Protection Act 1998.

However, Morrisons lost on the argument that the DPA 1998 operated so as to exclude vicarious liability completely. This principle can also be applied to the GDPR and so employers can “never say never” when it comes to vicariously liability for malicious data breaches by staff. It all depends on the facts of the breach.

This case only went as far as it did because the Morrisons employees failed to show, at first instance, that Morrisons was primarily liable for the data breach. If an employer fails to comply with its security obligations in a manner that is causally relevant to a rogue employee’s actions, it can still be exposed to primary liability under Article 32 of GDPR as well as the 6th Data Protection Principle which both impose obligations to ensure the security of personal data.

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop. There are only 3 places left on our next Advanced Certificate in GDPR Practice.

Mega GDPR Fines for Meta

On 4th January 2023, Ireland’s Data Protection Commission (DPC) announced the conclusion of two inquiries into the data processing operations of Meta Platforms Ireland Limited (“Meta Ireland”) in connection with the delivery of its Facebook and Instagram services. Not only does this decision significantly limit Meta’s ability to gather information from its users to tailor and sell advertising, it also provides useful insight into EU regulators’ view about how to comply with Principle 1 of GDPR i.e. the need to ensure personal data is “processed lawfully, fairly and in a transparent manner in relation to the data subject”(Article 5).

In decisions dated 31st December 2022, the DPC fined Meta Ireland €210 million and €180 million, relating to its Facebook and Instagram services respectively. The fines were imposed in connection with the company’s practise of monetising users’ personal data by running personalised adverts on their social media accounts. Information about a social media user’s digital footprint, such as what videos prompt them to stop scrolling or what types of links they click on, is used by marketers to get personalised adverts in front of people who are the most likely to buy their products. This practice helped Meta generate $118 billion in revenue in 2021.

The DPC’s decision was the result of two complaints from Facebook and Instagram users, supported by privacy campaign group NOYB, both of which raised the same basic issue: how Meta obtains legal permission from users to collect and use their personal data for personalised advertising. Article 6(1) of GDPR states that:

“Processing shall be lawful only if and to the extent that at least one of the following applies:

  1. the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
  • processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;”

In advance of the GDPR coming into force on 25th May 2018, Meta Ireland changed the Terms of Service for its Facebook and Instagram services. It also flagged the fact that it was changing the legal basis upon which it relies to process users’ personal data under Article 6 in the context of the delivery of the Facebook’s and Instagram’s services (including behavioural advertising). Having previously relied on the consent of users to the processing of their personal data, the company now sought to rely on the “contract” legal basis for most (but not all) of its processing operations. Existing and new users were required to click “I accept” to indicate their acceptance of the updated Terms of Service in order to continue using Facebook and Instagram. The services would not be accessible if users declined to do so.

Meta Ireland considered that, on accepting the updated Terms of Service, a contract was concluded between itself and the user. Consequently the processing of the user’s personal data in connection with the delivery of its Facebook and Instagram services was necessary for the performance of this “contract” which includes the provision of personalised services and behavioural advertising.  This, it claimed, provided a lawful basis by reference to Article 6(1)(b) of the GDPR.

The complainants contended that Meta Ireland was in fact still looking to rely on consent to provide a lawful basis for its processing of users’ data. They argued that, by making the accessibility of its services conditional on users accepting the updated Terms of Service, Meta Ireland was in fact “forcing” them to consent to the processing of their personal data for behavioural advertising and other personalised services. This was not real consent as defined in Article 4 of GDPR:

“any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her;” (our emphasis)

Following comprehensive investigations, consultation with other EU DP regulators (a process required by GDPR in such cases) and final rulings by the European Data Protection Board, the DPC made a number of findings; notably:

1. Meta Ireland did not provide clear information about its processing of users’ personal data, resulting in users having insufficient clarity as to what processing operations were being carried out on their personal data, for what purpose(s), and by reference to which of the six legal bases identified in Article 6. The DPC said this violated Articles 12 (transparency) and 13(1)(c) (information to be provide to the data subject) of GDPR. It also considered it to be a violation of Article 5(1)(a), which states that personal data must be processed lawfully, fairly and transparently.

2. Meta Ireland cannot rely on the contract legal basis for justifying its processing. The delivery of personalised advertising (as part of the broader suite of personalised services offered as part of the Facebook and Instagram services) could not be said to be necessary to perform the core elements of what was said to be a much more limited form of contract. The DPC adopted this position following a ruling by the EDPB, which agreed with other EU regulators’ representations to the DPC.

In addition to the fines, Meta Ireland has been directed to ensure its data processing operations comply with GDPR within a period of 3 months. It has said it will appeal; not surprising considering the decision has the potential to require it to make costly changes to its personalised advertising-based business in the European Union, one of its largest markets. 

It is important to note that this decision still allows Meta to use non-personal data (such as the content of a story) to personalise adverts or to ask users to give their consent to targeted adverts. However under GDPR users should be able to withdraw their consent at any time.  If a large number do so, it would impact one of the most valuable parts of Meta’s business. 

The forthcoming appeals by Meta will provide much needed judicial guidance on the GDPR particular Principle 1. Given the social media giant’s deep pockets, expect this one to run and run.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

UK GDPR Reform: Will There Be A New Consultation?

What is happening with the Government’s proposal for UK GDPR reform? Just like Donald Trump’s predicted “Red Wave” in the US Mid Term Elections, it’s turning out to be a bit of a ripple!

In July the Boris Johnson Government, published the Data Protection and Digital Information Bill. This was supposed to be the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. The government projected it would yield savings for businesses of £1billion over ten years. (Key provisions of the bill are summarised in our blog post here.)

On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”.

The Bill’s passage through Parliament was suspended. It seemed that drafters would have to go back to the drawing board to showcase even more “Brexit benefits”. There was even talk of another round of consultation. Remember the Bill is the result of an extensive consultation launched in September 2021 (“Data: A New Direction”).

Last week, Ibrahim Hasan, attended the IAPP Conference in Brussels. Owen Rowland, Deputy Director at the DCMS, told the conference that the latest “consultation” on the stalled bill will begin shortly. However he confirmed it will not be a full-blown public consultation:

“It’s important to clarify (the type of consultation). However, we are genuinely interested in continuing to engage with the whole range of stakeholders. Different business sectors as well as privacy and consumer groups,” Rowland said. “We’ll be providing details in the next couple of weeks in terms of the opportunities that we are going to particularly set up.”

The Bill may not receive a deep overhaul, but Rowland said he welcomes comments that potentially raise “amendments to (the existing proposal’s) text that we should make.” He added the consultation is being launched to avoid “a real risk” of missing important points and to provide “opportunities were not fully utilising” to gain stakeholder insights.

Rowland went on to suggest that the DCMS would conduct some roundtables. If any of our readers are invited to the aforementioned tables (round or otherwise) do keep us posted. Will it make a difference to the content of the bill? We are sceptical but time will tell. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

AI and Data Protection: Is ‘Cortana’ such a problem?

‘AI’ and/or ‘Machine Learning’ as it’s known is becoming more prevalent in the working environment. From ‘rogue algorithms’ upsetting GCSE gradings through to Microsoft 365 judging you for only working on one document all day, we cannot escape the fact that there are more ‘automated’ services than ever before.  

For DPOs, records managers and IG officers, this poses some interesting challenges to the future of records, information and personal data.  

I was asked to talk about the challenges of AI and machine learning at a recent IRMS Public Sector Group webinar. In the session titled ‘IRM challenges of AI & something called the ‘metaverse’ we looked at a range of issues, some of which I’d like to touch on a little bit below. While I remain unconvinced the ‘metaverse’ is going to arrive any time soon, AI and algorithms very much are here and are growing fast.  

From a personal data and privacy point of view, we know that algorithms guide our online lives. From what adverts we see to what posts come up on our feed on Twitter, Facebook etc. How is that not creepy that this algorithm knows more about me than I do? And how does that go before it has mass implications for everyday citizens. What happens if the ‘social algorithm’ works out your sexuality before you or your family has? I work with families that still to this day will abuse and cast out those that are LGBTQ+, so imagine the damage a ‘we thought you’d like this’ post would do.  

Interesting questions have been posed on Twitter about ‘deep fake’ videos and whether they are personal data. The answers are mixed and pose some interesting implications for the future. Can you imagine the impact if someone can use AI to generate a video of you doing something you are not meant to? That’s going to take some doing to undo by which time, the damage is done. If you want to see this in action, I’d recommend watching Season 2 of ‘Capture’ on BBC iPlayer. 

In an organisational context, if organisations are to use algorithms to help with workload and efficient services it is simple logic that the algorithm must be up to scratch. As the Borg Queen (a cybernetic alien) in Star Trek First Contact once said to Data (a self aware android) “you are an imperfect being, created by an imperfect being. Finding your weakness is only a matter of time”. If anyone can find me a perfectly designed system that doesn’t have process issues, bugs and reliability issues, do let me know.  

Many data scientists and other leading data commentators like Cathy O’Neill frequently state that “Algorithms are basically opinions embedded in code”. And opinions bring with them biases, shortcomings and room for error.  

Now, that is not to say that these things do not have their advantages – they very much do. However, in order to get something good out of them you need to ensure good stuff goes into them and good stuff helps create them. Blindly building and feeding a machine just because it’s funky and new, as we have all seen time and again, always leads to trouble.  

Myself and Olu attended (me virtually and Olu in person) the launch of the AI Standards Hub by the Alan Turning Institute (and others). A fascinating initiative by the ATI and others, including UK Government.  

Now why am I talking about this at the moment? Put simply, as I mentioned above, this technology is here and is not going anywhere. Take look at this company offering live editing of your voice, and you may even find this conversation between a google engineer and an AI quite thought provoking and sometimes scary. If anything, AI is evolving at an ever growing pace. Therefore information professionals from all over the spectrum need to be aware of how it works, how it can be used in your organisation, and how you can upskill to challenge and support it.  

In recent times the ICO has been publishing a range of guidance on this, including a relatively detailed guide on how to use AI and consider Data Protection implications. While it’s not a user manual it does give some key points to consider and steps to go through. 

Right Alexa, end my blog! oh I mean, Hey Siri, end my blog… Darn… OK Google…

If you are interested in learning more about the IRM & DP challenges with ‘AI’ and upskilling as a DPO, Records or Information Governance Manager then check out Scott’s workshop on Artificial Intelligence and Machine Learning, How to implement Good Information Governance. Book your place for 17th November now. 

ICO Reprimand for Misuse of Children’s Data: A Proportionate Response or a Let Off?

Last week, the Department for Education received a formal reprimand from the Information Commissioner’s Office(ICO) over a “serious breach” of the GDPR involving the unauthorised sharing of up to 28 million children’s personal data. But the Department has avoided a fine, despite a finding of “woeful” data protection practices.

The reprimand followed the ICO’s investigation into the sharing of personal data stored on the Learning Records Service (LRS) database, for which the DfE is the Data Controller. LRS provides a record of pupils’ qualifications that education providers can access. It contains both personal and Special Category Data and at the time of the incident there were 28 million records stored on it. Some of those records would have pertained to children aged 14 and over. 

The ICO started its investigation after receiving a breach report from the DfE about the unauthorised access to the LRS database. The DfE had only become aware of the breach after an exposé in a national Sunday newspaper.

The ICO found that the DfE’s poor due diligence meant that it continued to grant Trustopia access to the database when it advised the DfE that it was the new trading name for Edududes Ltd, which had been a training provider. Trustopia was in fact a screening company and used the database to provide age verification services to help gambling companies confirm customers were over 18. The ICO ruled that the DfE failed to:

  • protect against the unauthorised processing by third parties of data held on the LRS database for reasons other than the provision of educational services. Data Subjects were unaware of the processing and could not object or otherwise withdraw from this processing. Therefore the DfE failed to process the data fairly and lawfully in accordance with Article 5 (1)(a). 
  • have appropriate oversight to protect against unauthorised processing of personal data held on the LRS database and had also failed to ensure its confidentiality in accordance with Article 5 (1)(f). 

The ICO conducted a simultaneous investigation into Trustopia, during which the company confirmed it no longer had access to the database and the cache of data held in temporary files had been deleted. Trustopia was dissolved before the ICO investigation concluded and therefore regulatory action was not possible.

The DfE has been ordered to implement the following five measures to improve its compliance: 

  1. Improve transparency around the processing of the LRS database so Data Subjects are aware and are able to exercise their Data Subject rights, in order to satisfy the requirements of Article 5 (1)(a) of the UK GDPR. 
  • Review all internal security procedures on a regular basis to identify any additional preventative measures that can be implemented. This would reduce the risk of a recurrence to this type of incident and assist compliance with Article 5 (1)(f) of the UK GDPR. 
  • Ensure all relevant staff are made aware of any changes to processes as a result of this incident, by effective communication and by providing clear guidance. 
  • Complete a thorough and detailed Data Protection Impact Assessment, which adequately assesses the risk posed by the processing. This will enable the DfE to identify and mitigate the data protection risks for individuals. 

This investigation could, and many would say should, have resulted in a fine. However, in June 2022 John Edwards, the Information Commissioner, announced a new approach towards the public sector with the aim to reduce the impact of fines on the sector. Had this new trial approach not been in place, the DfE would have been issued with a fine of over £10 million. In a statement, John Edwards said:

“No-one needs persuading that a database of pupils’ learning records being used to help gambling companies is unacceptable. Our investigation found that the processes put in place by the Department for Education were woeful. Data was being misused, and the Department was unaware there was even a problem until a national newspaper informed them.

“We all have an absolute right to expect that our central government departments treat the data they hold on us with the utmost respect and security. Even more so when it comes to the information of 28 million children.

“This was a serious breach of the law, and one that would have warranted a £10 million fine in this specific case. I have taken the decision not to issue that fine, as any money paid in fines is returned to government, and so the impact would have been minimal. But that should not detract from how serious the errors we have highlighted were, nor how urgently they needed addressing by the Department for Education.”

The ICO also followed its new public sector enforcement approach when issuing a reprimand to NHS Blood and Transplant Service. In August 2019, the service inadvertently released untested development code into a live system for matching transplant list patients with donated organs. This error led to five adult patients on the non-urgent transplant list not being offered transplant livers at the earliest possible opportunity. The ICO said that, if the revised enforcement approach had not been in place, the service would have received a fine of £749,856. 

Some would say that the DFE has got off very lightly here and, given their past record, perhaps more stringent sanctions should have been imposed. Two years ago, the ICO criticised the DfE for secretly sharing children’s personal data with the Home Office, triggering fears it could be used for immigration enforcement as part of the government’s hostile environment policy. 

Many will question why the public sector merits this special treatment. It is not as if it has been the subject of a disproportionate number of fines. The first fine to a public authority was only issued in December 2021 (more than three and a half years after GDPR came into force) when the Cabinet Office was fined £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online. This was recently reduced to £50,000 following a negotiated settlement of a pending appeal.

Compare the DfE reprimand with last month’s Monetary Penalty Notice in the sum of £1,350,000 issued to a private company, Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products. With austerity coming back with a vengeance, no doubt the private sector will question the favourable terms for the public sector. 

Perhaps the Government will come to the private sector’s rescue. Following the new DCMS Secretary for State’s speech  last month, announcing a plan to replace the UK GDPR with a new “British data protection system” which cuts the “burdens” for British businesses, DCMS officials have said further delays to the Data Protection and Digital Information Bill are on the way. A new public consultation will be launched soon.

So far the EU is not impressed. A key European Union lawmaker has described meetings with the U.K. government over the country’s data protection reform plans as “appalling.” Italian MEP Fulvio Martusciello from the center-right European People’s Party said his impression from the visit was that Britain is “giving in on privacy in exchange for business gain.”

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 21st November. 

Back To The Future For UK GDPR?

On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”. Just as we are all getting to grips with the (relatively new) UK GDPR, do we want more change and uncertainty? How did we get here? Let’s recap.

In July the Government, led by Boris Johnson (remember him?), published the Data Protection and Digital Information Bill. This was supposed to be the next step in its much publicised plans to reform the UK Data Protection regime following Brexit (remember that?). 

In the Government’s response to the September 2021 data protection reform consultation (“Data: A New Direction”) it said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” To achieve this, the new Bill proposed amendments to existing UK data protection legislation in particular the UK GDPR. On further analysis, the bill was more a “tinkering” with GDPR rather than a wholesale change; although the government projected it would yield savings for businesses of £1billion over ten years. (Key provisions of the bill are summarised in our blog post here.)

Following the Bill, we had a new Prime Minister. Nadine Dorries at the DCMS was replaced by Michelle Donelan. The new bill’s passage in Parliament was suspended with a promise to re-introduce it. Now it seems that we could have a new piece of legislation altogether. 

The headline of Donelan’s speech was that the Truss Government would replace GDPR with “our own business- and consumer-friendly British data protection system”. She says it will be “co-design[ed] with business …” But the devil is in the detail or lack thereof.  

Donelan’s speech also contained the usual compulsory myths about GDPR and tired data protection law cliches (the old ones are the best!). She regurgitated complaints highlighted by  Oliver Dowden, when he was at the DCMS:

“We’ve even had churches write to the department, pleading for us to do something, so that they can send newsletters out to their communities without worrying about breaching data rules.”

And plumbers and electricians are also finding GDPR a problem:

“No longer will our businesses be shackled by unnecessary red tape. At the moment, even though we have shortages of electricians and plumbers, GDPR ties them in knots with clunky bureaucracy.” 

So that’s why my electrician doesn’t turn up. He is busy drafting a GDPR compliant privacy notice!

Donelan even claimed that “researchers at Oxford University estimated that it has directly caused businesses to lose over 8% of their profits.” This is selective quoting at best. (Andy Crow has done a great post on this.)

Will this “growth” focused reform of UK data protection rules risk the UK’s adequacy status with the EU? It depends on the final text of the law “co-designed” by business. ; )

7th Nov 2022 Update: Today we hear that the EU is not impressed. A key European Union lawmaker has described meetings with the U.K. government over the country’s data protection reform plans as “appalling.” Italian MEP Fulvio Martusciello from the center-right European People’s Party said his impression from the visit was that Britain is “giving in on privacy in exchange for business gain.”

Data protection practitioners should not burn their UK GDPR Handbook just yet! If you have been following the events of the last few days, you might suspect that we may have a new Prime Minister soon and/or a General Election. This will mean that the pause button on data protection reform could be pressed again. To repeat the phrase of the past year, “We live in uncertain times!”

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 21st November. 

£1.35 Million GDPR Fine for Catalogue Retailer

On 5th October, the Information Commissioner’s Office (ICO) issued a GDPR Monetary Penalty Notice in the sum of £1,350,000 to Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products.

This latest ICO fine is interesting but not because of the amount involved. There have been much higher fines. In October 2020, British Airways was fined £20 million for a cyber security breach which saw the personal and financial details of more than 400,000 customers being accessed by hackers. This, like most of the other ICO fines, involved a breach of the security provisions of GDPR. In the Easylife fine, the ICO focussed on the more interesting GDPR provisions (from a practitioner’s perspective) relating to legal basis, profiling and transparency. 

The background to the fine is that a telemarketing company was being investigated by the ICO for promoting funeral plans during the pandemic. This led to the investigation into Easylife because the company was conducting marketing calls for Easylife. The investigation initially concerned potential contraventions of the Privacy and Electronic Communications Regulations (PECR), and that investigation raised concerns of potential contraventions of GDPR, which the Commissioner then investigated separately.

The ICO investigation found that when a customer purchased a product from Easylife’s Health Club catalogue, the company would make assumptions about their medical condition and then market health-related products to them without their consent. For example, if a person bought a jar opener or a dinner tray, Easylife would use that purchase data to assume that person has arthritis and then call them to market glucosamine joint patches.

Special Category Data and Profiling

Article 4( 4) of the GDPR defines profiling:
“‘profiling’ means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements;”

Out of 122 products in Easylife’s Health Club catalogue, 80 were considered to be ‘trigger products’. Once these products were purchased by customers, Easlylife would target them with a health-related item. The ICO found that significant profiling of customers was taking place. 

Easylife’s use of customer transactional data to infer that the customer probably had a particular health condition was Special Category Data. Article 6 and 9 of the GDPR provides that such data may not be processed unless a lawfulness condition can be found. The only relevant condition in the context of Easylife’s health campaign was explicit consent. Easylife did not collect consent to process Special Category Data, instead relying on legitimate interest (based on its privacy notice) under Article 6. As a result, it had no lawful basis to process the data in contravention of Article 6 and Article 9 of the GDPR. 

Invisible Processing

Furthermore the ICO concluded that ‘invisible’ processing of health data took place. It was ‘invisible’ because Easylife’s customers were unaware that the company was collecting and using their personal data for profiling/marketing purposes. In order to process this data lawfully, Easylife would have had to collect explicit consent from the customers and to update its privacy policy to indicate that Special Category Data was to be processed by consent. Easylife’s omission to do this was a breach of Article 13(1)(c) of the GDPR.

John Edwards, UK Information Commissioner, said:

“Easylife was making assumptions about people’s medical condition based on their purchase history without their knowledge, and then peddled them a health product – that is not allowed.

The invisible use of people’s data meant that people could not understand how their data was being used and, ultimately, were not able to exercise their privacy and data protection rights. The lack of transparency, combined with the intrusive nature of the profiling, has resulted in a serious breach of people’s information rights.”

One other ICO monetary penalty notice has examined these issues in detail. In May 2022 Clearview AI was fined £7,552,800 following an investigation into its online database contains 20 billion images of people’s faces scraped from the internet. 

As Jon Baines pointed out (thanks Jon!), on the Jiscmail bulletin board, a large chunk of the online programmatic advertising market also profiles people and infers Special Category Data in the same way as Easylife. This was highlighted in the ICO’s 2019 report. The ICO said in January last year that it was resuming its Adtech investigation, but there has been very little news since then.

GDPR was not the only cause of Easylife’s woes. It was also fined £130,000 under PECR for making 1,345,732 direct marketing calls to people registered with the Telephone Preference Service (TPS).

This case also shows the importance of organisations only using  telephone marketing companies who understand and comply with GDPR and PECR. If not, the ICO enforcement spotlight will also fall on clients of such companies.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 25th October. 

The Data Protection and Digital Information Bill: A new UK GDPR?

In July the Government published the Data Protection and Digital Information Bill, the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. 

In the Government’s response to the September 2021 consultation (“Data: A New Direction”) it said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” To achieve this, the new Bill proposes substantial amendments to existing UK data protection legislation; namely the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). There is no shiny new Data Protection Act 2022 or even a new colour for the UK GDPR! Perhaps a missed opportunity to showcase the benefits of Brexit! 

In addition to reforming core data protection law, the Bill deals with certification of digital identity providers, electronic registers of births and deaths and information standards for data-sharing in the health and adult social care system. The notable DP provisions are set out below.

Amended Definition of Personal Data

Clause 1 of the Bill limits the scope of personal data to:

  • where the information is identifiable by the controller or processor by reasonable means at the time of the processing; or
  • where the controller or processor ought to know that another person will likely obtain the information as a result of the processing and the individual will likely be identifiable by that person by reasonable means at the time of the processing.

This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. It could make it easier for organisations to achieve data anonymisation as they would no longer need to concern themselves with potential future identifiability, with the focus instead being on identifiability “at the time of the processing”. On the other hand, the change does not address the risk of indirect identification.

Vexatious Data Subject Requests

Article 12 of the UK GDPR allows controllers to refuse to comply with data subject rights requests (or charge a fee) when the requests are “manifestly unfounded” or “excessive”.  Clause 7 of the Bill proposes to replace this with “vexatious” or “excessive”. Examples of vexatious requests given in the Bill are those requests intended to cause distress, not made in good faith, or that are an abuse of process. All these could easily fit into “manifestly unfounded” and so it is difficult to understand the need for change here. 

Data Subject Complaints

Currently, the UK GDPR allows a data subject to complain to the Information Commissioner, but nothing expressly deals with whether or how they can complain to a controller. Clause 39 of the Bill would make provision for this and require the controller to acknowledge receipt of such a complaint within 30 days and respond substantively “without undue delay”. However, under clause 40, if a data subject has not made a complaint to the controller, the ICO is entitled not to accept the complaint.

Much was made about “privacy management programmes” in the Government’s June announcement. These are not expressly mentioned in the Bill but most of the proposals that were to have fallen under that banner are still there (see below).

Senior Responsible Individuals

As announced in June, the obligation for some controllers and processors to appoint a Data Protection Officer (DPO) is proposed to be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals, are required (by clause 14) to designate a senior manager as a “Senior Responsible Individual”. Just like the DPO, the SRI must be adequately resourced and cannot be dismissed for performing their tasks under the role. The requirement for them to be a senior manager (rather than just reporting to senior management, as current DPOs must) will cause problems for those organisations currently using outsourced DPO services.

ROPAs and DPIAs

The requirement for Records of Processing Activities (ROPAs) will also go. Clause 15 of the Bill proposes to replace it with a leaner “Record of Processing of Personal Data”.  Clause 17 will replace Data Protection Impact Assessments (DPIAs) with leaner and less prescriptive Assessments of High Risk Processing. Clause 18 ensures that controllers are no longer required, under Article 36 of the UK GDPR, to consult the ICO on certain high risk DPIAs.

Automated Decision Making

Article 22 of UK GDPR currently confers a “right” on data subjects not to be subject to automated decision making which produces legal effects or otherwise significantly affects them. Clause 11 of the Bill reframes Article 22 in terms of a positive right to human intervention. However, it would only apply to “significant” decisions, rather than decisions that produce legal effects or similarly significant effects. It is unclear whether this will make any practical difference. 

International Transfers 

The judgment of the European Court of Justice (ECJ) in “Schrems II” not only stated that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal transfer tool. It also said that in any international data transfer situation, whether to the USA or other countries, the data exporter needs to make a complex assessment  about the recipient country’s data protection legislation to ensure that it adequately protects the data especially from access by foreign security agencies (a Transfer Impact Assessment or TIA) .  

The Bill amends Chapter 5 of the UK GDPR (international transfers) with the introduction of the “data protection test” for the above mentioned assessment. This would involve determining if the standard of protection provided for data subjects in the recipient country is “not materially lower” than the standard of protection in the UK. The new test would apply both to the Secretary of State, when making “adequacy” determinations, and to controllers, when deciding whether to transfer data. The explanatory notes to the Bill state that the test would not require a “point- by-point comparison” between the other country’s regime and the UK’s. Instead an assessment will be “based on outcomes i.e. the overall standard of protection for a data subject”. 

An outcome based approach will be welcome by organisations who regularly transfer personal data internationally especially where it is of no practical interest to foreign security agencies. However, this proposed approach will attract the attention of the EU (see later). (see also our forthcoming International Transfers webinar).

The Information Commission

Under clause 100 of the Bill, the Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive (presumably John Edwards, the current Commissioner). 

The Commission would have a principal function of overseeing data protection alongside additional duties such as to have regard to the desirability of promoting innovation; the desirability of promoting competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; and the need to safeguard public security and national security. New powers for the Commission include an audit/assessment power (clause 35) to require a controller to appoint a person to prepare and provide a report and to compel individuals to attend for interviews (clause 36) in civil and criminal investigations.

The Bill also proposes to abolish the Surveillance Camera Commissioner and the Biometrics Commissioner.

Privacy and Electronic Communications (EC Directive) Regulations 2003 

Currently, under PECR, cookies (and similar technologies) can only be used to store or access information on end user terminal equipment without express consent where it is “strictly necessary” e.g. website security or proper functioning of the site. The Bill proposes allowing cookies to be used without consent for the purposes of web analytics and to install automatic software updates (see the GDPR enforcement cases involving Google Analytics). 

Another notable proposed change to PECR, involves extending “the soft opt-in” to electronic communications from organisations other than businesses. This would permit political parties, charities and other non-profits to send unsolicited email and SMS direct marketing to individuals without consent, where they have an existing supporter relationship with the recipient. 

Finally on PECR, the Bill proposes to increase the fines for infringement from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

Business Data

The Bill would give the Secretary of State and the Treasury the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. “Customers” would not just be data subjects, but anyone who purchased (or received for free) goods, services or digital content from a trader in a consumer (rather than business) context. “Business data” would include information about goods, services and digital content supplied or provided by a trader. It would also include information about where those goods etc. are supplied, the terms on which they are supplied or provided, prices or performance and information relating to feedback from customers. Customers would potentially have a right to access their data, which might include information on the customer’s usage patterns and the price paid to aid personalised price comparisons. Similarly, businesses could potentially be required to publish, or otherwise make available, business data.

These provisions go much further than existing data portability provisions in the UK GDPR. The latter does not guarantee provision of data in “real time”, nor cover wider contextual data. Nor do they apply where the customer is not an individual.

Adequacy?

The Bill is currently making its way through Parliament. The impact assessment reiterates that “the government’s view is that reform of UK legislation on personal data is compatible with the EU maintaining free flow of personal data from Europe.”  However, with the multiple amendments proposed in the Bill, the UK GDPR is starting to look quite different to the EU version. And the more the two regimes diverge, the more there is a risk that the EU might put a “spanner in the works” when the UK adequacy assessment is reviewed in 2024. Much depends on the balance struck in the final text of the Bill. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September. 

Exit mobile version
%%footer%%