The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

Spring is around the corner, and what better way to celebrate than by learning something new? Act Now Training are offering a special Spring sale with 10% off on all one day courses until 21/04/23. Plus, we have some exciting discounts on our GDPR certificates! 

Our one day courses are designed to provide you with a comprehensive understanding of various information governance topics, including data protection, records management, FOI and information security. Whether you are a beginner or an experienced professional, our courses are tailored to meet your specific needs. 

But that’s not all! We also have some exclusive discounts on our GDPR certifications. You can get a 10% discount on our NEW Intermediate Certificate in GDPR course (Valued at £195) and a mega £150 off on our Advanced Certificate in GDPR Practice course. 

Our Intermediate Certificate in GDPR strengthens the foundations established by our UK GDPR Practitioner certificate. Delegates will cover more challenging topics and gain a deeper awareness of the fundamental data protection principles. It is an excellent option for those with an established knowledge base and experience in data protection who wish to level up their knowledge and sharpen their skills. 

Our Advanced Certificate in GDPR course is perfect for those who want to take their GDPR knowledge to the next level. This course covers the more complex aspects of GDPR and provides you with the practical skills needed to manage GDPR compliance effectively. You will learn how to break down complex multi-faceted scenarios and learn how to analyse case law, MPNs, ICO reprimands and Enforcement notices. This course is unlike any other, it challenges delegates with real world complex scenarios and is excellent in showcasing a much higher level of knowledge depth and understanding. 

Don’t miss this opportunity to enhance your information governance skills and take advantage of our Spring sale. To take advantage of this offer, simply book your chosen course before 21/04/23 and enter the code SPRING10 at checkout and the relevant discount will be applied.

Act Now Launches New Intermediate Certificate in GDPR Practice 

Act Now Training is pleased to announce the launch of the Intermediate Certificate in GDPR Practice.  

For the past three years, we have been working on a skills and competency framework for DPOs. Alongside industry experts and education professionals, we have undertaken a thorough analysis of all the core skills and competencies required for the DPO role and how they map against our wider GDPR course curriculum. The latter has been designed to allow delegates to easily map their personal learning journey and ensure they have the requisite level of knowledge and skill for their role. 

Through this work, we have identified a need to further develop DPOs who have undertaken our GDPR Practitioner Certificate but who now wish to hone their DPO skills. Hence the emphasis in the intermediate certificate is on skills, as well as advanced knowledge, with delegates covering more challenging topics to gain a deeper awareness of the fundamental data protection principles.  

This new course is part of our ongoing commitment to encouraging the development of Information Governance as a recognised profession. Through our involvement in NADPO and the IRMS over the past 20 years, Act Now has been actively encouraging new entrants to the IG profession and providing quality training to assist in their learning and development. When the DP and IG Apprenticeship was launched last year, we became one of the first training companies to partner up with a leading apprenticeship provider to deliver specialist IG training and materials to apprentices.
These have led to our partner, Damar, recruiting over 100 apprentices and helping them lay the foundations for a successful career in IG. 

Ibrahim Hasan, Course Director, said: 

“Having listened to the demand from our delegates and taken soundings from IG experts, we are excited to launch the Intermediate Certificate in GDPR Practice. This new course is a great option for those with an established knowledge base and experience in data protection who wish to level up their knowledge and sharpen their skills. It is also an ideal stepping stone for those contemplating our Advanced Certificate in GDPR Practice.” 

Content 

This new course will challenge delegates, who have completed our GDPR Practitioner Certificate, by looking at previously covered subjects in more depth and complexity. These include getting to grips with Principle 1 and interpreting lawfulness, fairness and transparency in the light of ICO and EU enforcement action. We also work through more complex subject access requests applying exemptions and considering the practical aspects.  

The course also covers new and topical data protection subjects such as the processing of  children’s data, use of AI and consideration of data ethics.
We also take time to compare data protection laws around the world and consider the changes to the UK data protection regime proposed by the recently announced revised Data Protection and Digital Information Bill

Format 

This course is set over three days (one day per week) and can be attended online or in the classroom. Each day is designed to develop delegates’ ability to understand and apply the UK DP law in a practical context using case studies and exercises. On completion of the course, delegates are required to complete a practical assessment within 30 days. 

Our teaching style is based on practical and engaging workshops covering theory alongside hands-on application. Delegates will have personal tutor support throughout the course and access to a comprehensive online resource lab ensuring they have the best opportunity for success. 

Who will this course benefit? 

This course is ideal for those who have completed our GDPR Practitioner Certificate who wish to sharpen their skills and knowledge before undertaking the Advanced Certificate in GDPR Practice. 

The first course is scheduled for April. We have a special introductory price for a limited period. If you would like a chat with Ibrahim to discuss your suitability for the course, please get in touch.  

Experian’s GDPR Appeal: Lawfulness, Fairness, and Transparency

On 20th February 2023, the First-Tier (Information Rights) Tribunal (FTT) overturned an Enforcement Notice issued against Experian by the Information Commissioner’s Office (ICO). 

This case relates to Experian’s marketing arm, Experian Marketing Services (EMS) which provides analytics services for direct mail marketing companies. It obtains personal data from three types of sources; publicly available sources, third parties and Experian’s credit reference agency (CRA) business. The company processes this personal data to build profiles about nearly every UK adult. An individual profile can contain over 400 data points. The company sells access to this data to marketing companies that wish to improve the targeting of their postal direct marketing communications. 

The ICO issued an Enforcement Notice against Experian in April 2020, alleging several GDPR violations namely; Art. 5(1)(a) (Principle 1, Lawfulness, fairness, and transparency), Art. 6(1) (Lawfulness of processing) and Art. 14 (Information to be provided where personal data have not been obtained from the data subject). 

Fair and Transparent Processing: Art 5(1)(a) 

The ICO criticised Experian’s privacy notice for being unclear and for not emphasising the “surprising” aspects of Experian’s processing. It ordered Experian to: 

  • Provide an up-front summary of Experian’s direct marketing processing. 
  • Put “surprising” information (e.g. regarding profiling via data from multiple sources) on the first or second layer of the notice. 
  • Use clearer and more concise language. 
  • Disclose each source and use of data and explain how data is shared, providing examples.  

The ICO also ordered Experian to stop using credit reference agency data (CRA data) for any purpose other than those requested by Data Subjects. 

Lawful Processing: Arts. 5(1)(a) and 6(1) 

All processing of personal data under the GDPR requires a legal basis. Experian processed all personal data held for marketing purposes on the basis of its legitimate interests, including personal data that was originally collected on the basis of consent. Before relying on legitimate interests, controllers must conduct a “legitimate interests assessment” to balance the risks of processing the risks. Experian had done this, but the ICO said the company had got the balance wrong. It ordered Experian to: 

  • Delete all personal data that had been collected via consent and was subsequently being processed on the basis of Experian’s legitimate interests. 
  • Stop processing personal data where an “objective” legitimate interests assessment revealed that the risks of the processing outweigh the benefits. 
  • Review the GDPR compliance of all third parties providing Experian with personal data. 
  • Stop processing any personal data that has not been collected in a GDPR-compliant way. 

Transparency: Art. 14 

Art. 14 GDPR requires controllers to provide notice to data subjects when obtaining personal data from a third-party or publicly available source. Experian did not do provide such notices relying on the exceptions in Art 14. 

Where Experian had received personal data from third parties, it said that it did not need to provide a notice because “the data subject already has the information”. It noted that before a third party sent Experian personal data, the third party would provide Data Subjects with its own privacy notice. That privacy notice would contain links to Experian’s privacy notice.
Where Experian had obtained personal data from a publicly available source, such as the electoral register, it claimed that to provide a notice would involve “disproportionate effort”. 

The ICO did not agree that these exceptions applied to Experian, and ordered it to: 

  • Send an Art. 14 notice to all Data Subjects whose personal data had been obtained from a third-party source or (with some exceptions) a publicly available source. 
  • Stop processing personal data about Data Subjects who had not received an Art. 14 notice. 

The FTT Decision  

The FTT found that Experian committed only two GDPR violations: 

  • Failing to provide an Art. 14 notice to people whose data had been obtained from publicly available sources. 
  • Processing personal data on the basis of “legitimate interests” where that personal data had been originally obtained on the basis of “consent” (by the time of the hearing, Experian had stopped doing this). 

The FTT said that the ICO’s Enforcement Notice should have given more weight to:  

  • The costs of complying with the corrective measures. 
  • The benefits of Experian’s processing. 
  • The fact that Data Subjects would (supposedly) not want to receive an Art. 14 notice. 

The FTT overturned most of the ICO’s corrective measures. The only new obligation on Experian is to send Art. 14 notices in future to some people whose data comes from publicly available sources. 

FTT on Transparency 

Experian had improved its privacy notice before the hearing, and the FTT was satisfied that it met the Art. 14 requirements. It agreed that Experian did not need to provide a notice to Data Subjects where it had received their personal data from a third party. The FTT said that “…the reasonable data subject will be familiar with hyperlinks and how to follow them”.
People who wanted to know about Experian’s processing had the opportunity to learn about it via third-party privacy notices. 

However, the FTT did not agree with Experian’s reliance on the “disproportionate effort” exception. In future, Experian will need to provide Art. 14 notices to some Data Subjects whose personal data comes from publicly available sources. 

FTT on Risks of Processing 

An ICO expert witness claimed that Experian’s use of CRA data presented a risk to Data Subjects. The witness later admitted he had misunderstood this risk. The FTT found that Experian’s use of CRA data actually decreased the risk of harm to Data Subjects. For example, Experian used CRA data to “screen out” data subjects with poor credit history from receiving marketing about low-interest credit cards. The FTT found that this helped increase the accuracy of marketing and was therefore beneficial. As such, the FTT found that the ICO had not properly accounted for the benefits of Experian’s processing of CRA data. 

The ICO’s Planned Appeal 

The FTT’s decision focuses heavily on whether Experian’s processing was likely to cause damage or distress to Data Subjects. Because the FTT found that the risk of damage was low, Experian could rely on exceptions that might not have applied to riskier processing.  

The ICO has confirmed that it will appeal the decision. There are no details yet on their arguments but they may claim that the FTT took an excessively narrow interpretation of privacy harms. 

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop. There are only 3 places left on our next Advanced Certificate in GDPR Practice.  

2023 IRMS Awards: Act Now nominated in all three categories

Act Now Training is pleased to announce that it has been nominated for the 2022 Information and Records Management Society (IRMS) awards in all three categories. 

Each year the IRMS recognises excellence in the field of information management with their prestigious Industry Awards. These highly sought-after awards are presented at a glittering ceremony at the annual Conference following the Gala Dinner. In 2021 and 2022 Act Now won the Supplier of the Year award. We are hoping to make it three in a row!

For 2023 Act Now has been nominated for the following awards: 

  • Team of the Year
  • Innovation of the Year
  • Supplier of the Year

The first two nominations acknowledge our work with Damar Limited to deliver the new Data Protection and Information Governance Practitioner Level 4 Apprenticeship. Act Now Training has teamed up Damar to produce materials and expert training to help ensure apprentices develop their skills and knowledge for a successful career information governance.

Voting is open to IRMS members until Friday 24th March 2023.

The voting page is here: https://irms.org.uk/surveys/?id=2023Awards

Vote now for your favourite training company!

The New DP Reform Bill: What’s Changed?

On 8th March 2023, the UK Department for Science, Information and Technology (DSIT) published the Data Protection and Digital Information (No.2) Bill (“the new Bill”). If enacted, it will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

According to the DSIT press release, the Bill will result in a “new common-sense-led UK version of the EU’s GDPR [and will] will reduce costs and burdens for British businesses and charities, remove barriers to international trade and cut the number of repetitive data collection pop-ups online.” It also claims that the reforms are “expected to unlock £4.7 billion in savings for the UK economy over the next 10 years.” How this figure has been calculated is not explained but we have been here before! Remember the red bus?

How did we get here?

This is the second version of a bill designed to reform the UK data protection regime. In July 2022, the Government published the Data Protection and Digital Information Bill (“the previous Bill”). This was paused in September 2022 so ministers could engage in “a co-design process with business leaders and data experts” and move away from the “one-size-fits-all’ approach of European Union’s GDPR.” On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, then the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”. Another full consultation round was expected but never materialised.

The previous Bill have now been withdrawn. We will provide analysis and updates on the new Bill, as it progresses through Parliament, over the coming months. An initial summary of the key proposals, both old and new, is set out below:

What remains the same from the original bill?

Many of the proposals in the new Bill are the same as contained in the previous Bill. For a detailed analysis please read our previous blog post. Here is a summary:

  • Amended Definition of Personal Data: This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. 

  • Vexatious Data Subject Requests: The terms “manifestly unfounded” or “excessive” requests, in Article 12 of the UK GDPR, will be replaced with “vexatious” or “excessive” requests. Explanation and examples of such requests will also be included.

  • Data Subject Complaints: Data Controllers will be required to acknowledge receipt of Data Subject complaints within 30 days and respond substantively “without undue delay”. The ICO will be entitled not to accept a complaint, if a Data Subject has not made a complaint to the controller first.

  • Data Protection Officer: The obligation for some controllers and processors to appoint a Data Protection Officer (DPO) will be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals will be required to designate a senior manager as a “Senior Responsible Individual”. 

  • Data Protection Impact Assessments: These will be replaced by leaner and less prescriptive “Assessments of High Risk Processing”. 

  • International Transfers: There will be a new approach to the test for adequacy applied by the UK Government to countries (and international organisations) and when Data Controllers are carrying out a Transfer Impact Assessment or TIA. The threshold for this new “data protection test” will be whether a jurisdiction offers protection that is “not materially lower” than under the UK GDPR. (For more detail see also our forthcoming International Transfers webinar).
  • The Information Commission: The Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive.

  • Business Data: The Secretary of State and the Treasury will be given the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. 

  • PECR: Cookies will be allowed to be used without consent for the purposes of web analytics and to install automatic software updates. Furthermore non-commercial organisations (e.g. charities and political parties) will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest. Finally, there will be an increase to the fines from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

What has changed?

The new Bill does not make any radical changes to the previous Bill; rather it clarifies some points and provides a bit more flexibility in other areas. The main changes are summarised below:

  • Scientific Research: The definition of scientific research is amended so that it now includes research for the purposes of commercial activity.
    This expands the circumstances in which processing for research purposes may be undertaken, providing a broader consent mechanism and exemption to the fair processing requirement.
  • Legitimate Interests: The previous Bill proposed that businesses could rely on legitimate interests (Article 6 lawful basis) without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These “recognised” legitimate interests cover purposes for processing such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement.  The new Bill, whilst keeping the above changes, introduces a non-exhaustive list of cases where organisations may rely on the “legitimate interests” legal basis, including for the purposes of direct marketing, transferring data within the organisation for administrative purposes and for the purposes of ensuring the security of network and information systems; although a balancing exercise still needs to be conducted in these cases. 

  • Automated Decision Making: The previous Bill clarified that its proposed restrictions on automated decision-making under Article 22 UK GDPR should only apply to decisions that are a result of automated processing without “meaningful human involvement”. The new Bill states that profiling will be a relevant factor in the assessment as to whether there has been meaningful human involvement in a decision. 
  • Records of Processing Activities (ROPA): The previous Bill streamlined the required content of ROPAs. The new Bill exempts all controllers and processors from the duty to maintain a ROPA unless they are carrying out high risk processing activities. 

The Impact

The EU conducts a review of adequacy with the UK every four years; the next adequacy decision is due on 27th June 2025. Some commentators have suggested that the changes may jeopardise the UK’s adequate status and so impact the free flow of data between the UK and EU. We disagree. Although the Government states that the new Bill is “a new system of data protection”, it still retains the UK GDPR’s structure and fundamental obligations. Some tinkering around the edges is not really going to have much of an impact (see the helpful redline version of the new Bill produced by the good people at Hogen Lovells). Organisations that are already compliant with the UK GDPR will not be required to make any major changes to their systems and processes. 

The new Bill has been introduced at the first reading stage. The second reading, due to be scheduled within the next few weeks, which will be the first time the Government’s data protection reforms will be debated in Parliament. We expect the Bill to be passed in a form similar to the one now published and come into force later this year.

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop. There are only 3 places left on our next Advanced Certificate in GDPR Practice

Rogue Employees and Personal Data

Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly:

(a) obtain or disclose personal data without the consent of the controller,

(b) procure the disclosure of personal data to another person without the consent of the controller, or

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained.

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. Two recent prosecutions highlight the willingness of the Information Commissioner’s Office (ICO) to use section 170 to make examples of individuals who seek to access/steal data from their employers for personal gain. 

In January, Asif Iqbal Khan pleaded guilty to stealing data of accident victims whilst working as a Customer Solutions Specialist for the RAC. Over a single month in 2019, the RAC had received 21 complaints from suspicious drivers who received calls from claims management companies following accidents in which the RAC had assisted.

A review of individuals that had accessed these claims found that Mr Khan was the only employee to access all 21. An internal investigation later reported suspicious behaviour from Mr Khan including taking photos of his computer screen with his phone. A search warrant, executed by the ICO, seized two phones from Mr Khan and a customer receipt for £12,000. The phones contained photos of data relating to over 100 accidents.

Khan appeared at Dudley Magistrates Court in January 2023 where he pleaded guilty to two counts of stealing data in breach of Section 170 of the DPA 2018. He was fined £5,000 and ordered to pay a victim surcharge as well as court costs.

This is the second recent prosecution under Section 170. In August last year, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust pleaded guilty to accessing medical records of patients without a valid legal reason.

An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000.

Of course a S.170 prosecution would have a much greater deterrent effect if the available sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail. This has led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment.  In July last year, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August, the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of customers’ personal data from vehicle repair garages to generate potential leads for personal injury claims.

Employer Liability

If a disgruntled or rogue employee commits an offence under section 170, might their employer also be liable for the consequences?

In 2020, the Supreme Court ruled that as an employer, Morrisons Supermarket could not be held responsible when an employee, Andrew Skelton, uploaded a file containing the payroll data of thousands of Morrisons employees to a publicly accessible website as well as leaking it to several newspapers. The court decided that, whatever Skelton was doing when he disclosed his colleagues’ personal data, he was not acting “in the course of his employment”, and accordingly no vicarious liability could be imposed under the old Data Protection Act 1998.

However, Morrisons lost on the argument that the DPA 1998 operated so as to exclude vicarious liability completely. This principle can also be applied to the GDPR and so employers can “never say never” when it comes to vicariously liability for malicious data breaches by staff. It all depends on the facts of the breach.

This case only went as far as it did because the Morrisons employees failed to show, at first instance, that Morrisons was primarily liable for the data breach. If an employer fails to comply with its security obligations in a manner that is causally relevant to a rogue employee’s actions, it can still be exposed to primary liability under Article 32 of GDPR as well as the 6th Data Protection Principle which both impose obligations to ensure the security of personal data.

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop. There are only 3 places left on our next Advanced Certificate in GDPR Practice.

GDPR and AI: The Rise of the Machines

2023 is going to be the year of AI. In January, Microsoft announced a multibillion dollar investment in OpenAI, the company behind image generation tool Dall-E and the chatbot ChatGPT. The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs.

The term “artificial intelligence” or “AI” refers to the use of machines and computer systems that can perform tasks normally requiring human intelligence. The past few years have seen rapid progress in this field. Advancements in deep learning algorithms, cloud computing, and data storage have made it possible for machines to process and analyse large amounts of data quickly and accurately. AI’s ability to interpret human language means that virtual assistants, such as Siri and Alexa, can now understand and respond to complex spoken commands at lightning speed.

The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs. Local government is using AI to simplify staff scheduling, predict demand for services and even estimate the risk of an individual committing fraud. Healthcare providers are now able to provide automated diagnoses based on medical imaging data from patients, thereby reducing wait times.

The Risks

With any major technological advance there are potential risks and downsides.  On Monday, ElevenLabs, an AI speech software company, said it had found an “increasing number of voice cloning misuse cases”. According to reports, hackers used the ElevenLabs software to create deepfake voices of famous people (including Emma Watson and Joe Rogan) making racist, transphobic and violent comments.

There are concerns about the impact of AI on employment and the future of work. In April 2021, the Court of Amsterdam ordered that Uber reinstate taxi drivers in the UK and Portugal who had been dismissed by “robo firing”; the use of an algorithm to make a decision about dismissal with no human involvement. The Court concluded that Uber’s had made the decisions “based solely on automated processing” within the meaning of Article 22(1) of the GDPR. It was ordered to reinstate the drivers’ accounts and pay them compensation.

As well as ethical questions about the use of AI in decision-making processes that affect people’s lives, AI-driven algorithms may lead to unintended biases or inaccurate decisions if not properly monitored and regulated. In 2021 the privacy pressure group, NOYB, filed a GDPR complaint against Amazon, claiming that Amazon’s algorithm discriminates against some customers by denying them the opportunity to pay for items by monthly invoice.

There is also a risk that AI is deployed without consideration of the privacy implications. In May 2022, the UK Information Commissioner’s Office fined Clearview AI Inc more than £7.5 million for breaches of GDPR. Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. The company, which describes itself as the “World’s Largest Facial Network”, allows customers, including the police, to upload an image of a person to its app, which then uses AI to check it against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Practical Steps

Recently, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using algorithms or AI:

1. Take a data protection by design and default approach

Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.

2. Be transparent with people about how you are using their data

Local authorities should regularly review their privacy policies, to ensure they comply with Articles 13 and 14, and identify areas for improvement. They should also bring any new uses of individuals’ personal data to their attention.

3. Identify the potential risks to people’s privacy

Local authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. 

In April 2021, the European Commission presented its proposal for a Regulation to harmonise rules for AI, also known as the “AI Act of the European Union’. Whilst there is still a long way to go before this proposal becomes legislation, it could create an impetus for the UK to further regulate AI. 

Use of AI has enormous benefits. It does though have a potential to adversely impact people’s lives and deny their fundamental rights. As such, understanding the implications of AI technology and how to use it in a fair and lawful manner is critical for data protection/information governance officers to understand. 

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

Mega GDPR Fines for Meta

On 4th January 2023, Ireland’s Data Protection Commission (DPC) announced the conclusion of two inquiries into the data processing operations of Meta Platforms Ireland Limited (“Meta Ireland”) in connection with the delivery of its Facebook and Instagram services. Not only does this decision significantly limit Meta’s ability to gather information from its users to tailor and sell advertising, it also provides useful insight into EU regulators’ view about how to comply with Principle 1 of GDPR i.e. the need to ensure personal data is “processed lawfully, fairly and in a transparent manner in relation to the data subject”(Article 5).

In decisions dated 31st December 2022, the DPC fined Meta Ireland €210 million and €180 million, relating to its Facebook and Instagram services respectively. The fines were imposed in connection with the company’s practise of monetising users’ personal data by running personalised adverts on their social media accounts. Information about a social media user’s digital footprint, such as what videos prompt them to stop scrolling or what types of links they click on, is used by marketers to get personalised adverts in front of people who are the most likely to buy their products. This practice helped Meta generate $118 billion in revenue in 2021.

The DPC’s decision was the result of two complaints from Facebook and Instagram users, supported by privacy campaign group NOYB, both of which raised the same basic issue: how Meta obtains legal permission from users to collect and use their personal data for personalised advertising. Article 6(1) of GDPR states that:

“Processing shall be lawful only if and to the extent that at least one of the following applies:

  1. the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
  • processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;”

In advance of the GDPR coming into force on 25th May 2018, Meta Ireland changed the Terms of Service for its Facebook and Instagram services. It also flagged the fact that it was changing the legal basis upon which it relies to process users’ personal data under Article 6 in the context of the delivery of the Facebook’s and Instagram’s services (including behavioural advertising). Having previously relied on the consent of users to the processing of their personal data, the company now sought to rely on the “contract” legal basis for most (but not all) of its processing operations. Existing and new users were required to click “I accept” to indicate their acceptance of the updated Terms of Service in order to continue using Facebook and Instagram. The services would not be accessible if users declined to do so.

Meta Ireland considered that, on accepting the updated Terms of Service, a contract was concluded between itself and the user. Consequently the processing of the user’s personal data in connection with the delivery of its Facebook and Instagram services was necessary for the performance of this “contract” which includes the provision of personalised services and behavioural advertising.  This, it claimed, provided a lawful basis by reference to Article 6(1)(b) of the GDPR.

The complainants contended that Meta Ireland was in fact still looking to rely on consent to provide a lawful basis for its processing of users’ data. They argued that, by making the accessibility of its services conditional on users accepting the updated Terms of Service, Meta Ireland was in fact “forcing” them to consent to the processing of their personal data for behavioural advertising and other personalised services. This was not real consent as defined in Article 4 of GDPR:

“any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her;” (our emphasis)

Following comprehensive investigations, consultation with other EU DP regulators (a process required by GDPR in such cases) and final rulings by the European Data Protection Board, the DPC made a number of findings; notably:

1. Meta Ireland did not provide clear information about its processing of users’ personal data, resulting in users having insufficient clarity as to what processing operations were being carried out on their personal data, for what purpose(s), and by reference to which of the six legal bases identified in Article 6. The DPC said this violated Articles 12 (transparency) and 13(1)(c) (information to be provide to the data subject) of GDPR. It also considered it to be a violation of Article 5(1)(a), which states that personal data must be processed lawfully, fairly and transparently.

2. Meta Ireland cannot rely on the contract legal basis for justifying its processing. The delivery of personalised advertising (as part of the broader suite of personalised services offered as part of the Facebook and Instagram services) could not be said to be necessary to perform the core elements of what was said to be a much more limited form of contract. The DPC adopted this position following a ruling by the EDPB, which agreed with other EU regulators’ representations to the DPC.

In addition to the fines, Meta Ireland has been directed to ensure its data processing operations comply with GDPR within a period of 3 months. It has said it will appeal; not surprising considering the decision has the potential to require it to make costly changes to its personalised advertising-based business in the European Union, one of its largest markets. 

It is important to note that this decision still allows Meta to use non-personal data (such as the content of a story) to personalise adverts or to ask users to give their consent to targeted adverts. However under GDPR users should be able to withdraw their consent at any time.  If a large number do so, it would impact one of the most valuable parts of Meta’s business. 

The forthcoming appeals by Meta will provide much needed judicial guidance on the GDPR particular Principle 1. Given the social media giant’s deep pockets, expect this one to run and run.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

US Data Transfers and Privacy Shield 2.0 

On 14th December 2022, the European Commission published a draft ‘adequacy decision’, under Article 47 of the GDPR, endorsing a new legal framework for transferring personal data from the EU to the USA. Subject to approval by other EU institutions, the decision paves the way for “Privacy Shield 2.0” to be in effect by Spring 2023.

The Background

In July 2020, the European Court of Justice (ECJ) in “Schrems II”, ruled that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework as a legal transfer tool as it failed to protect the rights of EU data subjects when their data was accessed by U.S. public authorities. In particular, the ECJ found that US surveillance programs are not limited to what is strictly necessary and proportionate as required by EU law and hence do not meet the requirements of Article 52 of the EU Charter on Fundamental Rights. Secondly, with regard to U.S. surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the USA, as required by Article 47 of the EU Charter.

The ECJ stated that organisations transferring personal data to the USA can still use the Article 49 GDPR derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporter to make a complex assessment  about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. 

Despite the Schrems II judgment, many organisations have continued to transfer personal data to the USA hoping that regulators will wait for a new Transatlantic data deal before enforcing the judgement.  Whilst the UK Information Commissioner’s Office (ICO) seems to have adopted a “wait and see” approach, other regulators have now started to take action. In February 2022, the French Data Protection Regulator, CNIL, ruled that the use of Google Analytics was a breach of GDPR due to the data being transferred to the USA without appropriate safeguards. This followed a similar decision by the Austrian Data Protection Authority in January. 

The Road to Adequacy

Since the Schrems ruling, replacing the Privacy Shield has been a priority for EU and US officials. In March 2022, it was announced that a new Trans-Atlantic Data Privacy Framework had been agreed in principle. In October, the US President signed an executive order giving effect to the US commitments in the framework. These include commitments to limit US authorities’ access to data exported from the EU to what is necessary and proportionate under surveillance legislation, to provide data subjects with rights of redress relating to how their data is handled under the framework regardless of their nationality, and to establish a Data Protection Review Court for determining the outcome of complaints.

Schrems III?

The privacy campaign group, noyb, of which Max Schrems is Honorary Chairman, is not impressed by the draft adequacy decision. It said in a statement:

“…the changes in US law seem rather minimal. Certain amendments, such as the introduction of the proportionality principle or the establishment of a Court, sound promising – but on closer examination, it becomes obvious that the Executive Order oversells and underperforms when it comes to the protection of non-US persons. It seems obvious that any EU “adequacy decision” that is based on Executive Order 14086 will likely not satisfy the CJEU. This would mean that the third deal between the US Government and the European Commission may fail.”

Max Schrems said: 

… As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again – in flagrant breach of our fundamental rights.”

The draft adequacy decision will now be reviewed by the European Data Protection Board (EDPB) and the European Member States. From the above statements it seems that if Privacy Shield 2.0 is finalised, a legal challenge against it is inevitable.

UK to US Data Transfers 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. At present use of such services usually involves a complicated TRA and execution of standard contractual clauses. A new UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a TRA as well as supplementary measures where privacy risks are identified. 

Good news may be round the corner for UK data exporters. The UK Government is also in the process of making an adequacy decision for the US. We suspect it will strike a similar deal once the EU/US one is finalised.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Our next online GDPR Practitioner Certificate course, starting on 10th January, is fully booked. We have places on the course starting on 19th January. 

%d bloggers like this: