Transparency in Health and Social Care: New ICO Guidance 

Within the health and social care sector, new technologies that use large amounts of personal data are being used to support both direct care and secondary purposes, such as planning and research. An example is the the use of AI to provide automated diagnoses based on medical imaging data from patients. 

Transparency is a key principle of UK Data Protection legislation. Compliance with the first data protection principle and Article 13 and 14 of the UK GDPR ensures that data subjects are aware of how their personal data is used, allowing them to make informed choices about who they disclose their data to and how to exercise their data rights. 

On Monday the Information Commissioner’s Office (ICO) published new guidance to assist health and social care organisations to comply with their transparency obligations under the UK GDPR. It supplements existing ICO guidance on the principle of transparency and the right to be informed

The guidance is aimed at all organisations, including from the private and third sector, who deliver health and social care services or process health and social care information. This includes local authorities, suppliers to the health and social care sector, universities using health information for research purposes and others (e.g. fire service, police and education) that use health information for their own purposes. The guidance will help them to understand the definition of transparency and assess appropriate levels of transparency, as well as providing practical steps to developing effective transparency information. 

This and other data protection developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.   

Apprentice Case Study – Meet Natasha

In 2022, Act Now Training teamed up with Damar to support their delivery of the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first cohort about to undertake the end point assessment.

Data Protection and Information Governance Apprentice, Natasha Lock, is an integral part of the Governance and Compliance team at the University of Lincoln. With the Data Protection and Digital Information (No.2) Bill set to make changes to the UK data protection regime, Natasha talks to us about why this is a great area to work in and how the apprenticeship route has been particularly beneficial for her.

How did you get onto the apprenticeship?

“I was already working at the university as an Information Compliance Officer when the opportunity for a staff apprenticeship came up.

“The process was swift and straightforward, and I was enrolled on the Data Protection and Information Governance Apprenticeship within three months of enquiring.”

How has the apprenticeship helped you?

“I started with a good understanding of the UK Data Protection legislation but my knowledge has grown significantly, and now I’m coming to the end of my level 4 apprenticeship, I’ve gained so much more insight and my confidence has grown.

“As a university, we hold vast amounts of data. My apprenticeship is allowing me to solve the challenge of data retention and implement better measures to retain, destroy and archive information. I have developed a greater understanding of the legislative requirements we must adhere to as a public sector institute and how to reduce and assess data protection risks.

“I love the fact that I can study whilst still doing my job. The flexibility works for me because I can go through course materials at my own pace. I really feel like I have a brilliant work/life/study balance.

“The University of Lincoln and Damar Training have been fantastic in supporting me. I get along with my coach, Tracey, so well. She is very friendly and personable and has enabled my creativity to flow.

“The course is very interactive, and I’ve found the forums with other apprentices to be a very useful way of sharing knowledge, ideas and stories.

“I’m enjoying it so much and people have noticed that my confidence has grown. I wouldn’t have had that without doing this apprenticeship. I’ve now got my sights on doing a law degree or law apprenticeship in the future.”

Abi Slater, Information Compliance Manager at Lincoln University, said: “It has been great to see how much Natasha has developed over the course of the apprenticeship. I believe the apprenticeship has provided Natasha with the knowledge and skills required to advance in her data protection career and the support from her coach at Damar Training has been excellent.

“I would encourage anyone with an interest in data protection and information governance to consider this apprenticeship.”

Tracey Coetzee, Coach at Damar Training said: “The Data Protection and Information Governance Apprenticeship was only approved by the Institute of Apprenticeships in 2022, and its delightful to see apprentices flourishing on the programme.

“From cyber security to managing data protection risks, this programme is upskilling participants and adding value to both private and public sector organisations and we’re thrilled to see the first cohort, including Natasha, approach the completion of their training.”

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

Kate Middleton’s Medical Records: Can anyone go to jail?

Kate Middleton seems to be at the centre of much media (and social media) attention at present. In addition to speculation about her health and whereabouts, there has been much focus and analysis of the now famous photoshopped Mother’s Day photo

This week it was reported that employees at the private London Clinic in Marylebone, where Kate was a patient following abdominal surgery in January, attempted to view her medical records. Reportedly three employees have now been suspended.

The Health Minister, Maria Caulfield, told Sky News it was “pretty severe and serious stuff to be accessing notes that you don’t have permission to”. She also said police had been “asked to look at” whether staff at the clinic attempted to access the princess’s private medical records. 

If the reports are true and individuals are proven to have been “snooping”, what are the consequences? Firstly, this would normally be a matter for the Information Commissioner’s Office (ICO) to investigate rather than the police. Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly: 

(a) obtain or disclose personal data without the consent of the controller, 

(b) procure the disclosure of personal data to another person without the consent of the controller, or 

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained. 

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. In June 2023, the ICO disclosed that since 1st June 2018, 92 cases involving Section 170 offences were investigated by its Criminal Investigations Team.   

Consequences 

Section 170 is only punishable by way of a fine; perpetrators cannot be sent to prison. Although there is now no cap on the maximum fine, prosecutions have resulted in relatively low fines compared to the harm caused.  

Two recent prosecutions have involved employees accessing medical records.
In November 2023, Loretta Alborghetti, was fined for illegally accessing the medical records of over 150 people. The offence took place whilst she worked as a medical secretary at Worcestershire Acute Hospitals NHS Trust. She was ordered to pay a total of £648. 

In August 2022, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust, pleaded guilty to accessing medical records of patients without a valid legal reason. An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off from going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000. 

Computer Misuse Act  

A Section 170 prosecution would have a much greater deterrent effect if the sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail.  

The relatively low fines have led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment. In July 2022, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of people’s personal information from vehicle repair garages to generate potential leads for personal injury claims. 

The ICO has now confirmed that it has a personal data breach report from The London Clinic. If its investigation, finds the Clinic did not comply with its security obligations under the Article 5(1)(f)) and Article 32 of the UK GDPR, it faces a possible maximum Monetary Penalty Notice of £17.5m or 4% of gross annual turnover (whichever is higher). This case highlights the importance of organisations ensuring adequate security measures around sensitive personal data especially where the data relates to high profile individuals.  
 
This and other data protection developments will be discussed in detail on our forthcoming GDPR Update workshop. There are only 3 places left on our next GDPR Practitioner Certificate 

Police Misuse of Body Worn Camera Footage 

Last week the BBC reported that police officers made offensive comments about an assault victim while watching body camera footage of her exposed body.  

The woman had been arrested by Thames Valley Police and placed in leg restraints before being recorded on body-worn cameras. While being transported to Newbury police station, she suffered a seizure which resulted in her chest and groin being exposed. A day later she was released without charge. 

A female officer later reviewed the body camera footage, which the force told Metro.co.uk was for ‘evidential purposes’ and ‘standard practice’. The BBC reports that three male colleagues joined her and made offensive comments about the victim.
The comments were brought to the attention of senior police officers by a student officer, who reported his colleagues for covering up the incident. The student officer was later dismissed; though the police said this was unrelated to the report. 

The policing regulator says Thames Valley Police should have reported the case for independent scrutiny. The force has now done so, following the BBC investigation. 

This is not the first time the BBC has highlighted such an issue. In September 2023 it revealed the findings of a two-year investigation. It obtained reports of misuse from Freedom of Information requests, police sources, misconduct hearings and regulator reports. It found more than 150 camera misuse reports with cases to answer over misconduct, recommendations for learning or where complaints were upheld. (You can watch Bodycam cops uncovered on BBC iPlayer) 

The most serious allegations include: 

  • Cases in seven forces where officers shared camera footage with colleagues or
    friends – either in person, via WhatsApp or on social media 

  • Images of a naked person being shared between officers on email and cameras used to covertly record conversations 

  • Footage being lost, deleted or not marked as evidence, including video, filmed by Bedfordshire Police, of a vulnerable woman alleging she had been raped by an inspector – the force later blamed an “administrative error” 

  • Switching off cameras during incidents, for which some officers faced no sanctions – one force said an officer may have been “confused”

Body worn cameras are used widely these days by not just police but also  council officers, train guards, security staff, and parking attendance (to name a few). 

There is no all-encompassing law regulating body worn cameras.  Of course they are used to collect and process personal data therefore will be subject to the UK GDPR. Where used covertly they also be subject to Regulation of Investigatory Powers Act 2000.  

The Information Commissioner’s Office (ICO) provides comprehensive guidelines on the use of CCTV, which are largely considered to extend to body worn cameras(BWCs) for security officers. There is a useful checklist on its website which recommends:  

  • Providing a privacy information  to individuals using BWCs, such as clear signage, verbal announcements or lights/indicators on the device itself and having readily available privacy policies. 
  • Training staff using BWV to inform individuals that recording may take place if it is not obvious to individuals in the circumstances. 
  • Having appropriate retention and disposal policies in place for any footage that is collected. 
  • Having efficient governance procedures in place to be able to retrieve stored footage and process it for subject access requests or onward disclosures where required. 
  • Using technology which has the ability to efficiently and effectively blur or mask footage, if redaction is required to protect the rights and freedoms of any third parties. 

Our one-day CCTV workshop will teach you how to plan and implement a CCTV/BWC project including key skills such as completing a DPIA and assessing camera evidence.
Our expert trainer will answer all your questions including when you can use CCTV/BWC, when it can be covert and how to deal with a request for images.  
 
This workshop is suitable for anyone involved in the operation of CCTV, BWCs and drones including DPOs, investigators, CCTV operators, enforcement officers, estate managers and security personnel. 

The Hidden Reach of the Prevent Strategy:
Beyond Counter-Terrorism Units

The UK government’s anti-radicalisation program, Prevent, is reportedly sharing the personal details of thousands of individuals more extensively than previously known. This sharing includes not just counter-terrorism units, but also airports, ports, immigration services, and officials at the Home Office and the Foreign, Commonwealth and Development Office (FCDO). Critics argue that such widespread data sharing could be illegal, as it involves moving sensitive personal data between databases without the consent of the individuals. 

A Metropolitan police document titled “Prevent case management guidance” indicates that Prevent details are also shared with the ports authority watchlist. This raises concerns that individuals may face increased scrutiny at airports or be subjected to counter-terrorism powers without reasonable suspicion. The document also mentions that foreign nationals may have their backgrounds checked by the FCDO and immigration services for any overseas convictions or intelligence. 

Furthermore, the Acro Criminal Records Office, which manages UK criminal records, is notified about individuals referred to Prevent, despite the program dealing with individuals who haven’t necessarily engaged in criminal behaviour.
Counter-terror police emphasise their careful approach to data sharing, which aims to protect vulnerable individuals. 

Prevent’s goal is to divert people from terrorism before they offend, and most people are unaware of their referral to the program. 95% of referrals result in no further action. A secret database, the National Police Prevent Case Management database, was previously disclosed in 2019, revealing the storage of details of those referred to Prevent. 

Newly disclosed information, obtained through a freedom of information request by the Open Rights Group (ORG), reveals that Prevent data is shared across various police databases, including the Police National Computer, specialised counter-terrorism and local intelligence systems, and the National Crime Agency. 

The sharing of this data was accidentally revealed due to a redaction error in a heavily edited Met document. Despite its sensitive nature, the ORG decided to make the document public. Sophia Akram of the ORG expressed concerns over the extent of the data sharing and potential harms, suggesting that it could be unfair and possibly unlawful. 

The guidance also indicates that data is retained and used even in cases where no further action is taken. There are concerns about the impact on young people’s educational opportunities, as Prevent requires public bodies like schools and the police to identify individuals at risk of extremism. 

Recent figures show thousands of referrals to Prevent, predominantly from educational institutions. From April 2022 to March 2023, a total of 6,817 individuals were directed to the Prevent program. Within this group, educational institutions were responsible for 2,684 referrals. Breaking down the referrals by age, there were 2,203 adolescents between the ages of 15 and 20, and 2,119 referrals involved children aged 14 or younger.

There are worries about the long-term consequences for children and young people referred to the program. Several cases have highlighted the intrusive nature of this data sharing and its potential impact on individuals’ lives. Cases in which students have missed gaining a place at a sixth form college and other cases involving children as young as four years old.  

Prevent Watch, an organisation monitoring the program, has raised alarms about the data sharing, particularly its effect on young children. The FoI disclosures challenge the notion that Prevent is non-criminalising, as data on individuals, even those marked as ‘no further action’, can be stored on criminal databases and flagged on watchlists. 

Counter-terrorism policing spokespeople defend the program, emphasising its
multi-agency nature and focus on protecting people from harm. They assert that data sharing is carefully managed and legally compliant, aiming to safeguard vulnerable individuals from joining terror groups or entering conflict zones. 

Learn more about data sharing with our UK GDPR Practitioner Certificate. Dive into the issues discussed in this blog and secure your spot now.

The British Library Hack: A Chapter in Ransomware Resilience

In a stark reminder of the persistent threat of cybercrime, the British Library has confirmed a data breach incident that has led to the exposure of sensitive personal data, with materials purportedly up for auction online. An October intrusion by a notorious cybercrime group targeted the library, which is home to an extensive collection, including over 14 million books.

Recently, the ransomware group Rhysida claimed responsibility, publicly displaying snippets of sensitive data, and announcing the sale of this information for a significant sum of around £600k to be paid in cryptocurrency.

While the group boasts about the data’s exclusivity and sets a firm bidding deadline (today 27th November 2023), the library has only acknowledged a leak of what seems to be internal human resources documents. It has not verified the identity of the attackers nor the authenticity of the sale items. The cyber attack has significantly disrupted the library’s operations, leading to service interruptions expected to span several months.

In response, the library has strengthened its digital defenses, sought expert cybersecurity assistance, and urged its patrons to update their login credentials as a protective measure. The library is working closely with the National Cyber Security Centre and law enforcement to investigate, but details remain confidential due to the ongoing inquiry.

The consequences of the attack have necessitated a temporary shutdown of the library’s online presence. Physical locations, however, remain accessible. Updates can be found the British Library’s X (née twitter) feed. The risk posed by Rhysida has drawn attention from international agencies, with recent advisories from the FBI and US cybersecurity authorities. The group has been active globally, with attacks on various sectors and institutions.

The British Library’s leadership has expressed appreciation for the support and patience from its community as it navigates the aftermath of the cyber attack.

What is a Ransomware Attack?

A ransomware attack is a type of malicious cyber operation where hackers infiltrate a computer system to encrypt data, effectively locking out the rightful users. The attackers then demand payment, often in cryptocurrency, for the decryption key. These attacks can paralyse organisations, leading to significant data loss and disruption of operations.

Who is Rhysida?

The Rhysida ransomware group first came to the fore in May of 2023, following the emergence of their victim support chat portal hosted via the TOR browser. The group identifies as a “cybersecurity team” who highlight security flaws by targeting victims’ systems and spotlighting the supposed potential ramifications of the involved security issues.

How to prevent a Ransomware Attack?

Hackers are becoming more and more sophisticated in ways they target our personal data. We have seen this with banking scams recently. However there are some measures we can implement personally and within our organisations to prevent a ransomware attack.

  1. Avoid Unverified Links: Refrain from clicking on links in spam emails or unfamiliar websites. Hackers frequently disseminate ransomware via such links, which, when clicked, can initiate the download of malware. This malware can then encrypt your data and hold it for ransom​​.

  2. Safeguard Personal Information: It’s crucial to never disclose personal information such as addresses, NI numbers, login details, or banking information online, especially in response to unsolicited communications​​.

  3. Educate Employees: Increasing awareness among employees can be a strong defence. Training should focus on identifying and handling suspicious emails, attachments, and links. Additionally, having a contingency plan in the event of a ransomware infection is important​​.

  4. Implement a Firewall: A robust firewall can act as a first line of defence, monitoring incoming and outgoing traffic for threats and signs of malicious activity. This should be complemented with proactive measures such as threat hunting and active tagging of workloads​​.

  5. Regular Backups: Maintain up-to-date backups of all critical data. In the event of a ransomware attack, having these backups means you can restore your systems to a previous, unencrypted state without having to consider ransom demands.

  6. Create Inventories of Assets and Data: Having inventories of the data and assets you hold allows you to have an immediate knowledge of what has been compromised in the event of an attack whilst also allowing you to update security protocols for sensitive data over time.

  7. Multi-Factor Authentication: Identifying legitimate users in more than one way ensures that you are only granting access to those intended. 

These are some strategies organisations can use as part of a more comprehensive cybersecurity protocol which will significantly reduce the risk of falling victim to a ransomware attack. 

Join us on our workshop “How to increase Cyber Security in your Organisation” and Cyber Security for DPO’s where we discuss all of the above and more helping you create the right foundations for Cyber resilience within your organisation. 

UK Biobank’s Data Sharing Raises Alarm Bells

An investigation by The Observer has uncovered that the UK Biobank, a repository of health data from half a million UK citizens, has been sharing information with insurance companies. This development contravenes the Biobank’s initial pledge to keep this sensitive data out of the hands of insurers, a promise that was instrumental in garnering public trust at the outset. UK Biobank has since come out and responded to the article calling it “disingenuous” and “extremely misleading”. 

A Promise Made, Then Modified 

The UK Biobank was set up in 2006 as a goldmine for scientific discovery, offering researchers access to a treasure trove of biological samples and associated health data. With costs for access set between £3,000 and £9,000, the research derived from this data has been nothing short of revolutionary. However, the foundations of this scientific jewel are now being questioned. 

When the project was first announced, clear assurances were given that data would not be made available to insurance companies, mitigating fears that genetic predispositions could be used discriminatorily in insurance assessments. These assurances appeared in the Biobank’s FAQs and were echoed in parliamentary discussions. 

Changing Terms Amidst Grey Areas 

The Biobank contends that while it does strictly regulate data access, allowing only verified researchers to delve into its database, this includes commercial entities such as insurance firms if the research is deemed to be in the public interest. The boundaries of what constitutes “health-related” and “public interest” are now under scrutiny.   

However, according to the Observer investigation, evidence suggests that this nuance—commercial entities conducting health-related research—was not clearly communicated to participants, especially given the categorical assurances given previously although the UK Biobank categorically denies this and shared its consent form and information leaflet. 

Data Sharing: The Ethical Quandary 

This breach of the original promise has raised the ire of experts in genetics and data privacy, with Prof Yves Moreau highlighting the severity of the breach of trust. The concern is not just about the sharing of data but about the integrity of consent given by participants. The Biobank’s response indicates that the commitments made were outdated and that the current policy, which includes sharing anonymised data for health-related research, was made clear to participants upon enrolment. 

The Ripple Effect of Biobank’s Data Policies 

Further complicating matters is the nature of the companies granted access. Among them are ReMark International, a global insurance consultancy, Lydia.ai, a Canadian “insurtech” firm that wants to give people “personalised and predictive health scores”, and Club Vita, a longevity data analytics company. These companies have utilised Biobank data for projects ranging from disease prediction algorithms to assessing longevity risk factors. The question that is raised is how can one ensure that this is in fact in the Public Interest, do we take a commercial entities word for this? UK Biobank says all research conducted is “consistent with being health-related and in the public interest” and it has an expert data access committee who decide on any complex issues but the who checks the ethics of the ethics committee? The issues with this self-regulation are axiomatic. 

The Fallout and the Future 

This situation has led to a broader conversation about the ethical use of volunteered health data and the responsibility of custodians like the UK Biobank to uphold public trust. As technology evolves and the appetite for data grows across industries, the mechanisms of consent and transparency may need to be revisited.  The Information Commissioner’s Office is now considering the case, spotlighting the crucial need for clarity and accuracy in how organisations manage and utilise sensitive personal information. 

As the UK Biobank navigates these turbulent waters, the focus shifts to how institutions like it can maintain the delicate balance between facilitating scientific progress and safeguarding the privacy rights of individuals who contribute their personal data for the greater good. For the UK Biobank, regaining the trust of its participants and the public is now an urgent task, one that will require more than just a careful review of policies but a reaffirmation of its commitment to ethical stewardship of the data entrusted to it. 

Take a look at our highly popular Data Ethics Course. Places fill up fast so if you would like learn more in this fascinating area, book your place now. 

Saudi Arabia’s First Ever DP Law Comes into Force 

Today (14th September 2023), Saudi Arabia’s first ever data protection law comes into force. Organisations doing business in the Middle East need to carefully consider the impact of the new law on their personal data processing activities. They have until 13th September 2024 to prepare and become fully compliant. 

Background 

The Personal Data Protection Law (PDPL) of Saudi Arabia was implemented by Royal Decree on 14th September 2021. It aims to regulate the collection, handling, disclosure and use of personal data. It will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA) which has published the aforementioned regulations. PDPL was originally going to come fully into force on 23rd March 2022. However, in November 2022, SDAIA published proposed amendments which were passed after public consultation.  

Following a consultation period, we also now have the final versions of the Implementing Regulations and the Personal Data Transfer Regulations; both expand on the general principles and obligations outlined in the PDPL (as amended in March 2023) and introduce new compliance requirements for data controllers. 

More Information  

Summary of the new law: https://actnowtraining.blog/2022/01/10/the-new-saudi-arabian-federal-data-protection-law/  

Summary of the Regulations: https://actnowtraining.blog/2023/07/26/data-protection-law-in-saudi-arabia-implementing-regulation-published/  

Action Plan 

13th September 2024 is not far away. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so could lead to enforcement action and also reputational damage. The following should be part of an action plan for compliance: 
 

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes.  
  1. Training staff at all levels to understand PDPL at how it will impact their role. 
  1. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed. 
  1. Reviewing how records management and information risk  is addressed within the organisation. 
  1. Drafting Privacy Notices  to ensure they set out the minimum information that should be included. 
  1. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification. 
  1. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure. 
  1. Appointing and training a Data Protection Officer. 
     

Act Now in Saudi Arabia 

Act Now Training can help your businesses prepare for the new law.
We have delivered training  extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. We have experience in helping organisations in territories where a new law of this type has been implemented.  

Now is the time to train your staff in the new law. Through our  KSA privacy programme, we offer comprehensive and cost-effective training from one hour awareness-raising webinars to comprehensive full day workshops and DPO certificate courses.  

To help deliver this and other courses, Suzanne Ballabás, an experienced middle-east based data protection specialist, recently joined our team of associates. We can deliver Online or Face to Face training. All of our training starts with a FREE analysis call to ensure you have the right level and most appropriate content for your organisations needs. Please get in touch to discuss your training or consultancy needs. 

Click on the Link Below to see our full Saudi Privacy Programme.

International Transfers Breach Results in Record GDPR Fine for Meta

Personal data transfers between the EU and US is an ongoing legal and political saga. The latest development is yesterday’s largest ever GDPR fine of €1.2bn (£1bn) issued by Ireland’s Data Protection Commission (DPC) to Facebook’s owner, Meta Ireland. The DPC ruled that Meta infringed Article 46 of the EU GDPR in the way it transferred personal data of its users from Europe to the US. 

The Law 

Chapter 5 of the EU GDPR mirrors the international transfer arrangements of the UK GDPR. There is a general prohibition on organisations transferring personal data to a country outside the EU, unless they ensure that data subjects’ rights are protected. This means that, if there is no adequacy decision in respect of the receiving country, one of the safeguards set out in Article 46 must be built into the arrangement. These include standard contractual clauses (SCCs) and binding corporate rules.
The former need to be included in a contract between the parties (data exporter and importer) and impose certain data protection obligations on both. 

The Problem with US Transfers 

In 2020, in a case commonly known as “Schrems II, the European Court of Justice (ECJ) concluded that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal mechanism to ensure GDPR compliance. They must consider using the Article 49 derogations or SCCs. If using the latter, whether for transfers to the US or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection and surveillance legislation, and to put in place “additional supplementary measures” to those included in the SCCs. The problem with the US is that it has stringent surveillance laws which give law enforcement agencies access to personal data without adequate safeguards (according to the ECJ in Schrems). Therefore any additional measures must address this possibility and build in safeguards to protect data subjects. 

In the light of the above, the new EU SCCs were published in June 2021.
The European Data Protection Board has also published its guidance on the aforementioned required assessment entitled “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data”. Meta’s use of the new EU SCC’s and its “additional supplementary measures” were the focus of the DPC’s attention when issuing its decision. 

The Decision 

The DPC ruled that Meta infringed Article 46(1) of GDPR when it continued to transfer personal data from the EU/EEA to the US following the ECJ’s ruling in Schrems II. It found that the measures used by Meta did not address the risks to the fundamental rights and freedoms of data subjects that were identified in Schrems; namely the risk of access to the data by US law enforcement.  

The DPC ruled that Meta should: 

  1. Suspend any future transfer of personal data to the US within five months of the date of the DPC’s decision; 
  1. Pay an administrative fine of €1.2 billion; and, 
  1. Bring its processing operations in line with the requirements of GDPR, within five months of the date of the DPC’s decision, by ceasing the unlawful processing, including storage, in the US of personal data of EEA users transferred in violation of GDPR. 

Meta has said that it will appeal the decision and seek a stay of the ruling, before the Irish courts.  Its President of Global Affairs, Sir Nick Clegg, said:  

“We are therefore disappointed to have been singled out when using the same legal mechanism as thousands of other companies looking to provide services in Europe. 

“This decision is flawed, unjustified and sets a dangerous precedent for the countless other companies transferring data between the EU and US.” 

The Future of US Transfers 

The Information Commissioner’s Office told the BBC that the decision “does not apply in the UK” but said it had “noted the decision and will review the details in due course”. The wider legal ramifications on data transfers from the UK to the US can’t be ignored. 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. A new  UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a Transfer Risk Assessment  as well as supplementary measures where privacy risks are identified.  

On 25th March 2022, the European Commission and the United States announced that they have agreed in principle on a new  Trans-Atlantic Data Privacy Framework. The final agreement is expected to be in place sometime this summer 2023 and will replace the Privacy Shield Framework. It is expected that the UK Government will strike a similar deal once the EU/US one is finalised. However both are likely to be challenged in the courts. 

The Meta fine is one of this year’s major GDPR developments nicely timed; within a few days of the 5th anniversary of GDPR. All organisations, whether in the UK or EU, need to carefully consider their data transfers mechanisms and ensure that they comply with Chapter 5 of GDPR in the light of the DPC’s ruling. A “wait and see’ approach is no longer an option.  

The Meta fine will be discussed in detail on our forthcoming International Transfers workshop. For those who want a 1 hour summary of the UK International Transfer regime we recommend our webinar 

US Data Transfers and Privacy Shield 2.0 

On 14th December 2022, the European Commission published a draft ‘adequacy decision’, under Article 47 of the GDPR, endorsing a new legal framework for transferring personal data from the EU to the USA. Subject to approval by other EU institutions, the decision paves the way for “Privacy Shield 2.0” to be in effect by Spring 2023.

The Background

In July 2020, the European Court of Justice (ECJ) in “Schrems II”, ruled that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework as a legal transfer tool as it failed to protect the rights of EU data subjects when their data was accessed by U.S. public authorities. In particular, the ECJ found that US surveillance programs are not limited to what is strictly necessary and proportionate as required by EU law and hence do not meet the requirements of Article 52 of the EU Charter on Fundamental Rights. Secondly, with regard to U.S. surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the USA, as required by Article 47 of the EU Charter.

The ECJ stated that organisations transferring personal data to the USA can still use the Article 49 GDPR derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporter to make a complex assessment  about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. 

Despite the Schrems II judgment, many organisations have continued to transfer personal data to the USA hoping that regulators will wait for a new Transatlantic data deal before enforcing the judgement.  Whilst the UK Information Commissioner’s Office (ICO) seems to have adopted a “wait and see” approach, other regulators have now started to take action. In February 2022, the French Data Protection Regulator, CNIL, ruled that the use of Google Analytics was a breach of GDPR due to the data being transferred to the USA without appropriate safeguards. This followed a similar decision by the Austrian Data Protection Authority in January. 

The Road to Adequacy

Since the Schrems ruling, replacing the Privacy Shield has been a priority for EU and US officials. In March 2022, it was announced that a new Trans-Atlantic Data Privacy Framework had been agreed in principle. In October, the US President signed an executive order giving effect to the US commitments in the framework. These include commitments to limit US authorities’ access to data exported from the EU to what is necessary and proportionate under surveillance legislation, to provide data subjects with rights of redress relating to how their data is handled under the framework regardless of their nationality, and to establish a Data Protection Review Court for determining the outcome of complaints.

Schrems III?

The privacy campaign group, noyb, of which Max Schrems is Honorary Chairman, is not impressed by the draft adequacy decision. It said in a statement:

“…the changes in US law seem rather minimal. Certain amendments, such as the introduction of the proportionality principle or the establishment of a Court, sound promising – but on closer examination, it becomes obvious that the Executive Order oversells and underperforms when it comes to the protection of non-US persons. It seems obvious that any EU “adequacy decision” that is based on Executive Order 14086 will likely not satisfy the CJEU. This would mean that the third deal between the US Government and the European Commission may fail.”

Max Schrems said: 

… As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again – in flagrant breach of our fundamental rights.”

The draft adequacy decision will now be reviewed by the European Data Protection Board (EDPB) and the European Member States. From the above statements it seems that if Privacy Shield 2.0 is finalised, a legal challenge against it is inevitable.

UK to US Data Transfers 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. At present use of such services usually involves a complicated TRA and execution of standard contractual clauses. A new UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a TRA as well as supplementary measures where privacy risks are identified. 

Good news may be round the corner for UK data exporters. The UK Government is also in the process of making an adequacy decision for the US. We suspect it will strike a similar deal once the EU/US one is finalised.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Our next online GDPR Practitioner Certificate course, starting on 10th January, is fully booked. We have places on the course starting on 19th January.