The Farage Bank Row: The Power of the GDPR Subject Access Right? 

Dame Alison Rose, the CEO of NatWest, resigned on Wednesday morning after being accused of leaking information on Nigel Farage’s bank account to the BBC. Following a GDPR subject access request, the ex-UKIP leader received information from the bank that contradicted its justification for downgrading his account. Some say that this incident highlights the power of data protection rights, while others argue that Dame Alison was forced to resign as a result of Mr Farage’s continued influence over the Government.
The truth is probably a mix of the two.

Background

In a Twitter post on 29th June, Mr Farage said his bank (who we now know to be Coutts) had decided to stop doing business with him. He said that a letter from the bank contained no explanation and he had then been told over the phone that it was a “commercial decision”. Mr Farage claimed he was being targeted because the “corporate world” had not forgiven him for Brexit.

On 4th July, a BBC report claimed that the real reason the bank did not want his custom was because Mr Farage did not have enough money in his accounts. Coutts requires clients to have at least £1m in investments or borrowing or £3m in savings. The BBC reported that Mr Farage’s political opinions were not a factor in the decision, but this turned out not to be the case. 

 Mr Farage submitted a Subject Access Request (SAR) to Coutts.
The response contained a 40-page document, published by the Daily Mail,  detailing all of the evidence Coutts accumulated about him to feed back to its Wealth Reputational Risk Committee. It revealed staff at the bank spent months compiling evidence on the “significant reputational risks of being associated with him”. It said continuing to have Mr Farage as a customer was not consistent with Coutts’ “position as an inclusive organisation” given his “publicly stated views”. Several examples were cited to flag concerns that he was “xenophobic and racist”, including his comparing Black Lives Matter protesters to the Taliban and his characterisation of the RNLI as a “taxi-service” for illegal immigrants. 

On 24th July, the BBC issued an apology to Mr Farage. It’s business editor Simon Jack also tweeted his apology, saying the reporting had been based on information from a “trusted and senior source” but “turned out to be incomplete and inaccurate”. This source later turned out to be Dame Alison. The Telegraph reported Dame Alison sat next to Simon Jack at charity dinner the day before the BBC story was published.

Dame Alison resigned after days of mounting pressure. The resignation was expected in the wake of briefings by Downing Street that she had lost the confidence of the Prime Minister and Chancellor. The Government owns a 38.6% in NatWest, the owner of Coutts.

The Data Protection Angle

The Information Commissioner, John Edwards, has issued a statement emphasising the importance of banks’ duty of confidentiality and the need for Coutts to be able to response to Mr Farage’s complaint. Mr Edwards has also written to UK Finance to remind them of their responsibilities on information they hold.

It is arguable that Dame Alison, or more accurately Coutts as the Data Controller, breached the UK GDPR which requires, amongst other things, for personal data to be processed fairly, lawfully and in a transparent manner. That is assuming she disclosed personal data about a client to a journalist without consent or lawful authority. Dame Alison has said she did not reveal any personal financial information about Mr Farage, but admitted she had left Simon Jack “with the impression that the decision to close Mr Farage’s accounts was solely a commercial one.” She said she was wrong to respond to any question raised by the BBC about the case.

Has Dame Alison committed a criminal offence under S.170 of the DPA 2018; that of unlawfully disclosing personal data without the consent of the Data Controller? This is unlikely as, being the head of the bank, her views and that of the controller would in effect be the same. Were others in Coutts to argue otherwise, there are a number of “reasonable belief” defences available to her.  

Many think this row is more about politics than confidentiality or banking. Labour MP Darren Jones has queried why the Prime Minister is intervening on one man’s bank account. He posted a string of other examples where he says the government has not intervened going on to give his reasons for the Government’s stance.

The Power of Subject Access

Whatever you think of Nigel Farage’s political views, this incident shows that the subject access right is a powerful tool which can be used by individuals to discover the truth behind decisions which affect their lives and to challenge them.

Article 15 of the UK GDPR allows a data subject to receive all their personal data that is held by a Data Controller, subject to certain exemptions.
This does not just include official documentation but also emails, comments and any other recorded discussions, whether they are professionally expressed or not. Coutts have now apologised for some of the language used about Farage describing it as “deeply inappropriate”. A high profile individual’s use of GDPR rights also reminds the normal public of the same rights. The BBC reports that NatWest has now received hundreds of subject access requests from customers.

On the same day as Dame Alison announced her resignation, Sky News reported the story of a woman who alleges that she was drugged and sexually assaulted while being held in custody by Greater Manchester Police. Zayna Iman has obtained bodycam and CCTV footage which is supposed to cover the 40 hours from when she was arrested and covering her detention in police custody. From that period, there are three hours of missing footage which GMP have so far failed to supply without any explanation.  Miss Iman’s allegations are the subject of an ongoing investigation and referral to the Independent Office for Police Conduct. 

Back to the Nigel Farage case and there is an irony here; Mr Farage was able to challenge the bank’s decision by using a right which originates in EU law; the UK GDPR being our post Brexit version of the EU GDPR!

Our How to Handle a Subject Access Request workshop will help you navigate each stage and requirement of a Subject Access Request.

Data Protection Law in Saudi Arabia: Implementing Regulation Published  

On 11th July 2023, the much-anticipated Implementing Regulation for Saudi Arabia’s first ever data protection law was published in draft form for public consultation. The regulation is the final step towards the implementation of the new law which will now officially come into force on 14th September 2023. Organisations will have until 13th September 2024 to comply to become fully compliant. At the same time, the draft regulation on the transfer of personal data outside Saudi Arabia was published. With a very short deadline for comments (31st  July 2023), those organisations doing businesses in the Middle East need to carefully consider the impact of the new law on their personal data processing activities.

Background

The Personal Data Protection Law (PDPL) of Saudi Arabia was implemented by Royal Decree on 14th September 2021. It aims to regulate the collection, handling, disclosure and use of personal data. It will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA) which has published the aforementioned regulations. PDPL was originally going to come fully into force on 23rd March 2022. However, in November 2022, SDAIA published proposed amendments which were passed after public consultation.  

Key Points to Note

The Implementing Regulation and the Data Transfer Regulation provide further guidance and clarity regarding the application of the new law.  

Like the GDPR, Data Controllers in Saudi Arabia may now rely on “legitimate interests” as a lawful basis to process personal data; this does not apply to sensitive personal data, or processing that contravenes the rights granted under PDPL and its regulations. The Implementing Regulation states that, before processing personal data for legitimate interests, a Data Controller must conduct an assessment of the proposed processing and its impact on the rights and interests of the Data Subject.
No doubt guidance on this assessment will follow but for now the UK Information Commissioner’s website is a good starting point.

The Implementing Regulation also fleshes out the detail of the various Data Subject rights under PDPL including access, correction and destruction. More detail is also provided about consent as a lawful basis of processing and when it can be withdrawn. The obligations of a Data Controller when appointing a Data Processor are also addressed in detail. 

 The Implementing Regulation introduces some new elements into PDPL, including a reference to a Legal Guardian, the definition of “Actual Interest”, and a National Register of Controllers. According to Article 37, the Competent Authority (SDAIA) will also set the rules for licensing entities to issue accreditation certificates for Controllers and Processors. 

Certain areas of the new law still require clarity. For example, according to Article 34 of the Implementing Regulation, the Competent Authority (SDAIA) is expected to issue additional rules, including circumstances under which a Data Protection Officer shall be appointed. Just like under the GDPR, PDPL permits data transfers outside of Saudi Arabia in certain circumstances and subject to various conditions, including to countries that have an appropriate level of protection for personal data which shall not be less than the level of protection established by PDPL. The Data Transfer Regulation covers, amongst other things, adequate countries and situations where, absent of any adequacy decision, personal data may still be transferred outside of Saudi Arabia. 

The Implementing Regulation is the final step towards the implementation of the new law. 13th September 2024 is not far away. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so will not just lead to enforcement action but also reputational damage.
The following should be part of an action plan for compliance: 

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes.  
  1. Training staff at all levels to understand PDPL at how it will impact on their role. 
  1. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed. 
  1. Reviewing how records management and information risk  is addressed within the organisation. 
  1. Drafting Privacy Notices to ensure they set out the minimum information that should be included. 
  1. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification. 
  1. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure. 
  1. Appointing and training a  Data Protection Officer. 
     

The UAE Federal Law

In November 2021, the United Arab Emirates enacted its first comprehensive national data protection law to regulate the collection and processing of personal data. Federal Decree Law No. 45 of 2021 regarding the Protection of Personal Data was published by the Cabinet Office on 27th November 2021 but to come into force regulations are required.
Whilst the two legal regimes are different, UAE is likely to follow Saudi Arabia’s lead and publish its detailed Executive Regulations very soon.  


Act Now in the Middle East  

Act Now Training can help your businesses prepare for PDPL and the UAE federal law. We have delivered training extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. Check out our UAE privacy programme. To help deliver this and other courses, Suzanne Ballabás, an experienced Dubai based data protection specialist, recently joined our team of associates.  We can also deliver customised in house training both remotely and face to face. Please get in touch to discuss your training or consultancy needs.

Data Flow Mapping: An Essential Skill for Data Protection Professionals 

Among essential skills for data protection professionals to develop is data flow mapping. In this blog post we explore the significance of this important skill and some useful tools to get  started. 

What is Data Flow Mapping? 

Data flow mapping is a systematic process that enables organisations to visualise the flow of personal data within their systems and networks.
It involves identifying the sources of data, the purposes for which it is processed, the entities with access to the data, and any transfers of data to third parties. By creating a visual representation of data flows, data protection professionals can gain a clear understanding of how personal data moves throughout the organisation and beyond. This knowledge is essential for effective risk assessment, Data Protection Impact Assessments (DPIAs) and compliance with other regulatory requirements. 

The Benefits of Data Flow Mapping 

Data flow mapping serves as a foundation for creating a comprehensive data inventory. It enables organisations to document all types of personal data they collect, process, store, and share. This inventory provides transparency and visibility into data processing activities, allowing for better management and control of personal data.  

The UK GDPR and the Data Protection Act 2018 impose strict obligations on organisations to protect personal data and ensure lawful processing.
Data flow mapping facilitates compliance by identifying areas where data protection measures need strengthening or adjustment.
It helps organisations determine whether they have a valid legal basis for processing personal data, obtain appropriate consents, and implement adequate security measures. Mapping data flows ensures compliance with the principles of lawfulness, fairness, and transparency, as well as data minimisation and purpose limitation. It will also assist in the production and maintenance of a Record of Processing Activity (ROPA) under Article 30 of the UK GDPR.  

Understanding the personal data landscape also helps organisations identify data subjects’ rights and obligations associated with each type of data. Data flow mapping enables organisations to respond effectively to data subject requests, such as access, rectification, and erasure.
By understanding the data flows, organisations can locate the relevant data and fulfil their obligations within the required timeframes.
This transparency empowers individuals to exercise their rights and fosters trust between organisations and data subjects. Furthermore, data flow mapping enhances transparency by providing a clear overview of how personal data is used and shared, enabling organisations to communicate their data processing practices accurately. 

In the event of a personal data breach or security incident, data flow mapping becomes a valuable asset for efficient incident response and management. It allows organisations to identify the affected data, assess the potential impact, and take appropriate measures to mitigate harm.
By understanding data flows, organisations can implement data breach response plans tailored to the specific types of data involved.
Proactive incident response minimizes the risk of data breaches and ensures compliance with legal obligations, including notification requirements and remedial actions. 

A data flow map is a powerful tool for identifying potential risks and vulnerabilities in data processing activities. It assists in assessing the security measures in place, evaluating the legal basis for data processing, and ensuring that data transfers, particularly international transfers, comply with relevant regulations. By understanding the risks, organisations can implement appropriate safeguards and mitigation strategies to protect personal data from unauthorised access, loss, or misuse. 

Effective data governance and accountability within organisations is greatly increased when data flow mapping is used. It promotes a holistic understanding of data processing activities, including the roles and responsibilities of individuals involved. This knowledge facilitates the establishment of appropriate policies, procedures, and internal controls to protect personal data. It also enables organisations to demonstrate accountability by showing regulators, stakeholders, and customers that they have implemented necessary measures to protect personal data and comply with legal requirements. 

Data Flow Mapping Tools 

While the process can be complex, there are several publicly available tools that can assist in simplifying data flow mapping. 

Lucidchart is a popular cloud-based diagramming tool. With its intuitive interface and drag-and-drop functionality, users can easily create visual representations of data flows. There are various templates and shapes specifically designed for data flow mapping, allowing organizations to quickly map out their data processing activities. Lucidchart also supports collaboration, enabling multiple team members to work together on data flow diagrams in real-time.  

Microsoft Visio is a widely used diagramming tool that includes features for data flow mapping. It has an extensive library of shapes and templates and offers various connectors and layout options to ensure clear and comprehensive representations of data flows. Visio also allows for easy linking of data flow diagrams to relevant documentation and policies.
As part of the Microsoft Office suite, Visio integrates seamlessly with other Microsoft products, making it a convenient choice for organisations already using Microsoft solutions. 

draw.io is a free, open-source diagramming tool that offers an intuitive interface for creating data flow diagrams. Users can save their diagrams locally or in cloud storage platforms such as Google Drive and OneDrive. draw.io is highly customizable, allowing users to tailor their data flow diagrams to their specific needs. While it may not have as many advanced features as some other tools, draw.io remains a practical option for organisations seeking a free and straightforward solution for data flow mapping. 

Data flow mapping is a critical skill for data protection professionals in the UK. By mapping data flows, organisations can create comprehensive data inventories, identify and mitigate risks, facilitate compliance, respond to data subject requests, and manage data breaches effectively.
As data becomes increasingly valuable and personal privacy gains greater significance, mastering the skill of data flow mapping is an essential step toward maintaining trust, building robust data protection frameworks, and ensuring the security and integrity of personal data. Data protection professionals who acquire this skill will be well-equipped to navigate the complex landscape of data protection and play a crucial role in upholding individuals’ privacy rights in the digital age.  


Sharpen your data flow mapping skills by joining our nextData Flow Mapping workshop. By the end you will understand the key concepts of data flow mapping, the benefits of this work and how to develop and implement a data flow mapping process in your organisation.

New GDPR Adequacy Decision for the EU-US Data Privacy Framework 

On 10th July 2023, the European Commission adopted its adequacy decision under Article 45 of GDPR for the EU-U.S. Data Privacy Framework (DPF). Thus, ends years of uncertainty and legal risk for European organisations wishing to transfer personal data to the US. In May, Meta Ireland (the owner of Facebook) was the subject of the largest ever GDPR fine of €1.2bn (£1bn) when Ireland’s Data Protection Commission ruled that its US data transfers were not GDPR compliant.  The new adequacy decision concludes that the United States ensures an adequate level of protection, comparable to that of the European Union, for personal data transferred from the EU to US companies under the new framework. Personal data can now flow safely from the EU to US companies participating in the Framework, without having to put in place additional data protection safeguards under the GDPR. 

The Journey to Adequacy 

In July 2020, the European Court of Justice (ECJ) in “Schrems II”, ruled that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework as a legal transfer tool as it failed to protect the rights of EU data subjects when their data was accessed by U.S. public authorities. In particular, the ECJ found that US surveillance programs are not limited to what is strictly necessary and proportionate as required by EU law and hence do not meet the requirements of Article 52 of the EU Charter on Fundamental Rights. Secondly, with regard to U.S. surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the USA, as required by Article 47 of the EU Charter. The ECJ stated that organisations transferring personal data to the USA can still use the Article 49 GDPR derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporter to make a complex assessment about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. Since the Schrems ruling, replacing the Privacy Shield has been a priority for EU and US officials. In March 2022, it was announced that a new  Trans-Atlantic Data Privacy Framework had been agreed in principle. The US President signed an executive order in October, giving effect to the US commitments in the framework, and paving the way for the European Commission to publish a draft ‘adequacy decision’ on 14th December 2022. 


The Changes

The EU-U.S. Data Privacy Framework (DPF) introduces new binding safeguards to address all the concerns raised by the European Court of Justice in Schrems. This includes limiting access to EU data by US intelligence services to what is necessary and proportionate, and establishing a Data Protection Review Court (DPRC), to which EU individuals will have access. The new framework introduces significant improvements compared to the mechanism that existed under the Privacy Shield. For example, if the DPRC finds that data was collected in violation of the new safeguards, it will be able to order the deletion of the data. The new safeguards in the area of government access to data will complement the obligations that US companies importing data from the EU will have to subscribe to. EU individuals will also benefit from several redress avenues in case their data is wrongly handled by US companies. This includes free of charge independent dispute resolution mechanisms and an arbitration panel. 


The Mechanics 

Just like the old Privacy Shield, US companies can self-certify their participation in the DPF by committing to comply with a detailed set of privacy obligations. These could include privacy principles such as purpose limitation, data minimisation and data retention, as well as specific obligations concerning data security and the sharing of data with third parties. The DPF will be administered by the US Department of Commerce, which will process applications for certification and monitor whether participating companies continue to meet the certification requirements. Compliance will be enforced by the US Federal Trade Commission. Many US companies remain self-certified to Privacy Shield standards. Consequently, it is no going to be a difficult task for them to transition to the DPF. As far as EU organisations go all they need to do now, before making a transfer of personal data to the US, is check that the organisation receiving their personal data is certified under the DPF. More information including the self-certification process is expected to be posted on the U.S. Department of Commerce’s new Data Privacy Framework website

Impact on Other Data Transfer Tools  

The safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) apply to all data transfers under the GDPR to companies in the US, regardless of the transfer mechanism used. These safeguards therefore also facilitate the use of other transfer tools, such as standard contractual clauses and binding corporate rules. This means that, when conducting a transfer impact assessment, a data controller can refer to the DPF adequacy decision as a conclusive finding by the European Commission that the 2 big protections introduced in the USA by the related Executive Order are applicable to transfers under your SCCs and provide suitable restrictions on government surveillance plus suitable redress for EEA data subjects. This makes any needed transfer impact assessment for the USA very straightforward. 
It is important to note that this adequacy decision only covers transfers of personal data from the EU to the US. The UK Government is also working on an adequacy finding for the US and this decision should expedite the process. 

The new US – EU Data Privacy Framework will be discussed in detail on our forthcomingInternational Transfers workshop.

Middle East Data Protection Specialist Joins the Act Now Team

Suzanna Ballabas

Act Now Training is pleased to announce that Suzanne Ballabás, an experienced Dubai based data protection specialist, has joined its team of associates.  

Suzanne is a privacy professional with over ten years of practical experience in implementing privacy practices across various international organisations, in addition to acting as a compliance officer for multiple regulated entities within the UAE’s financial districts of DIFC and ADGM.  

Previously, Suzanne held the position of Head of Data Protection in the Middle East for Waystone, where she managed data protection infrastructure for over 100 firms and served as the Data Protection Officer for various organisations, including Michael Page, DP World Financial Services, and Waystone. She played a crucial role in establishing Waystone’s data privacy practice in the Middle East and possesses extensive knowledge of data protection laws and regulations in the UAE.

Before her time in Dubai, Suzanne was based in London, working with the GDPR, rolling out the international privacy programme for international accountancy practice Baker Tilly.  

Suzanne is a law graduate and holds multiple IAPP privacy qualifications including Certified Information Privacy Professional/Europe (CIPP/E), Certified Information Privacy Manager (CIPP/M), Certified Information Privacy Technologist (CIPP/T. She also specialises in ADGM Compliance (Financial Services), Money Laundering Reporting and International Human Resource Management. 

Suzanne said: 

“I am really pleased to be joining the Act Now team. I’m excited to start working with them to help deliver their excellent courses and training programmes particularly those targeted at the fast developing Middle East data protection landscape.” 

This is an exciting time for privacy law in the Middle East. Alongside the passing of the law, which is awaiting executive regulations,  Saudi Arabia and a number of other jurisdictions have passed DP laws similar to GDPR. 

Ibrahim Hasan said: 

Act Now’s reputation is growing in the UAE as a provider of practical training on all aspects of  data protection. With Suzanne’s appointment we will be able to service more clients through delivery of our flagship courses, such as the UAE DPO Certificate, as well as develop new courses tailored for the Middle East market and to help practitioners understand the latest trends and developments in data protection law in the UAE and the wider Middle East.”  

For the past five years, Act Now has been delivered training extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. Check out our UAE privacy programme. We can also deliver customised in house training both remotely and face to face. Please get in touch to discuss your training or consultancy needs.   

International Transfers Breach Results in Record GDPR Fine for Meta

Personal data transfers between the EU and US is an ongoing legal and political saga. The latest development is yesterday’s largest ever GDPR fine of €1.2bn (£1bn) issued by Ireland’s Data Protection Commission (DPC) to Facebook’s owner, Meta Ireland. The DPC ruled that Meta infringed Article 46 of the EU GDPR in the way it transferred personal data of its users from Europe to the US. 

The Law 

Chapter 5 of the EU GDPR mirrors the international transfer arrangements of the UK GDPR. There is a general prohibition on organisations transferring personal data to a country outside the EU, unless they ensure that data subjects’ rights are protected. This means that, if there is no adequacy decision in respect of the receiving country, one of the safeguards set out in Article 46 must be built into the arrangement. These include standard contractual clauses (SCCs) and binding corporate rules.
The former need to be included in a contract between the parties (data exporter and importer) and impose certain data protection obligations on both. 

The Problem with US Transfers 

In 2020, in a case commonly known as “Schrems II, the European Court of Justice (ECJ) concluded that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal mechanism to ensure GDPR compliance. They must consider using the Article 49 derogations or SCCs. If using the latter, whether for transfers to the US or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection and surveillance legislation, and to put in place “additional supplementary measures” to those included in the SCCs. The problem with the US is that it has stringent surveillance laws which give law enforcement agencies access to personal data without adequate safeguards (according to the ECJ in Schrems). Therefore any additional measures must address this possibility and build in safeguards to protect data subjects. 

In the light of the above, the new EU SCCs were published in June 2021.
The European Data Protection Board has also published its guidance on the aforementioned required assessment entitled “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data”. Meta’s use of the new EU SCC’s and its “additional supplementary measures” were the focus of the DPC’s attention when issuing its decision. 

The Decision 

The DPC ruled that Meta infringed Article 46(1) of GDPR when it continued to transfer personal data from the EU/EEA to the US following the ECJ’s ruling in Schrems II. It found that the measures used by Meta did not address the risks to the fundamental rights and freedoms of data subjects that were identified in Schrems; namely the risk of access to the data by US law enforcement.  

The DPC ruled that Meta should: 

  1. Suspend any future transfer of personal data to the US within five months of the date of the DPC’s decision; 
  1. Pay an administrative fine of €1.2 billion; and, 
  1. Bring its processing operations in line with the requirements of GDPR, within five months of the date of the DPC’s decision, by ceasing the unlawful processing, including storage, in the US of personal data of EEA users transferred in violation of GDPR. 

Meta has said that it will appeal the decision and seek a stay of the ruling, before the Irish courts.  Its President of Global Affairs, Sir Nick Clegg, said:  

“We are therefore disappointed to have been singled out when using the same legal mechanism as thousands of other companies looking to provide services in Europe. 

“This decision is flawed, unjustified and sets a dangerous precedent for the countless other companies transferring data between the EU and US.” 

The Future of US Transfers 

The Information Commissioner’s Office told the BBC that the decision “does not apply in the UK” but said it had “noted the decision and will review the details in due course”. The wider legal ramifications on data transfers from the UK to the US can’t be ignored. 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. A new  UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a Transfer Risk Assessment  as well as supplementary measures where privacy risks are identified.  

On 25th March 2022, the European Commission and the United States announced that they have agreed in principle on a new  Trans-Atlantic Data Privacy Framework. The final agreement is expected to be in place sometime this summer 2023 and will replace the Privacy Shield Framework. It is expected that the UK Government will strike a similar deal once the EU/US one is finalised. However both are likely to be challenged in the courts. 

The Meta fine is one of this year’s major GDPR developments nicely timed; within a few days of the 5th anniversary of GDPR. All organisations, whether in the UK or EU, need to carefully consider their data transfers mechanisms and ensure that they comply with Chapter 5 of GDPR in the light of the DPC’s ruling. A “wait and see’ approach is no longer an option.  

The Meta fine will be discussed in detail on our forthcoming International Transfers workshop. For those who want a 1 hour summary of the UK International Transfer regime we recommend our webinar 

The California Consumer Privacy Act (CCPA) and the CPRA: What’s Changed? 

Californian privacy law is about to change once again thanks to the California Privacy Rights Act (CPRA) which will become fully enforceable on 1st July 2023. 

The current law is set out in the California Consumer Privacy Act (CCPA) which has been in force since 1st July 2020. CCPA regulates the processing of California consumers’ personal data, regardless of where a company is located. It provides broader rights to consumers, and stricter compliance requirements for businesses, than any other US state or federal privacy law. 

Like the EU General Data Protection Regulation (GDPR), CCPA is about giving individuals control over how their personal data is used by organisations. It requires transparency about how such data is collected, used and shared. It gives Californian consumers various rights including the right to: 

  • Know and access the personal being collected about them 
  • Know whether their personal data is being sold, and to whom 
  • Opt out of having their personal data sold 
  • Have their personal data deleted upon request 
  • Avoid discrimination for exercising their rights 

CPRA is not a new law; it amends the CCPA to give Californians even more control over their personal data.  The key provisions include: 

  • Changing the CCPA’s definition of Personal Information 
  • Creating a new data category called “Sensitive Personal Information” similar to Special Category Data under GDPR 
  • Changing the scope of the CCPA  
  • Adding new rights e.g. to correct inaccurate information and to limit use and disclosure of sensitive personal information 
  • Changing regulatory area of focus towards behavioral advertisement  
  • Adding additional requirements for business (closely modelled on GDPR Data Protection Principles) namely data minimisation, purpose limitation and storage limitation 
  • Expanding the CCPA’s current consent requirements to include where, amongst others, a business is selling or sharing personal information after a user has already opted out and when selling or sharing the personal information of minors  

Whilst becoming fully enforceable on 1st July 2023, CPRA will have a 12 month “lookback period” applying to its new rights from 1st January 2022. 

Until recently, CCPA did not have a regulator like the Information Commissioner in the UK. It was primality enforced by the Office of the Attorney General through the courts; although there is a private right of right action for a security breach. The courts can impose fines for breaches of CCPA depending on the nature of the breach: 

  • $2,500 for an unintentional and $7,500 for an intentional breach  
  • $100-$750 per incident per consumer, or actual damages, if higher – for damage caused by a security breach.  

CPRA establishes the California Privacy Protection Agency (CPPA) which has the authority to investigate potential breaches and violations, and to draft enforcement regulations. It has produced new CPRA Regulations providing rules on service provider contracts, dark patterns, and the recognition of “global opt-out” browser signals. 

While the CCPA fines and damages may appear relatively low, it is important to note that they are per breach. A privacy incident can affect thousands or tens of thousands of consumers, in which case it could cost a company hundreds of thousands or even millions of dollars. In the first three years of the CCPA’s existence, 320 lawsuits have been filed in 28 states according to a report by Akin, a US law firm. It found that: 

  • More than 80% of CCPA lawsuits in 2022 corresponded to a breach notice filed with the California Attorney General’s Office, and businesses that report a data breach to the AG’s office have about a 15% chance of facing subsequent consumer litigation. 
  • Breaches affecting at least 100,000 people accounted for 56% of lawsuits in 2022 stemming from data breaches. 
  • Financial services companies accounted for 34% of cases in 2022, by far the highest rate of any industry. Medical services and software/technology each comprised 13%.

All US based businesses, as well as those elsewhere who are processing Californian residents’ personal information, need to consider how CPRA will impact their data management and start the implementation process immediately. People are more concerned than ever about what is happening to their personal data as a result of recent media headlines concerning the exploitation of personal data by AI and social media companies

Ibrahim Hasan will be speaking about the CCPA and CPRA at the MER Information Governance Conference in Chicago in May.  

Interested in US privacy law? Check out our US privacy programme 

Saudi Arabian Data Protection Law Update 

In September 2021, Saudi Arabia announced its first ever data protection law. The Personal Data Protection Law (PDPL) was implemented by Royal Decree M/19 of 9/2/1443H approving Resolution No. 98 dated 7/2/1443H (14th September 2021). PDPL will regulate the collection, handling, disclosure and use of personal data and includes governance and transparency obligations. It will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA). 

PDPL was originally going to come fully into force on 23rd March 2022. However, in November 2022, SDAIA published proposed amendments for public consultation. On 21st March 2023, some of these amendments were passed by the Saudi Council of Ministers. PDPL will now officially come into force on 14th September 2023 and organisations will have till 13th September 2024 to comply. Much of the detail of the new law will be set out in the Executive Regulations which we are still waiting for, although a draft version was issued last year. 

The amendments to PDPL introduce several concepts that will align the new law more closely to the EU General Data Protection Regulation (GDPR) and the UK GDPR. These include: 

  • New Ground for Processing: Like the GDPR, Data Controllers may now rely on “legitimate interests” as a lawful basis to process personal data; this does not apply to sensitive personal data, or processing that contravenes the rights granted under PDPL and its executive regulations.  
     
  • Easier International Transfers: Like other data protection regimes, PDPL imposes limitations on the international transfer of personal data outside of the KSA. The strict prohibition on transfers outside Saudi Arabia has now been amended. Furthermore they no longer require approval from SDAIA. Data Controllers will need a specific purpose to transfer data outside the Kingdom and transfers appear to be limited to territories that SDAIA determines as having an appropriate level of protection for personal data, which will be further clarified once they issue evaluation criteria for this purpose. The pending executive regulations should set out exemptions from this condition.  
     
  • Removal of Controller Registration Requirements: The original law required Data Controllers to register on an electronic portal that would form a national record. This provision has now been removed. However, SDAIA has the mandate to license auditors and accreditation entities and create a national register if it determines that it would be an appropriate tool and mechanism for monitoring the compliance of controllers. 
  • Data Breach Notification Relaxed: Notifications of personal data breaches to SDAIA are no longer required “immediately.” However, controllers must now notify data subjects when a breach threatens personal data or contravenes the data subject’s rights or interests. The pending regulations are expected to provide additional specificity, such as particular dates for notifying data breaches and threshold requirements.  
     
  • Criminal Offences Reduced: The penalties for breaching PDPL will be a warning or a fine of up to SAR 5,000,000 (USD 1,333,000) that may be doubled for repeat offences. Criminal sanctions for violating the PDPL’s data transfer restrictions have been removed. There now remains only one criminal offence in relation to the disclosure or publication of sensitive personal data in violation of the law.  

Action Plan for Compliance 

Businesses established in Saudi Arabia, as well as those processing Saudi citizens’ personal data anywhere in the world, have sixteen months to prepare for PDPL. Considering that those covered by GDPR had four years, this is not a long time. Now is the time to put systems and processes in place to ensure compliance. Failure to do so will not just lead to enforcement action but also reputational damage.  

The following should be part of an action plan for compliance: 

  1. Raising awareness about PDPL at all levels. Our GDPR elearning course can be tailored for frontline staff. 
  1. Carrying out a data audit and reviewing how records management and information risk is addressed. 
  1. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification
  1. Revising privacy policies in the light of the more prescriptive transparency requirements.  
  1. Writing policies and procedures to deal with new and revised Data Subject rights such as Data Portability and Subject Access. 
  1. Appointing and training a Data Protection Officer.  

The new KSA data protection law is an important development in Middle East privacy law alongside the passing of the new UAE Federal DP law.
These laws, being closely aligned with the EU General Data Protection Regulation (GDPR) and the UK GDPR, open up exciting job opportunities for UK and EU Data Protection professionals. A quick scan of jobs sites shows a growing number of prospects. 

Act Now in the Middle East 

Act Now Training can help your businesses prepare for PDPL. We have delivered training extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. Check out our UAE privacy programme. We can also deliver customised in house training both remotely and face to face.
Please get in touch to discuss your training or consultancy needs.  

Our new Intermediate Certificate in GDPR Practice includes a module on worldwide data protection laws. 

Exploring the Legal and Regulatory Challenges of AI and Chat GPT 

In our recent blog post, entitled “GDPR and AI: The Rise of the Machines”, we said that 2023 is going to be the year of Artificial Intelligence (AI). Events so far seem to suggest that advances in the technology as well legal and regulatory challenges are on the horizon.   

Generative AI, particularly large language models like ChatGPT, have captured the world’s imagination. ChatGPT registered 100 million monthly users in January alone; having only been launched in November and it set the record for the fastest growing platform since TikTok, which took nine months to hit the same usage level. In March 2023, it recorded 1.6 Billion user visits which are just mind-boggling numbers and shows how much of a technological advancement it will become. There have already been some amazing medical uses of generative AI including the ability to match drugs to patients, numerous stories of major cancer research breakthroughs as well as the ability for robots to do major surgery. 
 
However, it is important to take a step back and reflect on the risks of a technology that has made its own CEO “a bit scared” and which has caused the “Godfather of AI” to quit his job at Google. The regulatory and legal backlash against AI has already started. Recently, Italy became the first Western country to block ChatGPT. The Italian DPA highlighted privacy concerns relating to the model. Other European regulators are reported to be looking into the issue too. In April the European Data Protection Board launched a dedicated task force on ChatGPT. It said the goal is to “foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities.” Elsewhere, Canada has opened an investigation into OpenAI due to a complaint alleging the collection, use and disclosure of personal information is without consent. 

The UK Information Commissioner’s Office (ICO) has expressed its own concerns. Stephen Almond, Director of Technology and Innovation at the ICO, said in a blog post

“Data protection law still applies when the personal information that you’re processing comes from publicly accessible sources…We will act where organisations are not following the law and considering the impact on individuals.”  

Wider Concerns 

ChatGPT suffered its first major personal data breach in March.
According to a blog post by OpenAI, the breach exposed payment-related and other personal information of 1.2% of the ChatGPT Plus subscribers. But the concerns around AI and ChatGPT don’t stop at privacy law.   

An Australian mayor is considering a defamation suit against ChatGPT after it told users that he was jailed for bribery; in reality he was the whistleblower in the bribery case. Similarly it falsely accused a US law professor of sexual assault. The Guardian reported recently that ChatGPT is making up fake Guardian articles. There are concerns about copyright law too; there have been a number of songs that use AI to clone the voices of artists including Drake and The Weeknd which has since  been removed from streaming services after criticism from music publishers. There has also been a full AI-Generated Joe Rogan episode with the OpenAI CEO as well as with Donald Trump. These podcasts are definitely worth a sample, it is frankly scary how realistic they actually are. 

AI also poses a significant threat to jobs. A report by investment bank Goldman Sachs says it could replace the equivalent of 300 million full-time jobs. Our director, Ibrahim Hasan, recently gave his thoughts on this topic to BBC News Arabic. (You can watch him here. If you just want to hear Ibrahim “speak in Arabic” skip the video to 2min 48 secs!) 
 

EU Regulation 

With increasing concern about the future risks AI could pose to people’s privacy, their human rights or their safety, many experts and policy makers believe AI needs to be regulated. The European Union’s proposed legislation, the Artificial Intelligence (AI) Act, focuses primarily on strengthening rules around data quality, transparency, human oversight and accountability. It also aims to address ethical questions and implementation challenges in various sectors ranging from healthcare and education to finance and energy. 

The Act also envisages grading AI products according to how potentially harmful they might be and staggering regulation accordingly. So for example an email spam filter would be more lightly regulated than something designed to diagnose a medical condition – and some AI uses, such as social grading by governments, would be prohibited altogether. 

UK White Paper 

On 29th March 2023, the UK government published a white paper entitled “A pro-innovation approach to AI regulation.” The paper sets out a new “flexible” approach to regulating AI which is intended to build public trust and make it easier for businesses to grow and create jobs. Unlike the EU there will be no new legislation to regulate AI. In its press release, the UK government says: 

“The government will avoid heavy-handed legislation which could stifle innovation and take an adaptable approach to regulating AI. Instead of giving responsibility for AI governance to a new single regulator, the government will empower existing regulators – such as the Health and Safety Executive, Equality and Human Rights Commission and Competition and Markets Authority – to come up with tailored, context-specific approaches that suit the way AI is actually being used in their sectors.” 

The white paper outlines the following five principles that regulators are to consider facilitating the safe and innovative use of AI in their industries: 

  • Safety, Security and Robustness: applications of AI should function in a secure, safe and robust way where risks are carefully managed; 

  • Transparency and Explainability: organizations developing and deploying AI should be able to communicate when and how it is used and explain a system’s decision-making process in an appropriate level of detail that matches the risks posed by the use of the AI; 

  • Fairness: AI should be used in a way which complies with the UK’s existing laws (e.g., the UK General Data Protection Regulation), and must not discriminate against individuals or create unfair commercial outcomes; 

  • Accountability and Governance: measures are needed to ensure there is appropriate oversight of the way AI is being used and clear accountability for the outcomes; and 

  • Contestability and Redress: people need to have clear routes to dispute harmful outcomes or decisions generated by AI 

Over the next 12 months, regulators will be tasked with issuing practical guidance to organisations, as well as other tools and resources such as risk assessment templates, that set out how the above five principles should be implemented in their sectors. The government has said this could be accompanied by legislation, when parliamentary time allows, to ensure consistency among the regulators. 

Michelle Donelan MP, Secretary of State for Science, Innovation and Technology, considers that this this light-touch, principles-based approach “will enable . . . [the UK] to adapt as needed while providing industry with the clarity needed to innovate.” However, this approach does make the UK an outlier in comparison to global trends. Many other countries are developing or passing special laws to address alleged AI dangers, such as algorithmic rules imposed in China or the United States. Consumer groups and privacy advocates will also be concerned about the risks to society in the absence of detailed and unified statutory AI regulation.  

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues.  

The New Data Protection Bill: What it means for DP and the public sector

In March, the UK Department for Science, Information and Technology (DSIT) published the Data Protection and Digital Information (No.2) Bill. The Bill is now going through Parliament. If enacted, it will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

Our director, Ibrahim Hasan, recently took part in a webinar organised by eCase. In this 45 minute session Ibrahim, alongside Data Protection experts Jon Baines of Mishcon de Reya and Lynn Wyeth of Leicester City Council, discusses the new Bill including:

  • The key changes
  • The differences with the current regime
  • What the changes mean for the public sector
  • The challenges

To access the webinar recording click here ((note: eCase email registration required to access)

The new Bill will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

%d bloggers like this: