The aim of the “Digital Omnibus” package is to ease administrative burdens for businesses across areas like privacy, cybersecurity and artificial intelligence. Although the EU GDPR is considered balanced and fit for purpose, “targeted changes” are proposed to address concerns, particularly from smaller companies. These include:
Clarification of Definitions: The definition of “personal data” is clarified. Information is not considered personal to a company if it does not possess means “reasonably likely” to be used to identify an individual.
Processing for AI Training: It is clarified that the processing of personal data for the development and training of AI systems can constitute a “legitimate interest” under certain conditions.
Simplified Reporting of Data Breaches: The reporting obligation to supervisory authorities is aligned with the threshold for notifying data subjects. A report is only required if there is a “high risk” to the rights and freedoms of natural persons. The deadline for reporting is extended to 96 hours.
Harmonization of Data Protection Impact Assessments (DPIA): National lists of processing operations requiring a DPIA (or not) are to be replaced by unified EU-wide lists to promote harmonisation.
Scientific Research: The conditions for data processing for scientific research purposes are clarified by defining “scientific research” and clarifying that this constitutes a legitimate interest.
The EU AI Act also faces a number of amendments, including simplifications for small and medium-sized enterprises and small mid-cap companies in the form of pared-back technical documentation requirements. Other measures involve sandboxes for real-world testing and to “reinforce the AI Office’s powers and centralise oversight of AI systems built on general-purpose AI models, reducing governance fragmentation”.
Both omnibus packages now have a long road ahead as they enter into the trilogue negotiations with European Parliament and the Council of the European Union. It is expected to take at least several months until negotiations are finalised.
Impact on the UK
The UK has already enacted its own package of amendments to the UK GDPR in the form of the Data (Use and Access) Act 2025 which received Royal Assent on 19th June 2025. The amendments are quite modest even before comparing them to the EU proposals above.
A more bolder list of amendments were contained in the Data Protection and Digital Information Bill published in 2022 by the Conservative Government. This included proposals to amend the definition of personal data and to replace Data Protection Officers with Senior Responsible Individuals. This bill was later replaced by a diluted bill of the same name (number 2 Bill) only for that to be dropped in the Parliamentary “wash up” stage before the last General Election.
Could the EU reforms (if enacted) lead to the UK making more fundamental changes to the UK GDPR? We doubt it. The Labour Government has more pressing priorities and with the passing of the DUA Act they can say they have “done GDPR reform”. If we get a change in Government, then Reform and the Conservatives might target the UK GDPR as way of reigning in “pesky human rights laws”.
Data protection professionals need to assess the changes to the UK data protection regime made by the DUA Act. Our half day workshop will explore the Act in detail giving you an action plan for compliance.A revised UK GDPR Handbookis now available incorporating the changes made by the DUA Act.
The EU AI Act comes into force today although not all the provisions will become enforceable straight away.
The Act sets out comprehensive rules for AI applications, including a risk-based system to address potential threats to health and safety, and human rights. It will ban certain AI applications that pose an “unacceptable risk,” including real-time and remote biometric identification systems such as facial recognition. Additionally, it will impose strict obligations on those considered “high risk,” encompassing AI used in EU-regulated product safety categories, for example, cars and medical devices. These obligations include adherence to data governance standards, transparency rules, and the incorporation of human oversight mechanisms.
February 2025: Chapters I (general provisions) & II (prohibited AI systems) will apply
August 2025: Chapter III Section 4 (notifying authorities), Chapter V (general purpose AI models), Chapter VII (governance), Chapter XII (confidentiality and penalties), and Article 78 (confidentiality) will apply, except for Article 101 (fines for General Purpose AI providers)
August 2026: the whole AI Act applies, except for Article 6(1) & corresponding obligations (one of the categories of high-risk AI systems)
August 2027 – Article 6(1) & corresponding obligations apply.
In the UK, despite media reports, the King’s Speech did not include a bill to regulate AI. The King said that the government would “seek to establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”. Expect a government consultation to be announced soon.
Our AI Act workshop will help you understand the new law in detail and its interaction with the UK’s objectives and strategy for AI regulation.
Here’s a pub quiz question for you, “Can a Data Controller circumvent the requirements of data protection law by disclosing personal data verbally rather than in writing?” The answer was “Yes” under the old Data Protection Act 1998. In Scott v LGBT Foundation Ltd [2020] WLR 62, the High Court rejected a claim that the LGBT foundation had breached, amongst other things, the claimants data protection rights by disclosing information about him to a GP. The court held that the 1998 Act did not apply to purely verbal communications.
Nowadays though, the answer to the above question is no; the oral disclosure of personal data amounts to “processing” as defined by Article 4(2) of the GDPR. So said the Court of Justice of the European Union (CJEU), on 7th March 2024, in a preliminary ruling in the Endemol Shine Finland.
The subject of the ruling is a television company which makes a number of reality TV shows in Finland. It had been organising a competition, and was seeking information from the District Court of South Savo for information about possible criminal proceedings involving one of the competition participants. It requested the District Court to disclose the information orally rather than in writing. The District Court refused the request on the basis that there was no legitimate reason for processing the criminal offence data under Finnish law, implementing Article 10 of the GDPR. On appeal Endemol Shine Finland argued that the GDPR did not apply as the oral disclosure of the information would not constitute processing of personal data under the GDPR.
Article 4(2) GDPR defines “processing” as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means”. On the face of it, this covers oral processing. However, Article 2 states that GDPR applies to processing of personal data “wholly or partly by automated means”, and processing by non-automated means which “forms or is intended to form part of a filing system.” Article 4(6) GDPR defines “filing system” broadly, covering “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”.
The Finnish Court of Appeal requested a preliminary ruling from CJEU on the meaning of Article 4(2) and whether the particular processing in this case came within the material scope of the GDPR under Article 2. The CJEU held the concept of processing in Article 4(2) of the GPDR necessarily covered the oral disclosure of personal data. It said the wording of the Article made it apparent that the EU legislature intended to give the concept of processing a broad scope. The court pointed out that the GDPR’s objective was “to ensure a high level of protection of the fundamental rights and freedoms of natural persons” and that “circumventing the application of that regulation by disclosing personal data orally rather than in writing would be manifestly incompatible with that objective”.
The CJEU went on to consider whether the oral processing of the data would fall within the material scope of the GDPR under Article 2. It held that it was clear from the request for a preliminary ruling that the personal data sought from the District Court of South Savo is contained in “a court’s register of persons” which appeared to be a filing system within the meaning of Article 4(6), and therefore fell within the scope of the GDPR.
UK Data Controllers should note the wording of Article 4 and Article 2 of the UK GDPR is the same as in the EU GDPR. So whilst this ruling from the CJEU is not binding on UK courts, it would be wise to assume that picking up the phone and making an oral disclosure of personal data will not allow the UK GDPR to be circumvented.
European representatives in Strasbourg recently concluded an extensive 37-hour discussion, resulting in the world’s first extensive framework for regulating artificial intelligence. This ground-breaking agreement, facilitated by European Commissioner Thierry Breton and Spain’s AI Secretary of State, Carme Artigas, is set to shape how social media and search engines operate, impacting major companies.
The deal, achieved after lengthy negotiations and hailed as a significant milestone, puts the EU at the forefront of AI regulation globally, surpassing the US, China, and the UK. The new legislation, expected to be enacted by 2025, involves comprehensive rules for AI applications, including a risk-based system to address potential threats to health, safety, and human rights.
Key components of the agreement include strict controls on AI-driven surveillance and real-time biometric technologies, with specific exceptions for law enforcement under certain circumstances. The European Parliament ensured a ban on such technologies, except in cases of terrorist threats, search for victims, or serious criminal investigations.
MEP Brando Benefei and Dragoș Tudorache, who led the negotiations, emphasised the aim of developing an AI ecosystem in Europe that prioritises human rights and values. The agreement also includes provisions for independent authorities to oversee predictive policing and uphold the presumption of innocence.
Tudorache highlighted the balance struck between equipping law enforcement with necessary tools and banning AI technologies that could pre-emptively identify potential criminals. (Minority Report anyone?) The highest risk AI systems will now be regulated based on the computational power required for training, with GPT4 being a notable example and the only technology fulfilling this criterion.
Some Key Aspects
The new EU AI Act delineates distinct regulations for AI systems based on their perceived level of risk, effectively categorizing them into “Unacceptable Risk,” “High Risk,” “Generative AI,” and “Limited Risk” groups, each with specific obligations for providers and users.
Unacceptable Risk
AI systems deemed a threat to people’s safety or rights will be prohibited. This includes:
AI-driven cognitive behavioural manipulation, particularly targeting vulnerable groups, like voice-activated toys promoting hazardous behaviours in children.
Social scoring systems that classify individuals based on behaviour, socio-economic status, or personal characteristics.
Real-time and remote biometric identification systems, like facial recognition.
Exceptions exist, such as “post” remote biometric identification for serious crime investigations, subject to court approval.
High Risk
AI systems impacting safety or fundamental rights fall under high-risk, subdivided into:
AI in EU-regulated product safety categories, like toys, aviation, cars, medical devices, and lifts.
Specific areas requiring EU database registration, including biometric identification, critical infrastructure management, education, employment, public services access, law enforcement, migration control, and legal assistance.
High-risk AI systems must undergo pre-market and lifecycle assessments.
Generative AI
AI like ChatGPT must adhere to transparency protocols:
Disclosing AI-generated content.
Preventing generation of illegal content.
Publishing summaries of copyrighted data used in training.
Limited Risk
These AI systems require basic transparency for informed user decisions, particularly for AI that generates or manipulates visual and audio content, like deepfakes. Users should be aware when interacting with AI.
The legislation sets a precedent for future digital regulation. As we saw with the GDPR, Governments outside the EU used the legislation as a foundation for their own laws and many corporations adopted the same privacy standards from within Europe for their businesses worldwide for efficiency. This could easily happen in the case of the EU AI Act with governments using it as a ‘starter for ten’. It will be interesting to see how the legislation will cater for algorithmic biases found within current iterations of the technology from facial recognition technology to other automated decision making algorithms.
The UK did publish its AI White Paper in March of this year and says it follows a “Pro-Innovation” approach. However, it seems to have decided to go ‘face first’ before any legislation is passed with facial recognition software recently used in the Beyoncé gig, King Charles’ coronation and during the Formula One Grand Prix. For many, it is the impact of the decision making the software is formulating through the power of AI which is worrying. The ICO does have useful guides on the use of AI which can be found here.
As artificial intelligence technology rapidly advances, exemplified by Google’s impressive Gemini demo, the urgency for comprehensive regulation was becoming increasingly apparent. The EU has signalled its intent to avoid past oversights seen in the unchecked expansion of tech giants and be at the forefront of regulating this fascinating technology to ensure its ethical and responsible utilisation.
On 17th May 2022, The Council of the European Union adopted the Data Governance Act (DGA) or Regulation on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act) (2020/0340 (COD) to give its full title. The Act aims to boost data sharing in the EU allowing companies to have access to more data to develop new products and services.
The DGA will achieve its aims through measures designed to increase trust in relation to data sharing, creating new rules on the neutrality of data marketplaces and facilitating the reuse of public sector data. The European Commission says in its Questions and Answers document:
“The economic and societal potential of data use is enormous: it can enable new products and services based on novel technologies, make production more efficient, and provide tools for combatting societal challenges“.
Application
The DGA will increase the amount of data available for re-use within the EU by allowing public sector data to be used for purposes different than the ones for which it was originally collected. The Act will also create sector-specific data spaces to enable the sharing of data within a specific sector e.g. transport, health, energy or agriculture.
Data is defined as “any digital representation of acts, facts or information and any compilation of such acts, facts or information, including in the form of sound, visual or audiovisual recording” that is held by public sector bodies and which is not subject to the Open Data Directive but is subject to the rights of others. Examples include data generated by GPS and healthcare data, which if put to productive use, could contribute to improving the quality of services. The Commission estimates that the Act could increase the economic value of data by up to €11 billion by 2028.
Each EU Member State will be required to establish a supervisory authority to act as a single information point providing assistance to governments. They will also be required to establish a register of available public sector data. The European Data Innovation Board (see later) will have oversight responsibilities and maintain a central register of available DGA Data.
The European Commission believes that, in order to encourage individuals to allow their data to be shared, they should trust the process by which such data is handled. To this end, the DGA creates data sharing service providers known as “data intermediaries”, which will handle the sharing of data by individuals, public bodies and private companies. The idea is to provide an alternative to the existing major tech platforms.
To uphold trust in data intermediaries, the DGA puts in place several protective measures. Firstly, intermediaries will have to notify public authorities of their intention to provide data-sharing services. Secondly, they will have to commit to the protection of sensitive and confidential data. Finally, the DGA imposes strict requirements to ensure the intermediaries’ neutrality. These providers will have to distinguish their data sharing services from other commercial operations and are prohibited from using the shared data for any other purposes.
Data Altruism
The DGA encourages data altruism. This where data subjects (or holders of non-personal data) consent to their data being used for the benefit of society e.g. scientific research purposes or improving public services. Organisations who participate in these activities will be entered into a register held by the relevant Member State’s supervisory authority. In order to share data for these purposes, a data altruism consent form will be used to obtain data subjects’ consent.
The DGA will also create a European Data Innovation Board. Its missions would be to oversee the data sharing service providers (the data intermediaries) and provide advice on best practices for data sharing.
The UK
Brexit means that the DGA will not apply in the UK, although it clearly may affect UK businesses doing business in the EU. It remains to be seen whether the UK will take similar approach although it notable that UK proposals for amending GDPR include “amending the law to facilitate innovative re-use of data for different purposes and by different data controllers.”
The DGA will shortly be published in the Official Journal of the European Union and enter into force 20 days after publication. The new rules will apply 15 months thereafter. To further encourage data sharing, on 23 February 2022 the European Commission proposed a Data Act that is currently being worked on.
This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September.
This new course is specially designed for Data Protection Officers and privacy practitioners, based in the EU and internationally, whose role involves advising on the EU GDPR and associated privacy legislation. The content of the course has been developed after analysing all the knowledge, practical skills and competencies required for the EU DPO to successfully navigate the European data protection landscape.
This course builds on Act Now’s very popular UK GDPR Practitioner certificate course which has been attended by hundreds of DPOs throughout the UK and abroad since its launch in 2017. Our teaching style is based on practical and engaging workshops covering theory alongside hands-on application using case studies that equip delegates with knowledge and skills that can be used immediately. Personal tutor support throughout the course will ensure the best opportunity for success. Delegates will also receive a comprehensive set of course materials, including our very popular EU GDPR Handbook (RRP £34.99), as well as access to our online Resource Lab, which includes over 20 hours of videos on key aspects of the syllabus.
The EU GDPR Practitioner Certificate course takes place over four days (one day per week) and involves workshops, case studies and exercises. This is followed by a written assessment. Delegates are then required to complete a practical project (in their own time) to achieve the certificate. Whether delivered online or in the classroom, delegates will receive all the fantastic features of the course specifically tailored for each learning environment.
The EU GDPR Practitioner Certificate course builds on Act Now’s track record for delivering innovative and high quality practical training for information governance professionals:
In April 2014 we launched the highly successful Data Protection Practitioner Certificate, the first course to teach essential knowledge and practical DPO skills.
In May 2016 we launched the Foundation Certificate in Information Governance. This was the first fully online certificated course covering data protection, information security, freedom of information and records management.
In December 2018, we launched the FOI Practitioner Certificate, the first such course to use modern assessment methods rather than rote learning.
In April 2020, at the start of the Pandemic, we launched our new online GDPR Practitioner Certificateto enable DPOs working from home to gain a qualification.
In November 2020, we launched of the Advanced Certificate in GDPR Practice which is the first advanced qualification specifically designed for DPOs.
In November 2021 we won the Information and Records Management Society (IRMS) Supplier of the year award.
“We have looked at every aspect of this course to ensure it equips EU Data Protection Officers with the knowledge and skills they need to implement the EU GDPR in a practical way. Because of its emphasis on practical skills, and the success of our UK GDPR Practitioner certificate course, we are confident that this course will become the qualification of choice for current and future EU Data Protection Officers.”
On 25th March 2022, the European Commission and the United States announced that they have agreed in principle on a new Trans-Atlantic Data Privacy Framework. The final agreement will replace the Privacy Shield Framework as a mechanism for lawfully transferring personal data from the EEA to the US in compliance with Article 44 of the GDPR. As for UK/US data transfers and compliance with the UK GDPR is concerned, it is expected that the UK Government will strike a similar deal once the EU/US one is finalised.
The need for a “Privacy Shield 2.0” arose two years ago, following the judgment of the European Court of Justice (ECJ) in “Schrems II” which stated that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal transfer tool. They must consider using the Article 49 derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. The problem with the US is that it has stringent surveillance laws which give law enforcement agencies access to personal data without adequate safeguards (according to the ECJ in Schrems).
Despite the Schrems II judgment, many organisations have continued to transfer personal data to the US hoping that regulators will wait for a new deal before enforcing Article 44. Whilst the UK Information Commissioner’s Office (ICO) seems to still have a “wait and see” approach, others have started to enforce. In February 2022, the French Data Protection Regulator, CNIL, ruled that use of Google Analytics was a breach of GDPR due to the data being transferred to the US without appropriate safeguards. This followed a similar decision by Austrian Data Protection Authority in January.
Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, which one does not involve a transfer of personal data to the US? At present use of such services usually involves a complicated TRA and execution of standard contractual clauses. In the UK, a new international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a TRA as well as supplementary measures where privacy risks are identified.
Has the Trans-Atlantic Data Privacy Framework saved DPOs hours of work? But before you break open the bubbly, it is important to understand that this is just an agreement in principle. The parties will now need to draft legal documents to reflect the agreed principles. This will take at least a few months and will then have to be reviewed by the European Data Protection Board (EDPB) adding more time. And of course there is the strong possibility of a legal challenge especially if the ECJ’s concerns about US surveillance laws are not addressed. Max Schrems said in a statement:
“We already had a purely political deal in 2015 that had no legal basis. From what you hear we could play the same game a third time now. The deal was apparently a symbol that von der Leyen wanted, but does not have support among experts in Brussels, as the US did not move. It is especially appalling that the US has allegedly used the war on Ukraine to push the EU on this economic matter.”
“The final text will need more time, once this arrives we will analyze it in depth, together with our US legal experts. If it is not in line with EU law, we or another group will likely challenge it. In the end, the Court of Justice will decide a third time. We expect this to be back at the Court within months from a final decision.“
“It is regrettable that the EU and US have not used this situation to come to a ‘no spy’ agreement, with baseline guarantees among like-minded democracies. Customers and businesses face more years of legal uncertainty.”
What should organisations do in the meantime? Our view is, if you have any choice in the matter, stick to personal data transfers to adequate countries i.e. those which have been deemed adequate by the UK/EU under Article 45. This will save a lot of time and head scratching conducting TRAs and executing SCCs. Where a US/non-adequate country transfer is unavoidable, a suitable transfer mechanisms has to be used as per Article 45. Of course for genuine one-off transfers the provisions of Article 49 derogations are worth considering.
In August, the Information Commissioner’s Office (ICO) launched a public consultation on its much anticipated draft guidance for international transfers of personal data and associated transfer tools. The aim of the consultation is to explore how to address the realities of the UK’s post Brexit data protection regime.
Chapter 5 of the UK GDPR mirrors the international transfer arrangements of the EU GDPR. There is a general prohibition on organisations transferring personal data to a country outside the UK, unless they ensure that data subjects’ rights are protected. This means that, if there is no adequacy decision in respect of the receiving country, one of the safeguards set out in Article 46 of the UK GDPR must be built into the arrangement. These include standard contractual clauses (SCCs) and binding corporate rules. The former need to be included in a contract between the parties (data exporter and importer) and impose certain data protection obligations on both.
The Current Transfer Regime
Until recently, many UK organisations were using the EU’s approved SCCs with a few ICO suggested amendments to fit the UK context. This was despite the fact that they needed updating in the light of the binding judgment of the European Court of Justice(ECJ) in a case commonly known as “Schrems II”.
In this case the ECJ concluded thatorganisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework. They must now consider using the Article 49 derogations or SCCs. If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection legislation, and to put in place “additional measures” to those included in the SCCs.
In the light of the above, the new EU SCCs were published in June. The European Data Protection Board has also published its guidance on the aforementioned required assessment entitled “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data”.
The Proposed UK Transfer Regime
Following Brexit, the UK is no longer part of the EU. Consequently, the UK has to develop its own international data transfer regime, including SCCs. The ICO is consulting on new guidance as well as a series of proposed international data transfer materials including:
A Transfer Risk Assessment (TRA) – Equivalent to the European Transfer Impact Assessment, this is designed to assist organisations to conduct risk assessments of their international personal data transfers, following the requirements set out in Schrems. The TRA is not mandatory, as organisations are also free to use their own methods to assess risk but does indicate the ICO’s expectations.
An International Data Transfer Agreement – Equivalent to the European SCCs, this a contract that organisations can use when transferring data to countries not covered by adequacy decisions.
The Addendum – This is designed to be used alongside the European Commission SCCs, to allow them to be used to safeguard a transfer under the UK GDPR, instead of the IDTA. It makes limited amendments to the EU SCCs to make them work in a UK context.
The deadline for responses to the consultation is 5.00pm on Thursday 7th October 2021. The ICO will then review the responses before issuing the finalised materials (on a date yet to be announced). Whatever the result of the consultation, organisations need to consider now which of their international data transfers will be affected and what resources will be required to implement the new regime.
December 2020 Update: This post was originally titled “Brexit, Trade Deals and GDPR: What happens next?’ and published in September 2020. It was updated on 26th December 2020.
So finally the UK has completed a trade deal with the EU which, subject to formal approval by both sides, will come into force on 1st January 2021. The full agreement has now been published and answers a question troubling data protection officers and lawyers alike.
Internation Transfers
On 1st January 2021, the UK was due to become a third country for the purposes of international data transfers under the EU GDPR. This meant that the lawful transfer of personal data from the EU into the UK without additional safeguards (standard contractual clauses etc) being required would only have been possible if the UK achieved adequacy status and joined a list of 12 countries. This was proving increasingly unlikely before the deadline and would have caused major headaches for international businesses.
The problem has been solved albeit temporarily. Page 406 and 407 of the UK-EU Trade and Cooperation Agreement contains provisions entitled, “Interim provision for transmission of personal data to the United Kingdom.” This allows the current transitional arrangement to continue i.e. personal data can continue to flow from the EU (plus Norway, Liechtenstein and Iceland) to the UK for four months, extendable to six months, as long as the UK makes no major changes to its data protection laws (see UK GDPR below). This gives time for the EU Commission to consider making an adequacy decision in respect of the UK, which could cut short the above period. Will the UK achieve adequacy during these 4-6 months? Whilst there is much for the EU to consider in such a short time, I suspect that pragmatism and economic factors will swing the decision in the UK’s favour.
These and other GDPR developments will be discussed in detail in our online GDPR update workshop.
Whilst staff are still working from home, what better time to train them on GDPR and keeping data safe. Our GDPR Essentials e learning course can help you do this in less than 45 minutes.
It would be quite easy to dismiss the importance of this case. For starters, it involves a social media Data Controller. Secondly it was decided under the ‘old’ 1995 Data Protection Directive rather than the General Data Protection Regulation (GDPR) 2016. Thirdly it is a ruling of the CJEU, that may be thought to have no relevance post 31 December when the Brexit Transition Period ends and the UK GDPR comes into force.
Firstly some basic observations:
The case is not just about Facebook. It concerns international transfers of personal data between organisations in the EEA and third countries, particularly the USA. Many public authorities do this too. For example, universities may share personal data of staff and students who teach or study abroad. Some NHS Trusts, using clinical devices sourced from the US, may transfer diagnostic and monitoring data back to the States.
Although the litigation started when the 1995 Data Protection Directive was in force, the CJEU makes it clear that the questions it had to consider must be answered in the light of the GDPR rather than the Directive.
The end of the Brexit Transition Period, on 31st December 2020, does nothing to invalidate the decision of the CJEU in this case. The UK GDPR contains the same provisions about international transfers as GDPR.
The International Transfer Regime
To understand the judgment, it is worth recalling how the GDPR regulates the transfers of personal data from organisations within the EEA to those outside it. GDPR Article 44 lays down the general principles. Essentially, international transfers can only take place if they comply with the provisions of Articles 45-48 of GDPR. For the purpose of this blog the important provisions are Articles 45, 46 and 49.
In the absence of an adequacy decision, a Data Controller (and Data Processor) can only make an international transfer if they have in place “appropriate safeguards”. These include the use of standard contractual clauses which have been adopted by the European Commission. The Commission issued the Standard Contract Clauses (SCC) Decision in 2010 which was amended in 2016.
Where a Data Controller is transferring personal data to a third country that is not covered by an adequacy decision and appropriate safeguards are not in place, then it may still be able to make the transfer, if the transfer is covered by one of the “derogations” listed in Article 49. These include (but are not limited to) where the data subject has explicitly consented to the transfer; the transfer is necessary for important reasons of public interest; or where the transfer is necessary for the performance of a contract between the data subject and the controller. For example, a local authority organising a visit to its twin city in China, may rely on the consent of the councillors and officers before transferring their personal details to the Chinese organisers.
Where none of the derogations apply then a transfer may only take place where it is not repetitive, concerns only a limited number of data subjects and is necessary for purposes of compelling legitimate interests of the Data Controller, which are not overridden by the interests or rights of the data subject. In addition to these hurdles the Data Controller must assess all the circumstances of the transfer and put suitable data protection safeguards in place. The European Data Protection Board (EDPB) has issued guidelines about the Article 49 derogations.
The Judgement
Max Schrems, an Austrian national, is a well-known campaigner against Facebook and its data processing activities. In 2013 he complained to the Irish Data Protection Commissioner requesting her to prohibit Facebook Ireland (a subsidiary of Facebook Inc, in the USA) from transferring his personal data to the USA. That complaint resulted in the Irish High Court referring the case to the CJEU, which ruled in “Shrems 1” that the EU-US Safe Harbour arrangement was invalid.
In 2015 Mr Schrems reformulated his complaint to the Irish Commissioner claiming that under US law, Facebook Inc was required to make the personal data (that had been transferred to it from Facebook Ireland) available to certain US law enforcement bodies and that this personal data was used in the context of various monitoring programmes in a way that violated his privacy. He also argued that US law did not provide EU citizens with legal remedies and so the transfers was not lawful under GDPR. Facebook Ireland argued that the transfer complied with the SCC Decision (i.e. they had standard EU clauses in place) and that was sufficient to make the transfers lawful. At the time, the EU-US Privacy Shield had not been adopted.
The Irish Commissioner agreed with Mr Schrems but she asked the High Court to refer various questions to the CJEU for a “preliminary ruling” on the validity of the SCC Decision. Although the case was primarily about the SCC Decision, the Court considered it had the right to consider the validity of the Privacy Shield Framework too.
The judgment is an extremely important one for both private and public sector organisations despite the fact that reading it is a bit like wading through treacle! Here are the key points:
The CJEU declared that the EU-US Privacy Shield Decision (Decision 2016/1250) was invalid in its entirety and so the Privacy Shield Framework for transferring data to the US could not be used. The Court held that any communication of personal data with a third party (such as the relevant security organisations in the US) was an interference with fundamental privacy rights which was neither lawful nor proportionate. The relevant US legislation did not provide any limits on the powers of US authorities to process the personal data for surveillance purposes. It also decided that the availability of a Privacy Shield Ombudsperson was not sufficient to guarantee that data subjects in the EU had a right to an effective legal remedy as required by GDPR.
The Court confirmed that the use of standard contractual clauses for international transfers was still lawful. Organisations can continue to incorporate these into the contractual arrangements with third country recipients. However, the point about standard contract clauses is that they are inherently contractual in nature and therefore only bind the parties to the contract. They cannot bind the public authorities, including law enforcement agencies, in third countries. The clauses may require, depending on the situation in the country concerned, the adoption of further supplementary measures to ensure compliance with the level of protection required by the GDPR.
The Court was clear that the responsibility in paragraph 2 above lies with Data Controllers in the EU and the recipient of the personal data to satisfy themselves, on a case by case basis, that the legislation of the third country enables the recipient to comply with the standard data protection clauses before transferring personal data to that third country. If they are not able to guarantee the necessary protection, they or the competent supervisory authority (in the UK the Information Commissioner’s Office) must suspend or end the transfer of personal data.
If a country, like the USA, has legislation in place that obliges recipients to share personal data with public authorities, then Data Controllers must assess, on a case by case basis, whether that mandatory requirement doesn’t go beyond what is necessary in a democratic society to safeguard national security, defence and public security.
What next?
Organisations, including those in the public sector, that transfer personal data to the US can no longer rely on the Privacy Shield Framework. They must now consider using the Article 49 derogations or the standard contractual clauses. If using the latter, whether for transfers to the US or other countries, the onus is on the Data Controllers to make a complex assessment about the recipient country’s data protection legislation, and to put in place “additional measures” to those included in the clauses. At time of writing it is not clear how to make this assessment and what additional measures will be needed. The European Data Protection Board (EDPB) has announced it will be looking into this.
The ICO has posted a general statement to the effect that organisations that are currently using the Privacy Shield should continue to do so until further notice. It seems likely that they will grant a grace period during which organisations can implement alternative transfer mechanisms.