International Transfers Breach Results in Record GDPR Fine for Meta

Personal data transfers between the EU and US is an ongoing legal and political saga. The latest development is yesterday’s largest ever GDPR fine of €1.2bn (£1bn) issued by Ireland’s Data Protection Commission (DPC) to Facebook’s owner, Meta Ireland. The DPC ruled that Meta infringed Article 46 of the EU GDPR in the way it transferred personal data of its users from Europe to the US. 

The Law 

Chapter 5 of the EU GDPR mirrors the international transfer arrangements of the UK GDPR. There is a general prohibition on organisations transferring personal data to a country outside the EU, unless they ensure that data subjects’ rights are protected. This means that, if there is no adequacy decision in respect of the receiving country, one of the safeguards set out in Article 46 must be built into the arrangement. These include standard contractual clauses (SCCs) and binding corporate rules.
The former need to be included in a contract between the parties (data exporter and importer) and impose certain data protection obligations on both. 

The Problem with US Transfers 

In 2020, in a case commonly known as “Schrems II, the European Court of Justice (ECJ) concluded that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal mechanism to ensure GDPR compliance. They must consider using the Article 49 derogations or SCCs. If using the latter, whether for transfers to the US or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection and surveillance legislation, and to put in place “additional supplementary measures” to those included in the SCCs. The problem with the US is that it has stringent surveillance laws which give law enforcement agencies access to personal data without adequate safeguards (according to the ECJ in Schrems). Therefore any additional measures must address this possibility and build in safeguards to protect data subjects. 

In the light of the above, the new EU SCCs were published in June 2021.
The European Data Protection Board has also published its guidance on the aforementioned required assessment entitled “Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data”. Meta’s use of the new EU SCC’s and its “additional supplementary measures” were the focus of the DPC’s attention when issuing its decision. 

The Decision 

The DPC ruled that Meta infringed Article 46(1) of GDPR when it continued to transfer personal data from the EU/EEA to the US following the ECJ’s ruling in Schrems II. It found that the measures used by Meta did not address the risks to the fundamental rights and freedoms of data subjects that were identified in Schrems; namely the risk of access to the data by US law enforcement.  

The DPC ruled that Meta should: 

  1. Suspend any future transfer of personal data to the US within five months of the date of the DPC’s decision; 
  1. Pay an administrative fine of €1.2 billion; and, 
  1. Bring its processing operations in line with the requirements of GDPR, within five months of the date of the DPC’s decision, by ceasing the unlawful processing, including storage, in the US of personal data of EEA users transferred in violation of GDPR. 

Meta has said that it will appeal the decision and seek a stay of the ruling, before the Irish courts.  Its President of Global Affairs, Sir Nick Clegg, said:  

“We are therefore disappointed to have been singled out when using the same legal mechanism as thousands of other companies looking to provide services in Europe. 

“This decision is flawed, unjustified and sets a dangerous precedent for the countless other companies transferring data between the EU and US.” 

The Future of US Transfers 

The Information Commissioner’s Office told the BBC that the decision “does not apply in the UK” but said it had “noted the decision and will review the details in due course”. The wider legal ramifications on data transfers from the UK to the US can’t be ignored. 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. A new  UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a Transfer Risk Assessment  as well as supplementary measures where privacy risks are identified.  

On 25th March 2022, the European Commission and the United States announced that they have agreed in principle on a new  Trans-Atlantic Data Privacy Framework. The final agreement is expected to be in place sometime this summer 2023 and will replace the Privacy Shield Framework. It is expected that the UK Government will strike a similar deal once the EU/US one is finalised. However both are likely to be challenged in the courts. 

The Meta fine is one of this year’s major GDPR developments nicely timed; within a few days of the 5th anniversary of GDPR. All organisations, whether in the UK or EU, need to carefully consider their data transfers mechanisms and ensure that they comply with Chapter 5 of GDPR in the light of the DPC’s ruling. A “wait and see’ approach is no longer an option.  

The Meta fine will be discussed in detail on our forthcoming International Transfers workshop. For those who want a 1 hour summary of the UK International Transfer regime we recommend our webinar 

The California Consumer Privacy Act (CCPA) and the CPRA: What’s Changed? 

Californian privacy law is about to change once again thanks to the California Privacy Rights Act (CPRA) which will become fully enforceable on 1st July 2023. 

The current law is set out in the California Consumer Privacy Act (CCPA) which has been in force since 1st July 2020. CCPA regulates the processing of California consumers’ personal data, regardless of where a company is located. It provides broader rights to consumers, and stricter compliance requirements for businesses, than any other US state or federal privacy law. 

Like the EU General Data Protection Regulation (GDPR), CCPA is about giving individuals control over how their personal data is used by organisations. It requires transparency about how such data is collected, used and shared. It gives Californian consumers various rights including the right to: 

  • Know and access the personal being collected about them 
  • Know whether their personal data is being sold, and to whom 
  • Opt out of having their personal data sold 
  • Have their personal data deleted upon request 
  • Avoid discrimination for exercising their rights 

CPRA is not a new law; it amends the CCPA to give Californians even more control over their personal data.  The key provisions include: 

  • Changing the CCPA’s definition of Personal Information 
  • Creating a new data category called “Sensitive Personal Information” similar to Special Category Data under GDPR 
  • Changing the scope of the CCPA  
  • Adding new rights e.g. to correct inaccurate information and to limit use and disclosure of sensitive personal information 
  • Changing regulatory area of focus towards behavioral advertisement  
  • Adding additional requirements for business (closely modelled on GDPR Data Protection Principles) namely data minimisation, purpose limitation and storage limitation 
  • Expanding the CCPA’s current consent requirements to include where, amongst others, a business is selling or sharing personal information after a user has already opted out and when selling or sharing the personal information of minors  

Whilst becoming fully enforceable on 1st July 2023, CPRA will have a 12 month “lookback period” applying to its new rights from 1st January 2022. 

Until recently, CCPA did not have a regulator like the Information Commissioner in the UK. It was primality enforced by the Office of the Attorney General through the courts; although there is a private right of right action for a security breach. The courts can impose fines for breaches of CCPA depending on the nature of the breach: 

  • $2,500 for an unintentional and $7,500 for an intentional breach  
  • $100-$750 per incident per consumer, or actual damages, if higher – for damage caused by a security breach.  

CPRA establishes the California Privacy Protection Agency (CPPA) which has the authority to investigate potential breaches and violations, and to draft enforcement regulations. It has produced new CPRA Regulations providing rules on service provider contracts, dark patterns, and the recognition of “global opt-out” browser signals. 

While the CCPA fines and damages may appear relatively low, it is important to note that they are per breach. A privacy incident can affect thousands or tens of thousands of consumers, in which case it could cost a company hundreds of thousands or even millions of dollars. In the first three years of the CCPA’s existence, 320 lawsuits have been filed in 28 states according to a report by Akin, a US law firm. It found that: 

  • More than 80% of CCPA lawsuits in 2022 corresponded to a breach notice filed with the California Attorney General’s Office, and businesses that report a data breach to the AG’s office have about a 15% chance of facing subsequent consumer litigation. 
  • Breaches affecting at least 100,000 people accounted for 56% of lawsuits in 2022 stemming from data breaches. 
  • Financial services companies accounted for 34% of cases in 2022, by far the highest rate of any industry. Medical services and software/technology each comprised 13%.

All US based businesses, as well as those elsewhere who are processing Californian residents’ personal information, need to consider how CPRA will impact their data management and start the implementation process immediately. People are more concerned than ever about what is happening to their personal data as a result of recent media headlines concerning the exploitation of personal data by AI and social media companies

Ibrahim Hasan will be speaking about the CCPA and CPRA at the MER Information Governance Conference in Chicago in May.  

Interested in US privacy law? Check out our US privacy programme 

Saudi Arabian Data Protection Law Update 

In September 2021, Saudi Arabia announced its first ever data protection law. The Personal Data Protection Law (PDPL) was implemented by Royal Decree M/19 of 9/2/1443H approving Resolution No. 98 dated 7/2/1443H (14th September 2021). PDPL will regulate the collection, handling, disclosure and use of personal data and includes governance and transparency obligations. It will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA). 

PDPL was originally going to come fully into force on 23rd March 2022. However, in November 2022, SDAIA published proposed amendments for public consultation. On 21st March 2023, some of these amendments were passed by the Saudi Council of Ministers. PDPL will now officially come into force on 14th September 2023 and organisations will have till 13th September 2024 to comply. Much of the detail of the new law will be set out in the Executive Regulations which we are still waiting for, although a draft version was issued last year. 

The amendments to PDPL introduce several concepts that will align the new law more closely to the EU General Data Protection Regulation (GDPR) and the UK GDPR. These include: 

  • New Ground for Processing: Like the GDPR, Data Controllers may now rely on “legitimate interests” as a lawful basis to process personal data; this does not apply to sensitive personal data, or processing that contravenes the rights granted under PDPL and its executive regulations.  
     
  • Easier International Transfers: Like other data protection regimes, PDPL imposes limitations on the international transfer of personal data outside of the KSA. The strict prohibition on transfers outside Saudi Arabia has now been amended. Furthermore they no longer require approval from SDAIA. Data Controllers will need a specific purpose to transfer data outside the Kingdom and transfers appear to be limited to territories that SDAIA determines as having an appropriate level of protection for personal data, which will be further clarified once they issue evaluation criteria for this purpose. The pending executive regulations should set out exemptions from this condition.  
     
  • Removal of Controller Registration Requirements: The original law required Data Controllers to register on an electronic portal that would form a national record. This provision has now been removed. However, SDAIA has the mandate to license auditors and accreditation entities and create a national register if it determines that it would be an appropriate tool and mechanism for monitoring the compliance of controllers. 
  • Data Breach Notification Relaxed: Notifications of personal data breaches to SDAIA are no longer required “immediately.” However, controllers must now notify data subjects when a breach threatens personal data or contravenes the data subject’s rights or interests. The pending regulations are expected to provide additional specificity, such as particular dates for notifying data breaches and threshold requirements.  
     
  • Criminal Offences Reduced: The penalties for breaching PDPL will be a warning or a fine of up to SAR 5,000,000 (USD 1,333,000) that may be doubled for repeat offences. Criminal sanctions for violating the PDPL’s data transfer restrictions have been removed. There now remains only one criminal offence in relation to the disclosure or publication of sensitive personal data in violation of the law.  

Action Plan for Compliance 

Businesses established in Saudi Arabia, as well as those processing Saudi citizens’ personal data anywhere in the world, have sixteen months to prepare for PDPL. Considering that those covered by GDPR had four years, this is not a long time. Now is the time to put systems and processes in place to ensure compliance. Failure to do so will not just lead to enforcement action but also reputational damage.  

The following should be part of an action plan for compliance: 

  1. Raising awareness about PDPL at all levels. Our GDPR elearning course can be tailored for frontline staff. 
  1. Carrying out a data audit and reviewing how records management and information risk is addressed. 
  1. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification
  1. Revising privacy policies in the light of the more prescriptive transparency requirements.  
  1. Writing policies and procedures to deal with new and revised Data Subject rights such as Data Portability and Subject Access. 
  1. Appointing and training a Data Protection Officer.  

The new KSA data protection law is an important development in Middle East privacy law alongside the passing of the new UAE Federal DP law.
These laws, being closely aligned with the EU General Data Protection Regulation (GDPR) and the UK GDPR, open up exciting job opportunities for UK and EU Data Protection professionals. A quick scan of jobs sites shows a growing number of prospects. 

Act Now in the Middle East 

Act Now Training can help your businesses prepare for PDPL. We have delivered training extensively in the Middle East to a wide range of delegates including representatives of the telecommunications, legal and technology sectors. Check out our UAE privacy programme. We can also deliver customised in house training both remotely and face to face.
Please get in touch to discuss your training or consultancy needs.  

Our new Intermediate Certificate in GDPR Practice includes a module on worldwide data protection laws. 

Exploring the Legal and Regulatory Challenges of AI and Chat GPT 

In our recent blog post, entitled “GDPR and AI: The Rise of the Machines”, we said that 2023 is going to be the year of Artificial Intelligence (AI). Events so far seem to suggest that advances in the technology as well legal and regulatory challenges are on the horizon.   

Generative AI, particularly large language models like ChatGPT, have captured the world’s imagination. ChatGPT registered 100 million monthly users in January alone; having only been launched in November and it set the record for the fastest growing platform since TikTok, which took nine months to hit the same usage level. In March 2023, it recorded 1.6 Billion user visits which are just mind-boggling numbers and shows how much of a technological advancement it will become. There have already been some amazing medical uses of generative AI including the ability to match drugs to patients, numerous stories of major cancer research breakthroughs as well as the ability for robots to do major surgery. 
 
However, it is important to take a step back and reflect on the risks of a technology that has made its own CEO “a bit scared” and which has caused the “Godfather of AI” to quit his job at Google. The regulatory and legal backlash against AI has already started. Recently, Italy became the first Western country to block ChatGPT. The Italian DPA highlighted privacy concerns relating to the model. Other European regulators are reported to be looking into the issue too. In April the European Data Protection Board launched a dedicated task force on ChatGPT. It said the goal is to “foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities.” Elsewhere, Canada has opened an investigation into OpenAI due to a complaint alleging the collection, use and disclosure of personal information is without consent. 

The UK Information Commissioner’s Office (ICO) has expressed its own concerns. Stephen Almond, Director of Technology and Innovation at the ICO, said in a blog post

“Data protection law still applies when the personal information that you’re processing comes from publicly accessible sources…We will act where organisations are not following the law and considering the impact on individuals.”  

Wider Concerns 

ChatGPT suffered its first major personal data breach in March.
According to a blog post by OpenAI, the breach exposed payment-related and other personal information of 1.2% of the ChatGPT Plus subscribers. But the concerns around AI and ChatGPT don’t stop at privacy law.   

An Australian mayor is considering a defamation suit against ChatGPT after it told users that he was jailed for bribery; in reality he was the whistleblower in the bribery case. Similarly it falsely accused a US law professor of sexual assault. The Guardian reported recently that ChatGPT is making up fake Guardian articles. There are concerns about copyright law too; there have been a number of songs that use AI to clone the voices of artists including Drake and The Weeknd which has since  been removed from streaming services after criticism from music publishers. There has also been a full AI-Generated Joe Rogan episode with the OpenAI CEO as well as with Donald Trump. These podcasts are definitely worth a sample, it is frankly scary how realistic they actually are. 

AI also poses a significant threat to jobs. A report by investment bank Goldman Sachs says it could replace the equivalent of 300 million full-time jobs. Our director, Ibrahim Hasan, recently gave his thoughts on this topic to BBC News Arabic. (You can watch him here. If you just want to hear Ibrahim “speak in Arabic” skip the video to 2min 48 secs!) 
 

EU Regulation 

With increasing concern about the future risks AI could pose to people’s privacy, their human rights or their safety, many experts and policy makers believe AI needs to be regulated. The European Union’s proposed legislation, the Artificial Intelligence (AI) Act, focuses primarily on strengthening rules around data quality, transparency, human oversight and accountability. It also aims to address ethical questions and implementation challenges in various sectors ranging from healthcare and education to finance and energy. 

The Act also envisages grading AI products according to how potentially harmful they might be and staggering regulation accordingly. So for example an email spam filter would be more lightly regulated than something designed to diagnose a medical condition – and some AI uses, such as social grading by governments, would be prohibited altogether. 

UK White Paper 

On 29th March 2023, the UK government published a white paper entitled “A pro-innovation approach to AI regulation.” The paper sets out a new “flexible” approach to regulating AI which is intended to build public trust and make it easier for businesses to grow and create jobs. Unlike the EU there will be no new legislation to regulate AI. In its press release, the UK government says: 

“The government will avoid heavy-handed legislation which could stifle innovation and take an adaptable approach to regulating AI. Instead of giving responsibility for AI governance to a new single regulator, the government will empower existing regulators – such as the Health and Safety Executive, Equality and Human Rights Commission and Competition and Markets Authority – to come up with tailored, context-specific approaches that suit the way AI is actually being used in their sectors.” 

The white paper outlines the following five principles that regulators are to consider facilitating the safe and innovative use of AI in their industries: 

  • Safety, Security and Robustness: applications of AI should function in a secure, safe and robust way where risks are carefully managed; 

  • Transparency and Explainability: organizations developing and deploying AI should be able to communicate when and how it is used and explain a system’s decision-making process in an appropriate level of detail that matches the risks posed by the use of the AI; 

  • Fairness: AI should be used in a way which complies with the UK’s existing laws (e.g., the UK General Data Protection Regulation), and must not discriminate against individuals or create unfair commercial outcomes; 

  • Accountability and Governance: measures are needed to ensure there is appropriate oversight of the way AI is being used and clear accountability for the outcomes; and 

  • Contestability and Redress: people need to have clear routes to dispute harmful outcomes or decisions generated by AI 

Over the next 12 months, regulators will be tasked with issuing practical guidance to organisations, as well as other tools and resources such as risk assessment templates, that set out how the above five principles should be implemented in their sectors. The government has said this could be accompanied by legislation, when parliamentary time allows, to ensure consistency among the regulators. 

Michelle Donelan MP, Secretary of State for Science, Innovation and Technology, considers that this this light-touch, principles-based approach “will enable . . . [the UK] to adapt as needed while providing industry with the clarity needed to innovate.” However, this approach does make the UK an outlier in comparison to global trends. Many other countries are developing or passing special laws to address alleged AI dangers, such as algorithmic rules imposed in China or the United States. Consumer groups and privacy advocates will also be concerned about the risks to society in the absence of detailed and unified statutory AI regulation.  

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues.  

The New Data Protection Bill: What it means for DP and the public sector

In March, the UK Department for Science, Information and Technology (DSIT) published the Data Protection and Digital Information (No.2) Bill. The Bill is now going through Parliament. If enacted, it will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

Our director, Ibrahim Hasan, recently took part in a webinar organised by eCase. In this 45 minute session Ibrahim, alongside Data Protection experts Jon Baines of Mishcon de Reya and Lynn Wyeth of Leicester City Council, discusses the new Bill including:

  • The key changes
  • The differences with the current regime
  • What the changes mean for the public sector
  • The challenges

To access the webinar recording click here ((note: eCase email registration required to access)

The new Bill will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

The State of US Privacy Law in 2023 

The United States is making substantial progress on privacy law. Six states have passed comprehensive data protection bills (with at least two more likely to follow) and five of these take effect throughout 2023.  

One of the most significant changes to US privacy law comes in the form of the California Privacy Rights Act (CPRA) which is fully enforceable from 1st July 2023. The CPRA makes several important amendments to the California Consumer Privacy Act (CCPA) which has been in force since 1st July 2020. 

Among other changes, the CPRA introduces a concept of “sensitive personal information”, which includes data about a consumer’s government ID numbers, account credentials, racial origin, religious beliefs, union membership, genetics, biometrics, health status, and more. It also provides several new rights for consumers, such as the right to correct inaccurate personal information and the right to limit the use and disclosure of sensitive personal information. 

CCPA’s “right to opt out” now explicitly allows consumers to refuse
“cross-contextual advertising”, which involves combining personal information from different websites or apps to target people with ads.
Most significantly, CPRA gives California its own privacy regulator, the California Privacy Protection Agency (CPPA). 

Beyond California 

Following in the footsteps of California, five US states have now passed broadly-applicable privacy legislation: 

These laws create new challenges for businesses operating in the US.
They introduce data protection concepts more familiar to organisations complying with the EU General Data Protection Regulation (GDPR).
More state privacy laws will likely take effect in coming years, with similar bills in Tennessee and Indiana awaiting governors’ signatures at the time of writing. Other bills, such as Washington’s as-yet unsigned My Health My Data Act, could also have a broad privacy impact. 

The new state laws generally apply across all sectors but only to businesses processing the personal data of at least 100,000 consumers—plus smaller companies that derive a given proportion of their revenue from selling personal data. Utah’s law also excludes any business generating under $25 million in annual revenues. But unlike the GDPR, they contain carve-outs for data processing covered by sectoral laws, such as the Health Insurance Portability and Accountability Act (HIPAA) and the Gramm-Leach-Bliley Act (GLBA)

Consumer Rights 

Each of the new US state privacy laws provides new consumer rights, including:  

  • The right of access 
  • The right to delete 
  • The right to correct (except Utah and Iowa) 
  • The right to data portability 
  • The right to opt out of targeted advertising, the sale of personal data (except Iowa) and profiling in furtherance of legal or similar effects (except Iowa and Utah) 

Each law also imposes new rights around the processing of “sensitive data”, with Virginia, Colorado and Connecticut’s laws mandating
GDPR-style consent; and Iowa and Utah requiring controllers to offer consumers an opt-out prior to collection. 

The consumer rights provided under these US privacy laws are somewhat narrower than the GDPR’s “data subject rights”. Consumers unhappy with a controller’s response must exhaust an internal appeals process before complaining to the state’s Attorney General. Controllers must respond to consumers’ requests within 45 days (compared to one month in the EU)—but, similarly to the GDPR, businesses must not charge a fee unless a request is  “manifestly unfounded, excessive, or repetitive”. 

Controllers’ Obligations 

Drawing language from the GDPR, each of these new laws requires controllers to implement binding agreements with their processors.
Much like California’s “service provider contracts”, controllers under these other state laws must contractually require processors to submit to audits, impose similar contracts on any sub processors, and not share data received from the controller (with limited exceptions). 

Virginia and Connecticut’s new privacy laws require controllers to conduct “data protection assessments” in certain circumstances, including before engaging in targeted advertising, selling personal data, processing sensitive data, and other risky activities. Privacy bills currently under consideration in Tennessee and Indiana contain a similar requirement. These provisions were clearly inspired by the GDPR’s Data Protection Impact Assessments and require businesses to balance the benefits that could flow from a processing activity against the risks to consumers and the public, considering any relevant safeguards. 

FTC Enforcement 

The new IS state privacy laws will have a major impact on companies operating in the US. But perhaps equally significant is recent enforcement action by the Federal Trade Commission (FTC) under existing laws. 

In February, the FTC enforced the Health Breach Notification Rule against the drug discount provider GoodRx, issuing a $1.5 million civil penalty and permanently banning the company from sharing health information for advertising purposes. In March, the FTC also settled for $7.8 million with remote therapy provider BetterHelp under the FTC Act—a consumer protection law that BetterHelp allegedly violated by promising not to share personal information and then doing so via pixels and other trackers.
The FTC’s broad interpretation of “personal information” and “health information” in these cases—and its view that the unauthorised sharing of data with advertisers can be a “data breach”—suggests a trend of more robust privacy enforcement in the US. 

Towards a US Federal Privacy Law 

A comprehensive US federal privacy law could provide some clarity in this increasingly complicated patchwork of state and sectoral privacy laws.
For the past two years, President Biden has advocated new privacy measures in his State of the Union address—focusing primarily on children’s privacy, but with a broader call this year to limit how tech companies collect personal information about everyone in the US. 

A federal bill, the American Data Privacy Protection Act (ADPPA) was introduced to the House of Congress last June. The ADPPA would apply to businesses and non-profits across all sectors, regardless of size.  

Among other provisions, the ADPPA would: 

  • Only allow the “reasonably necessary and proportionate” collection, use, and transfer of personal information. 
  • Require organisations to disclose how they collect, use, and share personal information. 
  • Provide consumers with rights to access, delete, and correct their personal information. 

The ADPPA would arguably impose much stricter requirements on businesses than the current tranche of state privacy laws. The bill failed to pass in last year’s legislative session. Opposition centred around the law’s potential to override state privacy laws, and the “private right of action”, which would allow individuals to sue non-compliant businesses. 

Biden’s call for improved privacy protections suggests that some version of the ADPPA could reappear in the US legislature this session. However, it is unclear whether the now Republican-controlled House will support a bill that significantly restricts business activity. 

Unless a federal law passes (and perhaps even if it does), businesses will continue to grapple with the various local and sectoral privacy laws passing across many US states. Either way, a long era of lax US privacy regulation seems to be coming to an end. 


Ibrahim Hasan will be speaking about the CCPA and CPRA at the MER Information Governance Conference in Chicago in May.  

Interested in US privacy law? Check out our US privacy programme 

Online Recruitment Firm Receives £130,000 PECR Fine

On 10th April 2023, the Information Commissioner’s Office (ICO) fined Join The Triboo Limited £130,000 for sending 107 million spam emails targeting jobseekers. The an online recruitment firm was found to have breached the Privacy and Electronic Communications Regulations (PECR) by sending unsolicited emails to individuals without their consent.

The PECR is a set of regulations, which amongst other things, govern the use of electronic communications (e.g. email, text message, and automated calling systems) for direct marketing purposes. In some cases, the regulations require that individuals must give their consent before receiving marketing messages, including job vacancies. When it comes to e mails, businesses cannot send unsolicited emails to individuals unless they have obtained their explicit consent to do so.

The UK General Data Protection Regulation (GDPR), which also applies to electronic communications involving personal data, defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.”

This means that businesses must provide individuals with clear and concise information about what data they are collecting, how they will use it, and who they will share it with. Individuals must then be given the option to give their consent, and this consent must be freely given and specific to the intended processing activity. Businesses must also provide individuals with the option to withdraw their consent at any time.

Join the Triboo Limited was found to have breached PECR by sending unsolicited emails to individuals without their consent. The emails were sent in bulk to individuals who had not signed up to receive job alerts from the firm, and the content of the emails did not provide individuals with clear and concise information about the firm’s processing activities.

Andy Curry, ICO Head of Investigations, said:

“It’s an issue many of us face – opening up our email inboxes and it being filled with emails we did not ask for or consent to. This shouldn’t just be considered a fact of life – it is against the law.

We provide advice and support to legitimate companies that want to comply with the law. Last year, we released updated direct marketing guidance to help those very businesses.

That is, however, not what was happening in this case. This company did not properly seek permission from the people it chose to bombard with spam emails. The company used job seeking websites as a key component in its unlawful campaign.

In taking this action, we say to the public that we will continue to be on your side and protect you, and we say to any other organisation operating outside of the law that we will pursue every case like this brought to us to the fullest extent.”

The ICO’s decision to fine this online recruitment firm serves as a reminder of the importance of complying with data protection laws. This will enable businesses to build trust with their customers and create a safer, more secure online environment for everyone.

Our forthcoming PECR and Marketing workshop will consider this and other developments in detail. 

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates. Limited time. Terms and Conditions apply. Book Now!

Data Protection Specialist Joins the Act Now Team 

Act Now Training is pleased to announce that Robert Bateman, a respected data protection specialist, has joined its team of associates.  

Robert is a trainer and author specialising in privacy, data protection, security, and AI. He is a respected voice on privacy and has been writing, researching, and leading conversations in the field since 2017. 

Throughout his career, Robert has interviewed some of the leading figures in privacy, including Max Schrems and Johnny Ryan. He has worked with dozens of high-profile privacy professionals and campaigners and has written about almost every aspect of data protection including GDPR Enforcement, AI and worldwide DP laws. 

Robert earned a post-graduate law degree in 2019 and a CIPP/E from the International Association of Privacy Professionals in 2021. His 2019 research on the compatibility of the UK’s “immigration exemption” in the Data Protection Act 2018 and the European Convention on Human Rights won the DMH Stallard Prize for Best Project. 

Ibrahim Hasan said: 

“I am very pleased that Robert has joined our team. He has a deep understanding of data protection law and an ability to explain it in clear and simple terms. In addition to delivering our course programme we will be working with Robert to develop advanced workshops to help practitioners understand the latest trends and developments in data protection law in the UK and internationally.”  

Robert said: 

“I’m excited to start working with Act Now to help deliver their excellent courses and training programmes. I’ve spent years watching, analysing and explaining almost every development in data protection and privacy, and I look forward to sharing my knowledge with delegates.” 

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates Limited time offer. Book Now!

Act Now Launches Updated GDPR Practitioner Certificate  

Act Now Training is pleased to announce the launch of its updated GDPR Practitioner Certificate. This course has been running successfully for the past five years with excellent delegate reviews: 

“The course was very useful as an IG Officer. The trainer was knowledgeable and explained some complex aspects of the legislation using interesting examples and real life scenarios. The course materials and handbook are invaluable and I know I will reuse them in conjunction with my usual resources.” NC, Lincolnshire County Council  

“I would highly recommend this online course which was well structured and interactive. The course tutor was engaging and made a complex subject accessible. There was a good balance between understanding the legal framework and practical application. I learnt a great deal which will help me in my DPO role.” RS, London Councils  

Key features of the new course include an updated course curriculum, new exercises and more emphasis on helping delegates develop key DPO skills.  

Our Motivation  

This revised course is part of our ongoing commitment to encourage and assist new talent in the IG profession. Through our involvement in NADPO and the IRMS over the past 20 years, Act Now has been  actively encouraging new entrants to the IG profession and providing quality training to assist in their learning and development. When the DP and IG Apprenticeship was launched last year, we became one of the first training companies to partner up with a leading apprenticeship provider to deliver specialist IG training and materials to apprentices. These have led to our partner, Damar, recruiting over 100 apprentices and helping them lay the foundations for a successful career in IG.  

Course Content 

The course curriculum has been updated in the light of Act Now’s Skills and Competency Framework for DPOs. For the past three years we have been working on this framework, alongside industry experts and education professionals, by thoroughly analysing all the core skills and competencies required for the DPO role and how they map against our wider GDPR course curriculum.  

Completing the course will enable delegates to gain a thorough understanding of the UK GDPR and develop the skills required to do their job with greater ease and confidence. In addition to the main course topics such as principles, rights and enforcement we have introduced new topics such as the ICO Accountability Framework. We also take time to consider the latest ICO enforcement action and the changes to the UK data protection regime proposed by the recently announced Data Protection and Digital Information Bill

Completing the GDPR Practitioner Certificate will enable delegates to gain a thorough understanding of the UK GDPR. The course will help delegates interpret the data protection principles in a practical context, drafting privacy notices, undertaking DPIAs and reporting data breaches. 

The course teaching style is based on four practical and engaging workshops covering theory alongside hands-on application using case studies that equip delegates with knowledge and skills that can be used immediately. Delegates will also have personal tutor support throughout the course and access to a comprehensive revised online resource lab. 

The DPO Learning Pathway 

The updated UK GDPR Practitioner Certificate is part of our learning pathway for Data Protection Officers. Once completed they can move on to the Intermediate Certificate in GDPR Practice where the emphasis is on skills, as well as advanced knowledge, with delegates covering more challenging topics to gain a deeper awareness of the fundamental data protection principles.  

Our premier certification is the Advanced Certificate in GDPR Practice, tailored for seasoned Data Protection Officers seeking to refine and expand their expertise. The course comprises a rigorous set of masterclasses that engage delegates in dissecting and interpreting intricate GDPR scenarios through compelling case studies. This immersive experience empowers participants with the skills and confidence needed to tackle even the most challenging Data Protection and Privacy scenarios they may encounter.

If you would like a chat to discuss your suitability for any of our certificate courses, please get in touch.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates. Click on the link to find out more and take advantage of this limited time offer!

AI and ChatGPT: Ibrahim Hasan on BBC News Arabic

2023 so far has been all about the rise of artificial intelligence (AI). Alongside the privacy issues, there have been concerns over the potential risks, including its threat to jobs and the spreading of misinformation and bias. AI could replace the equivalent of 300 million full-time jobs, a report by investment bank Goldman Sachs says. It could replace a quarter of work tasks in the US and Europe but may also mean new jobs and a productivity boom. 

Our director, Ibrahim Hasan, recently gave his thoughts on AI machine learning and ChatGPT to BBC News Arabic. You can watch here. If you just want to hear Ibrahim “speak in Arabic” skip the video to 2min 48 secs. 

Friends in the UAE, may be interested in our UAE privacy programme which includes courses on UAE and Middle East data protection laws.

We have run many in-house courses, gap analysis and audit services for clients in the Middle East including the UAE, Saudi Arabia and Qatar. If you are interested in any of these services, please contact us here.

Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues. 

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates. Limited time. Terms and Conditions apply. Book Now!

%d bloggers like this: