Google Analytics and GDPR Compliance: What next?

Google Analytics is a popular tool used by website owners across the world to observe and measure user engagement. In February 2022, the French Data Protection Regulator, CNIL, ruled that use of Google Analytics was a breach of GDPR. This followed a similar decision by Austrian Data Protection Authority in January. 

Is a website owner processing personal data by making use of Google Analytics? On the face of it, the answer should be no. Google Analytics only collects information about website visitors, such as which pages they access and where they link from. The website owners do not see any personal data about visitors. However, Google does assign a unique user identification number to each visitor which it can use to potentially identify visitors by combining it with other internal resources (just think of the vast amount of information which is collected by Google’s other services). 

The fact that the above mentioned French and Austrian decisions ruled that analytics information is personal data under GDPR does not in its itself make the use of Google Analytics unlawful. Of course website owners need to find a GDPR Article 6 condition for processing (Lawfulness) but this is not an insurmountable hurdle. Legitimate interests is a possibility although the UK Information Commissioner’s Office (ICO) holds the view that use of analytics services is not “strictly necessary” in terms of the PECR cookie rules and its own cookie banner, adopts the express consent approach.  

A bigger obstacle to the use of Google Analytics in Europe is the fact that website users’ personal data is being passed back to Google’s US servers. In GDPR terms that is a “restricted transfer” (aka international transfer). Following the judgment of the European Court of Justice (ECJ) in “Schrems II”, such transfers have been problematic to say the least.  In Schrems, the ECJ concluded thatorganisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework. They must consider using the Article 49 derogations or standard contractual clauses(SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporters to make a complex assessment about the recipient country’s data protection legislation, and to put in place “additional measures” to those included in the SCCs. The problem with the US is that it has stringent surveillance laws which give law enforcement agencies access to personal data without adequate safeguards (according to the ECJ in Schrems).

In France, the CNIL has ordered the website which was the subject of its ruling about Google Analytics to comply with the GDPR and “if necessary, to stop using this service under the current conditions”, giving it a deadline of one month to comply. The press release, announcing the decision, stated:

“Although Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, these are not sufficient to exclude the accessibility of this data for U.S. intelligence services.”

“There is therefore a risk for French website users who use this service and whose data is exported.”

The CNIL decision does leave open the door to continued use of Google Analytics but only with substantial changes that would ensure only “anonymous statistical data” gets transferred. It also suggests use of alternative toosl which do not involve a transfer outside the EU. Of course the problem will be solved if there is a new agreement between the EU and U.S. to replace the Privacy Shield. Negotiations are ongoing.

In the meantime, what can UK based website owners do. Should they stop using Google Analytics? Some may decide to adopt a “wait and see” approach. The ICO has not really shown any appetite to enforce the Schrems decision concentrating instead on alternative transfer tools including International Data Transfer agreement which comes into force tomorrow. Perhaps a better way is to assess which services, not just analytics services, involve transfers to the US and switch to EU based services instead.  

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop on Wednesday. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in April.

The Facebook Data Breach Fine Explained


On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998.  In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case.  For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.

The Facts

In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:

  • Their public FB profile, date of birth and current city
  • Photographs they were tagged in
  • Pages they liked
  • Posts on their time lime and their news feed posts
  • Friends list
  • Facebook messages (there was evidence to suggest the app also accessed the content of the messages)

The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.

The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).

In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information.  The App remained in operation until May 2015.

Breach of the DPA

The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.

The FB companies had breached s 4 (4) DPA 1998  by failing to comply with the 1stand 7th data protection principles. They had:

  1. Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly  given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
  2. Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing.  The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing.  (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.

Breach of FB Platform Policy

Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:

  • Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
  • Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
  • The App required permission from users to obtain personal data that the App did not need in breach of the policy.

The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.

Joint Data Controllers

The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.

The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.

The Use of Data Analytics for Political Purposes

The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.

As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.

Susan Wolf will be delivering these upcoming workshops and the forthcoming FOI: Contracts and Commercial Confidentiality workshop which is taking place on the 10th December in London. 

Our 2019 calendar is now live. We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. 

Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.


iPhone -> abcPhone

By Paul Simpkins

First the joke

I had a friend who played in a band. When he got his new smart phone he put all his gigs for the next 12 months into his calendar with an alert set for the day before so he knew when he was needed and he could plan the rest of his life.

A few days later his fellow band members rang him from a venue saying “Where are you, we’re on stage in 3 hours…”.

He looked at his phone and found that almost all the dates except 8 that he’d typed in hadn’t gone into his calendar. Only 8 were listed. The rest had disappeared. He dashed down to the phone shop and asked why to which the teenage assistant replied ”Sorry mate you’ve only got an 8 gig phone” [groan…]

But do you really need a phone with massive capacity and hundreds of apps? Do you need two level security or thumbprint login or many of the fancy apps that make your life so complicated efficient?

Is there a market for a simpler smartphone (maybe a dumbphone) that just has 8 key apps built in and no possibility of adding any more. We could call it the 8 app phone to remind us of the old joke. There would only be one home screen so we could call it… the screen.

Many old people don’t use 99% of the functionality of a smartphone. Yes youngsters are in constant contact with every social media platform that exists and are forever uploading and viewing videos of their friends eating junk food in branded outlets while streaming Spotify tunes but do we need all this connectivity?

This revolutionary concept crossed my mind this morning. I’d installed an update on my i-phone and instead of getting on with being my faithful companion my phone reverted to Hello Hola mode. All I had to do was set it up again and all my data would mysteriously flow back through the air to fill it up again. The problem was that I couldn’t remember my i-Cloud code as I’d bravely migrated to (see I can speak the lingo) two level authentication a few days ago. The phone wasn’t playing until it had the code. (I know I should have written it down on a yellow post it note but most of my reminders are in Notes on my phone). I also know that apple groupies will now be screaming “you stupid old git” at their screens and I acknowledge that I don’t know the front end of a universal serial bus from the back end but I’m happy in my own way. I just don’t see the point of unasked for updates that add on features I don’t think I’ll ever use. I’m often quite a few updates late and I still don’t know why I accepted this one so readily.

I went on the web and signed in with my Apple ID and it said no problem we’ll send a 6 letter code to your trusted device and you can type it in and you’ll be fine. Unfortunately my trusted device was the phone that the update had turned into a small door stop so the code I needed to unlock it was stopping at the door and not going in.

I rang Apple support and pointed out the problem and they ummed and ahhhed for 30 minutes before deciding that I had to take the SIM out of the phone, put it into another phone, set it up as a clone of my small doorstop, look in the text inbox, retrieve the code I’d been sent, take the SIM out, return it to my small doorstop and type in the code which would make my door stop suddenly metamorphose into a beautiful smartphone and fly off into the sunset.

The local phone shop refused to do it as it might lock the donor phone so I went home to find an old i phone. Soon I had no i-cloud code and 2 locked phones.

Fortunately I also had a macbook and an IT literate partner and for 3 hours we trawled the web, switched off this, switched on that, reset the donor phone and through trying every possible route through the Hello Hola roadblock finally made it work. Then we saw 9 texts each containing a 6 letter unlock code.

With feverish glee we put the SIM back where it belonged and tried to replicate the process. We did at one stage receive an email message saying that someone in Middlesbrough had tried to sign into my account but ignored as it was so obviously a ruse de guerre. (Heckmondwike yes but Middlesbrough no way…). An hour later we’d made it. It involved changing an apple ID password and several cups of coffee and a few cookies but we made it. By now darkness had fallen and we were both too tired to actually use the phone.

Back to the brilliant idea. The next development for Apple after the i-phone should (obviously) be the j-phone. The J stands for ‘just a few things on the” phone. Essentials are phone, text, web, calendar, maps, settings, camera, contacts and nothing else. (There will be a focus group later to decide which 8 are essential). {We’ll make them big icons while we’re at it}. But lets make even simpler and to save a law suit, just call it the abc-phone. Being as there’s no video or music or social media this can be produced cheaply and only sold to anyone who can produce a bus pass or a senior rail card (with photo ID  – we’re not letting any spotty youngsters in on the secret). There’ll be no real security on the phone – if someone pinches it there will be no value to sell on and the user can just buy another.

Over to you Apple…


A grumpy old man.


Make 2017 the year you achieve a GDPR qualification? See our full day workshops and new GDPR Practitioner Certificate.




image credits:

Who’s afraid of the big bad cloud?

By Frank Rankin

For when you first begin to undertake it, all that you find is a darkness, a sort of cloud of unknowing; you cannot tell what it is…

The Cloude of Unknowynge, Anonymous, 14th Century

When it comes to IT, “Cloud” is still a scary word for many organisations. The language doesn’t help – “Cloud” suggests an arrangement that is (literally) nebulous rather than the mature industry expected to be worth almost 200 billion dollars per year by the end of the decade[i]. The apprehension is largely expressed in terms of concerns around the robustness of security (let’s call those Principle 7 concerns) and the suspicion that cloud providers will store data willy-nilly on data servers in far-off, none-European lands (we’ll call those Principle 8 concerns).  But often these concerns are raised without real attempts to explore what these are or look at the solutions and controls offered by cloud providers and others.

To be clear of our terms, let’s borrow from the US government definition: “Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” [ii]

In other words, computing capacity is purchased as a commodity with the supplier, in contrast to an organisation purchasing and managing its own servers or software. And that means a transfer of controls.

Models of cloud computing range from Software as a Service (SaaS) such as Office 365 and Google Docs, through Platform as a Service (PaaS) to Infrastructure as a Service (IaaS) where the purchaser buys virtualized capacity and runs its own software.

Depending on the flavour of Cloud service being used, the scale of that transfer of controls varies as the diagram below illustrates.

From Cloud Security and Privacy by Mather and Kumaraswamy

And it is that transfer that is a source of nervousness for organisations. But it needn’t be. The cloud providers have invested heavily in information security and see good security as a market differentiator. Vendors such as Microsoft and Amazon Web Services advertise their certification to ISO27001:2013 and other national and international standards, and provide (within reason) detailed descriptions of their security arrangements.  It is up to the purchaser of cloud services to make our own risk assessment with regard to our information assets, and assess the adequacy of the offerings of the cloud vendors.

While using cloud does involve the transfer of controls, we should be honest enough to recognise whether this is likely to offer an improvement in the efficacy of those controls. To take one example, your own IT colleagues may be good and conscientious at applying software patches and updates, but it is unlikely that they can respond as timeously and consistently as the big cloud providers.

In making our assessments, we can be guided by resources such as the UK Government Cloud Security Principles against which suppliers listed on the G-Cloud are expected to self-assess.

Where the purchaser sees a need for further security controls in addition to the out-of-the-box cloud offerings, there is an extensive eco-system of third party vendors who specialise in add-on solutions for security, records management and other governance challenges around the cloud.

As long as the transfer of control is done transparently, and an organisation has clearly mapped out the locus for each required security control (on premise, core cloud offering or third-party solution) then you should be in a good position to assure yourself of the ongoing robustness of your information security on the cloud.

So much for Principle 7 of the Data Protection Act 1998.

The data protection concerns relate to the globalised nature of cloud provision. Perhaps in the early stages, the big cloud players in the USA didn’t always “get” European privacy concerns.

But the cloud providers have matured in their understanding of these issues.  That is why, for example, Microsoft offer European customers guarantees that their Office 365 or Azure solutions will be hosted within Europe (Dublin and Amsterdam at the moment with a U.K. data centre due to open shortly.) The larger vendors, such as Amazon, are happy to provide European customers with data processing agreements which incorporate the Model Clauses, and in some cases have received Article 29 Working Party approval of their contractual terms.

Think of the relationship between cloud customer and vendor as just like any of your existing relationships between data controller and data processor – only on a larger scope and scale.

And the shift in the EU General Data Protection Regulation (GDPR) (I am not going into Brexit here, but our GDPR expert has explained here, GDPR is still relevant post-Brexit) where data processors will be liable for data processing actions they take which go against or beyond the instructions of the data controller should only increase the level of assurance for European cloud purchasers. (More on the security requirements of GDPR here.)

A risk-based approach to assess the offerings of a cloud vendor should give assurance that the requirements of Principle 8 of the Data Protection Act 1998 are met.

Act Now is not in the business of promoting cloud providers – they do a good enough job of that themselves. But concerns around data protection and information security need not be a barrier to adopting cloud-based technology. Colleagues or stakeholders who argue that these issues are show-stoppers may have an incomplete understanding of the current state of play, or may have another agenda in mind.

So, in considering transferring information assets to the cloud, information governance practitioners should:

  • Carry out an information risk assessment, including a realistic understanding of threats and identifying the possible risks arising from keeping the data on the premises.
  • Make sure that information governance and security issues are “front-loaded” and made central to the procurement process: Many of the key controls and protection for the organisation have to be in the terms of the contract.
  • Understand the geographical location of the provider’s data centres and, where relevant, include contractual terms stating where your data must be held.
  • Survey the available third party security and governance add-on tools for cloud, but be wary of the vendors claims and measure the value of their offerings against a realistic understanding of your specific risks.

Ultimately, whether to move to the cloud or not will be a decision for the wider business, but privacy and information security professionals can help to make that decision an informed one.

Frank Rankin is an information security, FOI and records management expert. Amongst other courses he is currently delivering our Practitioner Certificate in Freedom of Information (Scotland).




Information Governance in Health & Social Care Conference

Act Now is pleased to announce that it will be holding a major conference in the new year on the 24th of March entitled ‘Health Now – Information Governance in Health and Social Care – Where are we now?’ Speakers from the ICO, many areas of the NHS, NADPO and Act Now will be meeting in Leeds to discuss the future of information governance and patient care.

If you work in information governance, records management, data protection, freedom of information, IT, compliance, information and compliance management, data & information management then this is for you. Over 100 delegates are expected from Local and Central Government, Health and Social Care and associated sectors.

To download your advance copy of the conference flyer click here. With a delegate fee of only £199 we expect a high demand for places. Book Now for Health Now! See our other courses for the health and social care sector here.

Act Now Book Draw Week 6

The winner of this week’s Act Now Book Draw was Peter Dinsdale from Newcastle University.

Next week’s book is Gringras: The Laws of the Internet (3rd Edition) by Elle Todd.

The next draw will take place on Wednesday 4th April at 9am. Click here to enter the draw.

If you enter the draw and win, you give us permission to let others know that you have won (by e mail, on our website and by Twitter). If you do not want us to do this, please do not enter the draw. Any information we receive through this free draw will not be used for any other purpose.

Exit mobile version