Act Now Launches New FOI Practitioner Certificate


FOI Certificate Banner

Act Now is pleased to announce the launch of its brand new FOI Practitioner Certificate.

This course is one of the first of its kind, in a way that only Act Now delivers – practical, on the ground skills to help you fulfil your role as an FOI Officer.

This new certificate course is ideal for those wishing to acquire detailed knowledge of FOI and related information access legislation (including EIR) in a practical context. It has been designed by leading FOI experts including Ibrahim Hasan and Susan Wolf – formerly a senior lecturer on the University of Northumbria’s LLM in Information
Rights Law.

The course uses the same format as our very successful GDPR Practitioner Certificate. It takes place over four days (one day per week) and involves lectures, discussion and practical drafting exercises. This format has been extremely well received by over 1000  delegates who have completed the course. Time will also be spent at the end of each day discussing what issues delegates may face when implementing/advising on the FOI topics of the day.

The four teaching days are followed by an online assessment and a practical project to be completed within 30 days.

Why is this course different?

  • An emphasis on practical application of FOI rather than rote learning
  • Lots of real life case studies and exercises
  • An emphasis on drafting Refusal Notices
  • An online Resource Lab with links, guidance and over 5 hours of videos
  • Modern assessment methods rather than a closed book exam

 Who should attend?

This course is suitable for anyone working within the public sector who needs to learn about FOI and related legislation in a practical context, as well as those with the requisite knowledge wishing to have it recognised through a formal qualification. It is most suitable for:

  • FOI Officers
  • Data Protection Officers
  • Compliance Officers
  • Auditors
  • Legal Advisers

Susan, says:

“FOI and EIR are almost 14 years old. Since the Act and Regulations came into force there have been many legal developments and court decisions that have given practitioners a much greater understanding of the legal provisions and how they should be applied in practice. With this in mind, we have written this course to ensure that it equips public sector officers with all the necessary knowledge and skills they need to respond to freedom of information requests accurately and efficiently. This course, with its emphasis on the law in practice, will enable trainees to become more accomplished and confident FOI practitioners”

Susan will share her vast experience gained through years of helping organisations comply with their information rights legislation obligations. This, together with a comprehensive set of course materials and guidance notes, will mean that delegates will not only be in a position to pass the course assessment but to learn valuable skills which they will be able to apply in their workplaces for years to come.

This new course builds on Act Now’s reputation for delivering practical training at an affordable price:

This new course widens the choice of qualifications for IG practitioners and advisers. Ibrahim Hasan (Director of Act Now Training) commented:

“We are pleased be able to launch this new qualification. Because of its emphasis on practical skills, we are confident that it will become the qualification of choice for current and future FOI Officers and advisers.”

To learn more please visit our website.

All our courses can be delivered at your premises at a substantially reduced cost.
Contact us for more information.

The Facebook Data Breach Fine Explained



On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998.  In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case.  For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.

The Facts

In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:

  • Their public FB profile, date of birth and current city
  • Photographs they were tagged in
  • Pages they liked
  • Posts on their time lime and their news feed posts
  • Friends list
  • Facebook messages (there was evidence to suggest the app also accessed the content of the messages)

The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.

The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).

Facebook Fine Graphic

In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information.  The App remained in operation until May 2015.

Breach of the DPA

The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.

The FB companies had breached s 4 (4) DPA 1998  by failing to comply with the 1stand 7th data protection principles. They had:

  1. Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly  given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
  2. Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing.  The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing.  (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.

Breach of FB Platform Policy

Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:

  • Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
  • Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
  • The App required permission from users to obtain personal data that the App did not need in breach of the policy.

The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.

Joint Data Controllers

The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.

The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.

The Use of Data Analytics for Political Purposes

The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.

As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.

Susan Wolf will be delivering these upcoming workshops and the forthcoming FOI: Contracts and Commercial Confidentiality workshop which is taking place on the 10th December in London. 

Our 2019 calendar is now live. We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. 

Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.

LGL Advert


Google v Lloyd- Representative action for damages fails under the DPA 1998


GoogleAs more individuals become aware of the way in which organisations such as Facebook, and Google have used their personal data unlawfully, then the prospect of litigation, and class actions, seems increasingly likely.  However, the recent case of Lloyd v Google [2018] EWHC 2599 (QB) demonstrates that it doesn’t necessarily follow that a clear breach of data protection legislation will result in a successful claim for damages.  The case shows that even if claimants can prove that there has been a breach of data protection legislation (now the GDPR and DPA 2018) they need to identify what harm the breach has caused and how the damage has been caused by the breach. This will inevitably be a fact specific exerciser.

The background-the Safari Workaround and DoubleClick Ad Cookie

The case concerned the use, by Google, of a cookie known as the “DoubleClick Ad cookie” between 2011 -2012. Google allegedly used the cookie to secretly track the internet activity of iPhone users in the US and the UK. Ordinarily the Safari browser (developed by Apple) had a default setting that blocked the use of third-party cookies, such as the DoubleClick Ad cookie. However, Google was able to exploit certain exceptions to this default blockage and implement the so called “Safari Workaround” which enabled Google to set the cookie on an iPhone, when the user used the Safari browser. This gave Google access to a huge amount of browser generated personal information, including the address or URL of the website which the browser is displaying to the user. It was claimed that this information enabled Google to obtain or deduce other sensitive information about individuals, such as their interests and habits, race ethnicity, class, political or religious view, health, age, sexuality and financial position. Google was also alleged to have aggregated this information to create lists of different types of people, such as “football lovers’, and offered these lists to subscribing advertisers.

Regulatory action was against Google in the USA with Google agreeing to pay US$25.5 million civil penalty to settle charges brought by the US Federal Trade Commission, and a further US$17 million to settle state consumer-based actions.  No such regulatory action as taken by the Information Commissioner even though the breach clearly affected UK iPhone users.

 The representative claim

The action against google was brought by Mr Lloyd who was the only named claimant. However, he brought this action as a representative of a much larger class of people. This is a novel type of litigation that allows a representative to sue in a representative capacity on behalf of a class of people who have “the same interest” in the claim. It was not entirely clear how big the class was, but estimates ranged between 5.4-4.4 million people. Google not surprising was keen that permission was denied, bearing in mind it estimated its potential liability (if the case was successful) of between £1-3 billion.

Mr Lloyd argued that he and each member of the group/class he represented had a right to be compensated “for the infringement of their data protection rights”. Specifically, it was alleged that Google had carried out the secret tracking and collation of personal data without the data subject’s consent or knowledge; that this was a breach of Google’s duty under s 4(4) of the DPA 1998 and that the data subjects were entitled to compensation under s 13 DPA 1998.

In other words, the fact of the contravention gave them a right to be compensated.  Neither Mr Lloyd or any member of the group alleged or gave evidence about any financial loss, distress or anxiety. There were no individual allegations of harm. In fact, Mr Lloyd asserted that the claim was generic and claimed an equal, standard “tariff” award for each member of the class (the claim was for £750 per person). This turned out to be fatal to the claim.

Litigation against a US based company

Any litigant, or group of litigants, considering an action against Apple or Google or any other such company that is based outside the UK first need the permission of the High Court in order to serve a claim against a defendant outside of the jurisdiction of the domestic courts. Before the court will grant permission, the claimant must prove three things. First that the case falls within one of the listed “jurisdictional gateways”; second, that the case has a reasonable prospect of success and finally that England is the appropriate place to deal with the case.  The High Court had no difficulty deciding that England would be the natural jurisdiction for the case since the claimants were all in the UK and the alleged damage had been incurred in the UK.  However, the High Court Judge found that Mr Lloyd’s case failed on the remaining two issues and denied permission for the case to proceed.

The Court identified that the relevant gateway in this case was that the claimant had to prove they had a good arguable claim in tort and the damage was sustained in England & Wales.  The Judge was clear that a claim for damages under the DPA 1998 is a claim in tort. He was also satisfied that each member of the class was (for at least some of the relevant period) within the jurisdiction when they connected to the internet using the Safari browser.

However, the real and substantial issue in this case was whether the Safari Workaround had caused “damage” within the meaning of the DPA 1998.  The Court engaged in a lengthy analysis of the case law on DPA damages and concluded that the claimants had not sustained damages in this case. On this basis the court decided that Mr Lloyd did not have a good arguable case or a reasonable prospect of success.

Damages under the DPA 1998

Section 13 of the DPA 1998 provided that an individual who suffers damage by reason of any contravention by a data controller of any of the requirements of the DPA 1998 is entitled to compensation from the data controller for that damage.

The High Court decided that giving the words their natural meaning, this statutory right to compensation arises where

(a) there has been a breach of the DPA; and

(b) as a result, the claimant suffers damage.

These are two separate events connected by a causal link.  In short, the breach must cause the damage. Based on this logic, it necessarily follows that some breaches will not give rise to damages.  The High Court judge suggested some examples where a data controller processes personal data in breach of the DPA, but where the breach may not warrant an award of compensation, such as:

  • recording inaccurate data, but not using or disclosing it
  • Holding, but not disclosing, using or consulting personal data that are irrelevant
  • Holding data for too long
  • Failing, without consequences, to take adequate security measures.

Of course, this is not to say that these types of breaches could never give rise to a successful claim for damages, as much will depend on the context and facts of the case. However, the Court did suggest that data subjects had alternative remedies such as rectification, erasure and objection.

One of the key arguments presented by Lloyd was that the claimants had incurred damage because they lost control of their data. According to the Court, there will be circumstances where the loss of control may have significantly harmful consequences, such as in Vidal Hall. (Google in v Vidal-Hall and others & The Information Commissioner [2015] EWCA Civ 311) The focus in that case was on the significant distress caused to the claimants by the delivery to their screens of unwanted advertising material. However, decision was very fact specific; it seemed that the type of information that had secretly been tracked and used to send targeting advertising was of a particularly private and sensitive nature, such that it would have caused harm to the claimants had any one else seen their computer screens.

The High Court in Lloyd v Google also accepted that delivery of unwanted commercial advertising can be upsetting in other ways, for example where repeated or bulk unwanted communications:

  • Is so distressing it constitutes harassment even if the content is inherently innocuous
  • Infringes a person’s right to respect for their autonomy
  • Represents a material interference with their freedom of choice over how they lead their life.

However, on the facts of the case the Court concluded that the claimants had not provided any particulars of any damage suffered.   Rather the claimants seemed to be relying on the fact that the claimants were entitled to be compensated because of the breach alone. The judge rejected this as a possibility.

A Court cannot award compensation just because the data protection rules have been breached. The Court also rejected the idea that the claimants should be compensated in order to “censure” the defendant’s behaviour. The Court also rejected any argument that damages under the DPA should be awarded in order on a sort of “restitutionary” basis, that is ‘calculated by reference to the market value of the data which has been refused’.

Representative action cases- what lessons can be learnt?

This was a novel litigation, it involved one named claimant bringing an action on behalf of a large group. The cation faced difficulties right from the start, not least in trying to identify the group. The Judge identified three real difficulties with this type of action:

  1. The representative (Lloyd) and the members of the class do not all have the “same interest” which is an essential requirement for any representative action. Some people may have suffered no damage and others different types of damage. Consequently, they did not all have the same interest in the action.
  2. Even if it was possible to define the class of people represented, it would be practically impossible to identify all members of the class.
  3. The court would not exercise its discretion to allow this case to go forward, particularly given the costs of the litigation, the fact that the damages payable to each individual (were the case to succeed) would be modest, and that none of the class had shown any interest in, appeared to care about in the claim.

Anyone contemplating pursuing this type of claim in future would be well advised to carefully consider and take on board the judge’s criticisms and seek to address them before pursing an action.

LGL Advert

Susan Wolf will be delivering the forthcoming GDPR workshop in Birmingham on the 19th November. Book your place now! 


Act Now launches GDPR Policy Pack


The first fine was issued recently under the General Data Protection Regulation (GDPR) by the Austrian data protection regulator. Whilst relatively modest at 4,800 Euros, it shows that regulators are ready and willing to exercise their GDPR enforcement powers.

Article 24 of GDPR emphasises the need for Data Controllers to demonstrate compliance through measures to “be reviewed and updated where necessary”. This includes the implementation of “appropriate data protection policies by the controller.” This can be daunting especially for those beginning their GDPR compliance journey.

Act Now has applied its information governance knowledge and experience to create a GDPR policy pack containing essential documentation templates to help you meet the requirements of GDPR as well as the Data Protection Act 2018. The pack includes, amongst other things, template privacy notices as well as procedures for data security and data breach reporting. Security is a very hot topic after the recent £500,000 fine levied on Equifax by the Information Commissioner under the Data Protection Act 1998.

We have also included template letters to deal with Data Subjects’ rights requests, including subject access. The detailed contents are set out below:

  • User guide
  • Policies
    • Data Protection Policy
    • Special Category Data Processing (DPA 2018)
    • CCTV
    • Information Security
  • Procedures
    • Data breach reporting
    • Data Protection Impact Assessment template
    • Data Subject rights request templates
  • Privacy Notices
    • Business clients and contacts
    • Customers
    • Employees and volunteers
    • Public authority services users
    • Website users
    • Members
  • Records and Tracking logs
    • Information Asset Register
    • Record of Processing Activity (Article 30)
    • Record of Special Category Data processing
    • Data Subject Rights request tracker
    • Information security incident log
    • Personal data breach log
    • Data protection advice log

The documents are designed to be as simple as possible while meeting the statutory requirements placed on Data Controllers. They are available as an instant download (in Word Format). Sequential files and names make locating each document very easy.

Click here to read sample documents.

The policy pack gives a useful starting point for organisations of all sizes both in the public and private sector. For only £149 plus VAT (special introductory price) it will save you hours of drafting time. Click here to buy now or visit or our website to find out more.

Act Now provides a full GDPR Course programme including one day workshops, e learning, healthchecks and our GDPR Practitioner Certificate. 

Equifax Ltd fined £500,000 for significant breaches of the DPA 1998


On 20th September the Information Commissioner issued Equifax Ltd with a £500, 000 monetary penalty, the biggest fine it has issued to date, and the maximum allowed under the Data Protection Act 1998.  Although half a million pounds might sound a significant amount of money, it represents a relatively modest amount compared to the fine the company might have received had the breech occurred 12 months late, under the GDPR regime.

In this blog we consider the incident, the actions of the parties and we speculate on what type of sanctions the company could have faced under the GDPR.

The background

Equifax Ltd is a major credit reference agency based in the UK.  Since 2011 it has offered a product called the Equifax Identity Verifier (EIV) which enables clients to verify the identity of their customers, online, over the telephone or in person. To verify an individual’s identity, the client enters that individual’s personal information on the Equifax system, which is then checked against other sources held by Equifax Ltd.  Initially the EIV was processed by its US parent, Equifax Inc.  Equifax Ltd in the UK was the data controller and Equifax Inc in the USA was the data processor.  In 2016, Equifax Ltd transferred the data processing for the EIV product to the UK. This required the migration of the personal data to the UK. However, the US company did not then delete all the UK personal data from its system, which its should have done as it had no lawful reason for continuing to store this data.

The cyber-attack incidents

Equifax Inc was subject to a number of cyber-attacks, between 13 May and 30 July 2017.  During this period the attackers exploited a vulnerability in the US company’s online consumer-facing disputes portal. This enabled the attackers to access personal data of about 146 million individuals in the USA. Additionally, they were able to access the name and date of birth of up to 15 million UK individuals, contained in the EIV dataset.  In addition, in respect of some 637,430 UK data subjects their telephone numbers and driving license numbers were also a compromised.

An additional data set (the GCS dataset) was also attacked and this allowed the hackers to access the email addresses of over 12,000 UK individuals. More significantly, for another 14,961 UK residents the compromised data was account information for Equifax’s credit services and included data subjects’ name, address, date of birth, user name, password (in plain text), secret question and answer (also in plain text), credit card number (obscured) and some payment amounts. This personal data was held in a plain text file, as opposed to the actual data base. The storage of password data in plain text was contrary to the company’s Cryptography Standard which specifically required that passwords were to be stored in encrypted, hashed, masked, tokenised or other form.  The file was held in a file share, which was accessible to multiple users.

In March 2017 Equifax Inc., received warning of the vulnerability of its Apache Struts 2 web application framework (that it used in its consumer facing online disputes portal). The warning came from the US Department of Homeland Security Computer Emergency Readiness Team which identified a critical level of vulnerability. The US company disseminated this warning to key personnel, but the consumer facing portable was neither identified or patched.

Equifax Inc. became aware of the cyber attack on 29 July 2017, and then further aware that the data of UK individuals had been compromised by late August 2017.  However, Equifax Inc failed to warn Equifax Ltd until late September 7th, 2017, at least a week after it became aware the UK personal data had been compromised.

Equifax Ltd notified the ICO on 8thSeptember. In this respect, its behaviour would have met the strict breach notification requirements of the GDPR which require a data controller to notify the Commissioner within 72 hours of become aware of the breach.  Initially they reported that about 1.49 million individuals’ data had been lost. This was later revised upwards to 15 million data subjects. They also indicated, incorrectly, that the data accessed did not include residential addresses or financial information.

The Information Commissioner’s Findings

On the facts, the Information Commissioner decided that although the information systems in the USA were compromised, Equifax Ltd was the data controller responsible for the personal data of its UK customers. The Commissioner found that Equifax had failed to take appropriate steps the ensure its US parent, and data processor, was protecting the information. The Monetary Penalty Notice lists the various contraventions of the DPA 1998:

  • Principles 5, 2 and 1
    • Following the migration of the EIV dataset from the US to the UK, it was no longer necessary for the US company to keep any of the data. The data set had not been deleted in full and was kept longer than necessary.
    • In relation to the GCS dataset stored on the US system, Equifax Ltd was not sufficiently aware of the purpose for which it was being processed until after the breach. In the absence of any lawful purpose the retention was unnecessary.
    • The UK company failed to follow up or check that the data had been removed from the US systems, or to have an adequate process in place to check this was done.
  • Principle 7
    • Equifax had not undertaken an adequate risk assessment (s) of the security arrangements put in place by its data processor before transferring the data to it or following the transfer.
    • The Data Processing Agreement between Equifax Ltd and Equifax Inc was inadequate and failed to provide appropriate safeguards/ security safeguards or the standard clauses.
    • Equifax Ltd had failed to ensure adequate security measures were in place. The Commissioner identified numerous examples of the inadequacy of the safeguard that were in place, including the lack of encryption; the use of plant text data, allowing multiple users to have access to plaintext files; failing to address IT vulnerabilities; having out of date software; failing to undertake sufficient and regular system scans
    • Poor communications between the UK and US companies particularly in relation to the US company’s delay in making the data controller aware of the breach.
  • Principle 8
    • The Data Processing Agreement between Equifax UK and Equifax Inc was inadequate in that it failed to incorporate the standard contractual clause as a separate agreement and/or to provide appropriate safeguards for data transfers outside the EEA.
    • There was therefore a lack of a legal basis for the international transfer of this data.

Overall the Information Commissioner found multiple failures at Equifax Ltd, which led to personal information being kept longer than necessary and vulnerable to unauthorised access. Given the nature of the breaches, individuals were exposed to the risk of financial and identity fraud. The Commissioner concluded that the maximum financial penalty it could levy was proportionate in all the circumstances.

What difference would it make if this happened under the GDPR?

If the same breaches had occurred post May 25th then both Equifax Ltd and Equifax Inc., might find themselves in a substantially different situation.

The level of fine: The most obvious difference would be in relation to the level of fine that the ICO could impose. Under Article 83 GDPR the ICO can impose a fine of up to £17 million (20m Euro) or 4% of global turnover. Equifax Ltd is part of a global group that operates or has investments in over 24 countries. According to its 2016 Annual Report the Equifax Group’s global annual revenue for 2016 was $3.144.9 billion. 4% of this is about $125 million. In 2016 the UK company, Equifax Ltd, recorded revenue of £114.6 million. This alone could lead to a fine of over £4.5 million.

Data Subjects’ rights to sue for damages: Although this is not a new right under the GDPR, the GDPR now expressly permits individuals to sue for both material (financial) and non-material damage, such as distress. In many respects this represents a bigger risk for companies such as Equifax who are processing data whose loss could cause significant harm to data subjects. Given the heightened awareness amongst the public of the GDPR, it is not difficult to anticipate that these type of high-volume breaches could result in class actions for compensation.

Breach Notification: Article 33 imposes a condition that data processors must notify data controllers ‘without undue delay’ if they become aware of a data breach. The delay on the part of the US company in informing the UK company would constitute a breach of Article 33.

Notifying Data Subjects: Under Article 34 GDPR the Data Controller has a duty to notify data subjects that their personal data has been breached, where the breach is likely to result in a high risk to their rights and freedoms.  Equifax Ltd issued a press releaseon 7thOctober 2017 saying that I would we will now begin writing to all impacted customers with immediate effect. This again does not meet the requirements of notification ‘without undue delay’.

We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. New Dates added for London!

Need to train frontline staff quickly? Try our extremely popular GDPR e-learning course.

Dont forget about our GDPR Helpline, its a great tool to use for some advice when you really need it.

European Parliament approves text of forthcoming EU Regulation on the Free Flow of Non-Personal Data within the European Union


On 4th October 2018 the European Parliament (by 520 to 81 votes) agreed the text of the proposed EU Regulation on the Free Flow of Non-Personal Data in the European Union. The draft Regulation was proposed by the European Commission in 2017, as part of its Digital Single Market Strategy. The European Parliament, Council of Ministers and the European Commission reached a political consensus on it in June 2018. This adoption by the Parliament brings the regulation one step closer to becoming law. All that remains now is for the Council of Ministers to agree it on 6th November. It will then enter into force by the end of the year, although Member States will have 6 months to apply the new rules. This mean that it will enter into force before the UK exits the European Union in March 2019.

Background to the proposal

The European Commission proposed this regulation as part of its Digital Single Market Strategy.

According to the EU Commission the value of the EU data market in 2016 was estimated to be almost 60 billion Euros, with one study suggesting it could increase to more than 106 billion Euros by 2020.  The new regulation is designed to unlock this potential by improving the mobility of non-personal data across borders. According to the EU Commission, the free flow of non-personal data is hampered by:

  • National rules and administrative practices that restrict where data can be processed and stored. The regulation refers to such rules as data localisation requirements;
  • Uncertainty for organisations and the public sector about the legitimacy of national restrictions on data storage and processing;
  • Private restrictions (legal and contractual and technical) that hinder or prevent users of data storage or other processing services from porting their data from one service provider to another or back to their own IT systems (so called vendor lock-ins).

The aims and outline of the regulation

The regulation only apples to the processing of non-personal electronic data. However, like the GDPR, its territorial scope is wide and includes the processing of electronic data which is:

  • provided as a service to users residing or having an establishment in the EU, regardless of whether the service provider is established in the EU; or
  • is carried out by a natural or legal person (an individual, business, organisation or a public authority) residing or having an establishment in the EU for its own needs.

Processing is also defined in very similar terms to the GDPR – as meaning any operation or set of operations which is performed on data or on sets of data in electronic format, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction. Unlike the GDPR, it only relates to data in electronic format. Its application is wide and encompasses outsourced data storage, processing of data on platforms, or in applications.

The regulation does not apply to personal data (see below).

National rules on data storage (data localisation requirements)

The regulation aims to ensure the free movement of non-personal data within the European Union by laying down a set of rules relating to national data processing localisation rules.   These are essentially any rules, laws or administrative practices that restrict, prohibit, limit or impose conditions on where data can be processed. The regulation states that such data localisation requirements are prohibited. Member States have 24 months to repeal any such laws.

However, Member States can retain or introduce data localisation rules provided they are justified on the grounds of public security and that the rules are proportionate. In the original proposal Member States would have only had 12 months, but this was extended to 24 months by the European Parliament. Although the main body of the regulation doesn’t define public security, the recitals refer to the fact that the term has been interpreted widely to include both internal and external public security, as well as issues of public safety.

Data Availability for Competent Authorities

The regulation does not affect the powers of ‘competent’ authorities to request or obtain access to data for the performance of their official duties. The definition of competent authority is wide and includes any authority of a Member State, or any other entity authorised by national law to perform a public function or to exercise official authority, that has the power to obtain access to data processed by a natural or legal person for the performance of its official duties, as provided for by Union or national law. It therefore includes central and local government but can also include other organisations that fulfil statutory functions.

This is important, particularly if data is going to be processed in another Member State. The aim is to ensure that the powers of competent authorities to request and receive data, to enable them to fulfil their functions and regulatory powers, remain unaffected by the free movement of data. Consequently, the regulation including a procedure for cooperation between national authorities and the possibility of Member States imposing penalties for failure to comply with an obligation to provide data.

The regulation also establishes a single point of contact for each Member State, to liaise with the contacts in other Member States, and the Commission. The aim is to ensure the effective application of the new rules.

Data Portability

The Regulation also seeks to encourage and facilitate data portability via the use of self-regulatory codes of conduct and certification schemes. The European Commission’s role is to encourage, for example, cloud service providers to develop self-regulatory codes of conduct for easier switching of service provider and porting back data to in house servers. These must be implemented by

Reference is also made to certification schemes that facilitate comparison of data processing products and services for professional users. Such certification schemes may relate to quality management, information security management or environmental management.

Actions to encourage cloud service providers to develop self-regulatory codes of conduct for easier switching of provider and porting data back to in-house servers, which must be implemented within 18 months of the regulation coming into force (mid 2020).

The European Commission is tasked with monitoring development and implementation of these codes of conduct.

The new regulation does not apply to personal data

The regulation concerns non -personal data and does not cover personal data. Data Protection practitioners will no doubt be relieved to know that this means it will have no impact on the GDPR.  According to the European Commission, the two regulations will operate together to enable the free flow of any data-both personal and non-personal “creating a single European space for data”.

In the case of a data set composed of both personal and non-personal data, this new Regulation applies to the non-personal data part of the data set. Where personal and non-personal data in a data set are inextricably linked, this Regulation shall not prejudice the application of Regulation (EU) 2016/679.

The difficulty that this raise will inevitably be a practical one; applying two different regulations to a single data set that contains both person and non-personal data. The regulation rests on the assumption of a clear personal/non-personal data dichotomy, which is practice may be difficult to distinguish.

The impact of Brexit

If the new Regulation enters into force at the end of the year it will apply directly in the UK as per any other Member State. It will remain in force after the date of exit because of the provisions of the EU Withdrawal Act 2018.

After the date of exit, the UK will no longer be a Member State. The regulation effectively allows for any non personal data to be stored and processed anywhere in the EU. It does not extend this ‘right; to storage and processing in third countries. There is of course concern that data localisation rules could be applied against data processors outside the EU, which in turn could have significant adverse business implications for UK data processors.

We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. New Dates added for London!

Need to train frontline staff quickly? Try our extremely popular GDPR e-learning course.

Dont forget about our GDPR Helpline, its a great tool to use for some advice when you really need it.

New RIPA Codes of Practice for Surveillance and CHIS


In August 2018 the revised Codes of Practice for Covert Surveillance and Property Interference and Covert Human Intelligence Sources (CHIS) were published. These contain substantial changes and additions which public authorities conducting surveillance under Part 2 of the Regulation of Investigatory Powers Act 2000 (RIPA) need to understand.

The codes provide guidance on when an application should be made for a RIPA authorisation, the procedures that must be followed before surveillance activity takes place and how to handle any information obtained through such activity. They are admissible as evidence in criminal and civil proceedings. Any court or tribunal considering such proceedings, including the Investigatory Powers Tribunal , as well as the Investigatory Powers Commissioner’s Office, responsible for overseeing the relevant powers and functions, may take the provisions of the codes into account.

Many of the changes in the revised codes reflect best practice guidance published in the OSC Procedures and Guidance Document, observations and commentary in OSC annual reports, and advice and guidance provided during inspections. The changes include amendments to the role of the Senior Responsible Officer and a new error reporting procedure. The codes also reflect developments in surveillance and monitoring – such as use of the internet and social media, drones, tracking devices etc. Here is a summary:

  1. Private information– further information and guidance relating to internet material and investigations.
  2. Tracking devices– clarification and further information
  3. Social media and internet research – Substantial new sections providing clarity and detail (read our blog post for more on this topic).
  4. Drones – A section providing guidance on the use of Aerial Surveillance Devices
  5. Intrusive surveillance– A further developed explanation
  6. General Observation Duties– Expanded section to include such activity on the internet
  7. Surveillance not core function– A section relating to covert surveillance for ‘non RIPA purposes’ (More on this topic in our blog post)
  8. CCTV and ANPR– Additional information relating to the deployment of these technologies and the relevant codes and oversight more here: (More on this topic )
  9. Necessity and proportionality– Expanded section.
  • Authorisation– New section explaining the requirement to present the circumstances in a fair and balanced manner
  • Collateral intrusion– Further explanation is provided
  • Handling of material obtained– Section relating to safeguards, retention and destruction of material
  • Third parties– more clarity relating to working with third parties, including those that are not public authorities
  • Reviews– Further detail relating to the review process requirements
  • Senior Responsible Officer– The section relating to the role of the SRO has been altered substantially and includes amendments to the role and responsibilities
  • Covert Surveillance of a CHIS– A new section dealing with this tactic
  • Renewals– A section that provides more information about the detail required
  • Record Keeping– This section has been expanded to provide more detail of requirements
  • Error Reporting– A new requirement introduced in the Investigatory Powers Act 2016. This section describes the types of errors and the reporting requirements, and how there is expected to be processes to identify if errors exist.
  • Privileged Information– A new section with more detail relating to safeguard requirements for such information.
  • Other Legislation– There is now a new section referring to the Criminal Procedure Investigations Act 1996and evidence.
  • Data Protection– A new section relating to the handling and management of material and referring to the Data Controller. (Read our blog poston GDPR and employee surveillance.)
  • Dissemination of Material– A new section relating to this aspect
  • Copying of Material– A new section relating to this aspect
  • Storage of Material– A new section relating to the secure storage of material obtained
  • Destruction of Material– Another new section relating to this aspect
  • Confidential or Privileged Material– This section has been expanded to provide more detailed information about requirements
  • Oversight– Section amended to reflect the role oversight role of the Investigatory Powers Commissioner’s Office, and their access to systems and material in order to fulfil the oversight role. (More on this subject here.) If you have a RIPA inspection coming up, read our guide
  • Complaints– This section is completely altered and provides additional information

On 30thApril 2018 the Investigatory Powers Tribunal awarded £46,694 to an individual who had complained about surveillance by British Transport Police (BTP). The determination was that that surveillance was unlawful as it had been conducted without a RIPA authorisation. BTP was criticised for amongst other things, lack of training and awareness of those involved in surveillance.

Our RIPA courses have been completely revised by our RIPA expert, Steve Morris, to include an explanation of the new codes of practice and recent developments.  If you would like an in house refresher training for your staff, please get in touch.

Revised S.45 Code of Practice under FOI

Filing records

GDPR has taken the limelight from other information governance legislation especially Freedom of Information.  In July 2018, the Cabinet Office published a new code of practice under section 45 of the Freedom of Information Act 2000(FOI) replacing the previous version.

In July 2015 the Independent Commission on Freedom of Information was established by the Cabinet Office to examine the Act’s operation. The Commission concluded that the Act was working well. It did though make twenty-one recommendations to enhance the Act and further the aims of transparency and openness. The government agreed to update the S.45 Code of Practice following a consultation exercise in November 2017.

The revised code provides new, updated or expanded guidance on a variety of issues, including:

  • Transparency about public authorities’ FOI performance and senior pay and benefits, to mandate the FOI Commission recommendations for greater openness in both areas.
  • The handling of vexatious and repeated requests. The FOI Commission specifically recommended the inclusion of guidance on vexatious requests.
  • Fundamental principles of FOI not previously included in the code, e.g. general principles about how to define “information” and that which is “held” for the purposes of the Act.

In the latter section the code makes a number of interesting points:

  • Information disclosed as part of “routine business” is not an FOI request. Section 8of the Act sets out the definition of a valid FOI request. Judge for yourself if this advice is accurate.
  • Information that has been deleted but remains on back-ups is not held. This goes against a Tribunal Decision as well as ICO guidance.
  • Requests for information made in a foreign language are not valid FOI requests. Again refer to section 8 above. It does not say a request has to be in English!

The code is not law but the Information Commissioner can issue Practice Recommendations where she considers that public authorities have not complied with it. The Commissioner can also refer to non -compliance with the code in Decision and Enforcement Notices.

As well as giving more guidance on advice and assistance, costs, vexatious requests and consultation, the code places new “burdens”:

  • Public authorities should produce a guide to their Publication Scheme including a schedule of fees.
  • Those authorities with over 100 Full Time Equivalent (FTE) employees should publish details of their performance on handling FOI requests on a quarterly basis.
  • Pay, expenses and benefits of the senior staff at director level and equivalents should be published quarterly. Of course local authorities are already required to publish some of this information by the Local Government Transparency Code.
  • The public interest test extension to the time limit for responding to an FOI request (see S.10(3)) should normally be no more than 20 working days.
  • Internal reviews should normally be completed within 20 working days.

Furthermore, the other S.45 Code covering datasets has been merged with the main section 45 Code so that statutory guidance under section 45 can be found in one place. There is also an annex explaining the link between the FOI dataset provisions and the Re-use of Public Sector Information Regulations 2015.

Public authorities need to consider the new code carefully and change their FOI compliance procedures accordingly.

We will be discussing this and other recent FOI developments in our forthcoming FOI Update webinar.

Free Information Governance Briefings for the Health Sector


Act Now Training is pleased to announce a series of free Information Governance briefings for the health sector.

The IG landscape has changed dramatically in a relatively short space of time. Healthcare professionals are facing new challenges in the form of the General Data Protection Regulation (GDPR), the Data Protection Act 2018 and the Data Security and Protection Toolkit.

In each free briefing, we will explain what these changes mean in practical terms and dispel some of the myths associated with the new legislation. Time has been allocated for questions, discussion and networking. Participants will leave with an action plan for compliance.

These briefings are ideal for Information Governance Leads in General Practices, pharmacies, Clinical Commissioning Groups, dentists, care homes and other healthcare providers.

The speakers are Ibrahim Hasan, a solicitor and director at Act Now Training, and Craig Walker, Data Protection Officer at St Helens and Knowsley Hospitals NHS Trust. Both are well-known experts in this field with many years of experience in training and advising the health sector. Other members of the Act Now team will also be on hand to answer participants’ questions over a complimentary lunch.


9.45am – Registration

10am – Start

  • The General Data Protection Regulation (GDPR) and the health sector
  • Data Protection Act 2018 – What does it mean for me?
  • Data Security & Protection Toolkit – Overview and summary of key changes
  • National Data Guardian (10 Data Security Standards) – What are they and why are they so important?
  • Data Protection Impact Assessments – When and Why?
  • Subject Access Requests – Looking at separating the facts from fiction – to charge or not to charge
  • Data Breach Prevention – What can we do to minimise the likelihood of breaches occurring
  • Cyber Security Basics – What to be on the lookout for
  • The role of the Data Protection Officer – Do I need one and what is their role?

12.00pm – Open Forum and Lunch

There are limited places available on each briefing so please book early to avoid disappointment.

These briefings are part of a series of courses specially designed for the health sector. This includes our GDPR workshops and the Certificate in Information Governance.