Act Now Training would like to wish everyone all of the seasons greetings and we wish you a happy and prosperous 2019.
The office will be shut from Friday 21st December until the 2nd January 2019.
We look forward to seeing you all in the new year!
Act Now Training would like to wish everyone all of the seasons greetings and we wish you a happy and prosperous 2019.
The office will be shut from Friday 21st December until the 2nd January 2019.
We look forward to seeing you all in the new year!
Act Now is pleased to announce the launch of its brand new FOI Practitioner Certificate.
This course is one of the first of its kind, in a way that only Act Now delivers – practical, on the ground skills to help you fulfil your role as an FOI Officer.
This new certificate course is ideal for those wishing to acquire detailed knowledge of FOI and related information access legislation (including EIR) in a practical context. It has been designed by leading FOI experts including Ibrahim Hasan and Susan Wolf – formerly a senior lecturer on the University of Northumbria’s LLM in Information
The course uses the same format as our very successful GDPR Practitioner Certificate. It takes place over four days (one day per week) and involves lectures, discussion and practical drafting exercises. This format has been extremely well received by over 1000 delegates who have completed the course. Time will also be spent at the end of each day discussing what issues delegates may face when implementing/advising on the FOI topics of the day.
The four teaching days are followed by an online assessment and a practical project to be completed within 30 days.
Why is this course different?
Who should attend?
This course is suitable for anyone working within the public sector who needs to learn about FOI and related legislation in a practical context, as well as those with the requisite knowledge wishing to have it recognised through a formal qualification. It is most suitable for:
“FOI and EIR are almost 14 years old. Since the Act and Regulations came into force there have been many legal developments and court decisions that have given practitioners a much greater understanding of the legal provisions and how they should be applied in practice. With this in mind, we have written this course to ensure that it equips public sector officers with all the necessary knowledge and skills they need to respond to freedom of information requests accurately and efficiently. This course, with its emphasis on the law in practice, will enable trainees to become more accomplished and confident FOI practitioners”
Susan will share her vast experience gained through years of helping organisations comply with their information rights legislation obligations. This, together with a comprehensive set of course materials and guidance notes, will mean that delegates will not only be in a position to pass the course assessment but to learn valuable skills which they will be able to apply in their workplaces for years to come.
This new course builds on Act Now’s reputation for delivering practical training at an affordable price:
This new course widens the choice of qualifications for IG practitioners and advisers. Ibrahim Hasan (Director of Act Now Training) commented:
“We are pleased be able to launch this new qualification. Because of its emphasis on practical skills, we are confident that it will become the qualification of choice for current and future FOI Officers and advisers.”
To learn more please visit our website.
All our courses can be delivered at your premises at a substantially reduced cost.
Contact us for more information.
On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998. In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case. For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.
In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:
The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.
The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).
In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information. The App remained in operation until May 2015.
Breach of the DPA
The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.
The FB companies had breached s 4 (4) DPA 1998 by failing to comply with the 1stand 7th data protection principles. They had:
Breach of FB Platform Policy
Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:
The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.
Joint Data Controllers
The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.
The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.
The Use of Data Analytics for Political Purposes
The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.
As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.
Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.
The background-the Safari Workaround and DoubleClick Ad Cookie
The case concerned the use, by Google, of a cookie known as the “DoubleClick Ad cookie” between 2011 -2012. Google allegedly used the cookie to secretly track the internet activity of iPhone users in the US and the UK. Ordinarily the Safari browser (developed by Apple) had a default setting that blocked the use of third-party cookies, such as the DoubleClick Ad cookie. However, Google was able to exploit certain exceptions to this default blockage and implement the so called “Safari Workaround” which enabled Google to set the cookie on an iPhone, when the user used the Safari browser. This gave Google access to a huge amount of browser generated personal information, including the address or URL of the website which the browser is displaying to the user. It was claimed that this information enabled Google to obtain or deduce other sensitive information about individuals, such as their interests and habits, race ethnicity, class, political or religious view, health, age, sexuality and financial position. Google was also alleged to have aggregated this information to create lists of different types of people, such as “football lovers’, and offered these lists to subscribing advertisers.
Regulatory action was against Google in the USA with Google agreeing to pay US$25.5 million civil penalty to settle charges brought by the US Federal Trade Commission, and a further US$17 million to settle state consumer-based actions. No such regulatory action as taken by the Information Commissioner even though the breach clearly affected UK iPhone users.
The representative claim
The action against google was brought by Mr Lloyd who was the only named claimant. However, he brought this action as a representative of a much larger class of people. This is a novel type of litigation that allows a representative to sue in a representative capacity on behalf of a class of people who have “the same interest” in the claim. It was not entirely clear how big the class was, but estimates ranged between 5.4-4.4 million people. Google not surprising was keen that permission was denied, bearing in mind it estimated its potential liability (if the case was successful) of between £1-3 billion.
Mr Lloyd argued that he and each member of the group/class he represented had a right to be compensated “for the infringement of their data protection rights”. Specifically, it was alleged that Google had carried out the secret tracking and collation of personal data without the data subject’s consent or knowledge; that this was a breach of Google’s duty under s 4(4) of the DPA 1998 and that the data subjects were entitled to compensation under s 13 DPA 1998.
In other words, the fact of the contravention gave them a right to be compensated. Neither Mr Lloyd or any member of the group alleged or gave evidence about any financial loss, distress or anxiety. There were no individual allegations of harm. In fact, Mr Lloyd asserted that the claim was generic and claimed an equal, standard “tariff” award for each member of the class (the claim was for £750 per person). This turned out to be fatal to the claim.
Litigation against a US based company
Any litigant, or group of litigants, considering an action against Apple or Google or any other such company that is based outside the UK first need the permission of the High Court in order to serve a claim against a defendant outside of the jurisdiction of the domestic courts. Before the court will grant permission, the claimant must prove three things. First that the case falls within one of the listed “jurisdictional gateways”; second, that the case has a reasonable prospect of success and finally that England is the appropriate place to deal with the case. The High Court had no difficulty deciding that England would be the natural jurisdiction for the case since the claimants were all in the UK and the alleged damage had been incurred in the UK. However, the High Court Judge found that Mr Lloyd’s case failed on the remaining two issues and denied permission for the case to proceed.
The Court identified that the relevant gateway in this case was that the claimant had to prove they had a good arguable claim in tort and the damage was sustained in England & Wales. The Judge was clear that a claim for damages under the DPA 1998 is a claim in tort. He was also satisfied that each member of the class was (for at least some of the relevant period) within the jurisdiction when they connected to the internet using the Safari browser.
However, the real and substantial issue in this case was whether the Safari Workaround had caused “damage” within the meaning of the DPA 1998. The Court engaged in a lengthy analysis of the case law on DPA damages and concluded that the claimants had not sustained damages in this case. On this basis the court decided that Mr Lloyd did not have a good arguable case or a reasonable prospect of success.
Damages under the DPA 1998
Section 13 of the DPA 1998 provided that an individual who suffers damage by reason of any contravention by a data controller of any of the requirements of the DPA 1998 is entitled to compensation from the data controller for that damage.
The High Court decided that giving the words their natural meaning, this statutory right to compensation arises where
(a) there has been a breach of the DPA; and
(b) as a result, the claimant suffers damage.
These are two separate events connected by a causal link. In short, the breach must cause the damage. Based on this logic, it necessarily follows that some breaches will not give rise to damages. The High Court judge suggested some examples where a data controller processes personal data in breach of the DPA, but where the breach may not warrant an award of compensation, such as:
Of course, this is not to say that these types of breaches could never give rise to a successful claim for damages, as much will depend on the context and facts of the case. However, the Court did suggest that data subjects had alternative remedies such as rectification, erasure and objection.
One of the key arguments presented by Lloyd was that the claimants had incurred damage because they lost control of their data. According to the Court, there will be circumstances where the loss of control may have significantly harmful consequences, such as in Vidal Hall. (Google in v Vidal-Hall and others & The Information Commissioner  EWCA Civ 311) The focus in that case was on the significant distress caused to the claimants by the delivery to their screens of unwanted advertising material. However, decision was very fact specific; it seemed that the type of information that had secretly been tracked and used to send targeting advertising was of a particularly private and sensitive nature, such that it would have caused harm to the claimants had any one else seen their computer screens.
The High Court in Lloyd v Google also accepted that delivery of unwanted commercial advertising can be upsetting in other ways, for example where repeated or bulk unwanted communications:
However, on the facts of the case the Court concluded that the claimants had not provided any particulars of any damage suffered. Rather the claimants seemed to be relying on the fact that the claimants were entitled to be compensated because of the breach alone. The judge rejected this as a possibility.
A Court cannot award compensation just because the data protection rules have been breached. The Court also rejected the idea that the claimants should be compensated in order to “censure” the defendant’s behaviour. The Court also rejected any argument that damages under the DPA should be awarded in order on a sort of “restitutionary” basis, that is ‘calculated by reference to the market value of the data which has been refused’.
Representative action cases- what lessons can be learnt?
This was a novel litigation, it involved one named claimant bringing an action on behalf of a large group. The cation faced difficulties right from the start, not least in trying to identify the group. The Judge identified three real difficulties with this type of action:
Anyone contemplating pursuing this type of claim in future would be well advised to carefully consider and take on board the judge’s criticisms and seek to address them before pursing an action.
Susan Wolf will be delivering the forthcoming GDPR workshop in Birmingham on the 19th November. Book your place now!
The first fine was issued recently under the General Data Protection Regulation (GDPR) by the Austrian data protection regulator. Whilst relatively modest at 4,800 Euros, it shows that regulators are ready and willing to exercise their GDPR enforcement powers.
Article 24 of GDPR emphasises the need for Data Controllers to demonstrate compliance through measures to “be reviewed and updated where necessary”. This includes the implementation of “appropriate data protection policies by the controller.” This can be daunting especially for those beginning their GDPR compliance journey.
Act Now has applied its information governance knowledge and experience to create a GDPR policy pack containing essential documentation templates to help you meet the requirements of GDPR as well as the Data Protection Act 2018. The pack includes, amongst other things, template privacy notices as well as procedures for data security and data breach reporting. Security is a very hot topic after the recent £500,000 fine levied on Equifax by the Information Commissioner under the Data Protection Act 1998.
We have also included template letters to deal with Data Subjects’ rights requests, including subject access. The detailed contents are set out below:
The documents are designed to be as simple as possible while meeting the statutory requirements placed on Data Controllers. They are available as an instant download (in Word Format). Sequential files and names make locating each document very easy.
Click here to read sample documents.
The policy pack gives a useful starting point for organisations of all sizes both in the public and private sector. For only £149 plus VAT (special introductory price) it will save you hours of drafting time. Click here to buy now or visit or our website to find out more.
On 20th September the Information Commissioner issued Equifax Ltd with a £500, 000 monetary penalty, the biggest fine it has issued to date, and the maximum allowed under the Data Protection Act 1998. Although half a million pounds might sound a significant amount of money, it represents a relatively modest amount compared to the fine the company might have received had the breech occurred 12 months late, under the GDPR regime.
In this blog we consider the incident, the actions of the parties and we speculate on what type of sanctions the company could have faced under the GDPR.
Equifax Ltd is a major credit reference agency based in the UK. Since 2011 it has offered a product called the Equifax Identity Verifier (EIV) which enables clients to verify the identity of their customers, online, over the telephone or in person. To verify an individual’s identity, the client enters that individual’s personal information on the Equifax system, which is then checked against other sources held by Equifax Ltd. Initially the EIV was processed by its US parent, Equifax Inc. Equifax Ltd in the UK was the data controller and Equifax Inc in the USA was the data processor. In 2016, Equifax Ltd transferred the data processing for the EIV product to the UK. This required the migration of the personal data to the UK. However, the US company did not then delete all the UK personal data from its system, which its should have done as it had no lawful reason for continuing to store this data.
The cyber-attack incidents
Equifax Inc was subject to a number of cyber-attacks, between 13 May and 30 July 2017. During this period the attackers exploited a vulnerability in the US company’s online consumer-facing disputes portal. This enabled the attackers to access personal data of about 146 million individuals in the USA. Additionally, they were able to access the name and date of birth of up to 15 million UK individuals, contained in the EIV dataset. In addition, in respect of some 637,430 UK data subjects their telephone numbers and driving license numbers were also a compromised.
An additional data set (the GCS dataset) was also attacked and this allowed the hackers to access the email addresses of over 12,000 UK individuals. More significantly, for another 14,961 UK residents the compromised data was account information for Equifax’s credit services and included data subjects’ name, address, date of birth, user name, password (in plain text), secret question and answer (also in plain text), credit card number (obscured) and some payment amounts. This personal data was held in a plain text file, as opposed to the actual data base. The storage of password data in plain text was contrary to the company’s Cryptography Standard which specifically required that passwords were to be stored in encrypted, hashed, masked, tokenised or other form. The file was held in a file share, which was accessible to multiple users.
In March 2017 Equifax Inc., received warning of the vulnerability of its Apache Struts 2 web application framework (that it used in its consumer facing online disputes portal). The warning came from the US Department of Homeland Security Computer Emergency Readiness Team which identified a critical level of vulnerability. The US company disseminated this warning to key personnel, but the consumer facing portable was neither identified or patched.
Equifax Inc. became aware of the cyber attack on 29 July 2017, and then further aware that the data of UK individuals had been compromised by late August 2017. However, Equifax Inc failed to warn Equifax Ltd until late September 7th, 2017, at least a week after it became aware the UK personal data had been compromised.
Equifax Ltd notified the ICO on 8thSeptember. In this respect, its behaviour would have met the strict breach notification requirements of the GDPR which require a data controller to notify the Commissioner within 72 hours of become aware of the breach. Initially they reported that about 1.49 million individuals’ data had been lost. This was later revised upwards to 15 million data subjects. They also indicated, incorrectly, that the data accessed did not include residential addresses or financial information.
The Information Commissioner’s Findings
On the facts, the Information Commissioner decided that although the information systems in the USA were compromised, Equifax Ltd was the data controller responsible for the personal data of its UK customers. The Commissioner found that Equifax had failed to take appropriate steps the ensure its US parent, and data processor, was protecting the information. The Monetary Penalty Notice lists the various contraventions of the DPA 1998:
Overall the Information Commissioner found multiple failures at Equifax Ltd, which led to personal information being kept longer than necessary and vulnerable to unauthorised access. Given the nature of the breaches, individuals were exposed to the risk of financial and identity fraud. The Commissioner concluded that the maximum financial penalty it could levy was proportionate in all the circumstances.
What difference would it make if this happened under the GDPR?
If the same breaches had occurred post May 25th then both Equifax Ltd and Equifax Inc., might find themselves in a substantially different situation.
The level of fine: The most obvious difference would be in relation to the level of fine that the ICO could impose. Under Article 83 GDPR the ICO can impose a fine of up to £17 million (20m Euro) or 4% of global turnover. Equifax Ltd is part of a global group that operates or has investments in over 24 countries. According to its 2016 Annual Report the Equifax Group’s global annual revenue for 2016 was $3.144.9 billion. 4% of this is about $125 million. In 2016 the UK company, Equifax Ltd, recorded revenue of £114.6 million. This alone could lead to a fine of over £4.5 million.
Data Subjects’ rights to sue for damages: Although this is not a new right under the GDPR, the GDPR now expressly permits individuals to sue for both material (financial) and non-material damage, such as distress. In many respects this represents a bigger risk for companies such as Equifax who are processing data whose loss could cause significant harm to data subjects. Given the heightened awareness amongst the public of the GDPR, it is not difficult to anticipate that these type of high-volume breaches could result in class actions for compensation.
Breach Notification: Article 33 imposes a condition that data processors must notify data controllers ‘without undue delay’ if they become aware of a data breach. The delay on the part of the US company in informing the UK company would constitute a breach of Article 33.
Notifying Data Subjects: Under Article 34 GDPR the Data Controller has a duty to notify data subjects that their personal data has been breached, where the breach is likely to result in a high risk to their rights and freedoms. Equifax Ltd issued a press releaseon 7thOctober 2017 saying that I would we will now begin writing to all impacted customers with immediate effect. This again does not meet the requirements of notification ‘without undue delay’.
Need to train frontline staff quickly? Try our extremely popular GDPR e-learning course.
On 4th October 2018 the European Parliament (by 520 to 81 votes) agreed the text of the proposed EU Regulation on the Free Flow of Non-Personal Data in the European Union. The draft Regulation was proposed by the European Commission in 2017, as part of its Digital Single Market Strategy. The European Parliament, Council of Ministers and the European Commission reached a political consensus on it in June 2018. This adoption by the Parliament brings the regulation one step closer to becoming law. All that remains now is for the Council of Ministers to agree it on 6th November. It will then enter into force by the end of the year, although Member States will have 6 months to apply the new rules. This mean that it will enter into force before the UK exits the European Union in March 2019.
Background to the proposal
The European Commission proposed this regulation as part of its Digital Single Market Strategy.
According to the EU Commission the value of the EU data market in 2016 was estimated to be almost 60 billion Euros, with one study suggesting it could increase to more than 106 billion Euros by 2020. The new regulation is designed to unlock this potential by improving the mobility of non-personal data across borders. According to the EU Commission, the free flow of non-personal data is hampered by:
The aims and outline of the regulation
The regulation only apples to the processing of non-personal electronic data. However, like the GDPR, its territorial scope is wide and includes the processing of electronic data which is:
Processing is also defined in very similar terms to the GDPR – as meaning any operation or set of operations which is performed on data or on sets of data in electronic format, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction. Unlike the GDPR, it only relates to data in electronic format. Its application is wide and encompasses outsourced data storage, processing of data on platforms, or in applications.
The regulation does not apply to personal data (see below).
National rules on data storage (data localisation requirements)
The regulation aims to ensure the free movement of non-personal data within the European Union by laying down a set of rules relating to national data processing localisation rules. These are essentially any rules, laws or administrative practices that restrict, prohibit, limit or impose conditions on where data can be processed. The regulation states that such data localisation requirements are prohibited. Member States have 24 months to repeal any such laws.
However, Member States can retain or introduce data localisation rules provided they are justified on the grounds of public security and that the rules are proportionate. In the original proposal Member States would have only had 12 months, but this was extended to 24 months by the European Parliament. Although the main body of the regulation doesn’t define public security, the recitals refer to the fact that the term has been interpreted widely to include both internal and external public security, as well as issues of public safety.
Data Availability for Competent Authorities
The regulation does not affect the powers of ‘competent’ authorities to request or obtain access to data for the performance of their official duties. The definition of competent authority is wide and includes any authority of a Member State, or any other entity authorised by national law to perform a public function or to exercise official authority, that has the power to obtain access to data processed by a natural or legal person for the performance of its official duties, as provided for by Union or national law. It therefore includes central and local government but can also include other organisations that fulfil statutory functions.
This is important, particularly if data is going to be processed in another Member State. The aim is to ensure that the powers of competent authorities to request and receive data, to enable them to fulfil their functions and regulatory powers, remain unaffected by the free movement of data. Consequently, the regulation including a procedure for cooperation between national authorities and the possibility of Member States imposing penalties for failure to comply with an obligation to provide data.
The regulation also establishes a single point of contact for each Member State, to liaise with the contacts in other Member States, and the Commission. The aim is to ensure the effective application of the new rules.
The Regulation also seeks to encourage and facilitate data portability via the use of self-regulatory codes of conduct and certification schemes. The European Commission’s role is to encourage, for example, cloud service providers to develop self-regulatory codes of conduct for easier switching of service provider and porting back data to in house servers. These must be implemented by
Reference is also made to certification schemes that facilitate comparison of data processing products and services for professional users. Such certification schemes may relate to quality management, information security management or environmental management.
Actions to encourage cloud service providers to develop self-regulatory codes of conduct for easier switching of provider and porting data back to in-house servers, which must be implemented within 18 months of the regulation coming into force (mid 2020).
The European Commission is tasked with monitoring development and implementation of these codes of conduct.
The new regulation does not apply to personal data
The regulation concerns non -personal data and does not cover personal data. Data Protection practitioners will no doubt be relieved to know that this means it will have no impact on the GDPR. According to the European Commission, the two regulations will operate together to enable the free flow of any data-both personal and non-personal “creating a single European space for data”.
In the case of a data set composed of both personal and non-personal data, this new Regulation applies to the non-personal data part of the data set. Where personal and non-personal data in a data set are inextricably linked, this Regulation shall not prejudice the application of Regulation (EU) 2016/679.
The difficulty that this raise will inevitably be a practical one; applying two different regulations to a single data set that contains both person and non-personal data. The regulation rests on the assumption of a clear personal/non-personal data dichotomy, which is practice may be difficult to distinguish.
The impact of Brexit
If the new Regulation enters into force at the end of the year it will apply directly in the UK as per any other Member State. It will remain in force after the date of exit because of the provisions of the EU Withdrawal Act 2018.
After the date of exit, the UK will no longer be a Member State. The regulation effectively allows for any non personal data to be stored and processed anywhere in the EU. It does not extend this ‘right; to storage and processing in third countries. There is of course concern that data localisation rules could be applied against data processors outside the EU, which in turn could have significant adverse business implications for UK data processors.
Need to train frontline staff quickly? Try our extremely popular GDPR e-learning course.
In August 2018 the revised Codes of Practice for Covert Surveillance and Property Interference and Covert Human Intelligence Sources (CHIS) were published. These contain substantial changes and additions which public authorities conducting surveillance under Part 2 of the Regulation of Investigatory Powers Act 2000 (RIPA) need to understand.
The codes provide guidance on when an application should be made for a RIPA authorisation, the procedures that must be followed before surveillance activity takes place and how to handle any information obtained through such activity. They are admissible as evidence in criminal and civil proceedings. Any court or tribunal considering such proceedings, including the Investigatory Powers Tribunal , as well as the Investigatory Powers Commissioner’s Office, responsible for overseeing the relevant powers and functions, may take the provisions of the codes into account.
Many of the changes in the revised codes reflect best practice guidance published in the OSC Procedures and Guidance Document, observations and commentary in OSC annual reports, and advice and guidance provided during inspections. The changes include amendments to the role of the Senior Responsible Officer and a new error reporting procedure. The codes also reflect developments in surveillance and monitoring – such as use of the internet and social media, drones, tracking devices etc. Here is a summary:
On 30thApril 2018 the Investigatory Powers Tribunal awarded £46,694 to an individual who had complained about surveillance by British Transport Police (BTP). The determination was that that surveillance was unlawful as it had been conducted without a RIPA authorisation. BTP was criticised for amongst other things, lack of training and awareness of those involved in surveillance.
Our RIPA courses have been completely revised by our RIPA expert, Steve Morris, to include an explanation of the new codes of practice and recent developments. If you would like an in house refresher training for your staff, please get in touch.
GDPR has taken the limelight from other information governance legislation especially Freedom of Information. In July 2018, the Cabinet Office published a new code of practice under section 45 of the Freedom of Information Act 2000(FOI) replacing the previous version.
In July 2015 the Independent Commission on Freedom of Information was established by the Cabinet Office to examine the Act’s operation. The Commission concluded that the Act was working well. It did though make twenty-one recommendations to enhance the Act and further the aims of transparency and openness. The government agreed to update the S.45 Code of Practice following a consultation exercise in November 2017.
The revised code provides new, updated or expanded guidance on a variety of issues, including:
In the latter section the code makes a number of interesting points:
The code is not law but the Information Commissioner can issue Practice Recommendations where she considers that public authorities have not complied with it. The Commissioner can also refer to non -compliance with the code in Decision and Enforcement Notices.
As well as giving more guidance on advice and assistance, costs, vexatious requests and consultation, the code places new “burdens”:
Furthermore, the other S.45 Code covering datasets has been merged with the main section 45 Code so that statutory guidance under section 45 can be found in one place. There is also an annex explaining the link between the FOI dataset provisions and the Re-use of Public Sector Information Regulations 2015.
Public authorities need to consider the new code carefully and change their FOI compliance procedures accordingly.
We will be discussing this and other recent FOI developments in our forthcoming FOI Update webinar.
Act Now Training is pleased to announce a series of free Information Governance briefings for the health sector.
The IG landscape has changed dramatically in a relatively short space of time. Healthcare professionals are facing new challenges in the form of the General Data Protection Regulation (GDPR), the Data Protection Act 2018 and the Data Security and Protection Toolkit.
In each free briefing, we will explain what these changes mean in practical terms and dispel some of the myths associated with the new legislation. Time has been allocated for questions, discussion and networking. Participants will leave with an action plan for compliance.
These briefings are ideal for Information Governance Leads in General Practices, pharmacies, Clinical Commissioning Groups, dentists, care homes and other healthcare providers.
The speakers are Ibrahim Hasan, a solicitor and director at Act Now Training, and Craig Walker, Data Protection Officer at St Helens and Knowsley Hospitals NHS Trust. Both are well-known experts in this field with many years of experience in training and advising the health sector. Other members of the Act Now team will also be on hand to answer participants’ questions over a complimentary lunch.
9.45am – Registration
10am – Start
12.00pm – Open Forum and Lunch
There are limited places available on each briefing so please book early to avoid disappointment.