Apprentice Case Study – Meet Natasha

In 2022, Act Now Training teamed up with Damar to support their delivery of the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first cohort about to undertake the end point assessment.

Data Protection and Information Governance Apprentice, Natasha Lock, is an integral part of the Governance and Compliance team at the University of Lincoln. With the Data Protection and Digital Information (No.2) Bill set to make changes to the UK data protection regime, Natasha talks to us about why this is a great area to work in and how the apprenticeship route has been particularly beneficial for her.

How did you get onto the apprenticeship?

“I was already working at the university as an Information Compliance Officer when the opportunity for a staff apprenticeship came up.

“The process was swift and straightforward, and I was enrolled on the Data Protection and Information Governance Apprenticeship within three months of enquiring.”

How has the apprenticeship helped you?

“I started with a good understanding of the UK Data Protection legislation but my knowledge has grown significantly, and now I’m coming to the end of my level 4 apprenticeship, I’ve gained so much more insight and my confidence has grown.

“As a university, we hold vast amounts of data. My apprenticeship is allowing me to solve the challenge of data retention and implement better measures to retain, destroy and archive information. I have developed a greater understanding of the legislative requirements we must adhere to as a public sector institute and how to reduce and assess data protection risks.

“I love the fact that I can study whilst still doing my job. The flexibility works for me because I can go through course materials at my own pace. I really feel like I have a brilliant work/life/study balance.

“The University of Lincoln and Damar Training have been fantastic in supporting me. I get along with my coach, Tracey, so well. She is very friendly and personable and has enabled my creativity to flow.

“The course is very interactive, and I’ve found the forums with other apprentices to be a very useful way of sharing knowledge, ideas and stories.

“I’m enjoying it so much and people have noticed that my confidence has grown. I wouldn’t have had that without doing this apprenticeship. I’ve now got my sights on doing a law degree or law apprenticeship in the future.”

Abi Slater, Information Compliance Manager at Lincoln University, said: “It has been great to see how much Natasha has developed over the course of the apprenticeship. I believe the apprenticeship has provided Natasha with the knowledge and skills required to advance in her data protection career and the support from her coach at Damar Training has been excellent.

“I would encourage anyone with an interest in data protection and information governance to consider this apprenticeship.”

Tracey Coetzee, Coach at Damar Training said: “The Data Protection and Information Governance Apprenticeship was only approved by the Institute of Apprenticeships in 2022, and its delightful to see apprentices flourishing on the programme.

“From cyber security to managing data protection risks, this programme is upskilling participants and adding value to both private and public sector organisations and we’re thrilled to see the first cohort, including Natasha, approach the completion of their training.”

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

Another Day; Another Police Data Breach  

The largest police force in the UK, the London Metropolitan Police (also known as the London Met), has fallen victim to a substantial data breach. Approximately 47,000 members of the police staff have been informed about the potential compromise of their personal data. This includes details such as photos, names, and ranks. The breach occurred when criminals targeted the IT systems of a contractor responsible for producing staff identification cards.

While this breach has raised concerns about the security of sensitive information, it is important to note that details like identification numbers and clearance levels might have been exposed as well. However, it has been confirmed that the breached data did not include home addresses of the affected Met police personnel. There are fears that organised crime groups or even terrorist entities could be responsible for this breach of security and personal data.

Furthermore, the breach has amplified security apprehensions for London Met police officers from Black, Asian, and Minority Ethnic backgrounds. Former London Met Police Chief Superintendent Dal Babu explained that individuals with less common names might face a heightened risk. Criminal networks could potentially locate and target them more easily online, compared to those with common names. This concern is particularly relevant for officers in specialised roles like counter-terrorism or undercover operations.

Reacting to this situation, former Met commander John O’Connor expressed outrage, highlighting concerns about the adequacy of the cyber security measures put in place by the contracted IT security company, given the highly sensitive nature of the information at stake.

This incident presents a significant challenge to the UK Home Office, and it is likely that the government will be compelled to swiftly review and bolster security protocols. This step is necessary to ensure that the personal data of security service personnel is safeguarded with the utmost levels of privacy and data security. Both the Information Commissioner’s Office (ICO) and The National Crime Agency have initiated investigations.

This follows the data breach of the Police Service of Northern Ireland (PSNI) where, in response to a Freedom of Information request, the PSNI mistakenly divulged information on every police officer and member of police staff. Over in England, Norfolk and Suffolk Police also recently announced it had mistakenly released information about more than 1,200 people, including victims and witnesses of crime, also following an FOI request. Last week, South Yorkshire Police referred itself to the information commissioner after “a significant and unexplained reduction” in data such as bodycam footage stored on its systems, a loss which it said could affect some 69 cases.

These incidents underscore the urgency of maintaining robust data protection measures and raising awareness about potential risks, especially within law enforcement agencies. It also requires Data Controllers to ensure that they have processes in place to comply with the requirements of GDPR (Article 28) when it comes to appointing Data Processors.

We have two workshops coming up in September (Introduction to Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security.

GDPR and AI: The Rise of the Machines

2023 is going to be the year of AI. In January, Microsoft announced a multibillion dollar investment in OpenAI, the company behind image generation tool Dall-E and the chatbot ChatGPT. The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs.

The term “artificial intelligence” or “AI” refers to the use of machines and computer systems that can perform tasks normally requiring human intelligence. The past few years have seen rapid progress in this field. Advancements in deep learning algorithms, cloud computing, and data storage have made it possible for machines to process and analyse large amounts of data quickly and accurately. AI’s ability to interpret human language means that virtual assistants, such as Siri and Alexa, can now understand and respond to complex spoken commands at lightning speed.

The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs. Local government is using AI to simplify staff scheduling, predict demand for services and even estimate the risk of an individual committing fraud. Healthcare providers are now able to provide automated diagnoses based on medical imaging data from patients, thereby reducing wait times.

The Risks

With any major technological advance there are potential risks and downsides.  On Monday, ElevenLabs, an AI speech software company, said it had found an “increasing number of voice cloning misuse cases”. According to reports, hackers used the ElevenLabs software to create deepfake voices of famous people (including Emma Watson and Joe Rogan) making racist, transphobic and violent comments.

There are concerns about the impact of AI on employment and the future of work. In April 2021, the Court of Amsterdam ordered that Uber reinstate taxi drivers in the UK and Portugal who had been dismissed by “robo firing”; the use of an algorithm to make a decision about dismissal with no human involvement. The Court concluded that Uber’s had made the decisions “based solely on automated processing” within the meaning of Article 22(1) of the GDPR. It was ordered to reinstate the drivers’ accounts and pay them compensation.

As well as ethical questions about the use of AI in decision-making processes that affect people’s lives, AI-driven algorithms may lead to unintended biases or inaccurate decisions if not properly monitored and regulated. In 2021 the privacy pressure group, NOYB, filed a GDPR complaint against Amazon, claiming that Amazon’s algorithm discriminates against some customers by denying them the opportunity to pay for items by monthly invoice.

There is also a risk that AI is deployed without consideration of the privacy implications. In May 2022, the UK Information Commissioner’s Office fined Clearview AI Inc more than £7.5 million for breaches of GDPR. Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. The company, which describes itself as the “World’s Largest Facial Network”, allows customers, including the police, to upload an image of a person to its app, which then uses AI to check it against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Practical Steps

Recently, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using algorithms or AI:

1. Take a data protection by design and default approach

Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.

2. Be transparent with people about how you are using their data

Local authorities should regularly review their privacy policies, to ensure they comply with Articles 13 and 14, and identify areas for improvement. They should also bring any new uses of individuals’ personal data to their attention.

3. Identify the potential risks to people’s privacy

Local authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. 

In April 2021, the European Commission presented its proposal for a Regulation to harmonise rules for AI, also known as the “AI Act of the European Union’. Whilst there is still a long way to go before this proposal becomes legislation, it could create an impetus for the UK to further regulate AI. 

Use of AI has enormous benefits. It does though have a potential to adversely impact people’s lives and deny their fundamental rights. As such, understanding the implications of AI technology and how to use it in a fair and lawful manner is critical for data protection/information governance officers to understand. 

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

New DP and IG Practitioner Apprenticeship

Act Now Training has teamed up with Damar Training on materials and expertise underpinning its new Data Protection and Information Governance Practitioner Level 4 Apprenticeship.

The apprenticeship, which received final approval in March, will help develop the skills of those working in the increasingly important fields of data protection and information governance. 

With the rapid advancement of technology, there is a huge amount of personal data being processed by organisations, which is the subject of important decisions affecting every aspect of people’s lives. This poses significant legal and ethical challenges, as well as the risk of incurring considerable fines from regulators for non compliance. 

This apprenticeship aims to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address these challenges.

Ibrahim Hasan, Director of Act Now, said:

“We are excited to be working Damar Training to help deliver this much needed apprenticeship. We are committed to developing the IG sector and encouraging a diverse range of entrants to the IG profession. We have looked at every aspect of the IG Apprenticeship standard to ensure the training materials equip budding IG officers with the knowledge and skills they need to implement the full range of IG legislation in a practical way.

Damar’s managing director, Jonathan Bourne, added:

“We want apprenticeships to create real, long-term value for apprentices and organisations. It is vital therefore that we work with partners who really understand not only the technical detail but also the needs of employers.

Act Now Training are acknowledged as leaders in the field, having recently won the Information and Records Management Society (IRMS) Supplier of the Year award for the second consecutive year. I am delighted therefore that we are able to bring together their 20 years of deep sector expertise with Damar’s 40+ year record of delivering apprenticeship in business and professional services.

This apprenticeship has already sparked significant interest, particularly among large public and private sector organisations and professional services firms. Damar has also assembled an employer reference group that is feeding into the design process in real time to ensure that the programme works for employers.

The employer reference group met for the first time on May 25. It included industry professionals across a variety of sectors including private and public health care, financial services, local and national government, education, IT and data consultancy, some of whom were part of the apprenticeship trailblazer group.

If your organisation is interested in the apprenticeship please get in touch with us to discuss further.

Law Firm Fined For GDPR Breach: What Went Wrong? 

On 10th March the Information Commissioner’s Office (ICO) announced that it had fined Tuckers Solicitors LLP £98,000 for a breach of GDPR.

The fine follows a ransomware attack on the firm’s IT systems in August 2020. The attacker had encrypted 972,191 files, of which 24,712 related to court bundles.  60 of those were exfiltrated by the attacker and released on the dark web.  Some of the files included Special Category Data. Clearly this was a personal data breach, not just for the fact that data was released on the dark web, but because of the unavailability of personal data (though encryption by the attacker) which is also cover by the definition in Article 4 GDPR. Tuckers reported the breach to the ICO as well as affected individuals through various means including social media

The ICO found that between 25th May 2018 (the date the GDPR came into force) and 25th August 2020 (the date on which the Tuckers reported the personal data breach), Tuckers had contravened Article 5(1)(f) of the GDPR (the sixth Data Protection Principle, Security) as it failed to process personal data in a manner that ensured appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures. The ICO found its starting point for calculating the breach to be 3.25 per cent of Tuckers’ turnover for 30 June 2020. It could have been worse; the maximum for a breach of the Data Protection Principles is 4% of gross annual turnover.

In reaching its conclusions, the Commissioner gave consideration to Article 32 GDPR, which requires a Data Controller, when implementing appropriate security measures, to consider:

 “…the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons”.

What does “state of the art” mean? In this case the ICO considered, in the context of “state of the art”, relevant industry standards of good practice including the ISO27000 series, the National Institutes of Standards and Technology (“NIST”), the various guidance from the ICO itself, the National Cyber Security Centre (“NCSC”), the Solicitors Regulatory
Authority, Lexcel and NCSC Cyber Essentials.

The ICO concluded that there are a number of areas in which Tuckers had failed to comply with, and to demonstrate that it complied, with the Security Principle. Their technical and organisational measures were, over the relevant period, inadequate in the following respects:

Lack of Multi-Factor Authentication (“MFA”)

MFA is an authentication method that requires the user to provide two or more verification factors to gain access to an online resource. Rather than just asking for a username and password, MFA requires one or more additional verification factors, which decreases the likelihood of a successful cyber-attack e.g. a code from a fob or text message. Tuckers had not used MFA on its remote access solution despite its own GDPR policy requiring it to be used where available. 

Patch Management 

Tuckers told the ICO that part of the reason for the attack was the late application of a software patch to fix a vulnerability. In January 2020 this patch was rated as “critical” by the NCSC and others. However Tuckers only installed it 4 months later. 

Failure to Encrypt Personal data

The personal data stored on the archive server, that was subject to this attack, had not been encrypted. The ICO accepted that encryption may not have prevented the ransomware attack. However, it would have mitigated some of the risks the attack posed to the affected data subjects especially given the sensitive nature of the data.

Action Points 

Ransomware is on the rise. Organisations need to strengthen their defences and have plans in place; not just to prevent a cyber-attack but what to do when it does takes place:

  1. Conduct a cyber security risk assessment and consider an external accreditation through Cyber Essentials. The ICO noted that in October 2019, Tuckers was assessed against the Cyber Essentials criteria and found to have failed to meet crucial aspects. The fact that some 10 months later it had still not resolved this issue was, in the Commissioner’s view, sufficient to constitute a negligent approach to data security obligations.
  2. Making sure everyone in your organisation knows the risks of malware/ransomware and follows good security practice. Our GDPR Essentials e learning solution contains a module on keeping data safe.
  3. Have plans in place for a cyber security breach. See our Managing Personal Data Breaches workshop

More useful advice in the ICO’s guidance note on ransomeware and DP compliance.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in April.

advanced_cert

The Facebook Data Breach Fine Explained

2000px-F_icon.svg-2

 

On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998.  In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case.  For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.

The Facts

In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:

  • Their public FB profile, date of birth and current city
  • Photographs they were tagged in
  • Pages they liked
  • Posts on their time lime and their news feed posts
  • Friends list
  • Facebook messages (there was evidence to suggest the app also accessed the content of the messages)

The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.

The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).

Facebook Fine Graphic

In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information.  The App remained in operation until May 2015.

Breach of the DPA

The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.

The FB companies had breached s 4 (4) DPA 1998  by failing to comply with the 1stand 7th data protection principles. They had:

  1. Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly  given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
  2. Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing.  The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing.  (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.

Breach of FB Platform Policy

Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:

  • Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
  • Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
  • The App required permission from users to obtain personal data that the App did not need in breach of the policy.

The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.

Joint Data Controllers

The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.

The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.

The Use of Data Analytics for Political Purposes

The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.

As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.

Susan Wolf will be delivering these upcoming workshops and the forthcoming FOI: Contracts and Commercial Confidentiality workshop which is taking place on the 10th December in London. 

Our 2019 calendar is now live. We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. 

Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.

LGL Advert

 

ICO Refuses to Disclose GDPR Policy Document for Special Categories Data

Screen Shot 2018-08-28 at 21.59.50

In the months leading up to 25th May 2018, data controllers will have been working like Trojans to become GDPR compliant. Data Protection Officers may have been pulling their hair out at the length of their ‘to do lists’.  Not least, working out what their lawful basis or processing is, drafting Privacy Notices in clear and plain English, reviewing their subject access and breach notification procedures and training staff.

Add to all of that the additional requirements imposed by the Data Protection Act 2018 to have an ‘appropriate policy’ in place in relation to the processing of certain special category personal data and personal data relating to criminal convictions.  Specifically s. 10 DPA requires that processing special category data meets the conditions in Part 1-3 of Schedule 1. This in turn also requires that in certain circumstances the data controller must have an ‘appropriate policy document in place’. [1]  Schedule 1, Part 4 provides some limited guidance on what must be in the policy document. The document must explain the controller’s procedures for securing compliance with the principles in Article 5 of the GDPR in connection with the processing of the personal data.  It must also explain the controller’s policies in relation to the retention and erasure of personal data processed in reliance of the condition.

This new requirement may not have been the foremost concern for every data controller and it is possible or even likely that policies may still be in draft as DPOs work out what to include in their documents.  The ICO has not, as yet, issued any guidance on these policy documents and so this no doubt will present challenges for many DPOs. . Perhaps the requirement is also presenting challenges for the ICO, because at the time of writing, the ICO is unwilling to publish its own Policy Document.

The request and the refusal

On 19th July the ICO received a request for a copy of its ‘Policy designed to show compliance with Schedule 1, Part 4 of the DPA 2018.’  Although the applicant did not explain why they wanted it (and as FOIA practitioners know, the regime is purpose blind), there can be little doubt that many data controllers would find the ICO’s own Policy Document a very useful guide to the scope and content of such a policy.  Additionally it is important that the public, and indeed ICO employees, are made aware of how the ICO itself will process special category and criminal conviction data.

On August 17th 2018 the ICO refused the request, citing the s 22 FOIA exemption (information held with a view to future publication).  S 22 provides that information is exempt information if:

  • the information is held by the public authority with a view to its publication, by the authority or any other person, at some future date (whether determined or not),
  • the information was already held with a view to such publication at the time the request for information was made, and
  • it is reasonable in all the circumstances that the information should be withheld from disclosure until the date referred to in paragraph (a).

S 22 is a qualified exemption and requires a determination of the public interest.

Sadly, the ICO’s Refusal Notice falls short of the ‘best practice’ that one should reasonably expect from the FOIA regulator.

  • The refusal notice offers no explanation of why the ICO believes it is reasonable in all the circumstances to withhold disclosure until some future date. The ICO has failed to follow its own guidance on the s 22 exemption in not even addressing this point. In fact it is arguable that by not considering this, the exemption is not engaged.
  • It fails to provide any indication of a future intended date for publication.  Although there is no requirement under the FOIA to do this, given the level of interest surrounding the new Data Protection Act it is difficult to see why the ICO did not seek to offer some indication of the intended future publication date.  It also neglects the ICO’s own advice on the s 22 exemption, that  is good practice to provide the requestor with an anticipated date of publication.
  • It fails to adequately explain the public interest factors that have been taken into account.

Weak and generic public interest assessment

The public interest test requires an assessment of whether:

In all the circumstances of the case, the public interest in maintaining the exemption outweighs the public interest in disclosing the information.

This requires a particular attention to the ‘circumstances of the case’. In one of its earliest judgments the Information Tribunal emphasised that a public authority must ask ‘is the balance of public interest in favour of maintain the exemption in relation to this information and in the circumstances of this case?’. [2] The ICO refusal notice is however generic and lacks any explicit reference to the information requested or the particular circumstances surrounding this document.

In favour of disclosure the ICO simply states that there is a public interest in transparency being demonstrated by disclosure and a legitimate interest in the compliance of the ICO with the legislation it regulates. It could have added more weight to this side of the equation. For instance, it could have supplemented these rather generic assertions by making explicit reference to the first Principle in Article 5 (1) GDPR, that data should be processed in a transparent manner. It might also have used different language recognising a ‘strong’ (rather than legitimate) public interest in ensuring that the ICO complies with the legislation it regulates, particularly given the gravity of non-compliance.

In favour of withholding the information the ICO cites three points, again without elaboration or reference to the specifics of the case.

First it states that ‘transparency is achieved through the pro-active publication of information on the web site’. Simply stating this falls well short of explaining how it is not in the public interest to disclose earlier than planned. Given that the information is going to be published at some future date, the public interest test should really consider why it is not in the public interest to publish earlier than planned. This is not addressed by the ICO.

Second, the ICO cites ‘the impact on ICO resources if we were to respond individually to requests for information that is due to be published’. This again appears to be something of a blanket refusal and fails to take into account the specific information that is being requested.

Finally, the ICO cites there is no pressing public interest in disclosing the information early. The refusal notice does not offer any reason in support of why it would not be in the public interest to disclose the document now. There is no explanation about why the ICO has reached this conclusion. However, perhaps more compelling is the fact that the Act has been in force for almost three months now. The ICO should have had a Policy Document in place since May 23rd 2018. In which case it is difficult to see how disclosing it now would be ‘early’. That is unless the document is still in a draft form and the ICO is not in a position to say when it might be published. Perhaps the ICO, like other data controllers is finding it a challenge to draft its Policy Document.

At the time of writing the requestor has submitted a request for an internal review.

I leave you with the ICO’s strapline; ICO, the UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.

 

Susan Wolf has over ten years experience teaching information rights practitioners on the LLM Information Rights Law & Practice at Northumbria University. She will be delivering a range of online webinars on various subjects around GDPR. 

We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. New Dates added for London!

Need to train frontline staff quickly? Try our extremely popular GDPR e-learning course.

 

[1]  In addition, under Part 3 of the DPA 2018 which implements the Law Enforcement Directive, sections 35 and 42 and Schedule 8 also require that data controllers have an appropriate policy document in place.

[2] Hogan and Oxford City Council v The Information Commissioner EA/2005/0026 & EA/2005/0030

GDPR is coming but don’t panic!

GDPR General Data Protection Regulation

The General Data Protection Regulation (GDPR)will come into force in 3 weeks time. 25thMay though is not a cliff edge; nor is it doomsday when the Information Commissioner will start wielding her 20million Euro (fine) stick!

In December, the Commissioner addressed some of the myths being peddled about GDPR:

“I‘ve even heard comparisons between the GDPR and the preparations for the Y2K Millennium Bug…

In the run up to 25 May 2018 there have been anxieties too, albeit on a less apocalyptic level. Things like we’ll be making early examples of organisations for minor breaches or reaching for large fines straight-away and that the new legislation is an unnecessary burden on organisations.

I want to reassure those that have GDPR preparations in train that there’s no need for a Y2K level of fear…”

There are a number of steps that you should be doing to prepare for GDPR. Remember, failure to have completed these tasks by 25th May will not lead to a 20 million Euro fine. However, to quote the commissioner at the ICO Conference this year, “It’s important that we all understand there is no deadline. 25th May is not the end. It is the beginning.”

  1. Raising awareness about GDPR at all levels. Our GDPR e learning course is ideal for frontline staff.
  2. Carrying out a data audit and reviewing how you address records management and information risk in your organisation.
  3. Reviewing information security polices and procedures in the light of the GDPR’s more stringent security obligations particularly breach notification.
  4. Revising privacy polices in the light of the GDPR’s more prescriptive transparency requirements. See our policy
  5. Writing polices and procedures to deal with new and revised Data Subject rights such as Data Portability and Subject Access.
  6. Considering whether you need a Data Protection Officer and if so who is going to do the job. Our GDPR certificate course is ideal for new DPOs.

Done everything? Have a go at the ICO’s GDPR Self Assessment Toolkit. Read the Commissioners full speech here.

Please get in touch if Act Now can help with your GDPR preparations. We provide audits, health checks and can offer a gap analysis, all followed by a step by step action plan!

 

GDPR: The New ICO Fees Regime

file6771267335956.jpg

25th May 2018, when the General Data Protection Regulation (GDPR) comes into force, will see the end of the current Notification regime under the Data Protection Act 1998.

Until recently, Data Controllers looked set to save a little money and the Information Commissioner’s Office (ICO) a lot of money. The ICO is currently funded partly from the annual Notification fees. In 2016 it collected more than 17 million pounds.

As predicted on this blog last year, the Government has now announced a new charging structure for Data Controllers to ensure the continued funding of the ICO. The Data Protection (Charges and Information) Regulations 2018 were laid before Parliament on 20th February 2018 and will come into effect on 25 May 2018, to coincide with the GDPR. The new regulations are made under a power contained in the Digital Economy Act 2017 (which is itself a controversial piece of legislation due to the wide ranging provisions about data sharing.) Data Processors do not have to pay any fee to the ICO but then many will be Data Controllers in their own right.

In summary there are three different tiers of fee and Data Controllers are expected to pay between £40 and £2,900 depending on the number of staff they employ and their annual turnover:

Tier 1 – Micro Organisations will pay £40

Applies to Data Controllers who have a maximum turnover of £632,000 for their financial year or no more than 10 members of staff.

Tier 2 – Small and Medium Organisations will pay £60

Applies to DataControllers who have a maximum turnover of £36 million for their financial year or no more than 250 members of staff.

Tier 3 – Large organisations will pay £2900

Applies to Data Controllers who do not meet the criteria for tier 1 or tier 2 above.

Data Controllers who currently have a registration (or notification) under the 1998 Act,  will not need to pay the new data protection fee until their registration expires. The ICO will write to them before this happens to explain what they need to do next. With regards to Data Controllers who are already registered, the ICO will decide what tier they come under based on the information it has but Controllers will always be able to challenge this. The good news is that Data Controllers choosing to pay the fee by direct debit, will receive an automatic discount of £5 at the point of payment. Every little helps!

The 2018 regulations make it clear that public authorities (e.g. councils) should categorise themselves according to staff numbers only. They do not need to take turnover into account. Furthermore, charities that are not otherwise subject to an exemption, will only be liable to pay the tier 1 fee, regardless of size or turnover.

A Data Controller processing personal data only for one or more of the following purposes is not required to pay a fee:

  • Staff administration
  • Advertising, marketing and public relations
  • Accounts and records
  • Not for profit purposes
  • Personal, family or household affairs
  • Maintaining a public register
  • Judicial functions
  • Processing personal information without an automated system such as a computer

To help Data Controllers understand the new fee regime, the ICO has produced a Guide to the Data Protection Fee.

STOP PRESS (25th May 218)

The Data Protection (Charges and Information) Regulations 2018  came into force today which give effect to the above.

Act Now can help you prepare for GDPR. Our 2018 course programme contains many more GDPR workshops and live webinars.

 Our GDPR Practitioner Certificate is proving very popular with those who need to get up to speed with GDPR as well as budding Data Protection Officers.  If you require these courses delivered at your premises, tailored to your needs, please get in touch.

Finally for frontline staff our one hour GDPR E Learning Course is ideal.

GDPR: The Rise of Information Risk?

canstockphoto25958576

By Scott Sammons

Risk Management is one of the things that many people claim to know about. Often though, their lack of knowledge is exposed when they end up either focusing on the wrong risks or creating some complicated process that educates no one and leads everyone on a merry dance. And truth be told it can be quite difficult to understand; which may explain why people switch off it or create complex processes to support the basic principles of managing risk.

However, the future is here and managing risks to information is about to go from a reasonably unknown practice into a full blown framework and way to help manage your GDPR compliance. (And selfishly as someone that has done Information Risk Management for a few years now I can finally say, “Yippeeee!”).

The General Data Protection Regulation (GDPR) is going to be implemented in May 2018. Throughout the GDPR there are references to the capturing and management of data protection risks. Combine that with the need under GDPR to demonstrate compliance, and therefore demonstrate the management of risks to that compliance, we are likely to see a quick rise in Information Risk as a discipline / practice / skill.

‘Information risk’ up until today has been a varied discipline. If you were to Google the term, or speak to any recruitment agency they would say that Information Risk was the domain of ‘Cyber Security’. Currently, outside of the NHS toolkit, the only other country wide frameworks that make reference to information risk management is ISO27000 and 27001. But not everyone goes for these, or indeed has a need to, so what we are left with, is an information risk management practice that varies greatly in approach and usefulness.

The GDPR doesn’t give you chapter and verse on how to implement it. However, it does in several areas, reference the need to do it and indeed as it starts to become embedded we will start to see further standards on what it should look like.

Firstly, and in the most obvious place, is Article 25 ‘Data Protection by Design and Default’. This article outlines the requirements for embedding Data Protection principles into the very core of new designs and ideas for products and services. Article 25(1) outlines that Data Controllers should implement appropriate technical and organisational measures to mitigate the risks posed against the rights and freedoms of the natural person by the processing proposed. Now, in order to determine what is ‘appropriate’ as a control you need to have first determined the likelihood and impact of that particular threat materialising.

Voila! A risk management process is born.

Similarly Article 35, ‘data protection impact assessment’ (DPIA) talks about a very similar process with regards to risks to Data Protection. In a DPIA, a Data Controller would assess the risks to the rights and freedoms of natural persons by the processing in scope and determine, with the DPO where appropriate, what controls should be put in place that are appropriate to the level of risk. This assessment shall contain at least;

  1. a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
  2. an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
  3. an assessment of the risks to the rights and freedoms of data subjects referred to in paragraph 1; and
  4. the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.

Or, in other words, everything that you would expect to see in a risk assessment under current risk assessment practices (especially if you already engage in information risk as a discipline).

Article 32 ‘Security of Processing’ goes a little further and states the below;

  1. Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:
  1. the pseudonymisation and encryption of personal data;
  2. the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services; 
  3. the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident; 
  4. a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.
  1. In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.

Here we see the familiar areas of Information Security Risk Management, with some little tweaks to make it relevant for GDPR. But again, the principle of knowing what your threats and vulnerabilities are so that you can assess them and then ensure your technical and organisational measures are appropriate to the level of risk. You can’t effectively know one without the other.

Another key area that risk and risk assessments come into play relates to Breach Notification in Article 33 (the Authority) and 34 (the data subject). In both articles the requirement to notify is necessary unless the breach is ‘unlikely to result in a risk to the rights and freedoms of natural persons’.

Please note however that in article 34 wording swaps this around and says the duty to inform the data subject is there if there is a high risk to the rights and freedoms of natural persons.

In other areas that either reference the need to risk manage or instances where as above only become necessary where a risk management process determines it are;

  • Prior Consultation (article 36)
  • Tasks of the Data Protection Officer (article 39)
  • Derogations for specific situations (international data transfers) (article 49)
  • Tasks of the Supervisory Authority (Article 37)
  • Tasks of the Data Protection Board (Article 70)

As we all know the GDPR is long and has the potential to become infinitely complicated depending on what processing you are doing, therefore you cannot possibly hope to comply with 100% of it 100% of the time. Find me someone that can and I’ll show you a magician. Therefore you need to ensure that you have a robust and easy to understand risk management process in place to manage your GDPR risks and determine what areas need more focus and what areas are ‘low risk’.

If you’ve not started your GDPR implementation programme yet, one thing that has worked well for me when determining where on earth to begin with this is to complete a data inventory, which includes why information is being processed, and to do a risk assessment on that inventory. What areas show up as massive gaps in current compliance let alone GDPR and what show up as minor tweaks? Once you have a reasonable level of overview you can then start to prioritise and logically see how things fit into place leading up to 2018. You can also see what areas of risk you can carry forward past May 2018 as currently there is no expectation from any of the supervisory authority that you will have / be 100% compliant by day 1.

Scott Sammons CIPP/E, AMIRMS is an experienced Data Protection & Information Risk practitioner and blogs under the name @privacyminion. He is on the Exam Board for the GDPR Practitioner Certificate.

Read more about the General Data Protection Regulation and attend our full day workshop.