Apprentice Case Study – Meet Natasha

In 2022, Act Now Training teamed up with Damar to deliver the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first cohort about to undertake the end point assessment.

Data Protection and Information Governance Apprentice, Natasha Lock, is an integral part of the Governance and Compliance team at the University of Lincoln. With the Data Protection and Digital Information (No.2) Bill set to make changes to the UK data protection regime, Natasha talks to us about why this is a great area to work in and how the apprenticeship route has been particularly beneficial for her.

How did you get onto the apprenticeship?

“I was already working at the university as an Information Compliance Officer when the opportunity for a staff apprenticeship came up.

“The process was swift and straightforward, and I was enrolled on the Data Protection and Information Governance Apprenticeship within three months of enquiring.”

How has the apprenticeship helped you?

“I started with a good understanding of the UK Data Protection legislation but my knowledge has grown significantly, and now I’m coming to the end of my level 4 apprenticeship, I’ve gained so much more insight and my confidence has grown.

“As a university, we hold vast amounts of data. My apprenticeship is allowing me to solve the challenge of data retention and implement better measures to retain, destroy and archive information. I have developed a greater understanding of the legislative requirements we must adhere to as a public sector institute and how to reduce and assess data protection risks.

“I love the fact that I can study whilst still doing my job. The flexibility works for me because I can go through course materials at my own pace. I really feel like I have a brilliant work/life/study balance.

“The University of Lincoln and Damar Training have been fantastic in supporting me. I get along with my coach, Tracey, so well. She is very friendly and personable and has enabled my creativity to flow.

“The course is very interactive, and I’ve found the forums with other apprentices to be a very useful way of sharing knowledge, ideas and stories.

“I’m enjoying it so much and people have noticed that my confidence has grown. I wouldn’t have had that without doing this apprenticeship. I’ve now got my sights on doing a law degree or law apprenticeship in the future.”

Abi Slater, Information Compliance Manager at Lincoln University, said: “It has been great to see how much Natasha has developed over the course of the apprenticeship. I believe the apprenticeship has provided Natasha with the knowledge and skills required to advance in her data protection career and the support from her coach at Damar Training has been excellent.

“I would encourage anyone with an interest in data protection and information governance to consider this apprenticeship.”

Tracey Coetzee, Coach at Damar Training said: “The Data Protection and Information Governance Apprenticeship was only approved by the Institute of Apprenticeships in 2022, and its delightful to see apprentices flourishing on the programme.

“From cyber security to managing data protection risks, this programme is upskilling participants and adding value to both private and public sector organisations and we’re thrilled to see the first cohort, including Natasha, approach the completion of their training.”

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

GDPR and AI: The Rise of the Machines

2023 is going to be the year of AI. In January, Microsoft announced a multibillion dollar investment in OpenAI, the company behind image generation tool Dall-E and the chatbot ChatGPT. The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs.

The term “artificial intelligence” or “AI” refers to the use of machines and computer systems that can perform tasks normally requiring human intelligence. The past few years have seen rapid progress in this field. Advancements in deep learning algorithms, cloud computing, and data storage have made it possible for machines to process and analyse large amounts of data quickly and accurately. AI’s ability to interpret human language means that virtual assistants, such as Siri and Alexa, can now understand and respond to complex spoken commands at lightning speed.

The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs. Local government is using AI to simplify staff scheduling, predict demand for services and even estimate the risk of an individual committing fraud. Healthcare providers are now able to provide automated diagnoses based on medical imaging data from patients, thereby reducing wait times.

The Risks

With any major technological advance there are potential risks and downsides.  On Monday, ElevenLabs, an AI speech software company, said it had found an “increasing number of voice cloning misuse cases”. According to reports, hackers used the ElevenLabs software to create deepfake voices of famous people (including Emma Watson and Joe Rogan) making racist, transphobic and violent comments.

There are concerns about the impact of AI on employment and the future of work. In April 2021, the Court of Amsterdam ordered that Uber reinstate taxi drivers in the UK and Portugal who had been dismissed by “robo firing”; the use of an algorithm to make a decision about dismissal with no human involvement. The Court concluded that Uber’s had made the decisions “based solely on automated processing” within the meaning of Article 22(1) of the GDPR. It was ordered to reinstate the drivers’ accounts and pay them compensation.

As well as ethical questions about the use of AI in decision-making processes that affect people’s lives, AI-driven algorithms may lead to unintended biases or inaccurate decisions if not properly monitored and regulated. In 2021 the privacy pressure group, NOYB, filed a GDPR complaint against Amazon, claiming that Amazon’s algorithm discriminates against some customers by denying them the opportunity to pay for items by monthly invoice.

There is also a risk that AI is deployed without consideration of the privacy implications. In May 2022, the UK Information Commissioner’s Office fined Clearview AI Inc more than £7.5 million for breaches of GDPR. Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. The company, which describes itself as the “World’s Largest Facial Network”, allows customers, including the police, to upload an image of a person to its app, which then uses AI to check it against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Practical Steps

Recently, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using algorithms or AI:

1. Take a data protection by design and default approach

Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.

2. Be transparent with people about how you are using their data

Local authorities should regularly review their privacy policies, to ensure they comply with Articles 13 and 14, and identify areas for improvement. They should also bring any new uses of individuals’ personal data to their attention.

3. Identify the potential risks to people’s privacy

Local authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. 

In April 2021, the European Commission presented its proposal for a Regulation to harmonise rules for AI, also known as the “AI Act of the European Union’. Whilst there is still a long way to go before this proposal becomes legislation, it could create an impetus for the UK to further regulate AI. 

Use of AI has enormous benefits. It does though have a potential to adversely impact people’s lives and deny their fundamental rights. As such, understanding the implications of AI technology and how to use it in a fair and lawful manner is critical for data protection/information governance officers to understand. 

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

Data Protection Impact Assessments under GDPR

CJgbrkzUwAAJSZA

The General Data Protection Regulation (GDPR) will come into force in about 10 months. There is plenty to learn and do before then including:

  1. Raising awareness about GDPR at all levels
  2. Reviewing how you address records management and information risk in your organisation.
  3. Reviewing compliance with the existing law as well as the six new DP Principles.
  4. Revising privacy polices in the light of the GDPR’s more prescriptive transparency requirements.
  5. Reviewing information security polices and procedures in the light of the GDPR’s more stringent security obligations particularly breach notification.
  6. Writing polices and procedures to deal with new and revised Data Subject rights such as Data Portability and Subject Access.
  7. Considering whether you need a Data Protection Officer and if so who is going to do the job.
    As well as:
  8. Considering when you will need to do a Data Protection Impact Assessment (DPIA).

Article 35 of GDPR introduces this concept. DPIAs (also known as Privacy Impact Assessments) are a tool which can help Data Controllers identify the most effective way to comply with their GDPR obligations and reduce the risks of harm to individuals through the misuse of their personal information. A well-managed DPIA will allow Data Controllers to identify and fix problems at an early stage, reducing the associated costs and damage to reputation, which might otherwise occur.

DPIAs are important tools for accountability, as they help Data Controllers not only to comply with requirements of the GDPR, but also to demonstrate that appropriate measures have been taken to ensure compliance (see Article 24)4.)

When is a DPIA needed?

Carrying out a DPIA is not mandatory for every processing operation. A DPIA is only required when the processing is “likely to result in a high risk to the rights and freedoms of natural persons” (Article 35(1)).

Such processing, according to Article 35(3)), includes (but is not limited to):

  • systematic and extensive processing activities, including profiling and where decisions that have legal effects – or similarly significant effects – on individuals.
  • large scale processing of special categories of data or personal data relating to criminal convictions or offences.
  • large scale, systematic monitoring of public areas (CCTV).

So what other cases will involve “high risk” processing that may require a DPIA? In May, the Article 29 Working Party published its data protection impact assessment guidelines for comments. We are still waiting for the final version but I don’t think its is going to change much. It sets out the criteria for assessing whether processing is high risk. This includes processing involving:

  1. Evaluation or scoring, including profiling and predicting especially from aspects concerning the Data Subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements
  2. Automated decision-making with legal or similar significant effects
  3. Systematic monitoring of individuals
  4. Sensitive data
  5. Personal Data on a large scale
  6. Datasets that have been matched or combined
  7. Data concerning vulnerable Data Subjects
  8. Innovative use or application of technological or organisational solutions
  9. Data transfers across borders outside the European Union
  10. Data that Prevents Data Subjects from exercising a right or using a service or a contract

What information should the DPIA contain?

The GDPR sets out the minimum features of a DPIA (Article 35(7), and Recitals 84 and 90):

  • A description of the processing operations and the purposes, including, where applicable, the legitimate interests pursued by the Data Controller.
  • An assessment of the necessity and proportionality of the processing in relation to the purpose.
  • An assessment of the risks to individuals.
  • The measures in place to address risk, including security, and to demonstrate that the Data Controller is complying with GDPR.

A DPIA can address more than one project.

The ICO’s Code of Practice on Privacy Impact Assessments will assist as well as the Irish Data Protection Commissioner’s Guidance.

When should a DPIA be conducted?

DPIA’s should be conducted prior to the processing operation commencing. DPIAs are an integral part of taking a Privacy by Design approach which is emphasised in Article 25. The DPIA should be treated as a continual process, not a one-time exercise. Data Controllers should start it early and update it throughout the lifecycle of the project.

The GDPR comes into force on 25th May 2018, and DPIAs are legally mandatory only for processing operations that are initiated after this date. Nevertheless, the Article 29 Working Party strongly recommends carrying out DPIAs for all high-risk operations prior to this date.

Who should conduct the DPIA?

A DPIA may be conducted by the Data Controller’s own staff or an external consultant. Of course the Data Controller remains liable for ensuring it is done correctly. The Data Protection Officer’s advice, if one has been designated, must also be sought as well as the views (if appropriate) of Data Subjects or their representatives.

If the DPIA suggests that any identified risks cannot be managed and the residual risk remains high, the Data Controller must consult with the Information Commissioner before moving forward with the project. Regardless of whether or not consultation with the ICO is required, the Data Controller’s obligations of retaining a record of the DPIA and updating the DPIA in due course remain.

Even if ICO consultation is not required, the DPIA may be reviewed by the ICO at a later date in the event of an audit or investigation arising from the Data Controller’s use of personal data.

What are the risks of non-compliance?

Failure to carry out a DPIA when the processing is subject to a DPIA (Article 35(1) and (3)), carrying out a DPIA in an incorrect way (Article 35(2) and (7) to (9)), or failing to consult the ICO where required (Article 36(3)(e)), can each result in an administrative fine of up to 10 million Euros, or in the case of an undertaking, up to 2% of the total worldwide annual turnover of the preceding financial year, whichever is higher.

More about Data Protection Impact Assesments in our forthcoming webinar.

Let Act Now help with your GDPR preparations. Our full day workshops and GDPR Practitioner Certificate (GDPR.Cert) courses are filling up fast. We also offer a GDPR health check service in which we can come carry out an audit and help you prepare and fill any weaknesses.

 

Image credits: https://privacy.org.nz/blog/toolkit-helps-assess-your-privacy-impact/

 

GDPR: The Rise of Information Risk?

canstockphoto25958576

By Scott Sammons

Risk Management is one of the things that many people claim to know about. Often though, their lack of knowledge is exposed when they end up either focusing on the wrong risks or creating some complicated process that educates no one and leads everyone on a merry dance. And truth be told it can be quite difficult to understand; which may explain why people switch off it or create complex processes to support the basic principles of managing risk.

However, the future is here and managing risks to information is about to go from a reasonably unknown practice into a full blown framework and way to help manage your GDPR compliance. (And selfishly as someone that has done Information Risk Management for a few years now I can finally say, “Yippeeee!”).

The General Data Protection Regulation (GDPR) is going to be implemented in May 2018. Throughout the GDPR there are references to the capturing and management of data protection risks. Combine that with the need under GDPR to demonstrate compliance, and therefore demonstrate the management of risks to that compliance, we are likely to see a quick rise in Information Risk as a discipline / practice / skill.

‘Information risk’ up until today has been a varied discipline. If you were to Google the term, or speak to any recruitment agency they would say that Information Risk was the domain of ‘Cyber Security’. Currently, outside of the NHS toolkit, the only other country wide frameworks that make reference to information risk management is ISO27000 and 27001. But not everyone goes for these, or indeed has a need to, so what we are left with, is an information risk management practice that varies greatly in approach and usefulness.

The GDPR doesn’t give you chapter and verse on how to implement it. However, it does in several areas, reference the need to do it and indeed as it starts to become embedded we will start to see further standards on what it should look like.

Firstly, and in the most obvious place, is Article 25 ‘Data Protection by Design and Default’. This article outlines the requirements for embedding Data Protection principles into the very core of new designs and ideas for products and services. Article 25(1) outlines that Data Controllers should implement appropriate technical and organisational measures to mitigate the risks posed against the rights and freedoms of the natural person by the processing proposed. Now, in order to determine what is ‘appropriate’ as a control you need to have first determined the likelihood and impact of that particular threat materialising.

Voila! A risk management process is born.

Similarly Article 35, ‘data protection impact assessment’ (DPIA) talks about a very similar process with regards to risks to Data Protection. In a DPIA, a Data Controller would assess the risks to the rights and freedoms of natural persons by the processing in scope and determine, with the DPO where appropriate, what controls should be put in place that are appropriate to the level of risk. This assessment shall contain at least;

  1. a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
  2. an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
  3. an assessment of the risks to the rights and freedoms of data subjects referred to in paragraph 1; and
  4. the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.

Or, in other words, everything that you would expect to see in a risk assessment under current risk assessment practices (especially if you already engage in information risk as a discipline).

Article 32 ‘Security of Processing’ goes a little further and states the below;

  1. Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:
  1. the pseudonymisation and encryption of personal data;
  2. the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services; 
  3. the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident; 
  4. a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.
  1. In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.

Here we see the familiar areas of Information Security Risk Management, with some little tweaks to make it relevant for GDPR. But again, the principle of knowing what your threats and vulnerabilities are so that you can assess them and then ensure your technical and organisational measures are appropriate to the level of risk. You can’t effectively know one without the other.

Another key area that risk and risk assessments come into play relates to Breach Notification in Article 33 (the Authority) and 34 (the data subject). In both articles the requirement to notify is necessary unless the breach is ‘unlikely to result in a risk to the rights and freedoms of natural persons’.

Please note however that in article 34 wording swaps this around and says the duty to inform the data subject is there if there is a high risk to the rights and freedoms of natural persons.

In other areas that either reference the need to risk manage or instances where as above only become necessary where a risk management process determines it are;

  • Prior Consultation (article 36)
  • Tasks of the Data Protection Officer (article 39)
  • Derogations for specific situations (international data transfers) (article 49)
  • Tasks of the Supervisory Authority (Article 37)
  • Tasks of the Data Protection Board (Article 70)

As we all know the GDPR is long and has the potential to become infinitely complicated depending on what processing you are doing, therefore you cannot possibly hope to comply with 100% of it 100% of the time. Find me someone that can and I’ll show you a magician. Therefore you need to ensure that you have a robust and easy to understand risk management process in place to manage your GDPR risks and determine what areas need more focus and what areas are ‘low risk’.

If you’ve not started your GDPR implementation programme yet, one thing that has worked well for me when determining where on earth to begin with this is to complete a data inventory, which includes why information is being processed, and to do a risk assessment on that inventory. What areas show up as massive gaps in current compliance let alone GDPR and what show up as minor tweaks? Once you have a reasonable level of overview you can then start to prioritise and logically see how things fit into place leading up to 2018. You can also see what areas of risk you can carry forward past May 2018 as currently there is no expectation from any of the supervisory authority that you will have / be 100% compliant by day 1.

Scott Sammons CIPP/E, AMIRMS is an experienced Data Protection & Information Risk practitioner and blogs under the name @privacyminion. He is on the Exam Board for the GDPR Practitioner Certificate.

Read more about the General Data Protection Regulation and attend our full day workshop.