GDPR and AI: The Rise of the Machines

2023 is going to be the year of AI. In January, Microsoft announced a multibillion dollar investment in OpenAI, the company behind image generation tool Dall-E and the chatbot ChatGPT. The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs.

The term “artificial intelligence” or “AI” refers to the use of machines and computer systems that can perform tasks normally requiring human intelligence. The past few years have seen rapid progress in this field. Advancements in deep learning algorithms, cloud computing, and data storage have made it possible for machines to process and analyse large amounts of data quickly and accurately. AI’s ability to interpret human language means that virtual assistants, such as Siri and Alexa, can now understand and respond to complex spoken commands at lightning speed.

The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs. Local government is using AI to simplify staff scheduling, predict demand for services and even estimate the risk of an individual committing fraud. Healthcare providers are now able to provide automated diagnoses based on medical imaging data from patients, thereby reducing wait times.

The Risks

With any major technological advance there are potential risks and downsides.  On Monday, ElevenLabs, an AI speech software company, said it had found an “increasing number of voice cloning misuse cases”. According to reports, hackers used the ElevenLabs software to create deepfake voices of famous people (including Emma Watson and Joe Rogan) making racist, transphobic and violent comments.

There are concerns about the impact of AI on employment and the future of work. In April 2021, the Court of Amsterdam ordered that Uber reinstate taxi drivers in the UK and Portugal who had been dismissed by “robo firing”; the use of an algorithm to make a decision about dismissal with no human involvement. The Court concluded that Uber’s had made the decisions “based solely on automated processing” within the meaning of Article 22(1) of the GDPR. It was ordered to reinstate the drivers’ accounts and pay them compensation.

As well as ethical questions about the use of AI in decision-making processes that affect people’s lives, AI-driven algorithms may lead to unintended biases or inaccurate decisions if not properly monitored and regulated. In 2021 the privacy pressure group, NOYB, filed a GDPR complaint against Amazon, claiming that Amazon’s algorithm discriminates against some customers by denying them the opportunity to pay for items by monthly invoice.

There is also a risk that AI is deployed without consideration of the privacy implications. In May 2022, the UK Information Commissioner’s Office fined Clearview AI Inc more than £7.5 million for breaches of GDPR. Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. The company, which describes itself as the “World’s Largest Facial Network”, allows customers, including the police, to upload an image of a person to its app, which then uses AI to check it against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Practical Steps

Recently, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using algorithms or AI:

1. Take a data protection by design and default approach

Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.

2. Be transparent with people about how you are using their data

Local authorities should regularly review their privacy policies, to ensure they comply with Articles 13 and 14, and identify areas for improvement. They should also bring any new uses of individuals’ personal data to their attention.

3. Identify the potential risks to people’s privacy

Local authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. 

In April 2021, the European Commission presented its proposal for a Regulation to harmonise rules for AI, also known as the “AI Act of the European Union’. Whilst there is still a long way to go before this proposal becomes legislation, it could create an impetus for the UK to further regulate AI. 

Use of AI has enormous benefits. It does though have a potential to adversely impact people’s lives and deny their fundamental rights. As such, understanding the implications of AI technology and how to use it in a fair and lawful manner is critical for data protection/information governance officers to understand. 

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

AI and Data Protection: Is ‘Cortana’ such a problem?

‘AI’ and/or ‘Machine Learning’ as it’s known is becoming more prevalent in the working environment. From ‘rogue algorithms’ upsetting GCSE gradings through to Microsoft 365 judging you for only working on one document all day, we cannot escape the fact that there are more ‘automated’ services than ever before.  

For DPOs, records managers and IG officers, this poses some interesting challenges to the future of records, information and personal data.  

I was asked to talk about the challenges of AI and machine learning at a recent IRMS Public Sector Group webinar. In the session titled ‘IRM challenges of AI & something called the ‘metaverse’ we looked at a range of issues, some of which I’d like to touch on a little bit below. While I remain unconvinced the ‘metaverse’ is going to arrive any time soon, AI and algorithms very much are here and are growing fast.  

From a personal data and privacy point of view, we know that algorithms guide our online lives. From what adverts we see to what posts come up on our feed on Twitter, Facebook etc. How is that not creepy that this algorithm knows more about me than I do? And how does that go before it has mass implications for everyday citizens. What happens if the ‘social algorithm’ works out your sexuality before you or your family has? I work with families that still to this day will abuse and cast out those that are LGBTQ+, so imagine the damage a ‘we thought you’d like this’ post would do.  

Interesting questions have been posed on Twitter about ‘deep fake’ videos and whether they are personal data. The answers are mixed and pose some interesting implications for the future. Can you imagine the impact if someone can use AI to generate a video of you doing something you are not meant to? That’s going to take some doing to undo by which time, the damage is done. If you want to see this in action, I’d recommend watching Season 2 of ‘Capture’ on BBC iPlayer. 

In an organisational context, if organisations are to use algorithms to help with workload and efficient services it is simple logic that the algorithm must be up to scratch. As the Borg Queen (a cybernetic alien) in Star Trek First Contact once said to Data (a self aware android) “you are an imperfect being, created by an imperfect being. Finding your weakness is only a matter of time”. If anyone can find me a perfectly designed system that doesn’t have process issues, bugs and reliability issues, do let me know.  

Many data scientists and other leading data commentators like Cathy O’Neill frequently state that “Algorithms are basically opinions embedded in code”. And opinions bring with them biases, shortcomings and room for error.  

Now, that is not to say that these things do not have their advantages – they very much do. However, in order to get something good out of them you need to ensure good stuff goes into them and good stuff helps create them. Blindly building and feeding a machine just because it’s funky and new, as we have all seen time and again, always leads to trouble.  

Myself and Olu attended (me virtually and Olu in person) the launch of the AI Standards Hub by the Alan Turning Institute (and others). A fascinating initiative by the ATI and others, including UK Government.  

Now why am I talking about this at the moment? Put simply, as I mentioned above, this technology is here and is not going anywhere. Take look at this company offering live editing of your voice, and you may even find this conversation between a google engineer and an AI quite thought provoking and sometimes scary. If anything, AI is evolving at an ever growing pace. Therefore information professionals from all over the spectrum need to be aware of how it works, how it can be used in your organisation, and how you can upskill to challenge and support it.  

In recent times the ICO has been publishing a range of guidance on this, including a relatively detailed guide on how to use AI and consider Data Protection implications. While it’s not a user manual it does give some key points to consider and steps to go through. 

Right Alexa, end my blog! oh I mean, Hey Siri, end my blog… Darn… OK Google…

If you are interested in learning more about the IRM & DP challenges with ‘AI’ and upskilling as a DPO, Records or Information Governance Manager then check out Scott’s workshop on Artificial Intelligence and Machine Learning, How to implement Good Information Governance. Book your place for 17th November now. 

Three New GDPR Workshops from Act Now Training

Act Now Training is pleased to announce three new additions to our GDPR workshop series

Data ethics is increasingly relevant to the role of information professionals. Just because the processing of personal data is lawful does not make it fair or ‘ethical’. And indeed, where something is fair it does not always mean it is lawful. Whilst the UK GDPR gives us some structure for working out what is a fair and proportionate use of personal data (and thus ethical), there can be a wide range of issues outside of the law to consider.  

Our Data Ethics workshop will explore what the term ‘Data Ethics’ actually means, the role it plays in the use of personal data (and indeed other data) and what practical steps information professionals can take to embed and promote data ethics within their organisations. From how to consider data ethics in DPIAs and sharing requests, through to embedding a practical data ethics framework in your organisation, we will pose questions, share experiences and best practice and where to find further guidance and support. 

A subject which has many ethical considerations is the use of Artificial Intelligence (also known as AI) and Machine Learning. AI is not coming; it is here. Whether ordering a taxi or submitting your tax return, AI is operating in the background. AI and Machine Learning have the capacity to improve our lives but, like all technologies, they have the potential to ruin lives too.  

Our new workshop, How to implement Good Information Governance into Artificial Intelligence & Machine Learning Projects, will explore exactly what ‘AI’ and ‘Machine Learning’ are and how they are starting to appear in the working environment. We will also explore the common challenges that these present focussing on GDPR as well as other information governance and records management issues.  Delegates will leave the workshop with practical ideas for how to approach Machine Learning and AI as well as awareness of key resources, current best practice and how they can keep up to date about a fast-developing area of technology. Think that AI is something for future generations to deal with? This workshop will make you think again!

The concepts of controller, joint controller and processor play a crucial role in the application of GDPR. They determine who is responsible for compliance with different data protection rules and how data subjects can exercise their rights in practice.  The precise meaning of these concepts and the criterion for their correct interpretation is the subject of much confusion. Incorrect interpretation can lead to the wrong allocation of data protection responsibilities leading to disputes when things go wrong. 

Our new workshop, Data Controller, Processor or Joint Controller: What am I?, will help both controllers and processors to understand their responsibilities and liabilities under GDPR and how to structure their relationships. This interactive workshop will explain the key differences between data controllers, joint controllers and data processors and what the roles and responsibilities are for each. By the end of this workshop, delegates will gain the confidence to decide on what an organisation’s role is under GDPR and how to manage the different relationships.

At Act Now we are always keen to hear from information governance professionals. If you have ideas for new workshops, or are interested in running one, please get in touch.

GDPR News Roundup

So much has happened in the world of data protection recently. Where to start?

International Transfers

In April, the European Data Protection Board’s (EDPB) opinions (GDPR and Law Enforcement Directive (LED)) on UK adequacy were adopted. The EDPB has looked at the draft EU adequacy decisions. It acknowledge that there is alignment between the EU and UK laws but also expressed some concerns. It has though issued a non-binding opinion recommending their acceptance. If accepted the two adequacy decisions will run for an initial period of four years. More here.

Last month saw the ICO’s annual data protection conference go online due to the pandemic. Whilst not the same as a face to face conference, it was still a good event with lots of nuggets for data protection professionals including the news that the ICO is working on bespoke UK standard contractual clauses (SCCs) for international data transfers. Deputy Commissioner Steve Wood said: 

“I think we recognise that standard contractual clauses are one of the most heavily used transfer tools in the UK GDPR. We’ve always sought to help organisations use them effectively with our guidance. The ICO is working on bespoke UK standard clauses for international transfers, and we intend to go out for consultation on those in the summer. We’re also considering the value to the UK for us to recognise transfer tools from other countries, so standard data transfer agreements, so that would include the EU’s standard contractual clauses as well.”

Lloyd v Google 

The much-anticipated Supreme Court hearing in the case of Lloyd v Google LLC took place at the end of April. The case concerns the legality of Google’s collection and use of browser generated data from more than 4 million+ iPhone users during 2011-12 without their consent.  Following the two-day hearing, the Supreme Court will now decide, amongst other things, whether, under the DPA 1998, damages are recoverable for ‘loss of control’ of data without needing to identify any specific financial loss and whether a claimant can bring a representative action on behalf of a group on the basis that the group have the ‘same interest’ in the claim and are identifiable. The decision is likely to have wide ranging implications for representative actions, what damages can be awarded for and the level of damages in data protection cases. Watch this space!

Ticketmaster Appeal

In November 2020, the ICO fined Ticketmaster £1.25m for a breach of Articles 5(1)(f) and 32 GPDR (security). Ticketmaster appealed the penalty notice on the basis that there had been no breach of the GDPR; alternatively that it was inappropriate to impose a penalty, and that in any event the sum was excessive. The appeal has now been stayed by the First-Tier Tribunal until 28 days after the pending judgment in a damages claim brought against Ticketmaster by 795 customers: Collins & Others v Ticketmaster UK Ltd (BL-2019-LIV-000007). 

Age Appropriate Design Code

This code came into force on 2 September 2020, with a 12 month transition period. The Code sets out 15 standards organisations must meet to ensure that children’s data is protected online. It applies to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have the best possible access to online services whilst minimising data collection and use.

With less than four months to go (2 September 2021) the ICO is urging organisations and businesses to make the necessary changes to their online services and products. We are planning a webinar on the code. Get in touch if interested.

AI and Automated Decision Making

Article 22 of GDPR provides protection for individuals against purely automated decisions with a legal or significant impact. In February, the Court of Amsterdam ordered Uber, the ride-hailing app, to reinstate six drivers who it was claimed were unfairly dismissed “by algorithmic means.” The court also ordered Uber to pay the compensation to the sacked drivers.

In April EU Commission published a proposal for a harmonised framework on AI. The framework seeks to impose obligations on both providers and users of AI. Like the GDPR the proposal includes fine levels and an extra-territorial effect. (Readers may be interested in our new webinar on AI and Machine Learning.)

Publicly Available Information

Just because information is publicly available it does not provide a free pass for companies to use it without consequences. Data protection laws have to be complied with. In November 2020, the ICO ordered the credit reference agency Experian Limited to make fundamental changes to how it handles personal data within its direct marketing services. The ICO found that significant ‘invisible’ processing took place, likely affecting millions of adults in the UK. It is ‘invisible’ because the individual is not aware that the organisation is collecting and using their personal data. Experian has lodged an appeal against the Enforcement Notice.

Interesting that recently the Spanish regulator has fined another credit reference agency, Equifax, €1m for several failures under the GDPR. Individuals complained about Equifax’s use of their personal data which was publicly available. Equifax had also failed to provide the individuals with a privacy notice. 

Data Protection by Design

The Irish data protection regulator issued its largest domestic fine recently. Irish Credit Bureau (ICB) was fined €90,000 following a change in the ICB’s computer code in 2018 resulted in 15,000 accounts having incorrect details recorded about their loans before the mistake was noticed. Amongst other things, the decision found that the ICB infringed Article 25(1) of the GDPR by failing to implement appropriate technical and organisational measures designed to implement the principle of accuracy in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects (aka DP by design and by default). 

Data Sharing 

The ICO’s Data Sharing Code of Practice provides organisations with a practical guide on how to share personal data in line with data protection law. Building on the code, the ICO recently outlined its plans to update its guidance on anonymisation and pseudonymisation, and to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing.

UK GDPR Handbook

The UK GDPR Handbook is proving very popular among data protection professionals.

It sets out the full text of the UK GDPR laid out in a clear and easy to read format. It cross references the EU GDPR recitals, which also now form part of the UK GDPR, allowing for a more logical reading. The handbook uses a unique colour coding system that allows users to easily identify amendments, insertions and deletions from the EU GDPR. Relevant provisions of the amended DPA 2018 have been included where they supplement the UK GDPR. To assist users in interpreting the legislation, guidance from the Information Commissioner’s Office, Article 29 Working Party and the European Data Protection Board is also signposted. Read what others have said:

“A very useful, timely, and professional handbook. Highly recommended.”

“What I’m liking so far is that this is “just” the text (beautifully collated together and cross-referenced Articles / Recital etc.), rather than a pundits interpretation of it (useful as those interpretations are on many occasions in other books).”

“Great resource, love the tabs. Logical and easy to follow.”

Order your copy here.

These and other GDPR developments will also be discussed in detail in our online GDPR update workshop next week.

%d bloggers like this: