GDPR and AI: The Rise of the Machines

2023 is going to be the year of AI. In January, Microsoft announced a multibillion dollar investment in OpenAI, the company behind image generation tool Dall-E and the chatbot ChatGPT. The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs.

The term “artificial intelligence” or “AI” refers to the use of machines and computer systems that can perform tasks normally requiring human intelligence. The past few years have seen rapid progress in this field. Advancements in deep learning algorithms, cloud computing, and data storage have made it possible for machines to process and analyse large amounts of data quickly and accurately. AI’s ability to interpret human language means that virtual assistants, such as Siri and Alexa, can now understand and respond to complex spoken commands at lightning speed.

The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs. Local government is using AI to simplify staff scheduling, predict demand for services and even estimate the risk of an individual committing fraud. Healthcare providers are now able to provide automated diagnoses based on medical imaging data from patients, thereby reducing wait times.

The Risks

With any major technological advance there are potential risks and downsides.  On Monday, ElevenLabs, an AI speech software company, said it had found an “increasing number of voice cloning misuse cases”. According to reports, hackers used the ElevenLabs software to create deepfake voices of famous people (including Emma Watson and Joe Rogan) making racist, transphobic and violent comments.

There are concerns about the impact of AI on employment and the future of work. In April 2021, the Court of Amsterdam ordered that Uber reinstate taxi drivers in the UK and Portugal who had been dismissed by “robo firing”; the use of an algorithm to make a decision about dismissal with no human involvement. The Court concluded that Uber’s had made the decisions “based solely on automated processing” within the meaning of Article 22(1) of the GDPR. It was ordered to reinstate the drivers’ accounts and pay them compensation.

As well as ethical questions about the use of AI in decision-making processes that affect people’s lives, AI-driven algorithms may lead to unintended biases or inaccurate decisions if not properly monitored and regulated. In 2021 the privacy pressure group, NOYB, filed a GDPR complaint against Amazon, claiming that Amazon’s algorithm discriminates against some customers by denying them the opportunity to pay for items by monthly invoice.

There is also a risk that AI is deployed without consideration of the privacy implications. In May 2022, the UK Information Commissioner’s Office fined Clearview AI Inc more than £7.5 million for breaches of GDPR. Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. The company, which describes itself as the “World’s Largest Facial Network”, allows customers, including the police, to upload an image of a person to its app, which then uses AI to check it against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Practical Steps

Recently, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using algorithms or AI:

1. Take a data protection by design and default approach

Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.

2. Be transparent with people about how you are using their data

Local authorities should regularly review their privacy policies, to ensure they comply with Articles 13 and 14, and identify areas for improvement. They should also bring any new uses of individuals’ personal data to their attention.

3. Identify the potential risks to people’s privacy

Local authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. 

In April 2021, the European Commission presented its proposal for a Regulation to harmonise rules for AI, also known as the “AI Act of the European Union’. Whilst there is still a long way to go before this proposal becomes legislation, it could create an impetus for the UK to further regulate AI. 

Use of AI has enormous benefits. It does though have a potential to adversely impact people’s lives and deny their fundamental rights. As such, understanding the implications of AI technology and how to use it in a fair and lawful manner is critical for data protection/information governance officers to understand. 

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

UK GDPR Reform: Will There Be A New Consultation?

What is happening with the Government’s proposal for UK GDPR reform? Just like Donald Trump’s predicted “Red Wave” in the US Mid Term Elections, it’s turning out to be a bit of a ripple!

In July the Boris Johnson Government, published the Data Protection and Digital Information Bill. This was supposed to be the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. The government projected it would yield savings for businesses of £1billion over ten years. (Key provisions of the bill are summarised in our blog post here.)

On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”.

The Bill’s passage through Parliament was suspended. It seemed that drafters would have to go back to the drawing board to showcase even more “Brexit benefits”. There was even talk of another round of consultation. Remember the Bill is the result of an extensive consultation launched in September 2021 (“Data: A New Direction”).

Last week, Ibrahim Hasan, attended the IAPP Conference in Brussels. Owen Rowland, Deputy Director at the DCMS, told the conference that the latest “consultation” on the stalled bill will begin shortly. However he confirmed it will not be a full-blown public consultation:

“It’s important to clarify (the type of consultation). However, we are genuinely interested in continuing to engage with the whole range of stakeholders. Different business sectors as well as privacy and consumer groups,” Rowland said. “We’ll be providing details in the next couple of weeks in terms of the opportunities that we are going to particularly set up.”

The Bill may not receive a deep overhaul, but Rowland said he welcomes comments that potentially raise “amendments to (the existing proposal’s) text that we should make.” He added the consultation is being launched to avoid “a real risk” of missing important points and to provide “opportunities were not fully utilising” to gain stakeholder insights.

Rowland went on to suggest that the DCMS would conduct some roundtables. If any of our readers are invited to the aforementioned tables (round or otherwise) do keep us posted. Will it make a difference to the content of the bill? We are sceptical but time will tell. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

Ronaldo’s Data and GDPR: Who said data protection is boring?

There is an interesting story this morning on the BBC website about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.  

Now 850 players (Ed – I don’t know if Ronaldo is one of them but I could not miss the chance to mention my favourite footballer!), led by former Cardiff City manager Russell Slade, want compensation for the trading of their performance data over the past six years by various companies. They also want an annual fee from the companies for any future use. The data ranges from average goals-per-game for an outfield player to height, weight and passes during a game. 

BBC News says that an initial 17 major betting, entertainment and data collection firms have been targeted, but Slade’s Global Sports Data and Technology Group has highlighted more than 150 targets it believes have “misused” data. His legal team claim that the fact players receive no payment for the unlicensed use of their data contravenes the General Data Protection Regulation (GDPR). However, the precise legal basis of their claim is unclear. 

In an interview with the BBC, Slade said:

“There are companies that are taking that data and processing that data without the individual consent of that player.”

This suggests a claim for breach of the First Data Protection Principle (Lawfulness and Transparency). However, if the players’ personal data is provided by their clubs e.g., height, weight, performance at training sessions etc. then it may be that players have already consented (and been recompensed for this) as part of their player contract. In any event, Data Protection professionals will know that consent is only one way in which a Data Controller can justify the processing of personal data under Article 6 of GDPR. Article 6(1)(f) allows processing where it:

“is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data… .”

Of course, this requires a balancing exercise considering the interests pursued by the clubs and data companies and the impact on individual players’ privacy. Some would argue that as far as public domain information is concerned, the impact on players’ privacy is minimal. However, “the interests or fundamental rights and freedoms of the data subject’ also include reputational damage, loss of control and financial loss, all of which it could be argued result from the alleged unauthorised use of data.

The BBC article quotes former Wales international Dave Edwards, one of the players behind the move:

“The more I’ve looked into it and you see how our data is used, the amount of channels its passed through, all the different organisations which use it, I feel as a player we should have a say on who is allowed to use it.”

The above seems to suggest that the players’ argument is also about control of their personal data. The GDPR does give players rights over their data which allow them to exercise some element of control including the right to see what data is held about them, to object to its processing and to ask for it to be deleted. It may be that players are exercising or attempting to exercise these rights in order to exert pressure on the companies to compensate them.

Without seeing the paperwork, including the letters before action which have been served on the companies, we can only speculate about the basis of the claim at this stage. Nonetheless, this is an interesting case and one to watch. If the claim is successful, the implications could have far-reaching effects beyond football. Whatever happens it will get data protection being talked about on the terraces!

Ibrahim Hasan, solicitor and director of Act Now Training, has given an interview to BBC Radio 4’s (PM programme) about this story. You can listen again here (from 39) minutes onwards.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Government Consultation: Are you ready for UK GDPR 2.0?

On 10 September 2021, the UK Government launched a consultation entitled “Data: A new direction” intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” Cynics will say that it is an attempt to water down the UK GDPR just a few months after the UK received adequacy status from the European Union. 

Back in May, the Prime Ministerial Taskforce on Innovation, Growth, and Regulatory Reform (TIGRR) published a 130-page report setting out a “new regulatory framework for the UK. Saying that the current data protection regime contained too many onerous compliance requirements, it suggested that the government: 

“Replace the UK GDPR with a new, more proportionate, UK Framework of Citizen Data Rights to give people greater control of their data while allowing data to flow more freely and drive growth across healthcare, public services and the digital economy.” 

Many of the recommendations made in the TIGRR Report can be found in the latest consultation document:

Research and Re Use of Data

  • Consolidating and bringing together research-specific provisions in the UK GDPR, “bringing greater clarity to the range of relevant provisions and how they relate to each other.” 
  • Incorporating a clearer definition of “scientific research” into the legislation. 
  • Clarifying in legislation how university research projects can rely on tasks in the public interest (Article 6(1)(e) of the UK GDPR) as a lawful ground for personal data processing. 
  • Creating a new, separate lawful ground for research, subject to suitable safeguards. 
  • Clarifying in legislation that data subjects should be allowed to give their consent to broader areas of scientific research when it is not possible to fully identify the purpose of personal data processing at the time of data collection.
  • Stating explicitly that the further use of data for research purposes is both always compatible with the original purpose and lawful under Article 6(1) of the UK GDPR. 
  • Replicating the Article 14(5)(b) exemption (disproportionate effort) in Article 13 (privacy notice), limited only to controllers processing personal data for research purposes.
  • Amending the law to facilitate innovative re-use of data for different purposes and by different data controllers.
  • Creating a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test “in order to give them more confidence to process personal data without unnecessary recourse to consent.” 

AI, Machine Learning and Automated Decision Making

  • Stipulating that processing personal data for the purposes of ensuring bias monitoring, detection and correction in relation to AI systems constitutes a legitimate interest in the terms of Article 6(1)(f) for which the balancing test is not required. 
  • Enabling organisations to use personal data and sensitive personal data for the purpose of managing the risk of bias in their AI systems by amending/clarifying the legitimate interests ground under Art 6 and clarifying/amending schedule 1 of the DPA 2018 (Special Category Data Processing).
  • Removing Article 22 of UK GDPR (the right not to be subject to a decision resulting from solely automated processing if that decision has significant effects on the individual) and permitting solely automated decision making subject to compliance with the rest of the data protection legislation. 

Accountability

  • Allowing data controllers to implementing a more flexible and risk-based accountability framework, which is based on privacy management programmes, that reflects the volume and sensitivity of the personal information they handle, and the type(s) of data processing they carry out. 
  • To support the implementation of the new accountability framework the government intends to remove the requirement to:
    • Consult the ICO in relation to high-risk personal data processing that cannot be mitigated (Article 36)
    • The record keeping requirements under Article 30
    • The need to report a data breach where the risk to individuals is “not material”
  • Introducing a new voluntary undertakings process. 

International Transfers

  • Adding more countries to the adequate list by “progressing an ambitious programme of adequacy assessments.”
  • Adding easier and more international transfer mechanisms.
  • Allowing repetitive use of Article 49 derogations.

PECR and Marketing 

  • Permitting organisations to use analytics cookies and similar technologies without the users’ consent. 
  • Permitting organisations to store information on, or collect information from, a user’s device without their consent for other limited purposes.
  • Extending “the soft opt-in” to electronic communications from organisations other than businesses where they have previously formed a relationship with the person, perhaps as a result of membership or subscription. 
  • Making it easier for political parties to use data for “political engagement”.
  • Increasing the fines that can be imposed under PECR to GDPR levels.

Other Proposals

  • Including “a clear test for determining when data will be regarded as anonymous” within the UK GDPR.
  • Introducing a fee regime (similar to that in the Freedom of Information Act 2000) for access to personal data held by all data controllers. 
  • Requiring the ICO to consider not just data protection but also “growth and innovation” as well as competition.

Businesses may welcome many of these proposals which they might see as limiting the administrative burden of the current data protection regime particularly reporting data breaches and conducting DPIAs. The Government also seems intent on liberalising access to data, to generate a broader market for it, which will suit the commercial interests of big business but at what privacy cost? The consultation runs until 19 November 2021.

What are your thoughts? Let us know in the comment field.

Our  GDPR Practitioner Certificate is our most popular certificate course available both online and classroom. We have added more dates.

Act Now Associate Appointed to Judicial Position

EDIT-37

Act Now Training would like to congratulate Susan Wolf our senior associate, who has been appointed as a Fee Paid Member of the Upper Tribunal assigned to the Administrative Appeals Chamber (Information Rights Jurisdiction) and First Tier Tribunal General Regulatory Chamber (Information Rights Jurisdiction). 

We are delighted that Susan will continue in her current position at Act Now Training delivering our full range of online and classroom-based workshops. Susan also writes for our information law blog and has developed our very popular FOI Practitioner Certificate

Prior to joining us, Susan taught information rights practitioners on the LLM in Information Rights Law at at Northumbria University. She has also taught and presented workshops on FOI, EIR and access to EU information in Germany, the Czech Republic and throughout the UK. 

Commenting on Susan’s appointment Ibrahim Hasan Director of Act Now Training, said: 

“I am delighted that Susan’s expertise as an information rights lawyer has been recognised through this judicial appointment. I am sure that she will use her fantastic skills and experience to the benefit her new role.”

%d bloggers like this: