Oral Disclosure of Personal Data: To GDPR or not to GDPR? 

Here’s a pub quiz question for you, “Can a Data Controller circumvent the requirements of data protection law by disclosing personal data verbally rather than in writing?” The answer was “Yes” under the old Data Protection Act 1998.
In Scott v LGBT Foundation Ltd [2020] WLR 62, the High Court rejected a claim that the LGBT foundation had breached, amongst other things, the claimants data protection rights by disclosing information about him to a GP. The court held that the 1998 Act did not apply to purely verbal communications.  

Nowadays though, the answer to the above question is no; the oral disclosure of personal data amounts to “processing” as defined by Article 4(2) of the GDPR.
So said the Court of Justice of the European Union (CJEU), on 7th March 2024, in a preliminary ruling in the Endemol Shine Finland

The subject of the ruling is a television company which makes a number of reality TV shows in Finland. It had been organising a competition, and was seeking information from the District Court of South Savo for information about possible criminal proceedings involving one of the competition participants. It requested the District Court to disclose the information orally rather than in writing. The District Court refused the request on the basis that there was no legitimate reason for processing the criminal offence data under Finnish law, implementing Article 10 of the GDPR.
On appeal Endemol Shine Finland argued that the GDPR did not apply as the oral disclosure of the information would not constitute processing of personal data under the GDPR. 

Article 4(2) GDPR defines “processing” as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means”. On the face of it, this covers oral processing. However, Article 2 states that GDPR applies to processing of personal data “wholly or partly by automated means”, and processing by non-automated means which “forms or is intended to form part of a filing system.” Article 4(6) GDPR defines “filing system” broadly, covering “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”. 

The Finnish Court of Appeal requested a preliminary ruling from CJEU on the meaning of Article 4(2) and whether the particular processing in this case came within the material scope of the GDPR under Article 2. The CJEU held the concept of processing in Article 4(2) of the GPDR necessarily covered the oral disclosure of personal data. It said the wording of the Article made it apparent that the EU legislature intended to give the concept of processing a broad scope. The court pointed out that the GDPR’s objective was “to ensure a high level of protection of the fundamental rights and freedoms of natural persons” and that “circumventing the application of that regulation by disclosing personal data orally rather than in writing would be manifestly incompatible with that objective”. 

The CJEU went on to consider whether the oral processing of the data would fall within the material scope of the GDPR under Article 2. It held that it was clear from the request for a preliminary ruling that the personal data sought from the District Court of South Savo is contained in “a court’s register of persons” which appeared to be a filing system within the meaning of Article 4(6), and therefore fell within the scope of the GDPR. 

UK Data Controllers should note the wording of Article 4 and Article 2 of the UK GDPR is the same as in the EU GDPR. So whilst this ruling from the CJEU is not binding on UK courts, it would be wise to assume that picking up the phone and making an oral disclosure of personal data will not allow the UK GDPR to be circumvented.   

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 


Transparency in Health and Social Care: New ICO Guidance 

Within the health and social care sector, new technologies that use large amounts of personal data are being used to support both direct care and secondary purposes, such as planning and research. An example is the the use of AI to provide automated diagnoses based on medical imaging data from patients. 

Transparency is a key principle of UK Data Protection legislation. Compliance with the first data protection principle and Article 13 and 14 of the UK GDPR ensures that data subjects are aware of how their personal data is used, allowing them to make informed choices about who they disclose their data to and how to exercise their data rights. 

On Monday the Information Commissioner’s Office (ICO) published new guidance to assist health and social care organisations to comply with their transparency obligations under the UK GDPR. It supplements existing ICO guidance on the principle of transparency and the right to be informed

The guidance is aimed at all organisations, including from the private and third sector, who deliver health and social care services or process health and social care information. This includes local authorities, suppliers to the health and social care sector, universities using health information for research purposes and others (e.g. fire service, police and education) that use health information for their own purposes. The guidance will help them to understand the definition of transparency and assess appropriate levels of transparency, as well as providing practical steps to developing effective transparency information. 

This and other data protection developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.   

GDPR and Privacy: Balancing Consent and Medical Need 

The recent High Court ruling in Wessex Fertility Limited v University Southampton Hospital NHS Foundation Trust Human Fertilisation and Embryology Authority v Donor Conception Network (2024), answers the question of when an express refusal of consent can be overridden because of a medical need. 

The claimant in this case, Wessex Fertility Limited (“the Clinic”) sought declarations that it is lawful for it to request that an egg donor provide a DNA sample for the purposes of genetic analysis; and the processing of the donor’s personal data involved in making this request is lawful under the GDPR. The first part of this question required the court to analyse the Human Fertilisation and Embryology Act 1990 (as amended). Data protection professionals will be interested in the second part which requires consideration of Article 8 of the ECHR (right to privacy) and the GDPR.  

Background 

The Clinic has been licensed to offer fertility treatment and related services since 1992. It treated a couple, Mr and Mrs H, using eggs donated by Donor A and Mr H’s sperm resulting in the birth of a baby girl, AH. Sadly, AH was born with a number of health problems including polydactyly, a cardiac abnormality and gross motor delay.
In order to best treat AH, the Clinic need to understand a genetic cause of her health problems. It wished to contact Donor A to request that she provides a DNA sample which would be used to carry out genetic analysis with the aim of establishing a genetic diagnosis of AH’s condition. 

Consent Wording 

When Donor A donated her eggs, she was asked by the Clinic to complete a number of consent forms, one of which had the following statement: 

“I/we wish to be notified if Wessex Fertility learns (e.g., through the birth of an affected child) that I have a previously unsuspected genetic disease, or that I am a carrier of a harmful inherited condition.”  

Beside this part of the form there is a box to tick ‘yes’ or ‘no’. Donor A, who completed this form in advance of her first egg collection for use by Mr and Mrs H, ticked the box to state that she did not wish to be notified in the event of such a discovery.
Consistent with this, Donor A also completed the form in the same way when at the time of her first egg collection for use with a different recipient couple and another when she returned for her second egg collection the following year. 

The judge considered whether Donor A, having specifically refused consent to be informed of any genetic conditions she might have, could be asked to provide DNA for genetic analysis. 

The Law 

The judge first applied Article 8 of the ECHR (the right to privacy). The judge closely analysed the wording of the consent questions which had been posed to Donor A.
He considered that the question to which Donor A answered “no”, in respect of being informed of genetic conditions, was not drafted in a way which imagined the scenario which had now arisen. Accordingly, it was possible to say that Donor A had not refused consent for a matter such as this. 

In concluding that any interference with Donor A’s Article 8 rights are justified and proportionate, if this court made the declaration requested, the judge took account of, amongst other things, the obvious benefit to AH as it may provide clarity about her diagnosis and/or treatment options in the widest sense. There may also be a wider benefit to others who may have been conceived, or may be conceived in the future, using Donor A’s eggs. 

The judge went on to consider whether the processing of Donor A’s personal data is lawful under the GDPR concluding that there was a lawful basis under Article 6: 

“It is clear the ‘legitimate interest’ under article 6(1)(f) is to enable trio testing to increase the chances of a diagnosis for AH and/or the provision of the correct treatment. Whilst it is recognised Donor A may not consent to providing DNA there is still a need to request it and all other steps short of making the request have been undertaken.” 

The Clinic was also processing Donor A’s health data which is Special Category Data and thus required an additional Article 9 lawful basis. The judge said: 

“The processing under article 9(2)(h) is lawful as it is necessary for the purposes of AH’s diagnosis and/or provision of treatment. Professor Lucassen’s evidence is clear about the benefits of trio testing, such a request will be made by a health professional under obligation of secrecy. The request will not be incompatible with the purposes for which the details were collected, namely, to enable the Clinic to contact Donor A as and when required, which would include in accordance with any declarations made by this Court” 

This case shows the importance of ensuring that consent statements are carefully drafted. When it comes to what the data subject did or did not consent to, the precise wording of the consent statement will be carefully scrutinised by the courts.  

This and other data protection developments will be discussed byRobert Batemanin our forthcomingGDPR Updateworkshop. We have also just launched our new workshop,Understanding GDPR Accountability and Conducting Data Protection Audits. 

DP Bill: Updated Keeling Schedules Published 

The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. If passed, it will make changes to UK data protection legislation including the UK GDPR.

The Government recently published updated versions of Keeling Schedules showing potential changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

Whilst no doubt there will be further amendments, the schedules are worth studying for a clear picture as to the impact of the Bill. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. 

Apprentice Case Study – Meet Natasha

In 2022, Act Now Training teamed up with Damar to deliver the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first cohort about to undertake the end point assessment.

Data Protection and Information Governance Apprentice, Natasha Lock, is an integral part of the Governance and Compliance team at the University of Lincoln. With the Data Protection and Digital Information (No.2) Bill set to make changes to the UK data protection regime, Natasha talks to us about why this is a great area to work in and how the apprenticeship route has been particularly beneficial for her.

How did you get onto the apprenticeship?

“I was already working at the university as an Information Compliance Officer when the opportunity for a staff apprenticeship came up.

“The process was swift and straightforward, and I was enrolled on the Data Protection and Information Governance Apprenticeship within three months of enquiring.”

How has the apprenticeship helped you?

“I started with a good understanding of the UK Data Protection legislation but my knowledge has grown significantly, and now I’m coming to the end of my level 4 apprenticeship, I’ve gained so much more insight and my confidence has grown.

“As a university, we hold vast amounts of data. My apprenticeship is allowing me to solve the challenge of data retention and implement better measures to retain, destroy and archive information. I have developed a greater understanding of the legislative requirements we must adhere to as a public sector institute and how to reduce and assess data protection risks.

“I love the fact that I can study whilst still doing my job. The flexibility works for me because I can go through course materials at my own pace. I really feel like I have a brilliant work/life/study balance.

“The University of Lincoln and Damar Training have been fantastic in supporting me. I get along with my coach, Tracey, so well. She is very friendly and personable and has enabled my creativity to flow.

“The course is very interactive, and I’ve found the forums with other apprentices to be a very useful way of sharing knowledge, ideas and stories.

“I’m enjoying it so much and people have noticed that my confidence has grown. I wouldn’t have had that without doing this apprenticeship. I’ve now got my sights on doing a law degree or law apprenticeship in the future.”

Abi Slater, Information Compliance Manager at Lincoln University, said: “It has been great to see how much Natasha has developed over the course of the apprenticeship. I believe the apprenticeship has provided Natasha with the knowledge and skills required to advance in her data protection career and the support from her coach at Damar Training has been excellent.

“I would encourage anyone with an interest in data protection and information governance to consider this apprenticeship.”

Tracey Coetzee, Coach at Damar Training said: “The Data Protection and Information Governance Apprenticeship was only approved by the Institute of Apprenticeships in 2022, and its delightful to see apprentices flourishing on the programme.

“From cyber security to managing data protection risks, this programme is upskilling participants and adding value to both private and public sector organisations and we’re thrilled to see the first cohort, including Natasha, approach the completion of their training.”

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

EU AI Act Approved by European Parliament  

On Wednesday 13th March 2024, the European Parliament approved the text of the harmonised rules on artificial intelligence, the so-called  “Artificial Intelligence Act” (AI Act). Agreed upon in negotiations with member states in December 2023, the Act was endorsed by MEPs with 523 votes in favour, 46 against and 49 abstentions. It aims to “protect fundamental rights, democracy, the rule of law and environmental sustainability from high-risk AI, while boosting innovation and establishing Europe as a leader in the field.” Despite Brexit, UK businesses and entities engaged in AI-related activities will still be affected by the Act if they intend to operate within the EU market. The Act will have an extra territorial reach just like the EU GDPR 

The main provisions of Act can be read here. In summary, the Act sets out comprehensive rules for AI applications, including a risk-based system to address potential threats to health and safety, and human rights. The Act will ban some AI applications which pose an “unacceptable risk”, such as real-time and remote biometric identification systems like facial recognition, and impose strict obligations on others considered as “high risk”, such as AI in EU-regulated product safety categories such as cars and medical devices. These obligations include adherence to data governance standards, transparency rules, and the incorporation of human oversight mechanisms.  

Next steps 

The Act is still subject to a final lawyer-linguist check and is expected to be finally adopted before the end of the legislature (through the so-called corrigendum procedure). It also needs to be formally endorsed by the Council of Europe. 

The Act will enter into force twenty days after its publication in the official Journal, and be fully applicable 24 months after its entry into force, except for: bans on prohibited practises, which will apply six months after the entry into force date; codes of practise (nine months after entry into force); general-purpose AI rules including governance (12 months after entry into force); and obligations for high-risk systems (36 months after entry into force). 

Influence on UK AI Regulation 

The EU’s regulatory approach will impact the UK Government’s decisions on AI governance. An AI White Paper was published in March last year entitled  
“A pro-innovation approach to AI regulation”. The paper sets out the UK’s preference not to place AI regulation on a statutory footing but to make use of “regulators’ domain-specific expertise to tailor the implementation of the principles to the specific context in which AI is used.” In January 2024, the ICO launched  a consultation series on Generative AI, examining how aspects of data protection law should apply to the development and use of the technology. It is expected to issue more AI guidance later in 2024. 

By attending our new AI Act workshop, you will understand the new law in detail and its interaction with the UK’s objectives and strategy for AI regulation.  

Electronic Tagging of Migrants: Enforcement Notice Published by ICO

On 1st March 2024, the Information Commissioner’s Office (ICO) announced that it has issued an Enforcement Notice and warning to the Home Office for failing to sufficiently assess the privacy risks posed by the electronic monitoring of people arriving in the UK via unauthorised means. (Strangely the actual text of the notice and warning were only recently published; three weeks after the ICO press release.)
The decision comes as a result of Privacy International’s complaint (filed in August 2022) against the Home Office policy. The civil liberties pressure group alleged widespread and significant breaches of privacy and data protection law.  

The ICO had been in discussion with the Home Office since August 2022 on its pilot to place ankle tags on, and track the GPS location of, up to 600 migrants who arrived in the UK and were on immigration bail. The purpose of the pilot was to test whether electronic monitoring is an effective way to maintain regular contact with asylum claimants, while reducing the risk of absconding, and to establish whether it is an effective alternative to detention. 

The ICO found the Home Office failed to conduct a Data Protection Impact Assessment (DPIA), in relation to the pilot, which satisfies the requirements of Article 35 of the UK GDPR.  Amongst things, the Home office had failed to sufficiently assess the privacy intrusion of the continuous collection of people’s location information.
It was also found to have breached the Accountability Principle (Article 5(2)) by failing to demonstrate its compliance with Article 5(1), in particular:  

  • Article 5(1)(a) Lawfulness: the Home Office identified the lawful basis for the processing as Article 6(1)(e), and for Special Category Data as Article 9(2)(g) and schedule 1 paragraph 6 DPA 2018. However, it did not demonstrate that the processing was necessary and proportionate for these purposes (neither in its DPIA or staff guidance) including why less privacy-intrusive methods could not meet its objectives.  
  • Article 5(1)(a) Fairness and Transparency: the Home Office’s privacy notice(s) did not demonstrate compliance with minimum transparency requirements, as set out at Articles 12 and 13. It failed to provide clear and easily accessible information to the people being tagged about what personal information is being collected, how it will be used, how long it will be kept for, and who it will be shared with. The privacy information was not set out clearly in one place, was inconsistent and there were information gaps. 
  • Article 5(1)(c) Data Minimisation: the Home Office’s draft DPIA and guidance for staff did not demonstrate that data minimisation will be considered and actioned when requesting access to the personal data produced by the electronic tags. 

Jon Edwards, the Information Commissioner, said: 

“It’s crucial that the UK has appropriate checks and balances in place to ensure people’s information rights are respected and safeguarded. This is even more important for those who might not even be aware that they have those rights. 

“This action is a warning to any organisation planning to monitor people electronically – you must be able to prove the necessity and proportionality of tracking people’s movements, taking into consideration people’s vulnerabilities and how such processing could put them at risk of further harm. This must be done from the outset, not as an afterthought.” 

The Enforcement Notice orders the Home Office to update its internal policies, access guidance and privacy information in relation to the data retained from the pilot scheme. The ICO has also issued a formal warning stating that any future processing by the Home Office on the same basis will be in breach of data protection law and will attract enforcement action.  

Surveillance is a hot topic for the ICO at present. Last month, the ICO issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance.  

The Enforcement Notice and warning are important reading for anyone who wishes to understand how to complete a compliant and meaningful DPIA. The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. Amongst other things, the DPIA provisions in the UK GDPR,  will be replaced by leaner and less prescriptive “Assessments of High-Risk Processing”.  

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Kate Middleton’s Medical Records: Can anyone go to jail?

Kate Middleton seems to be at the centre of much media (and social media) attention at present. In addition to speculation about her health and whereabouts, there has been much focus and analysis of the now famous photoshopped Mother’s Day photo

This week it was reported that employees at the private London Clinic in Marylebone, where Kate was a patient following abdominal surgery in January, attempted to view her medical records. Reportedly three employees have now been suspended.

The Health Minister, Maria Caulfield, told Sky News it was “pretty severe and serious stuff to be accessing notes that you don’t have permission to”. She also said police had been “asked to look at” whether staff at the clinic attempted to access the princess’s private medical records. 

If the reports are true and individuals are proven to have been “snooping”, what are the consequences? Firstly, this would normally be a matter for the Information Commissioner’s Office (ICO) to investigate rather than the police. Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly: 

(a) obtain or disclose personal data without the consent of the controller, 

(b) procure the disclosure of personal data to another person without the consent of the controller, or 

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained. 

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. In June 2023, the ICO disclosed that since 1st June 2018, 92 cases involving Section 170 offences were investigated by its Criminal Investigations Team.   

Consequences 

Section 170 is only punishable by way of a fine; perpetrators cannot be sent to prison. Although there is now no cap on the maximum fine, prosecutions have resulted in relatively low fines compared to the harm caused.  

Two recent prosecutions have involved employees accessing medical records.
In November 2023, Loretta Alborghetti, was fined for illegally accessing the medical records of over 150 people. The offence took place whilst she worked as a medical secretary at Worcestershire Acute Hospitals NHS Trust. She was ordered to pay a total of £648. 

In August 2022, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust, pleaded guilty to accessing medical records of patients without a valid legal reason. An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off from going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000. 

Computer Misuse Act  

A Section 170 prosecution would have a much greater deterrent effect if the sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail.  

The relatively low fines have led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment. In July 2022, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of people’s personal information from vehicle repair garages to generate potential leads for personal injury claims. 

The ICO has now confirmed that it has a personal data breach report from The London Clinic. If its investigation, finds the Clinic did not comply with its security obligations under the Article 5(1)(f)) and Article 32 of the UK GDPR, it faces a possible maximum Monetary Penalty Notice of £17.5m or 4% of gross annual turnover (whichever is higher). This case highlights the importance of organisations ensuring adequate security measures around sensitive personal data especially where the data relates to high profile individuals.  
 
This and other data protection developments will be discussed in detail on our forthcoming GDPR Update workshop. There are only 3 places left on our next GDPR Practitioner Certificate 

Conservative Party Challenged Over “Data Harvesting” 

In the run up to the General Election this year, political parties in the UK will face the challenge of effectively communicating their message to voters whilst at the same time respecting voters’ privacy. In the past few years, all parties have been accused of riding roughshod over data protection laws in their attempts to convince voters that they ‘have a plan’ or that ‘the country needs change’.  

In May 2017, the Information Commissioner’s Office (ICO) announced that it was launching a formal investigation into the use of data analytics for political purposes after allegations were made about the ‘invisible processing’ of people’s personal data and the micro-targeting of political adverts during the EU Referendum.
This culminated in a report to Parliament and enforcement action against Facebook, Emma’s Diary and some of the companies involved in the Vote Leave Campaign.  

In July 2018 the ICO published a report, Democracy Disrupted, which highlighted significant concerns about transparency around how people’s data was being used in political campaigning. The report revealed a complex ecosystem of digital campaigning with many actors. In 2019, the ICO issued assessment notices to seven political parties. It concluded: 

“The audits found some considerable areas for improvement in both transparency and lawfulness and we recommended several specific actions to bring the parties’ 

processing in compliance with data protection laws. In addition, we recommended that the parties implemented several appropriate technical and organisational measures to meet the requirements of accountability. Overall there was a limited level of assurance that processes and procedures were in place and were delivering data protection compliance.” 

In June 2021, the Conservative Party was fined £10,000 for sending marketing emails to 51 people who did not want to receive them. The messages were sent in the name of Boris Johnson in the eight days after he became Prime Minister in breach of the Privacy and Electronic Communications Regulations 2003 (PECR).  

The Tax Calculator 

The Good Law Project (GLP), a not for profit campaign organisation, is now challenging one aspect of the Conservative Party’s data collection practices. The party’s website contains an online tool which allows an individual to calculate the effect on them of recent changes to National Insurance contributions. However GLP claims this tool is “a simple data-harvesting exercise” which breaches UK data protection laws in a number of ways. It says that a visit to the website automatically leads to the placement of non-essential cookies (related to marketing, analysis and browser tracking), on the visitor’s machine without consent. This is a breach of Regulation 6 of PECR.
GLP also challenges the gathering and use of website visitors’ personal data on the site claiming that (amongst other things) it is neither fair, lawful nor transparent and thus a breach of the UK GDPR 

Director of GLP, Jo Maugham, has taken the first formal step in legal proceedings against the Conservative Party. The full proposed claim is set out in the GLP’s Letter Before Action. The Conservative Party has issued a response arguing that they have acted lawfully and that: 

  • They did obtain consent for the placement of cookies. (GLP disagrees and has now made a 15-page complaint to the ICO.) 
  • They have agreed to change their privacy notice. (GLP is considering whether to ask the court to make a declaration of illegality, claiming that the Tories “have stated publicly that it was lawful while tacitly admitting in private that it is not.”) 
  • They have agreed to the request by GLP to stop processing Jo Maugham’s personal data where that processing reveals his political opinions.  

Following a subject access request, Mr Maugham received 1,384 pages of personal data held about him. GLP claim he is being profiled and believe that such profiling is unlawful. They have instructed barristers with a view to taking legal action.

George Galloway

This is one to watch. If the legal action goes ahead, the result will have implications for other political parties. In any event, in election year, we are already seeing that all political parties data handling practices are going to be under the spotlight.

George Galloway’s landslide win in the Rochdale by-election last week has lead to scrutiny of his party’s processing of Muslim voters’ data. In his blog post , Jon Baines, discusses whether the Workers Party of Britain (led by Mr Galloway) has been processing Special Category Data in breach of the UK GDPR. In the run up to the
by-election, the party had sent different letters to constituents based, it seems, on their religion (or perhaps inferring their religion based on the their name). If this is what it did then, even if the inference is wrong, the party has been processing Special Category Data which requires a lawful basis under Article 9 of the UK GDPR.
In 2022, the ICO issued a fine in the sum of £1,350,000 to Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products. Following the lodging of an appeal by Easylife, the ICO later reduced the fine to £250,000 but the legal basis of the decision still stands. Will the ICO investigate George Galloway?

The DP Bill

The Data Protection and Digital Information (No.2) Bill is currently in the Committee stage of the House of Lords. It will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). Some of the changes will make it easier for political parties to use the personal data of voters and potential voters without the usual GDPR safeguards.
For example political parties could, in the future, rely on “legitimate interests” (as an Article 6 lawful basis) to process process personal data without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These include personal data being processed for the purpose of “democratic engagement”.  The Bill will also amend PECR so that political parties will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest.

As the General Election approaches, and with trust in politics and politicians at a low, all parties need to ensure that they are open, transparent and accountable about how they use voters’ data.  

Our workshop, How to do Marketing in Compliance with GDPR and PECR, is suitable for those advising political parties and any organisation which uses personal data to reach out to potential customers and service users. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.

Act Now Nominated for IRMS Supplier of the Year Award 

Act Now Training is pleased to announce that it has been nominated once again for the 2024 Information and Records Management Society (IRMS) awards. 

Each year the IRMS recognises excellence in the field of information management with their prestigious Industry Awards. These highly sought-after awards are presented at a glittering ceremony at the annual Conference following the Gala Dinner.  

Act Now has been nominated for the Supplier of the Year award which it won in 2021 and 2022. 

Voting is open to IRMS members until Friday 15th March 2024. 

You can vote for Act Now here: https://irms.org.uk/news/666165/Vote-now-for-the-IRMS-Awards-2024.htm  

Thank you for your support!