Navigating Turbulence: Qantas App Privacy Breach Sparks Concerns 

Today a number of news outlets are reporting that Australian airline Qantas is investigating a privacy breach on its app. Customers discovered that they had access to the personal details of other travellers, including boarding passes and frequent flyer information. This discovery has raised significant concerns about data security and privacy among Qantas app users. 

Qantas responded to the situation, acknowledging the issue and assuring customers that it was under investigation. Within three hours of the breach being detected, the airline claimed to have resolved the problem and issued a public apology for any inconvenience caused. 

Despite initial fears of a cyberattack, Qantas stated that the breach was likely due to a technology glitch, possibly linked to recent system updates. However, the extent of the breach was troubling, with some users reporting the ability to view multiple passengers’ details with just a few clicks. 

Customers shared their experiences on social media platforms, recounting instances where they were confronted with strangers’ personal information upon opening the app. Concerns were further amplified when reports emerged of individuals being able to manipulate flight bookings, raising questions about the app’s security measures. 

In response to the breach, Qantas advised affected users to log out and log back into the app to mitigate the issue. The airline reassured customers that there were no indications of travellers using incorrect boarding passes as a result of the breach. 

Social media channels buzzed with criticism of Qantas, with users sharing screenshots of the glitch and raising awareness of potential phishing attempts. Allegations surfaced of fake Qantas customer care accounts soliciting personal information from users under the guise of assistance. 

Does the UK GDPR apply here? 

In October 2020, the UK Information Commissioner’s Office fined British Airways £20million, under the GDPR, for a cyber security breach which saw the personal and financial details of more than 400,000 customers being accessed by attackers.   

Whilst Qantas has said that this incident was not due to a cyber-attack, it will certainly face questions about its handling of customer data under Australian data protection laws. It is also possible that Qantas, an Australian company,  is the subject of a probe by the UK Information Commissioner’s Office under the UK GDPR if, as is likely, UK data subjects are affected by the incident.  

Article 3(2) of the UK GDPR gives it an extra territorial effect. It states:  

“This Regulation applies to the relevant processing of personal data of data subjects who are in the United Kingdom by a controller or processor not established in the United Kingdom where the processing activities are related to: 

(a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the United Kingdom; or 

(b) the monitoring of their behaviour as far as their behaviour takes place within the United Kingdom.” 

Applying this principle, On 4th April 2023, the ICO issued a £12.7 million fine to TikTok, a US company owned whose parent company is owned by Beijing based ByteDance, for a number of breaches of the UK GDPR, including failing to use children’s personal data lawfully.   

As Qantas works to address the fallout from this breach and restore trust among its customer base, the incident serves as a stark reminder of the importance of robust data security measures in the digital age. It highlights the vulnerability of personal data in online platforms and underscores the need for companies to prioritise the protection of customer data. 

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.  

YMCA Fined for HIV Email Data Breach 

Another day and another ICO fine for a data breach involving email! The Central Young Men’s Christian Association (the Central YMCA) of London has been issued with a Monetary Penalty Notice of £7,500 for a data breach when emails intended for those on a HIV support programme were sent to 264 email addresses using CC instead of BCC, revealing the email addresses to all recipients. This resulted in 166 people being identifiable or potentially identifiable. A formal reprimand has also been issued

Failure to use blind carbon copy (BCC) correctly in emails is one of the top data breaches reported to the ICO every year. In December 2023, the ICO fined the Ministry of Defence (MoD) £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. Again the failure to use blind copy when using e mail was a central cause of the data breach. 

Last year the Patient and Client Council (PCC) and the Executive Office were the subject of ICO reprimands for disclosing personal data in this way. In October 2021, HIV Scotland was issued with a £10,000 GDPR fine when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.  

Organisations must have appropriate policies and training in place to minimise the risks of personal data being inappropriately disclosed via email. To avoid similar incidents, the ICO recommends that organisations should: 

  1. Consider using other secure means to send communications that involve large amounts of data or sensitive information. This could include using bulk email services, mail merge, or secure data transfer services, so information is not shared with people by mistake.  
  1. Consider having appropriate policies in place and training for staff in relation to email communications.  
  1. For non-sensitive communications, organisations that choose to use BCC should do so carefully to ensure personal email addresses are not shared inappropriately with other customers, clients, or other organisations. 

More on email best practice in the ICO’s email and security guidance

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Apprentice Case Study – Meet Evie

In 2022, Act Now Training teamed up with Damar Training to support their delivery of the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first group of apprentices due to undertake the endpoint assessment and so we caught up with Manchester Airport Group apprentice Evie Scott and her manager to get their thoughts on the programme so far.

Evie left college in summer 2022 after A Levels and a BTEC. She wanted to continue learning but in a more hands-on environment:

“In my final year of college, my tutor helped me create a LinkedIn account and I found the Data Protection and Information Governance Practitioner apprenticeship opportunity at Manchester Airport Group. Having previously visited the airport on a school trip I found the range of jobs there fascinating, so I started looking into their apprenticeship opportunities and how they could benefit my career.”

Evie applied successfully for the role of apprentice Data Protection and Information Governance Practitioner at Manchester Airport Group (MAG). Over a year into her job, she is finding the programme engaging and is developing new skills and perspectives that she can apply at work:

“I really enjoy the fact that the apprenticeship programme is challenging yet engaging. I enjoy the further reading aspect as it allows me to gain a greater insight into topics and offers different viewpoints and perspectives which I try adopting into my work.”

Charlotte Lewendon-Jones, Head of Data Protection and Privacy at MAG, has over 30 years’ experience in Information Governance. She was part of the trailblazer group of employers that helped develop the Data Protection and Information Governance Practitioner apprenticeship.

Charlotte manages the Data Protection and Compliance Team at MAG.
When MAG advertised their data protection apprenticeship opportunities in summer 2022, they were overwhelmed by the level of interest. This was testament, Charlotte believes, to the quality of the apprenticeship itself and to MAG’s commitment to its wider apprenticeship programme. On the impact of apprentices so far, she comments:

“The apprentices are confident and bring a fresh viewpoint to the team which brings huge improvements. When the apprentices go on training sessions, I challenge them on some of our processes to see what they have learnt, find ways in which we can do better and support their learning journey.”

About Evie, Charlotte adds:

“Considering Evie didn’t have any experience in data protection and information governance, I feel she’s done really well. Her training started in September 2022 and I’ve seen her confidence grow. Her approach and attitude to work are excellent, she’s gaining great experience, asking fewer questions and making more informed decisions based on her experience and what she’s learnt.”

Finally, we asked Evie how she feels the apprenticeship will impact her moving forward:

“When I apply what I have learnt so far to my workload or tasks I have an appreciation for why things are done in a certain way. I feel the further I get into my apprenticeship more it will continue to influence my everyday tasks, benefit the organisation and help me in my job role.”

“At Damar, we believe in the power of apprenticeships to benefit business and transform lives. We see it every day across the thousands of supportive employers, apprentices and workplace supervisors that we are proud to partner with.”

You can read about the experience of another apprentice (Natasha) here.

STOP PRESS (28/6/24): Evie has now successfully completed her apprenticeship. Many congratulations Evie!

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

Oral Disclosure of Personal Data: To GDPR or not to GDPR? 

Here’s a pub quiz question for you, “Can a Data Controller circumvent the requirements of data protection law by disclosing personal data verbally rather than in writing?” The answer was “Yes” under the old Data Protection Act 1998.
In Scott v LGBT Foundation Ltd [2020] WLR 62, the High Court rejected a claim that the LGBT foundation had breached, amongst other things, the claimants data protection rights by disclosing information about him to a GP. The court held that the 1998 Act did not apply to purely verbal communications.  

Nowadays though, the answer to the above question is no; the oral disclosure of personal data amounts to “processing” as defined by Article 4(2) of the GDPR.
So said the Court of Justice of the European Union (CJEU), on 7th March 2024, in a preliminary ruling in the Endemol Shine Finland

The subject of the ruling is a television company which makes a number of reality TV shows in Finland. It had been organising a competition, and was seeking information from the District Court of South Savo for information about possible criminal proceedings involving one of the competition participants. It requested the District Court to disclose the information orally rather than in writing. The District Court refused the request on the basis that there was no legitimate reason for processing the criminal offence data under Finnish law, implementing Article 10 of the GDPR.
On appeal Endemol Shine Finland argued that the GDPR did not apply as the oral disclosure of the information would not constitute processing of personal data under the GDPR. 

Article 4(2) GDPR defines “processing” as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means”. On the face of it, this covers oral processing. However, Article 2 states that GDPR applies to processing of personal data “wholly or partly by automated means”, and processing by non-automated means which “forms or is intended to form part of a filing system.” Article 4(6) GDPR defines “filing system” broadly, covering “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”. 

The Finnish Court of Appeal requested a preliminary ruling from CJEU on the meaning of Article 4(2) and whether the particular processing in this case came within the material scope of the GDPR under Article 2. The CJEU held the concept of processing in Article 4(2) of the GPDR necessarily covered the oral disclosure of personal data. It said the wording of the Article made it apparent that the EU legislature intended to give the concept of processing a broad scope. The court pointed out that the GDPR’s objective was “to ensure a high level of protection of the fundamental rights and freedoms of natural persons” and that “circumventing the application of that regulation by disclosing personal data orally rather than in writing would be manifestly incompatible with that objective”. 

The CJEU went on to consider whether the oral processing of the data would fall within the material scope of the GDPR under Article 2. It held that it was clear from the request for a preliminary ruling that the personal data sought from the District Court of South Savo is contained in “a court’s register of persons” which appeared to be a filing system within the meaning of Article 4(6), and therefore fell within the scope of the GDPR. 

UK Data Controllers should note the wording of Article 4 and Article 2 of the UK GDPR is the same as in the EU GDPR. So whilst this ruling from the CJEU is not binding on UK courts, it would be wise to assume that picking up the phone and making an oral disclosure of personal data will not allow the UK GDPR to be circumvented.   

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 


Transparency in Health and Social Care: New ICO Guidance 

Within the health and social care sector, new technologies that use large amounts of personal data are being used to support both direct care and secondary purposes, such as planning and research. An example is the the use of AI to provide automated diagnoses based on medical imaging data from patients. 

Transparency is a key principle of UK Data Protection legislation. Compliance with the first data protection principle and Article 13 and 14 of the UK GDPR ensures that data subjects are aware of how their personal data is used, allowing them to make informed choices about who they disclose their data to and how to exercise their data rights. 

On Monday the Information Commissioner’s Office (ICO) published new guidance to assist health and social care organisations to comply with their transparency obligations under the UK GDPR. It supplements existing ICO guidance on the principle of transparency and the right to be informed

The guidance is aimed at all organisations, including from the private and third sector, who deliver health and social care services or process health and social care information. This includes local authorities, suppliers to the health and social care sector, universities using health information for research purposes and others (e.g. fire service, police and education) that use health information for their own purposes. The guidance will help them to understand the definition of transparency and assess appropriate levels of transparency, as well as providing practical steps to developing effective transparency information. 

This and other data protection developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.   

GDPR and Privacy: Balancing Consent and Medical Need 

The recent High Court ruling in Wessex Fertility Limited v University Southampton Hospital NHS Foundation Trust Human Fertilisation and Embryology Authority v Donor Conception Network (2024), answers the question of when an express refusal of consent can be overridden because of a medical need. 

The claimant in this case, Wessex Fertility Limited (“the Clinic”) sought declarations that it is lawful for it to request that an egg donor provide a DNA sample for the purposes of genetic analysis; and the processing of the donor’s personal data involved in making this request is lawful under the GDPR. The first part of this question required the court to analyse the Human Fertilisation and Embryology Act 1990 (as amended). Data protection professionals will be interested in the second part which requires consideration of Article 8 of the ECHR (right to privacy) and the GDPR.  

Background 

The Clinic has been licensed to offer fertility treatment and related services since 1992. It treated a couple, Mr and Mrs H, using eggs donated by Donor A and Mr H’s sperm resulting in the birth of a baby girl, AH. Sadly, AH was born with a number of health problems including polydactyly, a cardiac abnormality and gross motor delay.
In order to best treat AH, the Clinic need to understand a genetic cause of her health problems. It wished to contact Donor A to request that she provides a DNA sample which would be used to carry out genetic analysis with the aim of establishing a genetic diagnosis of AH’s condition. 

Consent Wording 

When Donor A donated her eggs, she was asked by the Clinic to complete a number of consent forms, one of which had the following statement: 

“I/we wish to be notified if Wessex Fertility learns (e.g., through the birth of an affected child) that I have a previously unsuspected genetic disease, or that I am a carrier of a harmful inherited condition.”  

Beside this part of the form there is a box to tick ‘yes’ or ‘no’. Donor A, who completed this form in advance of her first egg collection for use by Mr and Mrs H, ticked the box to state that she did not wish to be notified in the event of such a discovery.
Consistent with this, Donor A also completed the form in the same way when at the time of her first egg collection for use with a different recipient couple and another when she returned for her second egg collection the following year. 

The judge considered whether Donor A, having specifically refused consent to be informed of any genetic conditions she might have, could be asked to provide DNA for genetic analysis. 

The Law 

The judge first applied Article 8 of the ECHR (the right to privacy). The judge closely analysed the wording of the consent questions which had been posed to Donor A.
He considered that the question to which Donor A answered “no”, in respect of being informed of genetic conditions, was not drafted in a way which imagined the scenario which had now arisen. Accordingly, it was possible to say that Donor A had not refused consent for a matter such as this. 

In concluding that any interference with Donor A’s Article 8 rights are justified and proportionate, if this court made the declaration requested, the judge took account of, amongst other things, the obvious benefit to AH as it may provide clarity about her diagnosis and/or treatment options in the widest sense. There may also be a wider benefit to others who may have been conceived, or may be conceived in the future, using Donor A’s eggs. 

The judge went on to consider whether the processing of Donor A’s personal data is lawful under the GDPR concluding that there was a lawful basis under Article 6: 

“It is clear the ‘legitimate interest’ under article 6(1)(f) is to enable trio testing to increase the chances of a diagnosis for AH and/or the provision of the correct treatment. Whilst it is recognised Donor A may not consent to providing DNA there is still a need to request it and all other steps short of making the request have been undertaken.” 

The Clinic was also processing Donor A’s health data which is Special Category Data and thus required an additional Article 9 lawful basis. The judge said: 

“The processing under article 9(2)(h) is lawful as it is necessary for the purposes of AH’s diagnosis and/or provision of treatment. Professor Lucassen’s evidence is clear about the benefits of trio testing, such a request will be made by a health professional under obligation of secrecy. The request will not be incompatible with the purposes for which the details were collected, namely, to enable the Clinic to contact Donor A as and when required, which would include in accordance with any declarations made by this Court” 

This case shows the importance of ensuring that consent statements are carefully drafted. When it comes to what the data subject did or did not consent to, the precise wording of the consent statement will be carefully scrutinised by the courts.  

This and other data protection developments will be discussed byRobert Batemanin our forthcomingGDPR Updateworkshop. We have also just launched our new workshop,Understanding GDPR Accountability and Conducting Data Protection Audits. 

DP Bill: Updated Keeling Schedules Published 

The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. If passed, it will make changes to UK data protection legislation including the UK GDPR.

The Government recently published updated versions of Keeling Schedules showing potential changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

Whilst no doubt there will be further amendments, the schedules are worth studying for a clear picture as to the impact of the Bill. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. 

Apprentice Case Study – Meet Natasha

In 2022, Act Now Training teamed up with Damar to support their delivery of the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first cohort about to undertake the end point assessment.

Data Protection and Information Governance Apprentice, Natasha Lock, is an integral part of the Governance and Compliance team at the University of Lincoln. With the Data Protection and Digital Information (No.2) Bill set to make changes to the UK data protection regime, Natasha talks to us about why this is a great area to work in and how the apprenticeship route has been particularly beneficial for her.

How did you get onto the apprenticeship?

“I was already working at the university as an Information Compliance Officer when the opportunity for a staff apprenticeship came up.

“The process was swift and straightforward, and I was enrolled on the Data Protection and Information Governance Apprenticeship within three months of enquiring.”

How has the apprenticeship helped you?

“I started with a good understanding of the UK Data Protection legislation but my knowledge has grown significantly, and now I’m coming to the end of my level 4 apprenticeship, I’ve gained so much more insight and my confidence has grown.

“As a university, we hold vast amounts of data. My apprenticeship is allowing me to solve the challenge of data retention and implement better measures to retain, destroy and archive information. I have developed a greater understanding of the legislative requirements we must adhere to as a public sector institute and how to reduce and assess data protection risks.

“I love the fact that I can study whilst still doing my job. The flexibility works for me because I can go through course materials at my own pace. I really feel like I have a brilliant work/life/study balance.

“The University of Lincoln and Damar Training have been fantastic in supporting me. I get along with my coach, Tracey, so well. She is very friendly and personable and has enabled my creativity to flow.

“The course is very interactive, and I’ve found the forums with other apprentices to be a very useful way of sharing knowledge, ideas and stories.

“I’m enjoying it so much and people have noticed that my confidence has grown. I wouldn’t have had that without doing this apprenticeship. I’ve now got my sights on doing a law degree or law apprenticeship in the future.”

Abi Slater, Information Compliance Manager at Lincoln University, said: “It has been great to see how much Natasha has developed over the course of the apprenticeship. I believe the apprenticeship has provided Natasha with the knowledge and skills required to advance in her data protection career and the support from her coach at Damar Training has been excellent.

“I would encourage anyone with an interest in data protection and information governance to consider this apprenticeship.”

Tracey Coetzee, Coach at Damar Training said: “The Data Protection and Information Governance Apprenticeship was only approved by the Institute of Apprenticeships in 2022, and its delightful to see apprentices flourishing on the programme.

“From cyber security to managing data protection risks, this programme is upskilling participants and adding value to both private and public sector organisations and we’re thrilled to see the first cohort, including Natasha, approach the completion of their training.”

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

EU AI Act Approved by European Parliament  

On Wednesday 13th March 2024, the European Parliament approved the text of the harmonised rules on artificial intelligence, the so-called  “Artificial Intelligence Act” (AI Act). Agreed upon in negotiations with member states in December 2023, the Act was endorsed by MEPs with 523 votes in favour, 46 against and 49 abstentions. It aims to “protect fundamental rights, democracy, the rule of law and environmental sustainability from high-risk AI, while boosting innovation and establishing Europe as a leader in the field.” Despite Brexit, UK businesses and entities engaged in AI-related activities will still be affected by the Act if they intend to operate within the EU market. The Act will have an extra territorial reach just like the EU GDPR 

The main provisions of Act can be read here. In summary, the Act sets out comprehensive rules for AI applications, including a risk-based system to address potential threats to health and safety, and human rights. The Act will ban some AI applications which pose an “unacceptable risk”, such as real-time and remote biometric identification systems like facial recognition, and impose strict obligations on others considered as “high risk”, such as AI in EU-regulated product safety categories such as cars and medical devices. These obligations include adherence to data governance standards, transparency rules, and the incorporation of human oversight mechanisms.  

Next steps 

The Act is still subject to a final lawyer-linguist check and is expected to be finally adopted before the end of the legislature (through the so-called corrigendum procedure). It also needs to be formally endorsed by the Council of Europe. 

The Act will enter into force twenty days after its publication in the official Journal, and be fully applicable 24 months after its entry into force, except for: bans on prohibited practises, which will apply six months after the entry into force date; codes of practise (nine months after entry into force); general-purpose AI rules including governance (12 months after entry into force); and obligations for high-risk systems (36 months after entry into force). 

Influence on UK AI Regulation 

The EU’s regulatory approach will impact the UK Government’s decisions on AI governance. An AI White Paper was published in March last year entitled  
“A pro-innovation approach to AI regulation”. The paper sets out the UK’s preference not to place AI regulation on a statutory footing but to make use of “regulators’ domain-specific expertise to tailor the implementation of the principles to the specific context in which AI is used.” In January 2024, the ICO launched  a consultation series on Generative AI, examining how aspects of data protection law should apply to the development and use of the technology. It is expected to issue more AI guidance later in 2024. 

Our AI Act workshop will help you understand the new law in detail and its interaction with the UK’s objectives and strategy for AI regulation.  

Electronic Tagging of Migrants: Enforcement Notice Published by ICO

On 1st March 2024, the Information Commissioner’s Office (ICO) announced that it has issued an Enforcement Notice and warning to the Home Office for failing to sufficiently assess the privacy risks posed by the electronic monitoring of people arriving in the UK via unauthorised means. (Strangely the actual text of the notice and warning were only recently published; three weeks after the ICO press release.)
The decision comes as a result of Privacy International’s complaint (filed in August 2022) against the Home Office policy. The civil liberties pressure group alleged widespread and significant breaches of privacy and data protection law.  

The ICO had been in discussion with the Home Office since August 2022 on its pilot to place ankle tags on, and track the GPS location of, up to 600 migrants who arrived in the UK and were on immigration bail. The purpose of the pilot was to test whether electronic monitoring is an effective way to maintain regular contact with asylum claimants, while reducing the risk of absconding, and to establish whether it is an effective alternative to detention. 

The ICO found the Home Office failed to conduct a Data Protection Impact Assessment (DPIA), in relation to the pilot, which satisfies the requirements of Article 35 of the UK GDPR.  Amongst things, the Home office had failed to sufficiently assess the privacy intrusion of the continuous collection of people’s location information.
It was also found to have breached the Accountability Principle (Article 5(2)) by failing to demonstrate its compliance with Article 5(1), in particular:  

  • Article 5(1)(a) Lawfulness: the Home Office identified the lawful basis for the processing as Article 6(1)(e), and for Special Category Data as Article 9(2)(g) and schedule 1 paragraph 6 DPA 2018. However, it did not demonstrate that the processing was necessary and proportionate for these purposes (neither in its DPIA or staff guidance) including why less privacy-intrusive methods could not meet its objectives.  
  • Article 5(1)(a) Fairness and Transparency: the Home Office’s privacy notice(s) did not demonstrate compliance with minimum transparency requirements, as set out at Articles 12 and 13. It failed to provide clear and easily accessible information to the people being tagged about what personal information is being collected, how it will be used, how long it will be kept for, and who it will be shared with. The privacy information was not set out clearly in one place, was inconsistent and there were information gaps. 
  • Article 5(1)(c) Data Minimisation: the Home Office’s draft DPIA and guidance for staff did not demonstrate that data minimisation will be considered and actioned when requesting access to the personal data produced by the electronic tags. 

Jon Edwards, the Information Commissioner, said: 

“It’s crucial that the UK has appropriate checks and balances in place to ensure people’s information rights are respected and safeguarded. This is even more important for those who might not even be aware that they have those rights. 

“This action is a warning to any organisation planning to monitor people electronically – you must be able to prove the necessity and proportionality of tracking people’s movements, taking into consideration people’s vulnerabilities and how such processing could put them at risk of further harm. This must be done from the outset, not as an afterthought.” 

The Enforcement Notice orders the Home Office to update its internal policies, access guidance and privacy information in relation to the data retained from the pilot scheme. The ICO has also issued a formal warning stating that any future processing by the Home Office on the same basis will be in breach of data protection law and will attract enforcement action.  

Surveillance is a hot topic for the ICO at present. Last month, the ICO issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance.  

The Enforcement Notice and warning are important reading for anyone who wishes to understand how to complete a compliant and meaningful DPIA. The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. Amongst other things, the DPIA provisions in the UK GDPR,  will be replaced by leaner and less prescriptive “Assessments of High-Risk Processing”.  

This and other GDPR developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.