Navigating Turbulence: Qantas App Privacy Breach Sparks Concerns 

Today a number of news outlets are reporting that Australian airline Qantas is investigating a privacy breach on its app. Customers discovered that they had access to the personal details of other travellers, including boarding passes and frequent flyer information. This discovery has raised significant concerns about data security and privacy among Qantas app users. 

Qantas responded to the situation, acknowledging the issue and assuring customers that it was under investigation. Within three hours of the breach being detected, the airline claimed to have resolved the problem and issued a public apology for any inconvenience caused. 

Despite initial fears of a cyberattack, Qantas stated that the breach was likely due to a technology glitch, possibly linked to recent system updates. However, the extent of the breach was troubling, with some users reporting the ability to view multiple passengers’ details with just a few clicks. 

Customers shared their experiences on social media platforms, recounting instances where they were confronted with strangers’ personal information upon opening the app. Concerns were further amplified when reports emerged of individuals being able to manipulate flight bookings, raising questions about the app’s security measures. 

In response to the breach, Qantas advised affected users to log out and log back into the app to mitigate the issue. The airline reassured customers that there were no indications of travellers using incorrect boarding passes as a result of the breach. 

Social media channels buzzed with criticism of Qantas, with users sharing screenshots of the glitch and raising awareness of potential phishing attempts. Allegations surfaced of fake Qantas customer care accounts soliciting personal information from users under the guise of assistance. 

Does the UK GDPR apply here? 

In October 2020, the UK Information Commissioner’s Office fined British Airways £20million, under the GDPR, for a cyber security breach which saw the personal and financial details of more than 400,000 customers being accessed by attackers.   

Whilst Qantas has said that this incident was not due to a cyber-attack, it will certainly face questions about its handling of customer data under Australian data protection laws. It is also possible that Qantas, an Australian company,  is the subject of a probe by the UK Information Commissioner’s Office under the UK GDPR if, as is likely, UK data subjects are affected by the incident.  

Article 3(2) of the UK GDPR gives it an extra territorial effect. It states:  

“This Regulation applies to the relevant processing of personal data of data subjects who are in the United Kingdom by a controller or processor not established in the United Kingdom where the processing activities are related to: 

(a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the United Kingdom; or 

(b) the monitoring of their behaviour as far as their behaviour takes place within the United Kingdom.” 

Applying this principle, On 4th April 2023, the ICO issued a £12.7 million fine to TikTok, a US company owned whose parent company is owned by Beijing based ByteDance, for a number of breaches of the UK GDPR, including failing to use children’s personal data lawfully.   

As Qantas works to address the fallout from this breach and restore trust among its customer base, the incident serves as a stark reminder of the importance of robust data security measures in the digital age. It highlights the vulnerability of personal data in online platforms and underscores the need for companies to prioritise the protection of customer data. 

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.  

YMCA Fined for HIV Email Data Breach 

Another day and another ICO fine for a data breach involving email! The Central Young Men’s Christian Association (the Central YMCA) of London has been issued with a Monetary Penalty Notice of £7,500 for a data breach when emails intended for those on a HIV support programme were sent to 264 email addresses using CC instead of BCC, revealing the email addresses to all recipients. This resulted in 166 people being identifiable or potentially identifiable. A formal reprimand has also been issued

Failure to use blind carbon copy (BCC) correctly in emails is one of the top data breaches reported to the ICO every year. In December 2023, the ICO fined the Ministry of Defence (MoD) £350,000 for disclosing personal information of people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. Again the failure to use blind copy when using e mail was a central cause of the data breach. 

Last year the Patient and Client Council (PCC) and the Executive Office were the subject of ICO reprimands for disclosing personal data in this way. In October 2021, HIV Scotland was issued with a £10,000 GDPR fine when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk.  

Organisations must have appropriate policies and training in place to minimise the risks of personal data being inappropriately disclosed via email. To avoid similar incidents, the ICO recommends that organisations should: 

  1. Consider using other secure means to send communications that involve large amounts of data or sensitive information. This could include using bulk email services, mail merge, or secure data transfer services, so information is not shared with people by mistake.  
  1. Consider having appropriate policies in place and training for staff in relation to email communications.  
  1. For non-sensitive communications, organisations that choose to use BCC should do so carefully to ensure personal email addresses are not shared inappropriately with other customers, clients, or other organisations. 

More on email best practice in the ICO’s email and security guidance

We have two workshops coming up (How to Increase Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to upskill their employees about data security. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

The Data Protection and Digital Information Bill: Where are we now? 

The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. It will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). It is expected to be passed in May and will probably come into force after a short transitional period.  

The current Bill is not substantially different to the previous version whose passage through Parliament was paused in September 2022 so ministers could engage in “a co-design process with business leaders and data experts” and move away from the “one-size-fits-all’ approach of the European Union’s GDPR.”  

The Same 

Many of the proposals in the new Bill are the same as contained in the previous Bill. These include: 

  • Amended Definition of Personal Data: This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world.

  • Vexatious Data Subject Requests: The terms “manifestly unfounded” or “excessive” requests, in Article 12 of the UK GDPR, will be replaced with “vexatious” or “excessive” requests. Explanation and examples of such requests will also be included. 

  • Data Subject Complaints: Data Controllers will be required to acknowledge receipt of Data Subject complaints within 30 days and respond substantively “without undue delay”. The ICO will be entitled not to accept a complaint if a Data Subject has not made a complaint to the controller first. 

  • Data Protection Officer: The obligation for some controllers and processors to appoint a Data Protection Officer (DPO) will be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals will be required to designate a senior manager as a “Senior Responsible Individual”.  

  • Data Protection Impact Assessments: These will be replaced by leaner and less prescriptive “Assessments of High-Risk Processing.”  

  • International Transfers: There will be a new approach to the test for adequacy applied by the UK Government to countries (and international organisations) and when Data Controllers are carrying out a Transfer Impact Assessment or TIA. The threshold for this new “data protection test” will be whether a jurisdiction offers protection that is “not materially lower” than under the UK GDPR. (For more detail see also our forthcoming International Transfers webinar). 
  • The Information Commission: The Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive. 

  • PECR: Cookies will be allowed to be used without consent for the purposes of web analytics and to install automatic software updates. Furthermore, non-commercial organisations (e.g. charities and political parties) will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest. Finally, there will be an increase to the fines from the current maximum of £500,000 to UK GDPR levels i.e. up to £17.5m of 4% of global annual turnover (whichever is higher).  

The Changes 

The main changes are summarised below: 

  • Scientific Research: The definition of scientific research is amended so that it now includes research for the purposes of commercial activity. This expands the circumstances in which processing for research purposes may be undertaken, providing a broader consent mechanism and exemption to the fair processing requirement. 
  • Legitimate Interests: The Previous Bill proposed that businesses could rely on legitimate interests (Article 6 lawful basis) without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These “recognised” legitimate interests cover purposes for processing such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement.  The new Bill, whilst keeping the above changes, introduces a non-exhaustive list of cases where organisations may rely on the “legitimate interests” legal basis, including for the purposes of direct marketing, transferring data within the organisation for administrative purposes and for the purposes of ensuring the security of network and information systems; although a balancing exercise still needs to be conducted in these cases.  
  • Automated Decision Making: The Previous Bill clarified that its proposed restrictions on automated decision-making under Article 22 UK GDPR should only apply to decisions that are a result of automated processing without “meaningful human involvement”. The new Bill states that profiling will be a relevant factor in the assessment as to whether there has been meaningful human involvement in a decision.  
  • Records of Processing Activities (ROPA): The Previous Bill streamlined the required content of ROPAs. The new Bill exempts all controllers and processors from the duty to maintain a ROPA unless they are carrying out high risk processing activities.  
  • Subject Access: Clause 12 of the Bill introduced at the House of Commons Report Stage amends Article 12 of UK GDPR (and the DPA 2018) so that Data Controllers are only obliged to undertake a reasonable and proportionate search for information request under the right of access.  

Adequacy 

Although the Government states that the new Bill is “a new system of data protection”, it still retains the UK GDPR’s structure and fundamental obligations. Organisations that are already compliant with the UK GDPR will not be required to make any major changes to their systems and processes.  

The EU conducts a review of adequacy with the UK every four years; the next adequacy decision is due on 27th June 2025. Some commentators have suggested that the changes may jeopardise the UK’s adequate status and so impact the free flow of data between the UK and EU. Defend Digital Me, a civil liberties organisation, has claimed that the Bill would, among other things, weaken data subjects’ rights, water down accountability requirements, and reduce the independence of the ICO.  

Other Parts of the Bill 

The Bill would also: 

  • establish a framework for the provision of digital verification services to enable digital identities to be used with the same confidence as paper documents. 
     
  • increase fines for nuisance calls and texts under PECR. 

  • update the PECR to cut down on ‘user consent’ pop-ups and banners. 

  • allow for the sharing of customer data, through smart data schemes, to provide services such as personalised market comparisons and account management. 
  • reform the way births and deaths are registered in England and Wales, enabling the move from a paper-based system to registration in an electronic register.
  • facilitate the flow and use of personal data for law enforcement and national security purposes. 

  • create a clearer legal basis for political parties and elected representatives to process personal data for the purposes of democratic engagement. 

Reading the Parliamentary debates on the Bill, it seems that the Labour party have no great desire to table substantial amendments to be the Bill. Consequently, it is expected that the Bill will be passed in a form similar to the one now published.  

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. Dive into the issues discussed in this blog and secure your spot now. 

Data Protection Bill Faces Scrutiny:
Commissioner Calls for Tighter Safeguards 

In a recent development, the Information Commissioner has weighed in on the debate surrounding the Data Protection and Digital Information Bill (DPDI Bill), legislation aimed at modernising data protection in the UK. While acknowledging the government’s efforts to strengthen the independence of the Information Commissioner’s Office (ICO) and update data protection practices, the Commissioner’s response highlights significant concerns, particularly around the use of personal data in social security contexts. We wrote a detailed breakdown on our blog here

The response, detailed and thorough, applauds the government’s amendments to the bill, recognising their potential to enhance ICO’s autonomy and bring data protection practices up to date with the digital age. However, the Commissioner expresses reservations about the adequacy of safeguards in the current draft of the bill, especially in terms of personal data handling for social security purposes. 

The Commissioner’s concern primarily revolves around the need for more precise language in the bill. This is to ensure that its provisions are fully aligned with established data protection principles, thereby safeguarding individual rights.
The response suggests that the current wording might be too broad or vague, potentially leading to misuse or overreach in the handling of personal data. 

Importantly, the Commissioner has provided detailed technical feedback for further improvements to the bill. It indicates a need for scrutiny and adjustments to the bill to ensure that it not only meets its intended purpose but also robustly protects the rights of individuals. 

While the Commissioner supports the bill’s overarching aim to enhance the UK’s data protection regime, the emphasis is clearly on the necessity of refining the bill.
This is to ensure it strikes the right balance between enabling data use for public and economic benefits and protecting individual privacy rights. 

The response from the Information Commissioner is a significant moment in the ongoing development of the DPDI Bill. It underscores the complexity and importance of legislating in the digital age, where data plays a crucial role in both the economy and personal privacy. 

As the bill progresses, the government and legislators should consider the Commissioner’s input. The balance they strike in the final version of the bill will be a key indicator of the UK’s approach to data protection in a rapidly evolving digital landscape. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. Dive into the issues discussed in this blog and secure your spot now.

The Hidden Reach of the Prevent Strategy:
Beyond Counter-Terrorism Units

The UK government’s anti-radicalisation program, Prevent, is reportedly sharing the personal details of thousands of individuals more extensively than previously known. This sharing includes not just counter-terrorism units, but also airports, ports, immigration services, and officials at the Home Office and the Foreign, Commonwealth and Development Office (FCDO). Critics argue that such widespread data sharing could be illegal, as it involves moving sensitive personal data between databases without the consent of the individuals. 

A Metropolitan police document titled “Prevent case management guidance” indicates that Prevent details are also shared with the ports authority watchlist. This raises concerns that individuals may face increased scrutiny at airports or be subjected to counter-terrorism powers without reasonable suspicion. The document also mentions that foreign nationals may have their backgrounds checked by the FCDO and immigration services for any overseas convictions or intelligence. 

Furthermore, the Acro Criminal Records Office, which manages UK criminal records, is notified about individuals referred to Prevent, despite the program dealing with individuals who haven’t necessarily engaged in criminal behaviour.
Counter-terror police emphasise their careful approach to data sharing, which aims to protect vulnerable individuals. 

Prevent’s goal is to divert people from terrorism before they offend, and most people are unaware of their referral to the program. 95% of referrals result in no further action. A secret database, the National Police Prevent Case Management database, was previously disclosed in 2019, revealing the storage of details of those referred to Prevent. 

Newly disclosed information, obtained through a freedom of information request by the Open Rights Group (ORG), reveals that Prevent data is shared across various police databases, including the Police National Computer, specialised counter-terrorism and local intelligence systems, and the National Crime Agency. 

The sharing of this data was accidentally revealed due to a redaction error in a heavily edited Met document. Despite its sensitive nature, the ORG decided to make the document public. Sophia Akram of the ORG expressed concerns over the extent of the data sharing and potential harms, suggesting that it could be unfair and possibly unlawful. 

The guidance also indicates that data is retained and used even in cases where no further action is taken. There are concerns about the impact on young people’s educational opportunities, as Prevent requires public bodies like schools and the police to identify individuals at risk of extremism. 

Recent figures show thousands of referrals to Prevent, predominantly from educational institutions. From April 2022 to March 2023, a total of 6,817 individuals were directed to the Prevent program. Within this group, educational institutions were responsible for 2,684 referrals. Breaking down the referrals by age, there were 2,203 adolescents between the ages of 15 and 20, and 2,119 referrals involved children aged 14 or younger.

There are worries about the long-term consequences for children and young people referred to the program. Several cases have highlighted the intrusive nature of this data sharing and its potential impact on individuals’ lives. Cases in which students have missed gaining a place at a sixth form college and other cases involving children as young as four years old.  

Prevent Watch, an organisation monitoring the program, has raised alarms about the data sharing, particularly its effect on young children. The FoI disclosures challenge the notion that Prevent is non-criminalising, as data on individuals, even those marked as ‘no further action’, can be stored on criminal databases and flagged on watchlists. 

Counter-terrorism policing spokespeople defend the program, emphasising its
multi-agency nature and focus on protecting people from harm. They assert that data sharing is carefully managed and legally compliant, aiming to safeguard vulnerable individuals from joining terror groups or entering conflict zones. 

Learn more about data sharing with our UK GDPR Practitioner Certificate. Dive into the issues discussed in this blog and secure your spot now.

Act Now Partners with Middlesex
University Dubai for UAE’s first
Executive Certificate in DP Law

Act Now Training, in collaboration with Middlesex University Dubai, is excited to announce the launch of the UAE’s first Data Protection Executive training programme. This qualification is ideal as a foundation for businesses and organisations aiming to comply with the UAE Federal Data Protection Law.

This practical course focusses on developing a data protection framework and ensuring compliance with the UAE Data Protection Law’s strict requirements. This is particularly relevant given the recent advancements in Data Protection law in the Middle East, including the UAE’s first comprehensive national data protection law, Federal Decree Law No. 45/2021. 

This law regulates personal data processing, emphasising transparency, accountability, and data subject rights. It applies to all organisations processing personal data within the UAE and abroad for UAE residents.

The importance of understanding this law is paramount for every business and organisation, as it necessitates a thorough reassessment of personal data handling practices. Non-compliance can lead to severe penalties and reputational damage.

The Executive Certificate in UAE DP Law is a practical qualification delivered over 5-weeks in two half day sessions per week and offers numerous benefits:

  1. Expertise in Cutting-Edge Legislation: Gain in-depth knowledge of the UAE’s data protection law, essential for professionals at the forefront of data protection practices.

  2. Professional Development: This knowledge enhances your resume, especially for roles in compliance, legal, and IT sectors, showing a commitment to legal reforms.

  3. Practical Application: The course’s structured format allows gradual learning and practical application of complex legal concepts, ensuring a deep understanding of the law.

  4. Risk Mitigation: Understanding the law aids in helping organisations avoid penalties and reputational harm due to non-compliance.

  5. Networking Opportunities: The course provides valuable connections in the field of data protection and law.

  6. Empowerment of Data Subjects: Delegates gain insights into their rights as data subjects, empowering them to protect their personal data effectively.

Delegates will receive extensive support, including expert instruction, comprehensive materials, interactive sessions, practical exercises, group collaboration, ongoing assessment, and additional resources for further learning. Personal tutor support is also provided throughout the course.

This program is highly recommended for officers in organisations both inside and outside the UAE that conduct business in the region or have customers, agents, and employees there. 

Act Now will be delivering and has designed the curriculum. Act Now Training is the UK’s premier provider of information governance training and consultancy, serving government organisations, multinational corporations, financial institutions, and corporate law firms.   

With a history of delivering practical, high-quality training since 2002.
Act Now’s skills-based training approach has led to numerous awards including most recently the Supplier of Year Award 2022-23 by the Information and Records Management Society in the UK. 

Our associates have decades of hands-on global Information Governance experience and thus are able to break down this complex area with real world examples making it easy to understand, apply and even fun!

Middlesex University Dubai is a 5 star rated KHDA university and one of three global campuses including London and Mauritius. It is the largest UK University in the UAE with over 5000 student enrolments from over 120 nationalities.

For more information and to register your interest, visit Middlesex University Dubai’s website. Alternatively you can Click Here.

UK Hospital Trust Reprimanded for GDPR Infringements 

The University Hospitals of Derby and Burton NHS Foundation Trust (UHDB), was recently issued a reprimand (30/10/23) by the Information Commissioner for multiple infringements of the UK General Data Protection Regulation (UK GDPR). This decision highlights significant concerns regarding the management and security of patient data. 

Background of the Case 

UHDB, formed by the merger of the Derby Teaching Hospital NHS Foundation Trust and Burton Hospitals NHS Foundation Trusts in July 2018, operates five hospitals across various locations.
The infringement was initially detected at The Florence Nightingale Community Hospital in Derby. 

The issue revolved around UHDB’s handling of patient referrals for outpatient appointments. These referrals, containing sensitive health data, were processed via an electronic referral system (e-RS). The system, however, was plagued with a critical flaw where referrals would disappear from the worklist after a certain period, resulting in significant delays and data loss. 

Key Findings of the Investigation 

The investigation into UHDB’s practices uncovered several alarming facts: 

Data Subjects Affected: Approximately 4,768 individuals were directly impacted, with over 4,199 experiencing delayed medical referrals. The delayed response potentially caused distress and inconvenience to patients, some of whom waited over two years for treatment. 

Organisational Failings: UHDB was found lacking in implementing adequate organisational measures to prevent accidental data loss, especially concerning special category data. 

Inadequate Processes: The reliance on manual processes and email communications for managing referral drop-offs was deemed ineffective and insecure. 

Lack of Formal Oversight: There was no formal oversight ensuring the effective management and reinstatement of referrals onto the worklist. 

Absence of Risk Assessments: No risk assessment was conducted in relation to handling referral drop-offs, a measure that could have identified and minimised data protection risks. 

Remedial Actions and Recommendations 

In response to the reprimand, UHDB has taken several remedial steps, including conducting full internal and external reviews, contacting affected patients, creating a new Standard Operating Procedure (SOP), and introducing robotic process automation to reduce human error. 

The Commissioner recommended further actions for UHDB, emphasising the need for continuous support to affected data subjects, assessment and monitoring of new processes, and sharing lessons learned across the organisation to prevent future incidents. 

Implications and Conclusions 

This case serves as a stark reminder of the critical importance of data protection in the healthcare sector. It underscores the need for robust systems and processes to safeguard sensitive patient information and the potential consequences of failing to comply with GDPR regulations. 

UHDB’s commitment to rectifying these issues is commendable, yet the incident raises broader questions about data management practices in the NHS and the healthcare sector at large.

The British Library Hack: A Chapter in Ransomware Resilience

In a stark reminder of the persistent threat of cybercrime, the British Library has confirmed a data breach incident that has led to the exposure of sensitive personal data, with materials purportedly up for auction online. An October intrusion by a notorious cybercrime group targeted the library, which is home to an extensive collection, including over 14 million books.

Recently, the ransomware group Rhysida claimed responsibility, publicly displaying snippets of sensitive data, and announcing the sale of this information for a significant sum of around £600k to be paid in cryptocurrency.

While the group boasts about the data’s exclusivity and sets a firm bidding deadline (today 27th November 2023), the library has only acknowledged a leak of what seems to be internal human resources documents. It has not verified the identity of the attackers nor the authenticity of the sale items. The cyber attack has significantly disrupted the library’s operations, leading to service interruptions expected to span several months.

In response, the library has strengthened its digital defenses, sought expert cybersecurity assistance, and urged its patrons to update their login credentials as a protective measure. The library is working closely with the National Cyber Security Centre and law enforcement to investigate, but details remain confidential due to the ongoing inquiry.

The consequences of the attack have necessitated a temporary shutdown of the library’s online presence. Physical locations, however, remain accessible. Updates can be found the British Library’s X (née twitter) feed. The risk posed by Rhysida has drawn attention from international agencies, with recent advisories from the FBI and US cybersecurity authorities. The group has been active globally, with attacks on various sectors and institutions.

The British Library’s leadership has expressed appreciation for the support and patience from its community as it navigates the aftermath of the cyber attack.

What is a Ransomware Attack?

A ransomware attack is a type of malicious cyber operation where hackers infiltrate a computer system to encrypt data, effectively locking out the rightful users. The attackers then demand payment, often in cryptocurrency, for the decryption key. These attacks can paralyse organisations, leading to significant data loss and disruption of operations.

Who is Rhysida?

The Rhysida ransomware group first came to the fore in May of 2023, following the emergence of their victim support chat portal hosted via the TOR browser. The group identifies as a “cybersecurity team” who highlight security flaws by targeting victims’ systems and spotlighting the supposed potential ramifications of the involved security issues.

How to prevent a Ransomware Attack?

Hackers are becoming more and more sophisticated in ways they target our personal data. We have seen this with banking scams recently. However there are some measures we can implement personally and within our organisations to prevent a ransomware attack.

  1. Avoid Unverified Links: Refrain from clicking on links in spam emails or unfamiliar websites. Hackers frequently disseminate ransomware via such links, which, when clicked, can initiate the download of malware. This malware can then encrypt your data and hold it for ransom​​.

  2. Safeguard Personal Information: It’s crucial to never disclose personal information such as addresses, NI numbers, login details, or banking information online, especially in response to unsolicited communications​​.

  3. Educate Employees: Increasing awareness among employees can be a strong defence. Training should focus on identifying and handling suspicious emails, attachments, and links. Additionally, having a contingency plan in the event of a ransomware infection is important​​.

  4. Implement a Firewall: A robust firewall can act as a first line of defence, monitoring incoming and outgoing traffic for threats and signs of malicious activity. This should be complemented with proactive measures such as threat hunting and active tagging of workloads​​.

  5. Regular Backups: Maintain up-to-date backups of all critical data. In the event of a ransomware attack, having these backups means you can restore your systems to a previous, unencrypted state without having to consider ransom demands.

  6. Create Inventories of Assets and Data: Having inventories of the data and assets you hold allows you to have an immediate knowledge of what has been compromised in the event of an attack whilst also allowing you to update security protocols for sensitive data over time.

  7. Multi-Factor Authentication: Identifying legitimate users in more than one way ensures that you are only granting access to those intended. 

These are some strategies organisations can use as part of a more comprehensive cybersecurity protocol which will significantly reduce the risk of falling victim to a ransomware attack. 

Join us on our workshop “How to increase Cyber Security in your Organisation” and Cyber Security for DPO’s where we discuss all of the above and more helping you create the right foundations for Cyber resilience within your organisation. 

UK Biobank’s Data Sharing Raises Alarm Bells

An investigation by The Observer has uncovered that the UK Biobank, a repository of health data from half a million UK citizens, has been sharing information with insurance companies. This development contravenes the Biobank’s initial pledge to keep this sensitive data out of the hands of insurers, a promise that was instrumental in garnering public trust at the outset. UK Biobank has since come out and responded to the article calling it “disingenuous” and “extremely misleading”. 

A Promise Made, Then Modified 

The UK Biobank was set up in 2006 as a goldmine for scientific discovery, offering researchers access to a treasure trove of biological samples and associated health data. With costs for access set between £3,000 and £9,000, the research derived from this data has been nothing short of revolutionary. However, the foundations of this scientific jewel are now being questioned. 

When the project was first announced, clear assurances were given that data would not be made available to insurance companies, mitigating fears that genetic predispositions could be used discriminatorily in insurance assessments. These assurances appeared in the Biobank’s FAQs and were echoed in parliamentary discussions. 

Changing Terms Amidst Grey Areas 

The Biobank contends that while it does strictly regulate data access, allowing only verified researchers to delve into its database, this includes commercial entities such as insurance firms if the research is deemed to be in the public interest. The boundaries of what constitutes “health-related” and “public interest” are now under scrutiny.   

However, according to the Observer investigation, evidence suggests that this nuance—commercial entities conducting health-related research—was not clearly communicated to participants, especially given the categorical assurances given previously although the UK Biobank categorically denies this and shared its consent form and information leaflet. 

Data Sharing: The Ethical Quandary 

This breach of the original promise has raised the ire of experts in genetics and data privacy, with Prof Yves Moreau highlighting the severity of the breach of trust. The concern is not just about the sharing of data but about the integrity of consent given by participants. The Biobank’s response indicates that the commitments made were outdated and that the current policy, which includes sharing anonymised data for health-related research, was made clear to participants upon enrolment. 

The Ripple Effect of Biobank’s Data Policies 

Further complicating matters is the nature of the companies granted access. Among them are ReMark International, a global insurance consultancy, Lydia.ai, a Canadian “insurtech” firm that wants to give people “personalised and predictive health scores”, and Club Vita, a longevity data analytics company. These companies have utilised Biobank data for projects ranging from disease prediction algorithms to assessing longevity risk factors. The question that is raised is how can one ensure that this is in fact in the Public Interest, do we take a commercial entities word for this? UK Biobank says all research conducted is “consistent with being health-related and in the public interest” and it has an expert data access committee who decide on any complex issues but the who checks the ethics of the ethics committee? The issues with this self-regulation are axiomatic. 

The Fallout and the Future 

This situation has led to a broader conversation about the ethical use of volunteered health data and the responsibility of custodians like the UK Biobank to uphold public trust. As technology evolves and the appetite for data grows across industries, the mechanisms of consent and transparency may need to be revisited.  The Information Commissioner’s Office is now considering the case, spotlighting the crucial need for clarity and accuracy in how organisations manage and utilise sensitive personal information. 

As the UK Biobank navigates these turbulent waters, the focus shifts to how institutions like it can maintain the delicate balance between facilitating scientific progress and safeguarding the privacy rights of individuals who contribute their personal data for the greater good. For the UK Biobank, regaining the trust of its participants and the public is now an urgent task, one that will require more than just a careful review of policies but a reaffirmation of its commitment to ethical stewardship of the data entrusted to it. 

Take a look at our highly popular Data Ethics Course. Places fill up fast so if you would like learn more in this fascinating area, book your place now. 

CJEU’s FT v. DW Ruling: Navigating Data Subject Access Requests 

In the landmark case FT v. DW (Case C 307/22), the Court of Justice of the European Union (CJEU), delivered a ruling that sheds light on the intricacies of data subject access requests under the EU General Data Protection Regulation (GDPR). The dispute began when DW, a patient, sought an initial complimentary copy of their dental medical records from FT, a dentist, citing concerns about possible malpractice. FT, however, declined the request based on German law, which requires patients to pay for copies of their medical records. The ensuing legal tussle ascended through the German courts, eventually reaching the CJEU, which had to ponder three pivotal questions. These are detailed below. 

Question 1: The Right to a Free Copy of Personal Data 

The first deliberation was whether the GDPR mandates healthcare providers to provide patients with a cost-free copy of their personal data, irrespective of the request’s motive, which DW’s case seemed to imply was for potential litigation. The CJEU, examining Articles 12(5) and 15(3) of the GDPR and indeed Recital 63, concluded that the regulation does indeed stipulate that the first copy of personal data should be free and that individuals need not disclose their reasons for such requests, highlighting the GDPR’s overarching principle of transparency. 

Question 2: Economic Considerations Versus Rights under the GDPR 

The second matter concerned the intersection of the GDPR with
pre-existing national laws that might impinge upon the economic interests of data controllers, such as healthcare providers. The CJEU assessed whether Article 23(1)(i) of the GDPR could uphold a national rule that imposes a fee for the first copy of personal data. The court found that while Article 23(1)(i) could apply to laws pre-dating the GDPR, it does not justify charges for the first copy of personal data, thus prioritizing the rights of individuals over the economic interests of data controllers. 

Question 3: Extent of Access to Medical Records 

The final issue addressed the extent of access to personal data, particularly whether it encompasses the entire medical record or merely a summary. The CJEU clarified that according to Article 15(3) of the GDPR, a “copy” entails a complete and accurate representation of the personal data, not merely a physical document or an abridged version. This means that a patient is entitled to access the full spectrum of their personal data within their medical records, ensuring they can fully verify and understand their information. 

Conclusion 

The CJEU’s decision in FT v DW reaffirms the GDPR’s dedication to data subject rights and offers a helpful interpretation of the GDPR. It highlights the right of individuals to a free first copy of their personal data for any purpose, refuting the imposition of fees by national law for such access, and establishing the right to a comprehensive reproduction of personal data contained within medical records. The judgement goes on to say the data must be complete even if the term ‘copy’ is used as well as being contextual and intelligible as is required by Article 12(1) of the GDPR. 

We will be examining the impact of this on our upcoming Handling SARs course as well as looking at the ruling in our GDPR Update course. Places are limited so book early to avoid disappointment.