Transparency in Health and Social Care: New ICO Guidance 

Within the health and social care sector, new technologies that use large amounts of personal data are being used to support both direct care and secondary purposes, such as planning and research. An example is the the use of AI to provide automated diagnoses based on medical imaging data from patients. 

Transparency is a key principle of UK Data Protection legislation. Compliance with the first data protection principle and Article 13 and 14 of the UK GDPR ensures that data subjects are aware of how their personal data is used, allowing them to make informed choices about who they disclose their data to and how to exercise their data rights. 

On Monday the Information Commissioner’s Office (ICO) published new guidance to assist health and social care organisations to comply with their transparency obligations under the UK GDPR. It supplements existing ICO guidance on the principle of transparency and the right to be informed

The guidance is aimed at all organisations, including from the private and third sector, who deliver health and social care services or process health and social care information. This includes local authorities, suppliers to the health and social care sector, universities using health information for research purposes and others (e.g. fire service, police and education) that use health information for their own purposes. The guidance will help them to understand the definition of transparency and assess appropriate levels of transparency, as well as providing practical steps to developing effective transparency information. 

This and other data protection developments will be discussed by Robert Bateman in our forthcoming GDPR Update workshop. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.   

GDPR and Privacy: Balancing Consent and Medical Need 

The recent High Court ruling in Wessex Fertility Limited v University Southampton Hospital NHS Foundation Trust Human Fertilisation and Embryology Authority v Donor Conception Network (2024), answers the question of when an express refusal of consent can be overridden because of a medical need. 

The claimant in this case, Wessex Fertility Limited (“the Clinic”) sought declarations that it is lawful for it to request that an egg donor provide a DNA sample for the purposes of genetic analysis; and the processing of the donor’s personal data involved in making this request is lawful under the GDPR. The first part of this question required the court to analyse the Human Fertilisation and Embryology Act 1990 (as amended). Data protection professionals will be interested in the second part which requires consideration of Article 8 of the ECHR (right to privacy) and the GDPR.  

Background 

The Clinic has been licensed to offer fertility treatment and related services since 1992. It treated a couple, Mr and Mrs H, using eggs donated by Donor A and Mr H’s sperm resulting in the birth of a baby girl, AH. Sadly, AH was born with a number of health problems including polydactyly, a cardiac abnormality and gross motor delay.
In order to best treat AH, the Clinic need to understand a genetic cause of her health problems. It wished to contact Donor A to request that she provides a DNA sample which would be used to carry out genetic analysis with the aim of establishing a genetic diagnosis of AH’s condition. 

Consent Wording 

When Donor A donated her eggs, she was asked by the Clinic to complete a number of consent forms, one of which had the following statement: 

“I/we wish to be notified if Wessex Fertility learns (e.g., through the birth of an affected child) that I have a previously unsuspected genetic disease, or that I am a carrier of a harmful inherited condition.”  

Beside this part of the form there is a box to tick ‘yes’ or ‘no’. Donor A, who completed this form in advance of her first egg collection for use by Mr and Mrs H, ticked the box to state that she did not wish to be notified in the event of such a discovery.
Consistent with this, Donor A also completed the form in the same way when at the time of her first egg collection for use with a different recipient couple and another when she returned for her second egg collection the following year. 

The judge considered whether Donor A, having specifically refused consent to be informed of any genetic conditions she might have, could be asked to provide DNA for genetic analysis. 

The Law 

The judge first applied Article 8 of the ECHR (the right to privacy). The judge closely analysed the wording of the consent questions which had been posed to Donor A.
He considered that the question to which Donor A answered “no”, in respect of being informed of genetic conditions, was not drafted in a way which imagined the scenario which had now arisen. Accordingly, it was possible to say that Donor A had not refused consent for a matter such as this. 

In concluding that any interference with Donor A’s Article 8 rights are justified and proportionate, if this court made the declaration requested, the judge took account of, amongst other things, the obvious benefit to AH as it may provide clarity about her diagnosis and/or treatment options in the widest sense. There may also be a wider benefit to others who may have been conceived, or may be conceived in the future, using Donor A’s eggs. 

The judge went on to consider whether the processing of Donor A’s personal data is lawful under the GDPR concluding that there was a lawful basis under Article 6: 

“It is clear the ‘legitimate interest’ under article 6(1)(f) is to enable trio testing to increase the chances of a diagnosis for AH and/or the provision of the correct treatment. Whilst it is recognised Donor A may not consent to providing DNA there is still a need to request it and all other steps short of making the request have been undertaken.” 

The Clinic was also processing Donor A’s health data which is Special Category Data and thus required an additional Article 9 lawful basis. The judge said: 

“The processing under article 9(2)(h) is lawful as it is necessary for the purposes of AH’s diagnosis and/or provision of treatment. Professor Lucassen’s evidence is clear about the benefits of trio testing, such a request will be made by a health professional under obligation of secrecy. The request will not be incompatible with the purposes for which the details were collected, namely, to enable the Clinic to contact Donor A as and when required, which would include in accordance with any declarations made by this Court” 

This case shows the importance of ensuring that consent statements are carefully drafted. When it comes to what the data subject did or did not consent to, the precise wording of the consent statement will be carefully scrutinised by the courts.  

This and other data protection developments will be discussed byRobert Batemanin our forthcomingGDPR Updateworkshop. We have also just launched our new workshop,Understanding GDPR Accountability and Conducting Data Protection Audits. 

DP Bill: Updated Keeling Schedules Published 

The Data Protection and Digital Information Bill is currently in the Committee stage of the House of Lords. If passed, it will make changes to UK data protection legislation including the UK GDPR.

The Government recently published updated versions of Keeling Schedules showing potential changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

Whilst no doubt there will be further amendments, the schedules are worth studying for a clear picture as to the impact of the Bill. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. 

Apprentice Case Study – Meet Natasha

In 2022, Act Now Training teamed up with Damar to deliver the new Data Protection and Information Governance Practitioner Apprenticeship. The aim is to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address future IG challenges. Two years on, over 130 apprentices are currently on the programme with the first cohort about to undertake the end point assessment.

Data Protection and Information Governance Apprentice, Natasha Lock, is an integral part of the Governance and Compliance team at the University of Lincoln. With the Data Protection and Digital Information (No.2) Bill set to make changes to the UK data protection regime, Natasha talks to us about why this is a great area to work in and how the apprenticeship route has been particularly beneficial for her.

How did you get onto the apprenticeship?

“I was already working at the university as an Information Compliance Officer when the opportunity for a staff apprenticeship came up.

“The process was swift and straightforward, and I was enrolled on the Data Protection and Information Governance Apprenticeship within three months of enquiring.”

How has the apprenticeship helped you?

“I started with a good understanding of the UK Data Protection legislation but my knowledge has grown significantly, and now I’m coming to the end of my level 4 apprenticeship, I’ve gained so much more insight and my confidence has grown.

“As a university, we hold vast amounts of data. My apprenticeship is allowing me to solve the challenge of data retention and implement better measures to retain, destroy and archive information. I have developed a greater understanding of the legislative requirements we must adhere to as a public sector institute and how to reduce and assess data protection risks.

“I love the fact that I can study whilst still doing my job. The flexibility works for me because I can go through course materials at my own pace. I really feel like I have a brilliant work/life/study balance.

“The University of Lincoln and Damar Training have been fantastic in supporting me. I get along with my coach, Tracey, so well. She is very friendly and personable and has enabled my creativity to flow.

“The course is very interactive, and I’ve found the forums with other apprentices to be a very useful way of sharing knowledge, ideas and stories.

“I’m enjoying it so much and people have noticed that my confidence has grown. I wouldn’t have had that without doing this apprenticeship. I’ve now got my sights on doing a law degree or law apprenticeship in the future.”

Abi Slater, Information Compliance Manager at Lincoln University, said: “It has been great to see how much Natasha has developed over the course of the apprenticeship. I believe the apprenticeship has provided Natasha with the knowledge and skills required to advance in her data protection career and the support from her coach at Damar Training has been excellent.

“I would encourage anyone with an interest in data protection and information governance to consider this apprenticeship.”

Tracey Coetzee, Coach at Damar Training said: “The Data Protection and Information Governance Apprenticeship was only approved by the Institute of Apprenticeships in 2022, and its delightful to see apprentices flourishing on the programme.

“From cyber security to managing data protection risks, this programme is upskilling participants and adding value to both private and public sector organisations and we’re thrilled to see the first cohort, including Natasha, approach the completion of their training.”

If you are interested in the DP and IG Apprenticeship, please see our website for more details and get in touch to discuss further.

Kate Middleton’s Medical Records: Can anyone go to jail?

Kate Middleton seems to be at the centre of much media (and social media) attention at present. In addition to speculation about her health and whereabouts, there has been much focus and analysis of the now famous photoshopped Mother’s Day photo

This week it was reported that employees at the private London Clinic in Marylebone, where Kate was a patient following abdominal surgery in January, attempted to view her medical records. Reportedly three employees have now been suspended.

The Health Minister, Maria Caulfield, told Sky News it was “pretty severe and serious stuff to be accessing notes that you don’t have permission to”. She also said police had been “asked to look at” whether staff at the clinic attempted to access the princess’s private medical records. 

If the reports are true and individuals are proven to have been “snooping”, what are the consequences? Firstly, this would normally be a matter for the Information Commissioner’s Office (ICO) to investigate rather than the police. Section 170 of the Data Protection Act 2018 makes it a criminal offence for a person to knowingly or recklessly: 

(a) obtain or disclose personal data without the consent of the controller, 

(b) procure the disclosure of personal data to another person without the consent of the controller, or 

(c) after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained. 

Section 170 is similar to the offence under section 55 of the old Data Protection Act 1998 which was often used to prosecute employees who had accessed healthcare and financial records without a legitimate reason. In June 2023, the ICO disclosed that since 1st June 2018, 92 cases involving Section 170 offences were investigated by its Criminal Investigations Team.   

Consequences 

Section 170 is only punishable by way of a fine; perpetrators cannot be sent to prison. Although there is now no cap on the maximum fine, prosecutions have resulted in relatively low fines compared to the harm caused.  

Two recent prosecutions have involved employees accessing medical records.
In November 2023, Loretta Alborghetti, was fined for illegally accessing the medical records of over 150 people. The offence took place whilst she worked as a medical secretary at Worcestershire Acute Hospitals NHS Trust. She was ordered to pay a total of £648. 

In August 2022, Christopher O’Brien, a former health adviser at the South Warwickshire NHS Foundation Trust, pleaded guilty to accessing medical records of patients without a valid legal reason. An ICO investigation found that he unlawfully accessed the records of 14 patients, who were known personally to him, between June and December 2019. One of the victims said the breach left them worried and anxious about O’Brien having access to their health records, with another victim saying it put them off from going to their doctor. O’Brien was ordered to pay £250 compensation to 12 patients, totalling £3,000. 

Computer Misuse Act  

A Section 170 prosecution would have a much greater deterrent effect if the sanctions included a custodial sentence. Successive Information Commissioners have argued for this but to no avail.  

The relatively low fines have led to some cases being prosecuted under section 1 of the Computer Misuse Act 1990 which carries tougher sentences including a maximum of 2 years imprisonment on indictment. In July 2022, a woman who worked for Cheshire Police pleaded guilty to using the police data systems to check up on ex-partners and in August the ICO commenced criminal proceedings against eight individuals over the alleged unlawful accessing and obtaining of people’s personal information from vehicle repair garages to generate potential leads for personal injury claims. 

The ICO has now confirmed that it has a personal data breach report from The London Clinic. If its investigation, finds the Clinic did not comply with its security obligations under the Article 5(1)(f)) and Article 32 of the UK GDPR, it faces a possible maximum Monetary Penalty Notice of £17.5m or 4% of gross annual turnover (whichever is higher). This case highlights the importance of organisations ensuring adequate security measures around sensitive personal data especially where the data relates to high profile individuals.  
 
This and other data protection developments will be discussed in detail on our forthcoming GDPR Update workshop. There are only 3 places left on our next GDPR Practitioner Certificate 

Conservative Party Challenged Over “Data Harvesting” 

In the run up to the General Election this year, political parties in the UK will face the challenge of effectively communicating their message to voters whilst at the same time respecting voters’ privacy. In the past few years, all parties have been accused of riding roughshod over data protection laws in their attempts to convince voters that they ‘have a plan’ or that ‘the country needs change’.  

In May 2017, the Information Commissioner’s Office (ICO) announced that it was launching a formal investigation into the use of data analytics for political purposes after allegations were made about the ‘invisible processing’ of people’s personal data and the micro-targeting of political adverts during the EU Referendum.
This culminated in a report to Parliament and enforcement action against Facebook, Emma’s Diary and some of the companies involved in the Vote Leave Campaign.  

In July 2018 the ICO published a report, Democracy Disrupted, which highlighted significant concerns about transparency around how people’s data was being used in political campaigning. The report revealed a complex ecosystem of digital campaigning with many actors. In 2019, the ICO issued assessment notices to seven political parties. It concluded: 

“The audits found some considerable areas for improvement in both transparency and lawfulness and we recommended several specific actions to bring the parties’ 

processing in compliance with data protection laws. In addition, we recommended that the parties implemented several appropriate technical and organisational measures to meet the requirements of accountability. Overall there was a limited level of assurance that processes and procedures were in place and were delivering data protection compliance.” 

In June 2021, the Conservative Party was fined £10,000 for sending marketing emails to 51 people who did not want to receive them. The messages were sent in the name of Boris Johnson in the eight days after he became Prime Minister in breach of the Privacy and Electronic Communications Regulations 2003 (PECR).  

The Tax Calculator 

The Good Law Project (GLP), a not for profit campaign organisation, is now challenging one aspect of the Conservative Party’s data collection practices. The party’s website contains an online tool which allows an individual to calculate the effect on them of recent changes to National Insurance contributions. However GLP claims this tool is “a simple data-harvesting exercise” which breaches UK data protection laws in a number of ways. It says that a visit to the website automatically leads to the placement of non-essential cookies (related to marketing, analysis and browser tracking), on the visitor’s machine without consent. This is a breach of Regulation 6 of PECR.
GLP also challenges the gathering and use of website visitors’ personal data on the site claiming that (amongst other things) it is neither fair, lawful nor transparent and thus a breach of the UK GDPR 

Director of GLP, Jo Maugham, has taken the first formal step in legal proceedings against the Conservative Party. The full proposed claim is set out in the GLP’s Letter Before Action. The Conservative Party has issued a response arguing that they have acted lawfully and that: 

  • They did obtain consent for the placement of cookies. (GLP disagrees and has now made a 15-page complaint to the ICO.) 
  • They have agreed to change their privacy notice. (GLP is considering whether to ask the court to make a declaration of illegality, claiming that the Tories “have stated publicly that it was lawful while tacitly admitting in private that it is not.”) 
  • They have agreed to the request by GLP to stop processing Jo Maugham’s personal data where that processing reveals his political opinions.  

Following a subject access request, Mr Maugham received 1,384 pages of personal data held about him. GLP claim he is being profiled and believe that such profiling is unlawful. They have instructed barristers with a view to taking legal action.

George Galloway

This is one to watch. If the legal action goes ahead, the result will have implications for other political parties. In any event, in election year, we are already seeing that all political parties data handling practices are going to be under the spotlight.

George Galloway’s landslide win in the Rochdale by-election last week has lead to scrutiny of his party’s processing of Muslim voters’ data. In his blog post , Jon Baines, discusses whether the Workers Party of Britain (led by Mr Galloway) has been processing Special Category Data in breach of the UK GDPR. In the run up to the
by-election, the party had sent different letters to constituents based, it seems, on their religion (or perhaps inferring their religion based on the their name). If this is what it did then, even if the inference is wrong, the party has been processing Special Category Data which requires a lawful basis under Article 9 of the UK GDPR.
In 2022, the ICO issued a fine in the sum of £1,350,000 to Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products. Following the lodging of an appeal by Easylife, the ICO later reduced the fine to £250,000 but the legal basis of the decision still stands. Will the ICO investigate George Galloway?

The DP Bill

The Data Protection and Digital Information (No.2) Bill is currently in the Committee stage of the House of Lords. It will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). Some of the changes will make it easier for political parties to use the personal data of voters and potential voters without the usual GDPR safeguards.
For example political parties could, in the future, rely on “legitimate interests” (as an Article 6 lawful basis) to process process personal data without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These include personal data being processed for the purpose of “democratic engagement”.  The Bill will also amend PECR so that political parties will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest.

As the General Election approaches, and with trust in politics and politicians at a low, all parties need to ensure that they are open, transparent and accountable about how they use voters’ data.  

Our workshop, How to do Marketing in Compliance with GDPR and PECR, is suitable for those advising political parties and any organisation which uses personal data to reach out to potential customers and service users. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits.

Police Misuse of Body Worn Camera Footage 

Last week the BBC reported that police officers made offensive comments about an assault victim while watching body camera footage of her exposed body.  

The woman had been arrested by Thames Valley Police and placed in leg restraints before being recorded on body-worn cameras. While being transported to Newbury police station, she suffered a seizure which resulted in her chest and groin being exposed. A day later she was released without charge. 

A female officer later reviewed the body camera footage, which the force told Metro.co.uk was for ‘evidential purposes’ and ‘standard practice’. The BBC reports that three male colleagues joined her and made offensive comments about the victim.
The comments were brought to the attention of senior police officers by a student officer, who reported his colleagues for covering up the incident. The student officer was later dismissed; though the police said this was unrelated to the report. 

The policing regulator says Thames Valley Police should have reported the case for independent scrutiny. The force has now done so, following the BBC investigation. 

This is not the first time the BBC has highlighted such an issue. In September 2023 it revealed the findings of a two-year investigation. It obtained reports of misuse from Freedom of Information requests, police sources, misconduct hearings and regulator reports. It found more than 150 camera misuse reports with cases to answer over misconduct, recommendations for learning or where complaints were upheld. (You can watch Bodycam cops uncovered on BBC iPlayer) 

The most serious allegations include: 

  • Cases in seven forces where officers shared camera footage with colleagues or
    friends – either in person, via WhatsApp or on social media 

  • Images of a naked person being shared between officers on email and cameras used to covertly record conversations 

  • Footage being lost, deleted or not marked as evidence, including video, filmed by Bedfordshire Police, of a vulnerable woman alleging she had been raped by an inspector – the force later blamed an “administrative error” 

  • Switching off cameras during incidents, for which some officers faced no sanctions – one force said an officer may have been “confused”

Body worn cameras are used widely these days by not just police but also  council officers, train guards, security staff, and parking attendance (to name a few). 

There is no all-encompassing law regulating body worn cameras.  Of course they are used to collect and process personal data therefore will be subject to the UK GDPR. Where used covertly they also be subject to Regulation of Investigatory Powers Act 2000.  

The Information Commissioner’s Office (ICO) provides comprehensive guidelines on the use of CCTV, which are largely considered to extend to body worn cameras(BWCs) for security officers. There is a useful checklist on its website which recommends:  

  • Providing a privacy information  to individuals using BWCs, such as clear signage, verbal announcements or lights/indicators on the device itself and having readily available privacy policies. 
  • Training staff using BWV to inform individuals that recording may take place if it is not obvious to individuals in the circumstances. 
  • Having appropriate retention and disposal policies in place for any footage that is collected. 
  • Having efficient governance procedures in place to be able to retrieve stored footage and process it for subject access requests or onward disclosures where required. 
  • Using technology which has the ability to efficiently and effectively blur or mask footage, if redaction is required to protect the rights and freedoms of any third parties. 

Our one-day CCTV workshop will teach you how to plan and implement a CCTV/BWC project including key skills such as completing a DPIA and assessing camera evidence.
Our expert trainer will answer all your questions including when you can use CCTV/BWC, when it can be covert and how to deal with a request for images.  
 
This workshop is suitable for anyone involved in the operation of CCTV, BWCs and drones including DPOs, investigators, CCTV operators, enforcement officers, estate managers and security personnel. 

The Hidden Reach of the Prevent Strategy:
Beyond Counter-Terrorism Units

The UK government’s anti-radicalisation program, Prevent, is reportedly sharing the personal details of thousands of individuals more extensively than previously known. This sharing includes not just counter-terrorism units, but also airports, ports, immigration services, and officials at the Home Office and the Foreign, Commonwealth and Development Office (FCDO). Critics argue that such widespread data sharing could be illegal, as it involves moving sensitive personal data between databases without the consent of the individuals. 

A Metropolitan police document titled “Prevent case management guidance” indicates that Prevent details are also shared with the ports authority watchlist. This raises concerns that individuals may face increased scrutiny at airports or be subjected to counter-terrorism powers without reasonable suspicion. The document also mentions that foreign nationals may have their backgrounds checked by the FCDO and immigration services for any overseas convictions or intelligence. 

Furthermore, the Acro Criminal Records Office, which manages UK criminal records, is notified about individuals referred to Prevent, despite the program dealing with individuals who haven’t necessarily engaged in criminal behaviour.
Counter-terror police emphasise their careful approach to data sharing, which aims to protect vulnerable individuals. 

Prevent’s goal is to divert people from terrorism before they offend, and most people are unaware of their referral to the program. 95% of referrals result in no further action. A secret database, the National Police Prevent Case Management database, was previously disclosed in 2019, revealing the storage of details of those referred to Prevent. 

Newly disclosed information, obtained through a freedom of information request by the Open Rights Group (ORG), reveals that Prevent data is shared across various police databases, including the Police National Computer, specialised counter-terrorism and local intelligence systems, and the National Crime Agency. 

The sharing of this data was accidentally revealed due to a redaction error in a heavily edited Met document. Despite its sensitive nature, the ORG decided to make the document public. Sophia Akram of the ORG expressed concerns over the extent of the data sharing and potential harms, suggesting that it could be unfair and possibly unlawful. 

The guidance also indicates that data is retained and used even in cases where no further action is taken. There are concerns about the impact on young people’s educational opportunities, as Prevent requires public bodies like schools and the police to identify individuals at risk of extremism. 

Recent figures show thousands of referrals to Prevent, predominantly from educational institutions. From April 2022 to March 2023, a total of 6,817 individuals were directed to the Prevent program. Within this group, educational institutions were responsible for 2,684 referrals. Breaking down the referrals by age, there were 2,203 adolescents between the ages of 15 and 20, and 2,119 referrals involved children aged 14 or younger.

There are worries about the long-term consequences for children and young people referred to the program. Several cases have highlighted the intrusive nature of this data sharing and its potential impact on individuals’ lives. Cases in which students have missed gaining a place at a sixth form college and other cases involving children as young as four years old.  

Prevent Watch, an organisation monitoring the program, has raised alarms about the data sharing, particularly its effect on young children. The FoI disclosures challenge the notion that Prevent is non-criminalising, as data on individuals, even those marked as ‘no further action’, can be stored on criminal databases and flagged on watchlists. 

Counter-terrorism policing spokespeople defend the program, emphasising its
multi-agency nature and focus on protecting people from harm. They assert that data sharing is carefully managed and legally compliant, aiming to safeguard vulnerable individuals from joining terror groups or entering conflict zones. 

Learn more about data sharing with our UK GDPR Practitioner Certificate. Dive into the issues discussed in this blog and secure your spot now.

Act Now Partners with Middlesex
University Dubai for UAE’s first
Executive Certificate in DP Law

Act Now Training, in collaboration with Middlesex University Dubai, is excited to announce the launch of the UAE’s first Data Protection Executive training programme. This qualification is ideal as a foundation for businesses and organisations aiming to comply with the UAE Federal Data Protection Law.

This practical course focusses on developing a data protection framework and ensuring compliance with the UAE Data Protection Law’s strict requirements. This is particularly relevant given the recent advancements in Data Protection law in the Middle East, including the UAE’s first comprehensive national data protection law, Federal Decree Law No. 45/2021. 

This law regulates personal data processing, emphasising transparency, accountability, and data subject rights. It applies to all organisations processing personal data within the UAE and abroad for UAE residents.

The importance of understanding this law is paramount for every business and organisation, as it necessitates a thorough reassessment of personal data handling practices. Non-compliance can lead to severe penalties and reputational damage.

The Executive Certificate in UAE DP Law is a practical qualification delivered over 5-weeks in two half day sessions per week and offers numerous benefits:

  1. Expertise in Cutting-Edge Legislation: Gain in-depth knowledge of the UAE’s data protection law, essential for professionals at the forefront of data protection practices.

  2. Professional Development: This knowledge enhances your resume, especially for roles in compliance, legal, and IT sectors, showing a commitment to legal reforms.

  3. Practical Application: The course’s structured format allows gradual learning and practical application of complex legal concepts, ensuring a deep understanding of the law.

  4. Risk Mitigation: Understanding the law aids in helping organisations avoid penalties and reputational harm due to non-compliance.

  5. Networking Opportunities: The course provides valuable connections in the field of data protection and law.

  6. Empowerment of Data Subjects: Delegates gain insights into their rights as data subjects, empowering them to protect their personal data effectively.

Delegates will receive extensive support, including expert instruction, comprehensive materials, interactive sessions, practical exercises, group collaboration, ongoing assessment, and additional resources for further learning. Personal tutor support is also provided throughout the course.

This program is highly recommended for officers in organisations both inside and outside the UAE that conduct business in the region or have customers, agents, and employees there. 

Act Now will be delivering and has designed the curriculum. Act Now Training is the UK’s premier provider of information governance training and consultancy, serving government organisations, multinational corporations, financial institutions, and corporate law firms.   

With a history of delivering practical, high-quality training since 2002.
Act Now’s skills-based training approach has led to numerous awards including most recently the Supplier of Year Award 2022-23 by the Information and Records Management Society in the UK. 

Our associates have decades of hands-on global Information Governance experience and thus are able to break down this complex area with real world examples making it easy to understand, apply and even fun!

Middlesex University Dubai is a 5 star rated KHDA university and one of three global campuses including London and Mauritius. It is the largest UK University in the UAE with over 5000 student enrolments from over 120 nationalities.

For more information and to register your interest, visit Middlesex University Dubai’s website. Alternatively you can Click Here.

The British Library Hack: A Chapter in Ransomware Resilience

In a stark reminder of the persistent threat of cybercrime, the British Library has confirmed a data breach incident that has led to the exposure of sensitive personal data, with materials purportedly up for auction online. An October intrusion by a notorious cybercrime group targeted the library, which is home to an extensive collection, including over 14 million books.

Recently, the ransomware group Rhysida claimed responsibility, publicly displaying snippets of sensitive data, and announcing the sale of this information for a significant sum of around £600k to be paid in cryptocurrency.

While the group boasts about the data’s exclusivity and sets a firm bidding deadline (today 27th November 2023), the library has only acknowledged a leak of what seems to be internal human resources documents. It has not verified the identity of the attackers nor the authenticity of the sale items. The cyber attack has significantly disrupted the library’s operations, leading to service interruptions expected to span several months.

In response, the library has strengthened its digital defenses, sought expert cybersecurity assistance, and urged its patrons to update their login credentials as a protective measure. The library is working closely with the National Cyber Security Centre and law enforcement to investigate, but details remain confidential due to the ongoing inquiry.

The consequences of the attack have necessitated a temporary shutdown of the library’s online presence. Physical locations, however, remain accessible. Updates can be found the British Library’s X (née twitter) feed. The risk posed by Rhysida has drawn attention from international agencies, with recent advisories from the FBI and US cybersecurity authorities. The group has been active globally, with attacks on various sectors and institutions.

The British Library’s leadership has expressed appreciation for the support and patience from its community as it navigates the aftermath of the cyber attack.

What is a Ransomware Attack?

A ransomware attack is a type of malicious cyber operation where hackers infiltrate a computer system to encrypt data, effectively locking out the rightful users. The attackers then demand payment, often in cryptocurrency, for the decryption key. These attacks can paralyse organisations, leading to significant data loss and disruption of operations.

Who is Rhysida?

The Rhysida ransomware group first came to the fore in May of 2023, following the emergence of their victim support chat portal hosted via the TOR browser. The group identifies as a “cybersecurity team” who highlight security flaws by targeting victims’ systems and spotlighting the supposed potential ramifications of the involved security issues.

How to prevent a Ransomware Attack?

Hackers are becoming more and more sophisticated in ways they target our personal data. We have seen this with banking scams recently. However there are some measures we can implement personally and within our organisations to prevent a ransomware attack.

  1. Avoid Unverified Links: Refrain from clicking on links in spam emails or unfamiliar websites. Hackers frequently disseminate ransomware via such links, which, when clicked, can initiate the download of malware. This malware can then encrypt your data and hold it for ransom​​.

  2. Safeguard Personal Information: It’s crucial to never disclose personal information such as addresses, NI numbers, login details, or banking information online, especially in response to unsolicited communications​​.

  3. Educate Employees: Increasing awareness among employees can be a strong defence. Training should focus on identifying and handling suspicious emails, attachments, and links. Additionally, having a contingency plan in the event of a ransomware infection is important​​.

  4. Implement a Firewall: A robust firewall can act as a first line of defence, monitoring incoming and outgoing traffic for threats and signs of malicious activity. This should be complemented with proactive measures such as threat hunting and active tagging of workloads​​.

  5. Regular Backups: Maintain up-to-date backups of all critical data. In the event of a ransomware attack, having these backups means you can restore your systems to a previous, unencrypted state without having to consider ransom demands.

  6. Create Inventories of Assets and Data: Having inventories of the data and assets you hold allows you to have an immediate knowledge of what has been compromised in the event of an attack whilst also allowing you to update security protocols for sensitive data over time.

  7. Multi-Factor Authentication: Identifying legitimate users in more than one way ensures that you are only granting access to those intended. 

These are some strategies organisations can use as part of a more comprehensive cybersecurity protocol which will significantly reduce the risk of falling victim to a ransomware attack. 

Join us on our workshop “How to increase Cyber Security in your Organisation” and Cyber Security for DPO’s where we discuss all of the above and more helping you create the right foundations for Cyber resilience within your organisation.