The Data Protection and Digital Information Bill: A new UK GDPR?

In July the Government published the Data Protection and Digital Information Bill, the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. 

In the Government’s response to the September 2021 consultation (“Data: A New Direction”) it said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” To achieve this, the new Bill proposes substantial amendments to existing UK data protection legislation; namely the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). There is no shiny new Data Protection Act 2022 or even a new colour for the UK GDPR! Perhaps a missed opportunity to showcase the benefits of Brexit! 

In addition to reforming core data protection law, the Bill deals with certification of digital identity providers, electronic registers of births and deaths and information standards for data-sharing in the health and adult social care system. The notable DP provisions are set out below.

Amended Definition of Personal Data

Clause 1 of the Bill limits the scope of personal data to:

  • where the information is identifiable by the controller or processor by reasonable means at the time of the processing; or
  • where the controller or processor ought to know that another person will likely obtain the information as a result of the processing and the individual will likely be identifiable by that person by reasonable means at the time of the processing.

This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. It could make it easier for organisations to achieve data anonymisation as they would no longer need to concern themselves with potential future identifiability, with the focus instead being on identifiability “at the time of the processing”. On the other hand, the change does not address the risk of indirect identification.

Vexatious Data Subject Requests

Article 12 of the UK GDPR allows controllers to refuse to comply with data subject rights requests (or charge a fee) when the requests are “manifestly unfounded” or “excessive”.  Clause 7 of the Bill proposes to replace this with “vexatious” or “excessive”. Examples of vexatious requests given in the Bill are those requests intended to cause distress, not made in good faith, or that are an abuse of process. All these could easily fit into “manifestly unfounded” and so it is difficult to understand the need for change here. 

Data Subject Complaints

Currently, the UK GDPR allows a data subject to complain to the Information Commissioner, but nothing expressly deals with whether or how they can complain to a controller. Clause 39 of the Bill would make provision for this and require the controller to acknowledge receipt of such a complaint within 30 days and respond substantively “without undue delay”. However, under clause 40, if a data subject has not made a complaint to the controller, the ICO is entitled not to accept the complaint.

Much was made about “privacy management programmes” in the Government’s June announcement. These are not expressly mentioned in the Bill but most of the proposals that were to have fallen under that banner are still there (see below).

Senior Responsible Individuals

As announced in June, the obligation for some controllers and processors to appoint a Data Protection Officer (DPO) is proposed to be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals, are required (by clause 14) to designate a senior manager as a “Senior Responsible Individual”. Just like the DPO, the SRI must be adequately resourced and cannot be dismissed for performing their tasks under the role. The requirement for them to be a senior manager (rather than just reporting to senior management, as current DPOs must) will cause problems for those organisations currently using outsourced DPO services.

ROPAs and DPIAs

The requirement for Records of Processing Activities (ROPAs) will also go. Clause 15 of the Bill proposes to replace it with a leaner “Record of Processing of Personal Data”.  Clause 17 will replace Data Protection Impact Assessments (DPIAs) with leaner and less prescriptive Assessments of High Risk Processing. Clause 18 ensures that controllers are no longer required, under Article 36 of the UK GDPR, to consult the ICO on certain high risk DPIAs.

Automated Decision Making

Article 22 of UK GDPR currently confers a “right” on data subjects not to be subject to automated decision making which produces legal effects or otherwise significantly affects them. Clause 11 of the Bill reframes Article 22 in terms of a positive right to human intervention. However, it would only apply to “significant” decisions, rather than decisions that produce legal effects or similarly significant effects. It is unclear whether this will make any practical difference. 

International Transfers 

The judgment of the European Court of Justice (ECJ) in “Schrems II” not only stated that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal transfer tool. It also said that in any international data transfer situation, whether to the USA or other countries, the data exporter needs to make a complex assessment  about the recipient country’s data protection legislation to ensure that it adequately protects the data especially from access by foreign security agencies (a Transfer Impact Assessment or TIA) .  

The Bill amends Chapter 5 of the UK GDPR (international transfers) with the introduction of the “data protection test” for the above mentioned assessment. This would involve determining if the standard of protection provided for data subjects in the recipient country is “not materially lower” than the standard of protection in the UK. The new test would apply both to the Secretary of State, when making “adequacy” determinations, and to controllers, when deciding whether to transfer data. The explanatory notes to the Bill state that the test would not require a “point- by-point comparison” between the other country’s regime and the UK’s. Instead an assessment will be “based on outcomes i.e. the overall standard of protection for a data subject”. 

An outcome based approach will be welcome by organisations who regularly transfer personal data internationally especially where it is of no practical interest to foreign security agencies. However, this proposed approach will attract the attention of the EU (see later). (see also our forthcoming International Transfers webinar).

The Information Commission

Under clause 100 of the Bill, the Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive (presumably John Edwards, the current Commissioner). 

The Commission would have a principal function of overseeing data protection alongside additional duties such as to have regard to the desirability of promoting innovation; the desirability of promoting competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; and the need to safeguard public security and national security. New powers for the Commission include an audit/assessment power (clause 35) to require a controller to appoint a person to prepare and provide a report and to compel individuals to attend for interviews (clause 36) in civil and criminal investigations.

The Bill also proposes to abolish the Surveillance Camera Commissioner and the Biometrics Commissioner.

Privacy and Electronic Communications (EC Directive) Regulations 2003 

Currently, under PECR, cookies (and similar technologies) can only be used to store or access information on end user terminal equipment without express consent where it is “strictly necessary” e.g. website security or proper functioning of the site. The Bill proposes allowing cookies to be used without consent for the purposes of web analytics and to install automatic software updates (see the GDPR enforcement cases involving Google Analytics). 

Another notable proposed change to PECR, involves extending “the soft opt-in” to electronic communications from organisations other than businesses. This would permit political parties, charities and other non-profits to send unsolicited email and SMS direct marketing to individuals without consent, where they have an existing supporter relationship with the recipient. 

Finally on PECR, the Bill proposes to increase the fines for infringement from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

Business Data

The Bill would give the Secretary of State and the Treasury the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. “Customers” would not just be data subjects, but anyone who purchased (or received for free) goods, services or digital content from a trader in a consumer (rather than business) context. “Business data” would include information about goods, services and digital content supplied or provided by a trader. It would also include information about where those goods etc. are supplied, the terms on which they are supplied or provided, prices or performance and information relating to feedback from customers. Customers would potentially have a right to access their data, which might include information on the customer’s usage patterns and the price paid to aid personalised price comparisons. Similarly, businesses could potentially be required to publish, or otherwise make available, business data.

These provisions go much further than existing data portability provisions in the UK GDPR. The latter does not guarantee provision of data in “real time”, nor cover wider contextual data. Nor do they apply where the customer is not an individual.

Adequacy?

The Bill is currently making its way through Parliament. The impact assessment reiterates that “the government’s view is that reform of UK legislation on personal data is compatible with the EU maintaining free flow of personal data from Europe.”  However, with the multiple amendments proposed in the Bill, the UK GDPR is starting to look quite different to the EU version. And the more the two regimes diverge, the more there is a risk that the EU might put a “spanner in the works” when the UK adequacy assessment is reviewed in 2024. Much depends on the balance struck in the final text of the Bill. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September. 

New DP and IG Practitioner Apprenticeship

Act Now Training has teamed up with Damar Training on materials and expertise underpinning its new Data Protection and Information Governance Practitioner Level 4 Apprenticeship.

The apprenticeship, which received final approval in March, will help develop the skills of those working in the increasingly important fields of data protection and information governance. 

With the rapid advancement of technology, there is a huge amount of personal data being processed by organisations, which is the subject of important decisions affecting every aspect of people’s lives. This poses significant legal and ethical challenges, as well as the risk of incurring considerable fines from regulators for non compliance. 

This apprenticeship aims to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address these challenges.

Ibrahim Hasan, Director of Act Now, said:

“We are excited to be working Damar Training to help deliver this much needed apprenticeship. We are committed to developing the IG sector and encouraging a diverse range of entrants to the IG profession. We have looked at every aspect of the IG Apprenticeship standard to ensure the training materials equip budding IG officers with the knowledge and skills they need to implement the full range of IG legislation in a practical way.

Damar’s managing director, Jonathan Bourne, added:

“We want apprenticeships to create real, long-term value for apprentices and organisations. It is vital therefore that we work with partners who really understand not only the technical detail but also the needs of employers.

Act Now Training are acknowledged as leaders in the field, having recently won the Information and Records Management Society (IRMS) Supplier of the Year award for the second consecutive year. I am delighted therefore that we are able to bring together their 20 years of deep sector expertise with Damar’s 40+ year record of delivering apprenticeship in business and professional services.

This apprenticeship has already sparked significant interest, particularly among large public and private sector organisations and professional services firms. Damar has also assembled an employer reference group that is feeding into the design process in real time to ensure that the programme works for employers.

The employer reference group met for the first time on May 25. It included industry professionals across a variety of sectors including private and public health care, financial services, local and national government, education, IT and data consultancy, some of whom were part of the apprenticeship trailblazer group.

If your organisation is interested in the apprenticeship please get in touch with us to discuss further.

Act Now Training Wins IRMS Supplier of the Year Award 2022-23

Act Now Training is proud to announce that it has won the Information and Records Management Society (IRMS) Supplier of the year award for 2022-23. This is the second consecutive year we have won this award.

The awards ceremony took place on Monday night at the IRMS Conference in Glasgow. Act Now was also nominated for two others awards including Innovation of the Year for our Advanced Certificate in GDPR Practice.

Ibrahim Hasan said:

“I would like to thank the IRMS for a great event and the members for voting for us. It feels really special to be recognised by fellow IG practitioners. We are proud to deliver great courses that meet the needs of IRMS members. This award also recognises the hard work of our colleagues who are focussed on fantastic customer service as well as our experienced associates who deliver great practical content and go the extra mile for our delegates. Congratulations to all the other IRMS awards winners.”

It has been another fantastic year for Act Now. We have launched some great new courses and products. We have exciting new courses planned for 2023. Watch this space!

BTW – Act Now also won the best elevator pitch prize at the conference vendor showcase. Click here to watch Ibrahim’s pitch.

The New Saudi Arabian Federal Data Protection Law 

The Middle East is fast catching up with Europe when it comes to data protection law. The Kingdom of Saudi Arabia(KSA) has enacted its first comprehensive national data protection law to regulate the processing of personal data. This is an important development alongside the passing of the new UAE Federal DP law. It also opens up opportunities for UK and EU Data Protection professionals especially as these new laws are closely aligned with the EU General Data Protection Regulation (GDPR) and the UK GDPR

The KSA Personal Data Protection Law (PDPL) was passed by Royal Decree M/19 of 9/2/1443H on 16 September 2021, approving Resolution No. 98 dated 7/2/1443H (14 September 2021). The detailed Executive Regulations are expected to be published soon and will give more details about the new law. It will be effective from 23rd March 2022 following which there will be a one year implementation period.

Enforcement 

PDPL will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA).The Executive Regulations will set out the administrate penalties that can be imposed on organisations for breaches. Expect large fines for non-compliance alongside other sanctions. PDPL could mirror the GDPR which allows the regulator to impose a fine of up to 20 million Euros or 4% of gross annual turnover, whichever is higher. PDPL also contains criminal offences which carry a term of imprisonment up to 2 years and/or a fine of up to 3 million Saudi Royals (approximately £566,000). Affected parties may also be able to claim compensation.

Territorial Scope

PDPL applies to all organisations that are processing personal data in the KSA irrespective of whether the data relates to Data Subjects living in the KSA. It also has an “extra-territorial” reach by applying to organisations based abroad who are processing personal data of Data Subjects resident in the KSA. Interestingly, unlike the UAE Federal DP law, PDPL does not exempt government authorities from its application although there are various exemptions from certain obligations where the data processing relates to national security, crime detection, statutory purposes etc.

Notable Provisions

PDPL mirrors GDPR’s underlying principles of transparency and accountability and empowers Data Subjects by giving them rights in relation to their personal data. We set out below the notable provisions including links to previous GDPR blog posts for readers wanting more detail, although more information about the finer points of the new law will be included in the forthcoming Executive Regulations. 

  • Personal Data – PDPL applies to the processing of personal data which is defined very broadly to include any data which identifies a living individual. However, unlike GDPR, Article 2 of PDPL includes within its scope, the data of a deceased person if it identifies them or a family member.
  • Registration  Article 23 requires Data Controllers (organisations that collect personal data and determine the purpose for which it is used and the method of processing) to register on an electronic portal that will form a national record of controllers. 
  • Lawful Bases – Like the UAE Federal DP law, PDPL makes consent the primary legal basis for processing personal data. There are exceptions including, amongst others, if the processing achieves a “definite interest” of the Data Subject and it is impossible or difficult to contact the Data Subject.
  • Rights – Data Subjects are granted various rights in Articles 4,5 and 7 of the PDPL which will be familiar to GDPR practitioners. These include the right to information (similar to Art 13 of GDPR), rectification, erasure and  Subject Access. All these rights are subject to similar exemptions found in Article 23 of GDPR.
  • Impact Assessments – Article 22 requires (what GDPR Practitioners call) “DPIAs” to be undertaken in relation to any new high risk data processing operations. This will involve assessing the impact of the processing on the risks to the rights of Data Subjects, especially their privacy and confidentiality.
  • Breach Notification – Article 20 requires organisations to notify the regulator, as well as a Data Subjects, if they suffer a personal data breach which compromises Data Subjects’ confidentiality, security or privacy. The timeframe for notifying will be set by the Executive Regulations.
  • Records Management – Organisations will have to demonstrate compliance with PDPL by keeping records. There is a specific requirement in Article 3 to keep records similar to a Record of Processing Activities(ROPA) under GDPR.
  • International Transfers – Like other data protection regimes PDPL  imposes limitations on the international transfer of personal data outside of the KSA. . There are exceptions; further details will be set out in the Executive Regulations.
  • Data Protection Officers – Organisations (both controllers and processors) will need to appoint at least one officer to be responsible for compliance with PDPL. The DPO can be an employee or an independent service provider and does not need to be located in the KSA. 
  • Training – After 23 March 2022, Data Controllers will be required to hold seminars for their employees to familiarise them with the new law.

Practical Steps

Organisations operating in the KSA, as well as those who are processing the personal data of KSA residents, need to assess the impact of PDPL on their data processing activities. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so will not just lead to enforcement action but also reputational damage. The following should be part of an action plan for compliance:

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes. 
  2. Training staff at all levels to understand PDPL at how it will impact on their role.
  3. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed.
  4. Reviewing how records management and information risk  is addressed within the organisation.
  5. Drafting Privacy Notices to ensure they set out the minimum information that should be included.
  6. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification.
  7. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure.
  8. Appointing and training a  Data Protection Officer.

Act Now Training can help your organisation prepare for PDPL. We are running a webinar on this topic soon and can also deliver more detailed in house training. Please get in touch to discuss you training needs. We are in Dubai and Abu Dhabi from 16th to 21st January 2022 and would be happy to arrange a meeting.

Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Ring Doorbells, Domestic CCTV and GDPR

The Daily Mail reports today that, “A female doctor is set to be paid more than £100,000 after a judge ruled that her neighbour’s Ring smart doorbell cameras breached her privacy in a landmark legal battle which could pave the way for thousands of lawsuits over the Amazon-owned device.”

Dr Mary Fairhurst, the Claimant, alleged that she was forced to move out of her home because the internet-connected cameras are so “intrusive”. She also said that the Defendant, Mr Woodard, had harassed her by becoming “aggressive” when she complained to him.

A judge at Oxford County Court, ruled yesterday that Jon Woodard’s use of his Ring cameras amounted to harassment, nuisance and a breach of data protection laws. The Daily Sage goes on to say:

“Yesterday’s ruling is thought to be the first of its kind in the UK and could set precedent for more than 100,000 owners of the Ring doorbell nationally.”

Before Ring doorbell owners rush out to dismantle their devices, let’s pause and reflect on this story. This was not about one person using a camera to watch their house or protect their motorbike. The Defendant had set up a network of cameras around his property which could also be used to watch his neighbour’s comings and goings. 

Careful reading of the judgement leads one to conclude that the legal action brought by the Claimant was really about the use of domestic cameras in such a way as to make a neighbour feel harassed and distressed. She was primarily arguing for protection and relief under the Protection from Harassment Act 1997 and the civil tort of nuisance. Despite the Daily Mail’s sensational headline, the judgement does not put domestic CCTV camera or Ring doorbell owners at risk of paying out thousands of pounds in compensation (as long as they don’t use the cameras to harass their neighbours!). However, it does require owners to think about the legal implications of their systems. Let’s examine the data protection angle.

Firstly, the UK GDPR can apply to domestic CCTV and door camera systems. After all, the owners of such systems are processing personal data (images and even voice recordings) about visitors to their property as well as passers-by and others caught in the systems’ peripheral vision.  However, on the face of it, a domestic system should be covered by Article 2(2)(a) of the UK GDPR which says the law does not apply to “processing of personal data by an individual in the course of purely personal or household activity.” Recital 18 explains further:

“This Regulation does not apply to the processing of personal data by a natural person in the course of a purely personal or household activity and thus with no connection to a professional or commercial activity. Personal or household activities could include correspondence and the holding of addresses, or social networking and online activity undertaken within the context of such activities.”

The judge in this case concluded that the camera system, set up by the Defendant, had collected data outside the boundaries of his property and, in the case of one specific camera, “it had a very wide field of view and captured the Claimant’s personal data as she drove in and out of the car park.” This would take the system outside of the personal and household exemption quoted above, as confirmed by the Information Commissioner’s CCTV guidance:

“If you set up your system so it captures only images within the boundary of your private domestic property (including your garden), then the data protection laws will not apply to you.

But what if your system captures images of people outside the boundary of your private domestic property – for example, in neighbours’ homes or gardens, shared spaces, or on a public footpath or a street?

Then the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA18) will apply to you, and you will need to ensure your use of CCTV complies with these laws.”

Once a residential camera system comes under the provisions of the UK GDPR then of course the owner has to comply with all the Data Protection Principles including the obligation to be transparent (through privacy notices) and to ensure that the data processing is adequate, relevant and not excessive. Data Subjects also have rights in relation to their data including to see a copy of it and ask for it to be deleted (subject to some exemptions).

Judge Clarke said the Defendant had “sought to actively mislead the Claimant about how and whether the cameras operated and what they captured.” This suggests a breach of the First Principle (lawfulness and transparency). There were also concerns about the amount of data some of the cameras captured (Fourth Principle).

Let’s now turn to the level of compensation which could be awarded to the Claimant. Article 82 of the UK GDPR does contain a free standing right for a Data Subject to sue for compensation where they have suffered material or non-material damage, including distress, as a result of a breach of the legislation. However, the figure mentioned by the Daily Mail headline of £100,000 seems far-fetched even for a breach of harassment and nuisance laws let alone GDPR on its own. The court will have to consider evidence of the duration of the breach and the level of damage and distress cause to the Claimant. 

This judgement does not mean that Ring door camera owners should rush out to dismantle them before passing dog walkers make compensation claims. It does though require owners to think carefully about the citing of cameras, the adequacy of notices and the impact of their system on their neighbour’s privacy. 

The Daily Mail story follows yesterday’s BBC website feature about footballers attempting to use GDPR to control use of their performance data (see yesterday’s blog and Ibrahim Hasan’s BBC interview). Early Christmas gifts for data protection professionals to help them highlight the importance and topicality of what they do!

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Ronaldo’s Data and GDPR: Who said data protection is boring?

There is an interesting story this morning on the BBC website about a group of footballers threatening legal action and seeking compensation for the trade in their personal data. The use of data is widespread in every sport. It is not just used by clubs to manage player performance but by others such as betting companies to help them set match odds. Some of the information may be sold by clubs whilst other information may be collected by companies using public sources including the media.  

Now 850 players (Ed – I don’t know if Ronaldo is one of them but I could not miss the chance to mention my favourite footballer!), led by former Cardiff City manager Russell Slade, want compensation for the trading of their performance data over the past six years by various companies. They also want an annual fee from the companies for any future use. The data ranges from average goals-per-game for an outfield player to height, weight and passes during a game. 

BBC News says that an initial 17 major betting, entertainment and data collection firms have been targeted, but Slade’s Global Sports Data and Technology Group has highlighted more than 150 targets it believes have “misused” data. His legal team claim that the fact players receive no payment for the unlicensed use of their data contravenes the General Data Protection Regulation (GDPR). However, the precise legal basis of their claim is unclear. 

In an interview with the BBC, Slade said:

“There are companies that are taking that data and processing that data without the individual consent of that player.”

This suggests a claim for breach of the First Data Protection Principle (Lawfulness and Transparency). However, if the players’ personal data is provided by their clubs e.g., height, weight, performance at training sessions etc. then it may be that players have already consented (and been recompensed for this) as part of their player contract. In any event, Data Protection professionals will know that consent is only one way in which a Data Controller can justify the processing of personal data under Article 6 of GDPR. Article 6(1)(f) allows processing where it:

“is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data… .”

Of course, this requires a balancing exercise considering the interests pursued by the clubs and data companies and the impact on individual players’ privacy. Some would argue that as far as public domain information is concerned, the impact on players’ privacy is minimal. However, “the interests or fundamental rights and freedoms of the data subject’ also include reputational damage, loss of control and financial loss, all of which it could be argued result from the alleged unauthorised use of data.

The BBC article quotes former Wales international Dave Edwards, one of the players behind the move:

“The more I’ve looked into it and you see how our data is used, the amount of channels its passed through, all the different organisations which use it, I feel as a player we should have a say on who is allowed to use it.”

The above seems to suggest that the players’ argument is also about control of their personal data. The GDPR does give players rights over their data which allow them to exercise some element of control including the right to see what data is held about them, to object to its processing and to ask for it to be deleted. It may be that players are exercising or attempting to exercise these rights in order to exert pressure on the companies to compensate them.

Without seeing the paperwork, including the letters before action which have been served on the companies, we can only speculate about the basis of the claim at this stage. Nonetheless, this is an interesting case and one to watch. If the claim is successful, the implications could have far-reaching effects beyond football. Whatever happens it will get data protection being talked about on the terraces!

Ibrahim Hasan, solicitor and director of Act Now Training, has given an interview to BBC Radio 4’s (PM programme) about this story. You can listen again here (from 39) minutes onwards.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Online GDPR Practitioner Certificate: Going from Strength to Strength

caleb-woods-RIcMwDLk1wo-unsplash

Act Now Training would like to congratulate all our delegates who have recently completed our online GDPR Practitioner Certificate.

At the start of the Pandemic, we decided to offer our flagship classroom based GDPR Practitioner Certificate as an online option. We redesigned the course for the online world with even more emphasis on practical exercises and case studies to try and recreate the classroom learning environment. Delegates receive all the fantastic features of our classroom course but in a live online learning environment accessible from anywhere in the world.

The course is aimed at those undertaking the role of Data Protection Officer under GDPR whether in the public or the private sector. It teaches delegates all the essential GDPR skills and knowledge. The course takes place over four days (one day per week) and involves lectures, assessments and exercises. This is followed by a written assessment. Candidates are then required to complete a practical project (in their own time) to achieve the certificate.

In less than 18 months, 178 delegates have completed the course representing a diverse range of organisations including private companies, councils, universities and government departments. We have even had delegates from the Houses of Parliament, Gibraltar and the Isle of Man. 

Read some of the delegate feedback below:

“The GDPR Practitioner Certificate course was really excellent.  The content was thorough and tied in with real life situations to really embed the learning.  It has really helped me in my role, even after the first week I had new practical skills I could use in my daily work life.” KL, Kent County Council

“Very useful, insightful course that provided hands on practical tips on GDPR implementation within our business.” AD, Danske Bank

“The course was delivered at a pace that suited the learners and with ample opportunities to revisit tricky topics or ask for clarification. The supplied learning materials were comprehensive and genuinely added value to the learning experience.” Nigel Leech, CEFAS

“The course was the right balance between overall guidance and detailed information presented in an easy and understandable format.” RR, Chugai Pharma Europe Ltd

​“The course delivered through Act Now was not only educational but informative.
The teaching skills were excellent, I felt at ease in the class and came away with a tremendous amount of knowledge that I can use in everyday situations. Thank you.”
HB, Foseco International Limited

Our next online GDPR Practitioner Certificate courses start in October. We also have a classroom course starting in November in Manchester. 

For those who have successfully completed our GDPR Practitioner Certificate, we have one place left on our Advanced Certificate in GDPR Practice course starting in October. 

Covid-19, GDPR and Temperature Checks

Hands holding Thermometer Infrared Gun Isometric Medical Digital Non-Contact.

Emma Garland writes…

Many countries have now been in some form of lockdown for a considerable length of time. As some of the lockdown measures are slowly being eased, one of the possible solutions to prevent a “second wave” is the implementation of temperature checks in shops and workplaces. This involves placing a thermometer on an individual’s forehead. Of course if the temperature is recorded or there is another way the individual can be identified, it will involve processing health data. Care must be taken to consider the GDPR and privacy implications.

Apple reopened stores across Germany on 11th May with extra safety procedures, including temperature checks and social distancing. It is now facing a probe by a regional German data protection regulator into whether its plan to take the temperature of its store customers violates GDPR.

The benefits of temperature check are self-evident. By detecting members of the public or staff who have a high temperature, and not permitting them to enter the store or workplace, staff have less risk of close contact with people who may have COVID 19. Temperature checks are just one small part of stopping the spread of COVID 19 and can be intrusive. What is the lawful basis for processing such data? Art 6(1)(d) of GDPR allows processing where it is:

“…is necessary in order to protect the vital interests of the data subject or of another natural person”

Of course “data concerning health” is also Special Category Data and requires an Article 9 condition to ensure it is lawful. Is a temperature check necessary to comply with employment obligations, for medical diagnosis or for reasons of public health?

All conditions under Article 6 and 9 must satisfy the test of necessity. There are many causes of a high temperature not just COVID 19. There have also been doubts over the accuracy of temperature readings. They take skin temperature, which can vary from core temperature, and do not account for the incubation phase of the disease where people may be asymptomatic.

ICO Guidance

The Information Commissioner’s Office (ICO) has produced guidance on workplace testing which states:

“Data protection law does not prevent you from taking the necessary steps to keep your staff and the public safe and supported during the present public health emergency.
But it does require you to be responsible with people’s personal data and ensure it is handled with care.”

The ICO suggests that  “legitimate interests” or “public task” could be used to justify the processing of personal data as part of a workplace testing regime. The former will require a Legitimate Interests Assessment, where the benefit of the data to the organisation is balanced against the risks to the individual.  In terms of Article 9, the ICO suggests the employment condition, supplemented by Schedule 1 of the Data Protection Act 2018. The logic used here is that employment responsibilities extend to compliance wide range of legislation, including health and safety.

More generally, the ICO says that that technology which could be considered privacy intrusive should have a high justification for usage. It should be part of a well thought out plan, which ensures that it is an appropriate means to achieve a justifiable end. alternatives should also have been fully evaluated. The ICO also states:

“If your organisation is going to undertake testing and process health information, then you should conduct a DPIA focussing on the new areas of risk.”

A Data Protection Impact Assessment should map the flow of the data including collection, usage, retention and deletion as well as the associated risks to individuals.
Some companies are even using thermal cameras as part of COVID 19 testing.
The Surveillance camera Commissioner (SCC) and the ICO have worked together to update the SCC DPIA template, which is specific to surveillance systems.

As shops begin to open and the world establishes post COVID 19 practices, many employers and retailers will be trying to find their “new normal”. People will also have to decide what they are comfortable with. Temperature should be part of a considered approach evaluating all the regulatory and privacy risks.

Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

The NHS COVID 19 Contact Tracing App: Part 3 The Human Rights Angle

christian-lue-P0JL8np1N6k-unsplash

Everyone will agree that the government needs to do everything it can to prevent the further spread of the Coronavirus and to “save lives” (except if your name is Dominic Cummings -Ed). However, there is much less consensus about the what it should do, and this can be seen in the current debate about the proposal to roll out a contact tracing  system and the NHS COVID App. This is the third in a series of blog posts where we examine the COVID App from different perspectives.

On May 7 2020, the  Parliamentary Joint Committee on Human Rights (PJCHR) published its report on the proposed contact tracing system and made a series of important recommendations to address its concerns about the compatibility of the scheme with data protection laws and the Human Rights Act 1998. After waiting for two weeks, the Secretary of State for Health, Matt Hancock, replied to the Committee rejecting its proposals as “unnecessary!” Let us examine those proposals in detail.

The Human Rights Considerations

Section 6 of the Human Rights Act 1998 makes it unlawful for any public authority (that includes the UK government and the NHSX) to act in a way that is incompatible with a Convention right. Article 8(1)of the ECHR states that “Everyone has the right to respect for his private and family life, his home and his correspondence.” This is not an absolute right. Article 8(2) provides that an interference with the right to privacy may be justified if it:

is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”

However, the government also has an obligation to protect the “right to life” enshrined in Article 2 of the ECHR. This means that if the NHS COVID App really can prevent the spread of the virus and save lives, then this is going to a major consideration in deciding whether the interference with Article 8 is necessary and proportionate.

On 7 May the Parliamentary Joint Committee on Human Rights  (PJCHR) published a Report on the NHS COVID App and this provides a very detailed assessment of some of the human rights implications of the “centralised” approach that the NHS has proposed. The overall conclusion of the report is that if the app is effective it could help pave the way out of current lockdown restrictions and help to prevent the spread of Coronavirus. However, it also concludes that the app, in its current form, raises “significant concerns regarding surveillance and the impact on other human rights which must be addressed first.”

How will the COVID App interfere with the right to privacy?

At first glance it would appear that the COVID App does not involve the transfer of any personal data. As explained in the first blog in this series, app user will be given a unique ID which will be made up of a set of random numbers and the first half of a person’s post code. The NHS web site suggests that this ‘anonymises’ the information. However, as the Parliamentary Report notes, there are parts of England where less than 10,000 people live in a post code area and as little as 3 or 4 “bits” of other information could be enough to identify individuals. The report also notes that relying upon people self-reporting alone (without requiring conformation that a person has tested positive for COVID 19) may carry the risks of false alerts thereby impacting on other people’s rights if they have to self-isolate unnecessarily.

Necessary interference?

An interference with a person’s right to privacy under ECHR Article 8 may be justified under Article 8(2) if it is “in accordance with the law” and is “necessary” for the protection of “health” (see above).

To be in accordance with the law, the app must meet the requirements of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 “http://www. legislation.gov.uk/ukpga/2018/12/contents” (DPA). However, as noted below, the PJCHR believes that the “current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system”. The Committee’s recommendations in relation to this are considered below.

The remaining  human rights consideration is whether the interference with peoples’ private lives is “necessary”. The answer to this depends on whether the use of the app will contribute to reducing the spread of COVID 19 and whether it will save lives.
This in turn depends on whether the app works and on the uptake of the app.

Although it was reported that uptake of the app in the Isle of Wight has exceeded 50% of the population, this falls short of the 60% that the government had previously suggested was necessary for the app to be effective. It is also debatable whether it necessarily follows that the uptake will be the same on the mainland. If the App is not capable of achieving its objective of preventing the spread of the virus, then the interference with peoples’ privacy rights will not be proportionate and will not fulfil the requirement of necessity in Article 8(2).

Although many people will probably download the app without thinking about privacy issues (how often do any of us download apps without checking Privacy Notices?), many others may have some real privacy concerns, particularly after the recent media debates. This has not been helped by reports that Serco (the company contracted to train call centre staff for the contact tracing scheme) has accidentally shared the email addresses of 300 contact tracers. Or by the fact that in other parts of the world there is growing concern about the privacy issues related to the use of contact tracing apps. Uptake of the app may be adversely affected if people lack confidence in the way in which data is being processed and why, and in the light of above they may have concerns about data security.

Consequently, the PJCHR’s report includes a series of recommendations aimed at ensuring that “robust privacy protections” are put in place as these are key to ensuring the effectiveness of the app .

Central to their recommendations was a proposal that the government introduce legislation to provide legal certainty about how personal data will be processed by the COVID App. Although individuals’ data protection rights are protected by the GDPR and DPA 2018 the Committee believes that it is “nearly impossible” for the public to understand what will happen to their data and also that it is necessary to turn government assurances about privacy into statutory obligations. The PJCHR sent a copy of their draft Bill to Secretary of State, Matt Hancock. However, on 21 May Matt Hancock rejected that proposal on the basis that the existing law provides “the necessary powers, duties and protections” and that participation in contact tracing and use of the app is voluntary.
In contrast the Australian government has passed additional new privacy protection legislation specifically aimed at the collection, use and disclosure of its COVID safe app data.

The Committee’s other recommendations are:

  1. The appointment of a Digital Contact Tracing Human Rights Commissioner to oversee the use, effectiveness and privacy protections of the app and any data associated with digital contact tracing. It calls for the Commissioner to have the same powers as the Information Commissioner. It would appear that Matt Hancock has also rejected this proposal on the basis that there is already sufficient governance in place.
  2. Particular safeguards for children under 18 to monitor children’s use, ensure against misuse and allow for interviews with parents where appropriate. It is noticeable that the Committee has set the age at 18.
  3. The app’s contribution to reducing the severity of the lockdown and to helping to prevent the spread of COVID 19 must be demonstrated and improved at regular intervals for the collection of the data to be reasonable. Therefore the Secretary of State for Health must review the operation of the app on a three weekly basis and must report to Parliament every three weeks.
  4. Transparency. In the second of this series of blog posts, we noted some of the issues relating to the publication of the Data Protection Impact Assessment. The PJCHR calls for this to be made public as it is updated.
  5. Time limited. The data associated with the contact tracing app must be permanently deleted when it is no longer required and may not be kept beyond the duration of the health emergency. However these terms may be open to some interpretation.

Matt Hancock has written that he will respond to these other issues “in due course”.
It is unclear what this means, but it does not suggest any immediate response.

The Draft Bill

The PJCHR’s draft bill (rejected by Matt Hancock) proposed a number of important provisions, some of which are set out below.

The Bill specifically limited the purpose of the COVID App to:

  1. Protecting the health of individuals who are or may become infected with Coronavirus; and
  2. Preventing or controlling the spread of Coronavirus (a) preventing the spread of Coronavirus.

Additionally it contained provisions  that prohibited the use of centrally held data without specific statutory authorisation; limited the amount of time that data could be held on a smart phone to 28 days followed by automatic deletion unless a person has notified that they have COVID 19 or suspected COVID 19. It also prohibited “data reconstruction” in relation to any centrally held data. The fact that the Bill includes this, seems to suggest an implicit recognition that the Unique IDs are not truly anonymous.

The ‘status’ of the NHS COVID App keeps changing and it still remains to be seen when (and if) it will be rolled out. But the Northern Ireland Assembly has already announced it will be working with the Irish government to produce a coordinated response based on a  decentralised model.  It is reported to be doing this because of the difficulties and uncertainties surrounding the app, and the human rights issues arising from a centralised app.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online   GDPR Practitioner Certificate  course is  fully booked. We have  1 place left   on the course starting on 11th  June. 

online-gdpr-banner

%d bloggers like this: