UK GDPR Reform: Will There Be A New Consultation?

What is happening with the Government’s proposal for UK GDPR reform? Just like Donald Trump’s predicted “Red Wave” in the US Mid Term Elections, it’s turning out to be a bit of a ripple!

In July the Boris Johnson Government, published the Data Protection and Digital Information Bill. This was supposed to be the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. The government projected it would yield savings for businesses of £1billion over ten years. (Key provisions of the bill are summarised in our blog post here.)

On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”.

The Bill’s passage through Parliament was suspended. It seemed that drafters would have to go back to the drawing board to showcase even more “Brexit benefits”. There was even talk of another round of consultation. Remember the Bill is the result of an extensive consultation launched in September 2021 (“Data: A New Direction”).

Last week, Ibrahim Hasan, attended the IAPP Conference in Brussels. Owen Rowland, Deputy Director at the DCMS, told the conference that the latest “consultation” on the stalled bill will begin shortly. However he confirmed it will not be a full-blown public consultation:

“It’s important to clarify (the type of consultation). However, we are genuinely interested in continuing to engage with the whole range of stakeholders. Different business sectors as well as privacy and consumer groups,” Rowland said. “We’ll be providing details in the next couple of weeks in terms of the opportunities that we are going to particularly set up.”

The Bill may not receive a deep overhaul, but Rowland said he welcomes comments that potentially raise “amendments to (the existing proposal’s) text that we should make.” He added the consultation is being launched to avoid “a real risk” of missing important points and to provide “opportunities were not fully utilising” to gain stakeholder insights.

Rowland went on to suggest that the DCMS would conduct some roundtables. If any of our readers are invited to the aforementioned tables (round or otherwise) do keep us posted. Will it make a difference to the content of the bill? We are sceptical but time will tell. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

AI and Data Protection: Is ‘Cortana’ such a problem?

‘AI’ and/or ‘Machine Learning’ as it’s known is becoming more prevalent in the working environment. From ‘rogue algorithms’ upsetting GCSE gradings through to Microsoft 365 judging you for only working on one document all day, we cannot escape the fact that there are more ‘automated’ services than ever before.  

For DPOs, records managers and IG officers, this poses some interesting challenges to the future of records, information and personal data.  

I was asked to talk about the challenges of AI and machine learning at a recent IRMS Public Sector Group webinar. In the session titled ‘IRM challenges of AI & something called the ‘metaverse’ we looked at a range of issues, some of which I’d like to touch on a little bit below. While I remain unconvinced the ‘metaverse’ is going to arrive any time soon, AI and algorithms very much are here and are growing fast.  

From a personal data and privacy point of view, we know that algorithms guide our online lives. From what adverts we see to what posts come up on our feed on Twitter, Facebook etc. How is that not creepy that this algorithm knows more about me than I do? And how does that go before it has mass implications for everyday citizens. What happens if the ‘social algorithm’ works out your sexuality before you or your family has? I work with families that still to this day will abuse and cast out those that are LGBTQ+, so imagine the damage a ‘we thought you’d like this’ post would do.  

Interesting questions have been posed on Twitter about ‘deep fake’ videos and whether they are personal data. The answers are mixed and pose some interesting implications for the future. Can you imagine the impact if someone can use AI to generate a video of you doing something you are not meant to? That’s going to take some doing to undo by which time, the damage is done. If you want to see this in action, I’d recommend watching Season 2 of ‘Capture’ on BBC iPlayer. 

In an organisational context, if organisations are to use algorithms to help with workload and efficient services it is simple logic that the algorithm must be up to scratch. As the Borg Queen (a cybernetic alien) in Star Trek First Contact once said to Data (a self aware android) “you are an imperfect being, created by an imperfect being. Finding your weakness is only a matter of time”. If anyone can find me a perfectly designed system that doesn’t have process issues, bugs and reliability issues, do let me know.  

Many data scientists and other leading data commentators like Cathy O’Neill frequently state that “Algorithms are basically opinions embedded in code”. And opinions bring with them biases, shortcomings and room for error.  

Now, that is not to say that these things do not have their advantages – they very much do. However, in order to get something good out of them you need to ensure good stuff goes into them and good stuff helps create them. Blindly building and feeding a machine just because it’s funky and new, as we have all seen time and again, always leads to trouble.  

Myself and Olu attended (me virtually and Olu in person) the launch of the AI Standards Hub by the Alan Turning Institute (and others). A fascinating initiative by the ATI and others, including UK Government.  

Now why am I talking about this at the moment? Put simply, as I mentioned above, this technology is here and is not going anywhere. Take look at this company offering live editing of your voice, and you may even find this conversation between a google engineer and an AI quite thought provoking and sometimes scary. If anything, AI is evolving at an ever growing pace. Therefore information professionals from all over the spectrum need to be aware of how it works, how it can be used in your organisation, and how you can upskill to challenge and support it.  

In recent times the ICO has been publishing a range of guidance on this, including a relatively detailed guide on how to use AI and consider Data Protection implications. While it’s not a user manual it does give some key points to consider and steps to go through. 

Right Alexa, end my blog! oh I mean, Hey Siri, end my blog… Darn… OK Google…

If you are interested in learning more about the IRM & DP challenges with ‘AI’ and upskilling as a DPO, Records or Information Governance Manager then check out Scott’s workshop on Artificial Intelligence and Machine Learning, How to implement Good Information Governance. Book your place for 17th November now. 

ICO Reprimand for Misuse of Children’s Data: A Proportionate Response or a Let Off?

Last week, the Department for Education received a formal reprimand from the Information Commissioner’s Office(ICO) over a “serious breach” of the GDPR involving the unauthorised sharing of up to 28 million children’s personal data. But the Department has avoided a fine, despite a finding of “woeful” data protection practices.

The reprimand followed the ICO’s investigation into the sharing of personal data stored on the Learning Records Service (LRS) database, for which the DfE is the Data Controller. LRS provides a record of pupils’ qualifications that education providers can access. It contains both personal and Special Category Data and at the time of the incident there were 28 million records stored on it. Some of those records would have pertained to children aged 14 and over. 

The ICO started its investigation after receiving a breach report from the DfE about the unauthorised access to the LRS database. The DfE had only become aware of the breach after an exposé in a national Sunday newspaper.

The ICO found that the DfE’s poor due diligence meant that it continued to grant Trustopia access to the database when it advised the DfE that it was the new trading name for Edududes Ltd, which had been a training provider. Trustopia was in fact a screening company and used the database to provide age verification services to help gambling companies confirm customers were over 18. The ICO ruled that the DfE failed to:

  • protect against the unauthorised processing by third parties of data held on the LRS database for reasons other than the provision of educational services. Data Subjects were unaware of the processing and could not object or otherwise withdraw from this processing. Therefore the DfE failed to process the data fairly and lawfully in accordance with Article 5 (1)(a). 
  • have appropriate oversight to protect against unauthorised processing of personal data held on the LRS database and had also failed to ensure its confidentiality in accordance with Article 5 (1)(f). 

The ICO conducted a simultaneous investigation into Trustopia, during which the company confirmed it no longer had access to the database and the cache of data held in temporary files had been deleted. Trustopia was dissolved before the ICO investigation concluded and therefore regulatory action was not possible.

The DfE has been ordered to implement the following five measures to improve its compliance: 

  1. Improve transparency around the processing of the LRS database so Data Subjects are aware and are able to exercise their Data Subject rights, in order to satisfy the requirements of Article 5 (1)(a) of the UK GDPR. 
  • Review all internal security procedures on a regular basis to identify any additional preventative measures that can be implemented. This would reduce the risk of a recurrence to this type of incident and assist compliance with Article 5 (1)(f) of the UK GDPR. 
  • Ensure all relevant staff are made aware of any changes to processes as a result of this incident, by effective communication and by providing clear guidance. 
  • Complete a thorough and detailed Data Protection Impact Assessment, which adequately assesses the risk posed by the processing. This will enable the DfE to identify and mitigate the data protection risks for individuals. 

This investigation could, and many would say should, have resulted in a fine. However, in June 2022 John Edwards, the Information Commissioner, announced a new approach towards the public sector with the aim to reduce the impact of fines on the sector. Had this new trial approach not been in place, the DfE would have been issued with a fine of over £10 million. In a statement, John Edwards said:

“No-one needs persuading that a database of pupils’ learning records being used to help gambling companies is unacceptable. Our investigation found that the processes put in place by the Department for Education were woeful. Data was being misused, and the Department was unaware there was even a problem until a national newspaper informed them.

“We all have an absolute right to expect that our central government departments treat the data they hold on us with the utmost respect and security. Even more so when it comes to the information of 28 million children.

“This was a serious breach of the law, and one that would have warranted a £10 million fine in this specific case. I have taken the decision not to issue that fine, as any money paid in fines is returned to government, and so the impact would have been minimal. But that should not detract from how serious the errors we have highlighted were, nor how urgently they needed addressing by the Department for Education.”

The ICO also followed its new public sector enforcement approach when issuing a reprimand to NHS Blood and Transplant Service. In August 2019, the service inadvertently released untested development code into a live system for matching transplant list patients with donated organs. This error led to five adult patients on the non-urgent transplant list not being offered transplant livers at the earliest possible opportunity. The ICO said that, if the revised enforcement approach had not been in place, the service would have received a fine of £749,856. 

Some would say that the DFE has got off very lightly here and, given their past record, perhaps more stringent sanctions should have been imposed. Two years ago, the ICO criticised the DfE for secretly sharing children’s personal data with the Home Office, triggering fears it could be used for immigration enforcement as part of the government’s hostile environment policy. 

Many will question why the public sector merits this special treatment. It is not as if it has been the subject of a disproportionate number of fines. The first fine to a public authority was only issued in December 2021 (more than three and a half years after GDPR came into force) when the Cabinet Office was fined £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online. This was recently reduced to £50,000 following a negotiated settlement of a pending appeal.

Compare the DfE reprimand with last month’s Monetary Penalty Notice in the sum of £1,350,000 issued to a private company, Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products. With austerity coming back with a vengeance, no doubt the private sector will question the favourable terms for the public sector. 

Perhaps the Government will come to the private sector’s rescue. Following the new DCMS Secretary for State’s speech  last month, announcing a plan to replace the UK GDPR with a new “British data protection system” which cuts the “burdens” for British businesses, DCMS officials have said further delays to the Data Protection and Digital Information Bill are on the way. A new public consultation will be launched soon.

So far the EU is not impressed. A key European Union lawmaker has described meetings with the U.K. government over the country’s data protection reform plans as “appalling.” Italian MEP Fulvio Martusciello from the center-right European People’s Party said his impression from the visit was that Britain is “giving in on privacy in exchange for business gain.”

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 21st November. 

Back To The Future For UK GDPR?

On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”. Just as we are all getting to grips with the (relatively new) UK GDPR, do we want more change and uncertainty? How did we get here? Let’s recap.

In July the Government, led by Boris Johnson (remember him?), published the Data Protection and Digital Information Bill. This was supposed to be the next step in its much publicised plans to reform the UK Data Protection regime following Brexit (remember that?). 

In the Government’s response to the September 2021 data protection reform consultation (“Data: A New Direction”) it said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” To achieve this, the new Bill proposed amendments to existing UK data protection legislation in particular the UK GDPR. On further analysis, the bill was more a “tinkering” with GDPR rather than a wholesale change; although the government projected it would yield savings for businesses of £1billion over ten years. (Key provisions of the bill are summarised in our blog post here.)

Following the Bill, we had a new Prime Minister. Nadine Dorries at the DCMS was replaced by Michelle Donelan. The new bill’s passage in Parliament was suspended with a promise to re-introduce it. Now it seems that we could have a new piece of legislation altogether. 

The headline of Donelan’s speech was that the Truss Government would replace GDPR with “our own business- and consumer-friendly British data protection system”. She says it will be “co-design[ed] with business …” But the devil is in the detail or lack thereof.  

Donelan’s speech also contained the usual compulsory myths about GDPR and tired data protection law cliches (the old ones are the best!). She regurgitated complaints highlighted by  Oliver Dowden, when he was at the DCMS:

“We’ve even had churches write to the department, pleading for us to do something, so that they can send newsletters out to their communities without worrying about breaching data rules.”

And plumbers and electricians are also finding GDPR a problem:

“No longer will our businesses be shackled by unnecessary red tape. At the moment, even though we have shortages of electricians and plumbers, GDPR ties them in knots with clunky bureaucracy.” 

So that’s why my electrician doesn’t turn up. He is busy drafting a GDPR compliant privacy notice!

Donelan even claimed that “researchers at Oxford University estimated that it has directly caused businesses to lose over 8% of their profits.” This is selective quoting at best. (Andy Crow has done a great post on this.)

Will this “growth” focused reform of UK data protection rules risk the UK’s adequacy status with the EU? It depends on the final text of the law “co-designed” by business. ; )

7th Nov 2022 Update: Today we hear that the EU is not impressed. A key European Union lawmaker has described meetings with the U.K. government over the country’s data protection reform plans as “appalling.” Italian MEP Fulvio Martusciello from the center-right European People’s Party said his impression from the visit was that Britain is “giving in on privacy in exchange for business gain.”

Data protection practitioners should not burn their UK GDPR Handbook just yet! If you have been following the events of the last few days, you might suspect that we may have a new Prime Minister soon and/or a General Election. This will mean that the pause button on data protection reform could be pressed again. To repeat the phrase of the past year, “We live in uncertain times!”

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 21st November. 

The Data Protection and Digital Information Bill: A new UK GDPR?

In July the Government published the Data Protection and Digital Information Bill, the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. 

In the Government’s response to the September 2021 consultation (“Data: A New Direction”) it said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” To achieve this, the new Bill proposes substantial amendments to existing UK data protection legislation; namely the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). There is no shiny new Data Protection Act 2022 or even a new colour for the UK GDPR! Perhaps a missed opportunity to showcase the benefits of Brexit! 

In addition to reforming core data protection law, the Bill deals with certification of digital identity providers, electronic registers of births and deaths and information standards for data-sharing in the health and adult social care system. The notable DP provisions are set out below.

Amended Definition of Personal Data

Clause 1 of the Bill limits the scope of personal data to:

  • where the information is identifiable by the controller or processor by reasonable means at the time of the processing; or
  • where the controller or processor ought to know that another person will likely obtain the information as a result of the processing and the individual will likely be identifiable by that person by reasonable means at the time of the processing.

This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. It could make it easier for organisations to achieve data anonymisation as they would no longer need to concern themselves with potential future identifiability, with the focus instead being on identifiability “at the time of the processing”. On the other hand, the change does not address the risk of indirect identification.

Vexatious Data Subject Requests

Article 12 of the UK GDPR allows controllers to refuse to comply with data subject rights requests (or charge a fee) when the requests are “manifestly unfounded” or “excessive”.  Clause 7 of the Bill proposes to replace this with “vexatious” or “excessive”. Examples of vexatious requests given in the Bill are those requests intended to cause distress, not made in good faith, or that are an abuse of process. All these could easily fit into “manifestly unfounded” and so it is difficult to understand the need for change here. 

Data Subject Complaints

Currently, the UK GDPR allows a data subject to complain to the Information Commissioner, but nothing expressly deals with whether or how they can complain to a controller. Clause 39 of the Bill would make provision for this and require the controller to acknowledge receipt of such a complaint within 30 days and respond substantively “without undue delay”. However, under clause 40, if a data subject has not made a complaint to the controller, the ICO is entitled not to accept the complaint.

Much was made about “privacy management programmes” in the Government’s June announcement. These are not expressly mentioned in the Bill but most of the proposals that were to have fallen under that banner are still there (see below).

Senior Responsible Individuals

As announced in June, the obligation for some controllers and processors to appoint a Data Protection Officer (DPO) is proposed to be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals, are required (by clause 14) to designate a senior manager as a “Senior Responsible Individual”. Just like the DPO, the SRI must be adequately resourced and cannot be dismissed for performing their tasks under the role. The requirement for them to be a senior manager (rather than just reporting to senior management, as current DPOs must) will cause problems for those organisations currently using outsourced DPO services.

ROPAs and DPIAs

The requirement for Records of Processing Activities (ROPAs) will also go. Clause 15 of the Bill proposes to replace it with a leaner “Record of Processing of Personal Data”.  Clause 17 will replace Data Protection Impact Assessments (DPIAs) with leaner and less prescriptive Assessments of High Risk Processing. Clause 18 ensures that controllers are no longer required, under Article 36 of the UK GDPR, to consult the ICO on certain high risk DPIAs.

Automated Decision Making

Article 22 of UK GDPR currently confers a “right” on data subjects not to be subject to automated decision making which produces legal effects or otherwise significantly affects them. Clause 11 of the Bill reframes Article 22 in terms of a positive right to human intervention. However, it would only apply to “significant” decisions, rather than decisions that produce legal effects or similarly significant effects. It is unclear whether this will make any practical difference. 

International Transfers 

The judgment of the European Court of Justice (ECJ) in “Schrems II” not only stated that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal transfer tool. It also said that in any international data transfer situation, whether to the USA or other countries, the data exporter needs to make a complex assessment  about the recipient country’s data protection legislation to ensure that it adequately protects the data especially from access by foreign security agencies (a Transfer Impact Assessment or TIA) .  

The Bill amends Chapter 5 of the UK GDPR (international transfers) with the introduction of the “data protection test” for the above mentioned assessment. This would involve determining if the standard of protection provided for data subjects in the recipient country is “not materially lower” than the standard of protection in the UK. The new test would apply both to the Secretary of State, when making “adequacy” determinations, and to controllers, when deciding whether to transfer data. The explanatory notes to the Bill state that the test would not require a “point- by-point comparison” between the other country’s regime and the UK’s. Instead an assessment will be “based on outcomes i.e. the overall standard of protection for a data subject”. 

An outcome based approach will be welcome by organisations who regularly transfer personal data internationally especially where it is of no practical interest to foreign security agencies. However, this proposed approach will attract the attention of the EU (see later). (see also our forthcoming International Transfers webinar).

The Information Commission

Under clause 100 of the Bill, the Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive (presumably John Edwards, the current Commissioner). 

The Commission would have a principal function of overseeing data protection alongside additional duties such as to have regard to the desirability of promoting innovation; the desirability of promoting competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; and the need to safeguard public security and national security. New powers for the Commission include an audit/assessment power (clause 35) to require a controller to appoint a person to prepare and provide a report and to compel individuals to attend for interviews (clause 36) in civil and criminal investigations.

The Bill also proposes to abolish the Surveillance Camera Commissioner and the Biometrics Commissioner.

Privacy and Electronic Communications (EC Directive) Regulations 2003 

Currently, under PECR, cookies (and similar technologies) can only be used to store or access information on end user terminal equipment without express consent where it is “strictly necessary” e.g. website security or proper functioning of the site. The Bill proposes allowing cookies to be used without consent for the purposes of web analytics and to install automatic software updates (see the GDPR enforcement cases involving Google Analytics). 

Another notable proposed change to PECR, involves extending “the soft opt-in” to electronic communications from organisations other than businesses. This would permit political parties, charities and other non-profits to send unsolicited email and SMS direct marketing to individuals without consent, where they have an existing supporter relationship with the recipient. 

Finally on PECR, the Bill proposes to increase the fines for infringement from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

Business Data

The Bill would give the Secretary of State and the Treasury the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. “Customers” would not just be data subjects, but anyone who purchased (or received for free) goods, services or digital content from a trader in a consumer (rather than business) context. “Business data” would include information about goods, services and digital content supplied or provided by a trader. It would also include information about where those goods etc. are supplied, the terms on which they are supplied or provided, prices or performance and information relating to feedback from customers. Customers would potentially have a right to access their data, which might include information on the customer’s usage patterns and the price paid to aid personalised price comparisons. Similarly, businesses could potentially be required to publish, or otherwise make available, business data.

These provisions go much further than existing data portability provisions in the UK GDPR. The latter does not guarantee provision of data in “real time”, nor cover wider contextual data. Nor do they apply where the customer is not an individual.

Adequacy?

The Bill is currently making its way through Parliament. The impact assessment reiterates that “the government’s view is that reform of UK legislation on personal data is compatible with the EU maintaining free flow of personal data from Europe.”  However, with the multiple amendments proposed in the Bill, the UK GDPR is starting to look quite different to the EU version. And the more the two regimes diverge, the more there is a risk that the EU might put a “spanner in the works” when the UK adequacy assessment is reviewed in 2024. Much depends on the balance struck in the final text of the Bill. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September. 

New DP and IG Practitioner Apprenticeship

Act Now Training has teamed up with Damar Training on materials and expertise underpinning its new Data Protection and Information Governance Practitioner Level 4 Apprenticeship.

The apprenticeship, which received final approval in March, will help develop the skills of those working in the increasingly important fields of data protection and information governance. 

With the rapid advancement of technology, there is a huge amount of personal data being processed by organisations, which is the subject of important decisions affecting every aspect of people’s lives. This poses significant legal and ethical challenges, as well as the risk of incurring considerable fines from regulators for non compliance. 

This apprenticeship aims to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address these challenges.

Ibrahim Hasan, Director of Act Now, said:

“We are excited to be working Damar Training to help deliver this much needed apprenticeship. We are committed to developing the IG sector and encouraging a diverse range of entrants to the IG profession. We have looked at every aspect of the IG Apprenticeship standard to ensure the training materials equip budding IG officers with the knowledge and skills they need to implement the full range of IG legislation in a practical way.

Damar’s managing director, Jonathan Bourne, added:

“We want apprenticeships to create real, long-term value for apprentices and organisations. It is vital therefore that we work with partners who really understand not only the technical detail but also the needs of employers.

Act Now Training are acknowledged as leaders in the field, having recently won the Information and Records Management Society (IRMS) Supplier of the Year award for the second consecutive year. I am delighted therefore that we are able to bring together their 20 years of deep sector expertise with Damar’s 40+ year record of delivering apprenticeship in business and professional services.

This apprenticeship has already sparked significant interest, particularly among large public and private sector organisations and professional services firms. Damar has also assembled an employer reference group that is feeding into the design process in real time to ensure that the programme works for employers.

The employer reference group met for the first time on May 25. It included industry professionals across a variety of sectors including private and public health care, financial services, local and national government, education, IT and data consultancy, some of whom were part of the apprenticeship trailblazer group.

If your organisation is interested in the apprenticeship please get in touch with us to discuss further.

Act Now Training Wins IRMS Supplier of the Year Award 2022-23

Act Now Training is proud to announce that it has won the Information and Records Management Society (IRMS) Supplier of the year award for 2022-23. This is the second consecutive year we have won this award.

The awards ceremony took place on Monday night at the IRMS Conference in Glasgow. Act Now was also nominated for two others awards including Innovation of the Year for our Advanced Certificate in GDPR Practice.

Ibrahim Hasan said:

“I would like to thank the IRMS for a great event and the members for voting for us. It feels really special to be recognised by fellow IG practitioners. We are proud to deliver great courses that meet the needs of IRMS members. This award also recognises the hard work of our colleagues who are focussed on fantastic customer service as well as our experienced associates who deliver great practical content and go the extra mile for our delegates. Congratulations to all the other IRMS awards winners.”

It has been another fantastic year for Act Now. We have launched some great new courses and products. We have exciting new courses planned for 2023. Watch this space!

BTW – Act Now also won the best elevator pitch prize at the conference vendor showcase. Click here to watch Ibrahim’s pitch.

The New Saudi Arabian Federal Data Protection Law 

The Middle East is fast catching up with Europe when it comes to data protection law. The Kingdom of Saudi Arabia(KSA) has enacted its first comprehensive national data protection law to regulate the processing of personal data. This is an important development alongside the passing of the new UAE Federal DP law. It also opens up opportunities for UK and EU Data Protection professionals especially as these new laws are closely aligned with the EU General Data Protection Regulation (GDPR) and the UK GDPR

The KSA Personal Data Protection Law (PDPL) was passed by Royal Decree M/19 of 9/2/1443H on 16 September 2021, approving Resolution No. 98 dated 7/2/1443H (14 September 2021). The detailed Executive Regulations are expected to be published soon and will give more details about the new law. It will be effective from 23rd March 2022 following which there will be a one year implementation period.

Enforcement 

PDPL will initially be enforced by the Saudi Arabian Authority for Data and Artificial Intelligence (SDAIA).The Executive Regulations will set out the administrate penalties that can be imposed on organisations for breaches. Expect large fines for non-compliance alongside other sanctions. PDPL could mirror the GDPR which allows the regulator to impose a fine of up to 20 million Euros or 4% of gross annual turnover, whichever is higher. PDPL also contains criminal offences which carry a term of imprisonment up to 2 years and/or a fine of up to 3 million Saudi Royals (approximately £566,000). Affected parties may also be able to claim compensation.

Territorial Scope

PDPL applies to all organisations that are processing personal data in the KSA irrespective of whether the data relates to Data Subjects living in the KSA. It also has an “extra-territorial” reach by applying to organisations based abroad who are processing personal data of Data Subjects resident in the KSA. Interestingly, unlike the UAE Federal DP law, PDPL does not exempt government authorities from its application although there are various exemptions from certain obligations where the data processing relates to national security, crime detection, statutory purposes etc.

Notable Provisions

PDPL mirrors GDPR’s underlying principles of transparency and accountability and empowers Data Subjects by giving them rights in relation to their personal data. We set out below the notable provisions including links to previous GDPR blog posts for readers wanting more detail, although more information about the finer points of the new law will be included in the forthcoming Executive Regulations. 

  • Personal Data – PDPL applies to the processing of personal data which is defined very broadly to include any data which identifies a living individual. However, unlike GDPR, Article 2 of PDPL includes within its scope, the data of a deceased person if it identifies them or a family member.
  • Registration  Article 23 requires Data Controllers (organisations that collect personal data and determine the purpose for which it is used and the method of processing) to register on an electronic portal that will form a national record of controllers. 
  • Lawful Bases – Like the UAE Federal DP law, PDPL makes consent the primary legal basis for processing personal data. There are exceptions including, amongst others, if the processing achieves a “definite interest” of the Data Subject and it is impossible or difficult to contact the Data Subject.
  • Rights – Data Subjects are granted various rights in Articles 4,5 and 7 of the PDPL which will be familiar to GDPR practitioners. These include the right to information (similar to Art 13 of GDPR), rectification, erasure and  Subject Access. All these rights are subject to similar exemptions found in Article 23 of GDPR.
  • Impact Assessments – Article 22 requires (what GDPR Practitioners call) “DPIAs” to be undertaken in relation to any new high risk data processing operations. This will involve assessing the impact of the processing on the risks to the rights of Data Subjects, especially their privacy and confidentiality.
  • Breach Notification – Article 20 requires organisations to notify the regulator, as well as a Data Subjects, if they suffer a personal data breach which compromises Data Subjects’ confidentiality, security or privacy. The timeframe for notifying will be set by the Executive Regulations.
  • Records Management – Organisations will have to demonstrate compliance with PDPL by keeping records. There is a specific requirement in Article 3 to keep records similar to a Record of Processing Activities(ROPA) under GDPR.
  • International Transfers – Like other data protection regimes PDPL  imposes limitations on the international transfer of personal data outside of the KSA. . There are exceptions; further details will be set out in the Executive Regulations.
  • Data Protection Officers – Organisations (both controllers and processors) will need to appoint at least one officer to be responsible for compliance with PDPL. The DPO can be an employee or an independent service provider and does not need to be located in the KSA. 
  • Training – After 23 March 2022, Data Controllers will be required to hold seminars for their employees to familiarise them with the new law.

Practical Steps

Organisations operating in the KSA, as well as those who are processing the personal data of KSA residents, need to assess the impact of PDPL on their data processing activities. Work needs to start now to implement systems and processes to ensure compliance. Failure to do so will not just lead to enforcement action but also reputational damage. The following should be part of an action plan for compliance:

  1. Training the organisation’s management team to understand the importance of PDPL, the main provisions and changes required to systems and processes. 
  2. Training staff at all levels to understand PDPL at how it will impact on their role.
  3. Carrying out a data audit to understand what personal data is held, where it sits and how it is processed.
  4. Reviewing how records management and information risk  is addressed within the organisation.
  5. Drafting Privacy Notices to ensure they set out the minimum information that should be included.
  6. Reviewing information security policies and procedures in the light of the new more stringent security obligations particularly breach notification.
  7. Draft policies and procedures to deal with Data Subjects’ rights particularly requests for subject access, rectification and erasure.
  8. Appointing and training a  Data Protection Officer.

Act Now Training can help your organisation prepare for PDPL. We are running a webinar on this topic soon and can also deliver more detailed in house training. Please get in touch to discuss you training needs. We are in Dubai and Abu Dhabi from 16th to 21st January 2022 and would be happy to arrange a meeting.

Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Ring Doorbells, Domestic CCTV and GDPR

The Daily Mail reports today that, “A female doctor is set to be paid more than £100,000 after a judge ruled that her neighbour’s Ring smart doorbell cameras breached her privacy in a landmark legal battle which could pave the way for thousands of lawsuits over the Amazon-owned device.”

Dr Mary Fairhurst, the Claimant, alleged that she was forced to move out of her home because the internet-connected cameras are so “intrusive”. She also said that the Defendant, Mr Woodard, had harassed her by becoming “aggressive” when she complained to him.

A judge at Oxford County Court, ruled yesterday that Jon Woodard’s use of his Ring cameras amounted to harassment, nuisance and a breach of data protection laws. The Daily Sage goes on to say:

“Yesterday’s ruling is thought to be the first of its kind in the UK and could set precedent for more than 100,000 owners of the Ring doorbell nationally.”

Before Ring doorbell owners rush out to dismantle their devices, let’s pause and reflect on this story. This was not about one person using a camera to watch their house or protect their motorbike. The Defendant had set up a network of cameras around his property which could also be used to watch his neighbour’s comings and goings. 

Careful reading of the judgement leads one to conclude that the legal action brought by the Claimant was really about the use of domestic cameras in such a way as to make a neighbour feel harassed and distressed. She was primarily arguing for protection and relief under the Protection from Harassment Act 1997 and the civil tort of nuisance. Despite the Daily Mail’s sensational headline, the judgement does not put domestic CCTV camera or Ring doorbell owners at risk of paying out thousands of pounds in compensation (as long as they don’t use the cameras to harass their neighbours!). However, it does require owners to think about the legal implications of their systems. Let’s examine the data protection angle.

Firstly, the UK GDPR can apply to domestic CCTV and door camera systems. After all, the owners of such systems are processing personal data (images and even voice recordings) about visitors to their property as well as passers-by and others caught in the systems’ peripheral vision.  However, on the face of it, a domestic system should be covered by Article 2(2)(a) of the UK GDPR which says the law does not apply to “processing of personal data by an individual in the course of purely personal or household activity.” Recital 18 explains further:

“This Regulation does not apply to the processing of personal data by a natural person in the course of a purely personal or household activity and thus with no connection to a professional or commercial activity. Personal or household activities could include correspondence and the holding of addresses, or social networking and online activity undertaken within the context of such activities.”

The judge in this case concluded that the camera system, set up by the Defendant, had collected data outside the boundaries of his property and, in the case of one specific camera, “it had a very wide field of view and captured the Claimant’s personal data as she drove in and out of the car park.” This would take the system outside of the personal and household exemption quoted above, as confirmed by the Information Commissioner’s CCTV guidance:

“If you set up your system so it captures only images within the boundary of your private domestic property (including your garden), then the data protection laws will not apply to you.

But what if your system captures images of people outside the boundary of your private domestic property – for example, in neighbours’ homes or gardens, shared spaces, or on a public footpath or a street?

Then the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA18) will apply to you, and you will need to ensure your use of CCTV complies with these laws.”

Once a residential camera system comes under the provisions of the UK GDPR then of course the owner has to comply with all the Data Protection Principles including the obligation to be transparent (through privacy notices) and to ensure that the data processing is adequate, relevant and not excessive. Data Subjects also have rights in relation to their data including to see a copy of it and ask for it to be deleted (subject to some exemptions).

Judge Clarke said the Defendant had “sought to actively mislead the Claimant about how and whether the cameras operated and what they captured.” This suggests a breach of the First Principle (lawfulness and transparency). There were also concerns about the amount of data some of the cameras captured (Fourth Principle).

Let’s now turn to the level of compensation which could be awarded to the Claimant. Article 82 of the UK GDPR does contain a free standing right for a Data Subject to sue for compensation where they have suffered material or non-material damage, including distress, as a result of a breach of the legislation. However, the figure mentioned by the Daily Mail headline of £100,000 seems far-fetched even for a breach of harassment and nuisance laws let alone GDPR on its own. The court will have to consider evidence of the duration of the breach and the level of damage and distress cause to the Claimant. 

This judgement does not mean that Ring door camera owners should rush out to dismantle them before passing dog walkers make compensation claims. It does though require owners to think carefully about the citing of cameras, the adequacy of notices and the impact of their system on their neighbour’s privacy. 

The Daily Mail story follows yesterday’s BBC website feature about footballers attempting to use GDPR to control use of their performance data (see yesterday’s blog and Ibrahim Hasan’s BBC interview). Early Christmas gifts for data protection professionals to help them highlight the importance and topicality of what they do!

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Exit mobile version
%%footer%%