Act Now Launches Updated GDPR Practitioner Certificate  

Act Now Training is pleased to announce the launch of its updated GDPR Practitioner Certificate. This course has been running successfully for the past five years with excellent delegate reviews: 

“The course was very useful as an IG Officer. The trainer was knowledgeable and explained some complex aspects of the legislation using interesting examples and real life scenarios. The course materials and handbook are invaluable and I know I will reuse them in conjunction with my usual resources.” NC, Lincolnshire County Council  

“I would highly recommend this online course which was well structured and interactive. The course tutor was engaging and made a complex subject accessible. There was a good balance between understanding the legal framework and practical application. I learnt a great deal which will help me in my DPO role.” RS, London Councils  

Key features of the new course include an updated course curriculum, new exercises and more emphasis on helping delegates develop key DPO skills.  

Our Motivation  

This revised course is part of our ongoing commitment to encourage and assist new talent in the IG profession. Through our involvement in NADPO and the IRMS over the past 20 years, Act Now has been  actively encouraging new entrants to the IG profession and providing quality training to assist in their learning and development. When the DP and IG Apprenticeship was launched last year, we became one of the first training companies to partner up with a leading apprenticeship provider to deliver specialist IG training and materials to apprentices. These have led to our partner, Damar, recruiting over 100 apprentices and helping them lay the foundations for a successful career in IG.  

Course Content 

The course curriculum has been updated in the light of Act Now’s Skills and Competency Framework for DPOs. For the past three years we have been working on this framework, alongside industry experts and education professionals, by thoroughly analysing all the core skills and competencies required for the DPO role and how they map against our wider GDPR course curriculum.  

Completing the course will enable delegates to gain a thorough understanding of the UK GDPR and develop the skills required to do their job with greater ease and confidence. In addition to the main course topics such as principles, rights and enforcement we have introduced new topics such as the ICO Accountability Framework. We also take time to consider the latest ICO enforcement action and the changes to the UK data protection regime proposed by the recently announced Data Protection and Digital Information Bill

Completing the GDPR Practitioner Certificate will enable delegates to gain a thorough understanding of the UK GDPR. The course will help delegates interpret the data protection principles in a practical context, drafting privacy notices, undertaking DPIAs and reporting data breaches. 

The course teaching style is based on four practical and engaging workshops covering theory alongside hands-on application using case studies that equip delegates with knowledge and skills that can be used immediately. Delegates will also have personal tutor support throughout the course and access to a comprehensive revised online resource lab. 

The DPO Learning Pathway 

The updated UK GDPR Practitioner Certificate is part of our learning pathway for Data Protection Officers. Once completed they can move on to the Intermediate Certificate in GDPR Practice where the emphasis is on skills, as well as advanced knowledge, with delegates covering more challenging topics to gain a deeper awareness of the fundamental data protection principles.  

Our premier certification is the Advanced Certificate in GDPR Practice, tailored for seasoned Data Protection Officers seeking to refine and expand their expertise. The course comprises a rigorous set of masterclasses that engage delegates in dissecting and interpreting intricate GDPR scenarios through compelling case studies. This immersive experience empowers participants with the skills and confidence needed to tackle even the most challenging Data Protection and Privacy scenarios they may encounter.

If you would like a chat to discuss your suitability for any of our certificate courses, please get in touch.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates. Click on the link to find out more and take advantage of this limited time offer!

The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

The New DP Reform Bill: What’s Changed?

On 8th March 2023, the UK Department for Science, Information and Technology (DSIT) published the Data Protection and Digital Information (No.2) Bill (“the new Bill”). If enacted, it will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).

According to the DSIT press release, the Bill will result in a “new common-sense-led UK version of the EU’s GDPR [and will] will reduce costs and burdens for British businesses and charities, remove barriers to international trade and cut the number of repetitive data collection pop-ups online.” It also claims that the reforms are “expected to unlock £4.7 billion in savings for the UK economy over the next 10 years.” How this figure has been calculated is not explained but we have been here before! Remember the red bus?

How did we get here?

This is the second version of a bill designed to reform the UK data protection regime. In July 2022, the Government published the Data Protection and Digital Information Bill (“the previous Bill”). This was paused in September 2022 so ministers could engage in “a co-design process with business leaders and data experts” and move away from the “one-size-fits-all’ approach of European Union’s GDPR.” On 3rd October 2022, during the Conservative Party Conference, Michelle Donelan, then the new Secretary for State for Digital, Culture, Media and Sport (DCMS), made a speech announcing a plan to replace the UK GDPR with a new “British data protection system”. Another full consultation round was expected but never materialised.

The previous Bill have now been withdrawn. We will provide analysis and updates on the new Bill, as it progresses through Parliament, over the coming months. An initial summary of the key proposals, both old and new, is set out below:

What remains the same from the original bill?

Many of the proposals in the new Bill are the same as contained in the previous Bill. For a detailed analysis please read our previous blog post. Here is a summary:

  • Amended Definition of Personal Data: This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. 

  • Vexatious Data Subject Requests: The terms “manifestly unfounded” or “excessive” requests, in Article 12 of the UK GDPR, will be replaced with “vexatious” or “excessive” requests. Explanation and examples of such requests will also be included.

  • Data Subject Complaints: Data Controllers will be required to acknowledge receipt of Data Subject complaints within 30 days and respond substantively “without undue delay”. The ICO will be entitled not to accept a complaint, if a Data Subject has not made a complaint to the controller first.

  • Data Protection Officer: The obligation for some controllers and processors to appoint a Data Protection Officer (DPO) will be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals will be required to designate a senior manager as a “Senior Responsible Individual”. 

  • Data Protection Impact Assessments: These will be replaced by leaner and less prescriptive “Assessments of High Risk Processing”. 

  • International Transfers: There will be a new approach to the test for adequacy applied by the UK Government to countries (and international organisations) and when Data Controllers are carrying out a Transfer Impact Assessment or TIA. The threshold for this new “data protection test” will be whether a jurisdiction offers protection that is “not materially lower” than under the UK GDPR. (For more detail see also our forthcoming International Transfers webinar).
  • The Information Commission: The Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive.

  • Business Data: The Secretary of State and the Treasury will be given the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. 

  • PECR: Cookies will be allowed to be used without consent for the purposes of web analytics and to install automatic software updates. Furthermore non-commercial organisations (e.g. charities and political parties) will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest. Finally, there will be an increase to the fines from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

What has changed?

The new Bill does not make any radical changes to the previous Bill; rather it clarifies some points and provides a bit more flexibility in other areas. The main changes are summarised below:

  • Scientific Research: The definition of scientific research is amended so that it now includes research for the purposes of commercial activity.
    This expands the circumstances in which processing for research purposes may be undertaken, providing a broader consent mechanism and exemption to the fair processing requirement.
  • Legitimate Interests: The previous Bill proposed that businesses could rely on legitimate interests (Article 6 lawful basis) without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These “recognised” legitimate interests cover purposes for processing such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement.  The new Bill, whilst keeping the above changes, introduces a non-exhaustive list of cases where organisations may rely on the “legitimate interests” legal basis, including for the purposes of direct marketing, transferring data within the organisation for administrative purposes and for the purposes of ensuring the security of network and information systems; although a balancing exercise still needs to be conducted in these cases. 

  • Automated Decision Making: The previous Bill clarified that its proposed restrictions on automated decision-making under Article 22 UK GDPR should only apply to decisions that are a result of automated processing without “meaningful human involvement”. The new Bill states that profiling will be a relevant factor in the assessment as to whether there has been meaningful human involvement in a decision. 
  • Records of Processing Activities (ROPA): The previous Bill streamlined the required content of ROPAs. The new Bill exempts all controllers and processors from the duty to maintain a ROPA unless they are carrying out high risk processing activities. 

The Impact

The EU conducts a review of adequacy with the UK every four years; the next adequacy decision is due on 27th June 2025. Some commentators have suggested that the changes may jeopardise the UK’s adequate status and so impact the free flow of data between the UK and EU. We disagree. Although the Government states that the new Bill is “a new system of data protection”, it still retains the UK GDPR’s structure and fundamental obligations. Some tinkering around the edges is not really going to have much of an impact (see the helpful redline version of the new Bill produced by the good people at Hogen Lovells). Organisations that are already compliant with the UK GDPR will not be required to make any major changes to their systems and processes. 

The new Bill has been introduced at the first reading stage. The second reading, due to be scheduled within the next few weeks, which will be the first time the Government’s data protection reforms will be debated in Parliament. We expect the Bill to be passed in a form similar to the one now published and come into force later this year.

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update  workshop. There are only 3 places left on our next Advanced Certificate in GDPR Practice

GDPR and AI: The Rise of the Machines

2023 is going to be the year of AI. In January, Microsoft announced a multibillion dollar investment in OpenAI, the company behind image generation tool Dall-E and the chatbot ChatGPT. The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs.

The term “artificial intelligence” or “AI” refers to the use of machines and computer systems that can perform tasks normally requiring human intelligence. The past few years have seen rapid progress in this field. Advancements in deep learning algorithms, cloud computing, and data storage have made it possible for machines to process and analyse large amounts of data quickly and accurately. AI’s ability to interpret human language means that virtual assistants, such as Siri and Alexa, can now understand and respond to complex spoken commands at lightning speed.

The public sector is increasingly leveraging the power of AI to undertake administrative tasks and create more tailored services that meet users’ needs. Local government is using AI to simplify staff scheduling, predict demand for services and even estimate the risk of an individual committing fraud. Healthcare providers are now able to provide automated diagnoses based on medical imaging data from patients, thereby reducing wait times.

The Risks

With any major technological advance there are potential risks and downsides.  On Monday, ElevenLabs, an AI speech software company, said it had found an “increasing number of voice cloning misuse cases”. According to reports, hackers used the ElevenLabs software to create deepfake voices of famous people (including Emma Watson and Joe Rogan) making racist, transphobic and violent comments.

There are concerns about the impact of AI on employment and the future of work. In April 2021, the Court of Amsterdam ordered that Uber reinstate taxi drivers in the UK and Portugal who had been dismissed by “robo firing”; the use of an algorithm to make a decision about dismissal with no human involvement. The Court concluded that Uber’s had made the decisions “based solely on automated processing” within the meaning of Article 22(1) of the GDPR. It was ordered to reinstate the drivers’ accounts and pay them compensation.

As well as ethical questions about the use of AI in decision-making processes that affect people’s lives, AI-driven algorithms may lead to unintended biases or inaccurate decisions if not properly monitored and regulated. In 2021 the privacy pressure group, NOYB, filed a GDPR complaint against Amazon, claiming that Amazon’s algorithm discriminates against some customers by denying them the opportunity to pay for items by monthly invoice.

There is also a risk that AI is deployed without consideration of the privacy implications. In May 2022, the UK Information Commissioner’s Office fined Clearview AI Inc more than £7.5 million for breaches of GDPR. Clearview’s online database contains 20 billion images of people’s faces and data scraped from publicly available information on the internet and social media platforms all over the world. The company, which describes itself as the “World’s Largest Facial Network”, allows customers, including the police, to upload an image of a person to its app, which then uses AI to check it against all the images in the Clearview database. The app then provides a list of matching images with a link to the websites from where they came from. 

Practical Steps

Recently, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using algorithms or AI:

1. Take a data protection by design and default approach

Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.

2. Be transparent with people about how you are using their data

Local authorities should regularly review their privacy policies, to ensure they comply with Articles 13 and 14, and identify areas for improvement. They should also bring any new uses of individuals’ personal data to their attention.

3. Identify the potential risks to people’s privacy

Local authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. 

In April 2021, the European Commission presented its proposal for a Regulation to harmonise rules for AI, also known as the “AI Act of the European Union’. Whilst there is still a long way to go before this proposal becomes legislation, it could create an impetus for the UK to further regulate AI. 

Use of AI has enormous benefits. It does though have a potential to adversely impact people’s lives and deny their fundamental rights. As such, understanding the implications of AI technology and how to use it in a fair and lawful manner is critical for data protection/information governance officers to understand. 

Want to know more about this rapidly developing area? Our forthcoming AI and Machine Learning workshop will explore the common challenges that this subject presents focussing on GDPR as well as other information governance and records management issues. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? See our Advanced Certificate in GDPR Practice.

US Data Transfers and Privacy Shield 2.0 

On 14th December 2022, the European Commission published a draft ‘adequacy decision’, under Article 47 of the GDPR, endorsing a new legal framework for transferring personal data from the EU to the USA. Subject to approval by other EU institutions, the decision paves the way for “Privacy Shield 2.0” to be in effect by Spring 2023.

The Background

In July 2020, the European Court of Justice (ECJ) in “Schrems II”, ruled that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework as a legal transfer tool as it failed to protect the rights of EU data subjects when their data was accessed by U.S. public authorities. In particular, the ECJ found that US surveillance programs are not limited to what is strictly necessary and proportionate as required by EU law and hence do not meet the requirements of Article 52 of the EU Charter on Fundamental Rights. Secondly, with regard to U.S. surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the USA, as required by Article 47 of the EU Charter.

The ECJ stated that organisations transferring personal data to the USA can still use the Article 49 GDPR derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporter to make a complex assessment  about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. 

Despite the Schrems II judgment, many organisations have continued to transfer personal data to the USA hoping that regulators will wait for a new Transatlantic data deal before enforcing the judgement.  Whilst the UK Information Commissioner’s Office (ICO) seems to have adopted a “wait and see” approach, other regulators have now started to take action. In February 2022, the French Data Protection Regulator, CNIL, ruled that the use of Google Analytics was a breach of GDPR due to the data being transferred to the USA without appropriate safeguards. This followed a similar decision by the Austrian Data Protection Authority in January. 

The Road to Adequacy

Since the Schrems ruling, replacing the Privacy Shield has been a priority for EU and US officials. In March 2022, it was announced that a new Trans-Atlantic Data Privacy Framework had been agreed in principle. In October, the US President signed an executive order giving effect to the US commitments in the framework. These include commitments to limit US authorities’ access to data exported from the EU to what is necessary and proportionate under surveillance legislation, to provide data subjects with rights of redress relating to how their data is handled under the framework regardless of their nationality, and to establish a Data Protection Review Court for determining the outcome of complaints.

Schrems III?

The privacy campaign group, noyb, of which Max Schrems is Honorary Chairman, is not impressed by the draft adequacy decision. It said in a statement:

“…the changes in US law seem rather minimal. Certain amendments, such as the introduction of the proportionality principle or the establishment of a Court, sound promising – but on closer examination, it becomes obvious that the Executive Order oversells and underperforms when it comes to the protection of non-US persons. It seems obvious that any EU “adequacy decision” that is based on Executive Order 14086 will likely not satisfy the CJEU. This would mean that the third deal between the US Government and the European Commission may fail.”

Max Schrems said: 

… As the draft decision is based on the known Executive Order, I can’t see how this would survive a challenge before the Court of Justice. It seems that the European Commission just issues similar decisions over and over again – in flagrant breach of our fundamental rights.”

The draft adequacy decision will now be reviewed by the European Data Protection Board (EDPB) and the European Member States. From the above statements it seems that if Privacy Shield 2.0 is finalised, a legal challenge against it is inevitable.

UK to US Data Transfers 

Personal data transfers are also a live issue for most UK Data Controllers including public authorities. Whether using an online meeting app, cloud storage solution or a simple text messaging service, all often involve a transfer of personal data to the US. At present use of such services usually involves a complicated TRA and execution of standard contractual clauses. A new UK international data transfer agreement (IDTA) came into force on 21st March 2022 but it still requires a TRA as well as supplementary measures where privacy risks are identified. 

Good news may be round the corner for UK data exporters. The UK Government is also in the process of making an adequacy decision for the US. We suspect it will strike a similar deal once the EU/US one is finalised.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Our next online GDPR Practitioner Certificate course, starting on 10th January, is fully booked. We have places on the course starting on 19th January. 

The Data Protection and Digital Information Bill: A new UK GDPR?

In July the Government published the Data Protection and Digital Information Bill, the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. 

In the Government’s response to the September 2021 consultation (“Data: A New Direction”) it said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” To achieve this, the new Bill proposes substantial amendments to existing UK data protection legislation; namely the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). There is no shiny new Data Protection Act 2022 or even a new colour for the UK GDPR! Perhaps a missed opportunity to showcase the benefits of Brexit! 

In addition to reforming core data protection law, the Bill deals with certification of digital identity providers, electronic registers of births and deaths and information standards for data-sharing in the health and adult social care system. The notable DP provisions are set out below.

Amended Definition of Personal Data

Clause 1 of the Bill limits the scope of personal data to:

  • where the information is identifiable by the controller or processor by reasonable means at the time of the processing; or
  • where the controller or processor ought to know that another person will likely obtain the information as a result of the processing and the individual will likely be identifiable by that person by reasonable means at the time of the processing.

This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. It could make it easier for organisations to achieve data anonymisation as they would no longer need to concern themselves with potential future identifiability, with the focus instead being on identifiability “at the time of the processing”. On the other hand, the change does not address the risk of indirect identification.

Vexatious Data Subject Requests

Article 12 of the UK GDPR allows controllers to refuse to comply with data subject rights requests (or charge a fee) when the requests are “manifestly unfounded” or “excessive”.  Clause 7 of the Bill proposes to replace this with “vexatious” or “excessive”. Examples of vexatious requests given in the Bill are those requests intended to cause distress, not made in good faith, or that are an abuse of process. All these could easily fit into “manifestly unfounded” and so it is difficult to understand the need for change here. 

Data Subject Complaints

Currently, the UK GDPR allows a data subject to complain to the Information Commissioner, but nothing expressly deals with whether or how they can complain to a controller. Clause 39 of the Bill would make provision for this and require the controller to acknowledge receipt of such a complaint within 30 days and respond substantively “without undue delay”. However, under clause 40, if a data subject has not made a complaint to the controller, the ICO is entitled not to accept the complaint.

Much was made about “privacy management programmes” in the Government’s June announcement. These are not expressly mentioned in the Bill but most of the proposals that were to have fallen under that banner are still there (see below).

Senior Responsible Individuals

As announced in June, the obligation for some controllers and processors to appoint a Data Protection Officer (DPO) is proposed to be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals, are required (by clause 14) to designate a senior manager as a “Senior Responsible Individual”. Just like the DPO, the SRI must be adequately resourced and cannot be dismissed for performing their tasks under the role. The requirement for them to be a senior manager (rather than just reporting to senior management, as current DPOs must) will cause problems for those organisations currently using outsourced DPO services.

ROPAs and DPIAs

The requirement for Records of Processing Activities (ROPAs) will also go. Clause 15 of the Bill proposes to replace it with a leaner “Record of Processing of Personal Data”.  Clause 17 will replace Data Protection Impact Assessments (DPIAs) with leaner and less prescriptive Assessments of High Risk Processing. Clause 18 ensures that controllers are no longer required, under Article 36 of the UK GDPR, to consult the ICO on certain high risk DPIAs.

Automated Decision Making

Article 22 of UK GDPR currently confers a “right” on data subjects not to be subject to automated decision making which produces legal effects or otherwise significantly affects them. Clause 11 of the Bill reframes Article 22 in terms of a positive right to human intervention. However, it would only apply to “significant” decisions, rather than decisions that produce legal effects or similarly significant effects. It is unclear whether this will make any practical difference. 

International Transfers 

The judgment of the European Court of Justice (ECJ) in “Schrems II” not only stated that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal transfer tool. It also said that in any international data transfer situation, whether to the USA or other countries, the data exporter needs to make a complex assessment  about the recipient country’s data protection legislation to ensure that it adequately protects the data especially from access by foreign security agencies (a Transfer Impact Assessment or TIA) .  

The Bill amends Chapter 5 of the UK GDPR (international transfers) with the introduction of the “data protection test” for the above mentioned assessment. This would involve determining if the standard of protection provided for data subjects in the recipient country is “not materially lower” than the standard of protection in the UK. The new test would apply both to the Secretary of State, when making “adequacy” determinations, and to controllers, when deciding whether to transfer data. The explanatory notes to the Bill state that the test would not require a “point- by-point comparison” between the other country’s regime and the UK’s. Instead an assessment will be “based on outcomes i.e. the overall standard of protection for a data subject”. 

An outcome based approach will be welcome by organisations who regularly transfer personal data internationally especially where it is of no practical interest to foreign security agencies. However, this proposed approach will attract the attention of the EU (see later). (see also our forthcoming International Transfers webinar).

The Information Commission

Under clause 100 of the Bill, the Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive (presumably John Edwards, the current Commissioner). 

The Commission would have a principal function of overseeing data protection alongside additional duties such as to have regard to the desirability of promoting innovation; the desirability of promoting competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; and the need to safeguard public security and national security. New powers for the Commission include an audit/assessment power (clause 35) to require a controller to appoint a person to prepare and provide a report and to compel individuals to attend for interviews (clause 36) in civil and criminal investigations.

The Bill also proposes to abolish the Surveillance Camera Commissioner and the Biometrics Commissioner.

Privacy and Electronic Communications (EC Directive) Regulations 2003 

Currently, under PECR, cookies (and similar technologies) can only be used to store or access information on end user terminal equipment without express consent where it is “strictly necessary” e.g. website security or proper functioning of the site. The Bill proposes allowing cookies to be used without consent for the purposes of web analytics and to install automatic software updates (see the GDPR enforcement cases involving Google Analytics). 

Another notable proposed change to PECR, involves extending “the soft opt-in” to electronic communications from organisations other than businesses. This would permit political parties, charities and other non-profits to send unsolicited email and SMS direct marketing to individuals without consent, where they have an existing supporter relationship with the recipient. 

Finally on PECR, the Bill proposes to increase the fines for infringement from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

Business Data

The Bill would give the Secretary of State and the Treasury the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. “Customers” would not just be data subjects, but anyone who purchased (or received for free) goods, services or digital content from a trader in a consumer (rather than business) context. “Business data” would include information about goods, services and digital content supplied or provided by a trader. It would also include information about where those goods etc. are supplied, the terms on which they are supplied or provided, prices or performance and information relating to feedback from customers. Customers would potentially have a right to access their data, which might include information on the customer’s usage patterns and the price paid to aid personalised price comparisons. Similarly, businesses could potentially be required to publish, or otherwise make available, business data.

These provisions go much further than existing data portability provisions in the UK GDPR. The latter does not guarantee provision of data in “real time”, nor cover wider contextual data. Nor do they apply where the customer is not an individual.

Adequacy?

The Bill is currently making its way through Parliament. The impact assessment reiterates that “the government’s view is that reform of UK legislation on personal data is compatible with the EU maintaining free flow of personal data from Europe.”  However, with the multiple amendments proposed in the Bill, the UK GDPR is starting to look quite different to the EU version. And the more the two regimes diverge, the more there is a risk that the EU might put a “spanner in the works” when the UK adequacy assessment is reviewed in 2024. Much depends on the balance struck in the final text of the Bill. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September. 

Have you Considered an Apprentice?

Act Now Training has teamed up with Damar Training on materials and expertise underpinning its new Data Protection and Information Governance Practitioner Level 4 Apprenticeship.


The apprenticeship will help develop the skills of those working in the increasingly important fields of data protection and information governance.

With the rapid advancement of technology, there is a huge amount of personal data being processed by organisations, which is the subject of important decisions affecting every aspect of people’s lives. This poses significant legal and ethical challenges, as well as the risk of incurring considerable fines from regulators for non compliance.

This apprenticeship aims to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address these challenges.

If you know someone who you think would benefit from doing an apprenticeship in DP and IG, then this may be the perfect solution for them.
Places are limited for each cohort. Cohorts start in September, January and May.

Further details can be found at https://www.actnow.org.uk/apprenticeship

Act Now Launches New Website

Act Now Training is pleased to announce the launch of its new website

With a fresh new design and new navigation buttons and menus, we are confident that the new website will help you find all the information you need about our courses and services much more efficiently. We have also added our blog to the front page giving you fast access to all the latest news and developments in the world of information law. 

This is only the first phase of our website. We want to improve and streamline the delegates’ learning experience even further. To assist delegates on their learning journey, Phase 2 will include improved backend support and additional learner support with customisable content. All of this, combined with our focus on providing courses that are underpinned by a solid skills and competency framework, will allow us to continue in our aim of being the premier provider for your information governance needs.

We really do hope you enjoy the new website and we would love to receive your feedback about how we can improve the site further to meet your needs. Please get in touch.

The New EU Data Governance Act

On 17th May 2022, The Council of the European Union adopted the Data Governance Act (DGA) or Regulation on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act) (2020/0340 (COD) to give its full title. The Act aims to boost data sharing in the EU allowing companies to have access to more data to develop new products and services. 

The DGA will achieve its aims through measures designed to increase trust in relation to data sharing, creating new rules on the neutrality of data marketplaces and facilitating the reuse of public sector data. The European Commission says in its Questions and Answers document

The economic and societal potential of data use is enormous: it can enable new products and services based on novel technologies, make production more efficient, and provide tools for combatting societal challenges“.

Application

The DGA will increase the amount of data available for re-use within the EU by allowing public sector data to be used for purposes different than the ones for which it was originally collected. The Act will also create sector-specific data spaces to enable the sharing of data within a specific sector e.g. transport, health, energy or agriculture.

Data is defined as “any digital representation of acts, facts or information and any compilation of such acts, facts or information, including in the form of sound, visual or audiovisual recording” that is held by public sector bodies and which is not subject to the Open Data Directive but is subject to the rights of others. Examples include data generated by GPS and healthcare data, which if put to productive use, could contribute to improving the quality of services. The Commission estimates that the Act could increase the economic value of data by up to €11 billion by 2028.

Each EU Member State will be required to establish a supervisory authority to act as a single information point providing assistance to governments. They will also be required to establish a register of available public sector data. The European Data Innovation Board (see later) will have oversight responsibilities and maintain a central register of available DGA Data. 

On first reading the DGA seems similar to The Re-use of Public Sector Information Regulations 2015 which implemented Directive 2013/37/EU. The aim of the latter was to remove obstacles that stood in the way of re-using public sector information. However the DGA goes much further. 

Data Intermediary Services 

The European Commission believes that, in order to encourage individuals to allow their data to be shared, they should trust the process by which such data is handled. To this end, the DGA creates data sharing service providers known as “data intermediaries”, which will handle the sharing of data by individuals, public bodies and private companies. The idea is to provide an alternative to the existing major tech platforms.

To uphold trust in data intermediaries, the DGA puts in place several protective measures. Firstly, intermediaries will have to notify public authorities of their intention to provide data-sharing services. Secondly, they will have to commit to the protection of sensitive and confidential data. Finally, the DGA imposes strict requirements to ensure the intermediaries’ neutrality. These providers will have to distinguish their data sharing services from other commercial operations and are prohibited from using the shared data for any other purposes. 

Data Altruism

The DGA encourages data altruism. This where data subjects (or holders of non-personal data) consent to their data being used for the benefit of society e.g. scientific research purposes or improving public services. Organisations who participate in these activities will be entered into a register held by the relevant Member State’s supervisory authority. In order to share data for these purposes, a data altruism consent form will be used to obtain data subjects’ consent.

The DGA will also create a European Data Innovation Board. Its missions would be to oversee the data sharing service providers (the data intermediaries) and provide advice on best practices for data sharing.

The UK

Brexit means that the DGA will not apply in the UK, although it clearly may affect UK businesses doing business in the EU. It remains to be seen whether the UK will take similar approach although it notable that UK proposals for amending GDPR include “amending the law to facilitate innovative re-use of data for different purposes and by different data controllers.”

The DGA will shortly be published in the Official Journal of the European Union and enter into force 20 days after publication. The new rules will apply 15 months thereafter. To further encourage data sharing, on 23 February 2022 the European Commission proposed a Data Act that is currently being worked on.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

New DP and IG Practitioner Apprenticeship

Act Now Training has teamed up with Damar Training on materials and expertise underpinning its new Data Protection and Information Governance Practitioner Level 4 Apprenticeship.

The apprenticeship, which received final approval in March, will help develop the skills of those working in the increasingly important fields of data protection and information governance. 

With the rapid advancement of technology, there is a huge amount of personal data being processed by organisations, which is the subject of important decisions affecting every aspect of people’s lives. This poses significant legal and ethical challenges, as well as the risk of incurring considerable fines from regulators for non compliance. 

This apprenticeship aims to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address these challenges.

Ibrahim Hasan, Director of Act Now, said:

“We are excited to be working Damar Training to help deliver this much needed apprenticeship. We are committed to developing the IG sector and encouraging a diverse range of entrants to the IG profession. We have looked at every aspect of the IG Apprenticeship standard to ensure the training materials equip budding IG officers with the knowledge and skills they need to implement the full range of IG legislation in a practical way.

Damar’s managing director, Jonathan Bourne, added:

“We want apprenticeships to create real, long-term value for apprentices and organisations. It is vital therefore that we work with partners who really understand not only the technical detail but also the needs of employers.

Act Now Training are acknowledged as leaders in the field, having recently won the Information and Records Management Society (IRMS) Supplier of the Year award for the second consecutive year. I am delighted therefore that we are able to bring together their 20 years of deep sector expertise with Damar’s 40+ year record of delivering apprenticeship in business and professional services.

This apprenticeship has already sparked significant interest, particularly among large public and private sector organisations and professional services firms. Damar has also assembled an employer reference group that is feeding into the design process in real time to ensure that the programme works for employers.

The employer reference group met for the first time on May 25. It included industry professionals across a variety of sectors including private and public health care, financial services, local and national government, education, IT and data consultancy, some of whom were part of the apprenticeship trailblazer group.

If your organisation is interested in the apprenticeship please get in touch with us to discuss further.

Exit mobile version
%%footer%%