£1.35 Million GDPR Fine for Catalogue Retailer

On 5th October, the Information Commissioner’s Office (ICO) issued a GDPR Monetary Penalty Notice in the sum of £1,350,000 to Easylife Ltd. The catalogue retailer was found to have been using 145,400 customers personal data to predict their medical condition and then, without their consent, targeting them with health-related products.

This latest ICO fine is interesting but not because of the amount involved. There have been much higher fines. In October 2020, British Airways was fined £20 million for a cyber security breach which saw the personal and financial details of more than 400,000 customers being accessed by hackers. This, like most of the other ICO fines, involved a breach of the security provisions of GDPR. In the Easylife fine, the ICO focussed on the more interesting GDPR provisions (from a practitioner’s perspective) relating to legal basis, profiling and transparency. 

The background to the fine is that a telemarketing company was being investigated by the ICO for promoting funeral plans during the pandemic. This led to the investigation into Easylife because the company was conducting marketing calls for Easylife. The investigation initially concerned potential contraventions of the Privacy and Electronic Communications Regulations (PECR), and that investigation raised concerns of potential contraventions of GDPR, which the Commissioner then investigated separately.

The ICO investigation found that when a customer purchased a product from Easylife’s Health Club catalogue, the company would make assumptions about their medical condition and then market health-related products to them without their consent. For example, if a person bought a jar opener or a dinner tray, Easylife would use that purchase data to assume that person has arthritis and then call them to market glucosamine joint patches.

Special Category Data and Profiling

Article 4( 4) of the GDPR defines profiling:
“‘profiling’ means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements;”

Out of 122 products in Easylife’s Health Club catalogue, 80 were considered to be ‘trigger products’. Once these products were purchased by customers, Easlylife would target them with a health-related item. The ICO found that significant profiling of customers was taking place. 

Easylife’s use of customer transactional data to infer that the customer probably had a particular health condition was Special Category Data. Article 6 and 9 of the GDPR provides that such data may not be processed unless a lawfulness condition can be found. The only relevant condition in the context of Easylife’s health campaign was explicit consent. Easylife did not collect consent to process Special Category Data, instead relying on legitimate interest (based on its privacy notice) under Article 6. As a result, it had no lawful basis to process the data in contravention of Article 6 and Article 9 of the GDPR. 

Invisible Processing

Furthermore the ICO concluded that ‘invisible’ processing of health data took place. It was ‘invisible’ because Easylife’s customers were unaware that the company was collecting and using their personal data for profiling/marketing purposes. In order to process this data lawfully, Easylife would have had to collect explicit consent from the customers and to update its privacy policy to indicate that Special Category Data was to be processed by consent. Easylife’s omission to do this was a breach of Article 13(1)(c) of the GDPR.

John Edwards, UK Information Commissioner, said:

“Easylife was making assumptions about people’s medical condition based on their purchase history without their knowledge, and then peddled them a health product – that is not allowed.

The invisible use of people’s data meant that people could not understand how their data was being used and, ultimately, were not able to exercise their privacy and data protection rights. The lack of transparency, combined with the intrusive nature of the profiling, has resulted in a serious breach of people’s information rights.”

One other ICO monetary penalty notice has examined these issues in detail. In May 2022 Clearview AI was fined £7,552,800 following an investigation into its online database contains 20 billion images of people’s faces scraped from the internet. 

As Jon Baines pointed out (thanks Jon!), on the Jiscmail bulletin board, a large chunk of the online programmatic advertising market also profiles people and infers Special Category Data in the same way as Easylife. This was highlighted in the ICO’s 2019 report. The ICO said in January last year that it was resuming its Adtech investigation, but there has been very little news since then.

GDPR was not the only cause of Easylife’s woes. It was also fined £130,000 under PECR for making 1,345,732 direct marketing calls to people registered with the Telephone Preference Service (TPS).

This case also shows the importance of organisations only using  telephone marketing companies who understand and comply with GDPR and PECR. If not, the ICO enforcement spotlight will also fall on clients of such companies.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. 

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 25th October. 

TikTok Faces a £27 Million GDPR Fine

On 26 September 2022, TikTok was issued with a Notice of Intent under the GDPR by the Information Commissioner’s Office (ICO). The video-sharing platform faces a £27 million fine after an ICO investigation found that the company may have breached UK data protection law.  

The notice sets out the ICO’s provisional view that TikTok breached UK data protection law between May 2018 and July 2020. It found the company may have:

  • processed the data of children under the age of 13 without appropriate parental consent,
  • failed to provide proper information to its users in a concise, transparent and easily understood way, and
  • processed special category data, without legal grounds to do so.

The Information Commissioner, John Edwards said:

“We all want children to be able to learn and experience the digital world, but with proper data privacy protections. Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.

“I’ve been clear that our work to better protect children online involves working with organisations but will also involve enforcement action where necessary. In addition to this, we are currently looking into how over 50 different online services are conforming with the Children’s code and have six ongoing investigations looking into companies providing digital services who haven’t, in our initial view, taken their responsibilities around child safety seriously enough.”

Rolled out in September last year, the Children’s Code puts in place new data protection standards for online services likely to be accessed by children.

It will be interesting to see if and when this notice becomes an actual fine. If it does it will be the largest fine issued by the ICO. It is also the first potential fine to look at transparency and consent and will provide valuable guidance to Data Controllers especially if it is appealed to the Tribunal.  

It is important to note that this is not a fine but ‘notice of intent’ – a legal document that precedes a potential fine. The notice sets out the ICO’s provisional view which may of course change after TikTok makes representations. 

Remember we have been here before. In July 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine was for £20 million issued in July 2020. In November 2020Marriott International Inc was fined £18.4 million, much lower than the £99 million set out in the original notice.

This is not the first time TikTok has found itself in hot water of over its data handling practices. In 2019, the company was given a record $5.7m fine by the Federal Trade Commission, for mishandling children’s data. It has also been fined in South Korea for similar reasons.

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 25th October. 

Data Protection and IG Practitioner Launch Event

On 7th September, we celebrated the launch of the new Data Protection and Information Governance Practitioner Apprenticeship. The apprenticeship, which received final approval in March, will help develop the skills of those working in the increasingly important fields of data protection and information governance.  Act Now Training has teamed up with leading apprenticeship provider, Damar Training, to provide the materials and expertise underpinning the new apprenticeship. 

The launch event, held at The Bloc in Manchester, was attended by data protection experts and learning and development leads from a wide range of public and private sector organisations across England. Attendees included members of the Trailblazer Group that designed the apprenticeship and also members of Damar’s employer reference group. The latter has been a close part of the programme design process, ensuring that the programme meets the needs of employers in this rapidly growing sector.

Attendees enjoyed talks during the event from Jonathan Bourne (Managing Director, Damar Training), Ibrahim Hasan (Director, Act Now Training) and Phillipa Nazari (Assistant Director Information Governance and Data Protection Officer, Greater Manchester Combined Authority and Transport for Greater Manchester). Phillipa was also the chair of the Trailblazer Group.

Jonathan Bourne commented:

“The data protection apprenticeship is much needed. It helps address skills shortages in data protection but also enables organisations to improve compliance, reduce regulatory and legal risk and increase their efficiency. We are delighted to be working with so many committed employers who, like us, see the apprenticeship as one of the keys to improved capability in data protection and information governance.”

Ibrahim Hasan added:

“We are excited to be working with Damar Training on this much needed apprenticeship. It will develop the information governance profession by encouraging a diverse range of new information governance professionals with the knowledge and skills to enable them to tackle the data protection challenges ahead.”

Phillipa commented:

“It has been a great honour to give something back to the profession that will leave a lasting legacy for employers. Each individual and organisation that benefits from the apprenticeship will be ensuring that a fundamental human right is upheld and organisations can derive value and insight from their data and information in the appropriate, legal and ethical way. I hope that the apprenticeship will open up opportunities for a rich and varied career and be accessible for an even more diverse pool of talent. It’s been needed for a long time and now the future looks bright.” 

The apprenticeship is available to employers across England. The first cohort of apprentices on this exciting and innovative new programme will start in October. 

If your organisation is interested in the apprenticeship please get in touch with us to discuss further. 

The Data Protection and Digital Information Bill: A new UK GDPR?

In July the Government published the Data Protection and Digital Information Bill, the next step in its much publicised plans to reform the UK Data Protection regime following Brexit. 

In the Government’s response to the September 2021 consultation (“Data: A New Direction”) it said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” To achieve this, the new Bill proposes substantial amendments to existing UK data protection legislation; namely the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). There is no shiny new Data Protection Act 2022 or even a new colour for the UK GDPR! Perhaps a missed opportunity to showcase the benefits of Brexit! 

In addition to reforming core data protection law, the Bill deals with certification of digital identity providers, electronic registers of births and deaths and information standards for data-sharing in the health and adult social care system. The notable DP provisions are set out below.

Amended Definition of Personal Data

Clause 1 of the Bill limits the scope of personal data to:

  • where the information is identifiable by the controller or processor by reasonable means at the time of the processing; or
  • where the controller or processor ought to know that another person will likely obtain the information as a result of the processing and the individual will likely be identifiable by that person by reasonable means at the time of the processing.

This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. It could make it easier for organisations to achieve data anonymisation as they would no longer need to concern themselves with potential future identifiability, with the focus instead being on identifiability “at the time of the processing”. On the other hand, the change does not address the risk of indirect identification.

Vexatious Data Subject Requests

Article 12 of the UK GDPR allows controllers to refuse to comply with data subject rights requests (or charge a fee) when the requests are “manifestly unfounded” or “excessive”.  Clause 7 of the Bill proposes to replace this with “vexatious” or “excessive”. Examples of vexatious requests given in the Bill are those requests intended to cause distress, not made in good faith, or that are an abuse of process. All these could easily fit into “manifestly unfounded” and so it is difficult to understand the need for change here. 

Data Subject Complaints

Currently, the UK GDPR allows a data subject to complain to the Information Commissioner, but nothing expressly deals with whether or how they can complain to a controller. Clause 39 of the Bill would make provision for this and require the controller to acknowledge receipt of such a complaint within 30 days and respond substantively “without undue delay”. However, under clause 40, if a data subject has not made a complaint to the controller, the ICO is entitled not to accept the complaint.

Much was made about “privacy management programmes” in the Government’s June announcement. These are not expressly mentioned in the Bill but most of the proposals that were to have fallen under that banner are still there (see below).

Senior Responsible Individuals

As announced in June, the obligation for some controllers and processors to appoint a Data Protection Officer (DPO) is proposed to be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals, are required (by clause 14) to designate a senior manager as a “Senior Responsible Individual”. Just like the DPO, the SRI must be adequately resourced and cannot be dismissed for performing their tasks under the role. The requirement for them to be a senior manager (rather than just reporting to senior management, as current DPOs must) will cause problems for those organisations currently using outsourced DPO services.

ROPAs and DPIAs

The requirement for Records of Processing Activities (ROPAs) will also go. Clause 15 of the Bill proposes to replace it with a leaner “Record of Processing of Personal Data”.  Clause 17 will replace Data Protection Impact Assessments (DPIAs) with leaner and less prescriptive Assessments of High Risk Processing. Clause 18 ensures that controllers are no longer required, under Article 36 of the UK GDPR, to consult the ICO on certain high risk DPIAs.

Automated Decision Making

Article 22 of UK GDPR currently confers a “right” on data subjects not to be subject to automated decision making which produces legal effects or otherwise significantly affects them. Clause 11 of the Bill reframes Article 22 in terms of a positive right to human intervention. However, it would only apply to “significant” decisions, rather than decisions that produce legal effects or similarly significant effects. It is unclear whether this will make any practical difference. 

International Transfers 

The judgment of the European Court of Justice (ECJ) in “Schrems II” not only stated that organisations that transfer personal data to the US can no longer rely on the Privacy Shield Framework as a legal transfer tool. It also said that in any international data transfer situation, whether to the USA or other countries, the data exporter needs to make a complex assessment  about the recipient country’s data protection legislation to ensure that it adequately protects the data especially from access by foreign security agencies (a Transfer Impact Assessment or TIA) .  

The Bill amends Chapter 5 of the UK GDPR (international transfers) with the introduction of the “data protection test” for the above mentioned assessment. This would involve determining if the standard of protection provided for data subjects in the recipient country is “not materially lower” than the standard of protection in the UK. The new test would apply both to the Secretary of State, when making “adequacy” determinations, and to controllers, when deciding whether to transfer data. The explanatory notes to the Bill state that the test would not require a “point- by-point comparison” between the other country’s regime and the UK’s. Instead an assessment will be “based on outcomes i.e. the overall standard of protection for a data subject”. 

An outcome based approach will be welcome by organisations who regularly transfer personal data internationally especially where it is of no practical interest to foreign security agencies. However, this proposed approach will attract the attention of the EU (see later). (see also our forthcoming International Transfers webinar).

The Information Commission

Under clause 100 of the Bill, the Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive (presumably John Edwards, the current Commissioner). 

The Commission would have a principal function of overseeing data protection alongside additional duties such as to have regard to the desirability of promoting innovation; the desirability of promoting competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; and the need to safeguard public security and national security. New powers for the Commission include an audit/assessment power (clause 35) to require a controller to appoint a person to prepare and provide a report and to compel individuals to attend for interviews (clause 36) in civil and criminal investigations.

The Bill also proposes to abolish the Surveillance Camera Commissioner and the Biometrics Commissioner.

Privacy and Electronic Communications (EC Directive) Regulations 2003 

Currently, under PECR, cookies (and similar technologies) can only be used to store or access information on end user terminal equipment without express consent where it is “strictly necessary” e.g. website security or proper functioning of the site. The Bill proposes allowing cookies to be used without consent for the purposes of web analytics and to install automatic software updates (see the GDPR enforcement cases involving Google Analytics). 

Another notable proposed change to PECR, involves extending “the soft opt-in” to electronic communications from organisations other than businesses. This would permit political parties, charities and other non-profits to send unsolicited email and SMS direct marketing to individuals without consent, where they have an existing supporter relationship with the recipient. 

Finally on PECR, the Bill proposes to increase the fines for infringement from the current maximum of £500,000 to UK GDPR levels i.e.  up to £17.5m of 4% of global annual turnover (whichever is higher). 

Business Data

The Bill would give the Secretary of State and the Treasury the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data. “Customers” would not just be data subjects, but anyone who purchased (or received for free) goods, services or digital content from a trader in a consumer (rather than business) context. “Business data” would include information about goods, services and digital content supplied or provided by a trader. It would also include information about where those goods etc. are supplied, the terms on which they are supplied or provided, prices or performance and information relating to feedback from customers. Customers would potentially have a right to access their data, which might include information on the customer’s usage patterns and the price paid to aid personalised price comparisons. Similarly, businesses could potentially be required to publish, or otherwise make available, business data.

These provisions go much further than existing data portability provisions in the UK GDPR. The latter does not guarantee provision of data in “real time”, nor cover wider contextual data. Nor do they apply where the customer is not an individual.

Adequacy?

The Bill is currently making its way through Parliament. The impact assessment reiterates that “the government’s view is that reform of UK legislation on personal data is compatible with the EU maintaining free flow of personal data from Europe.”  However, with the multiple amendments proposed in the Bill, the UK GDPR is starting to look quite different to the EU version. And the more the two regimes diverge, the more there is a risk that the EU might put a “spanner in the works” when the UK adequacy assessment is reviewed in 2024. Much depends on the balance struck in the final text of the Bill. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September. 

Have you Considered an Apprentice?

Act Now Training has teamed up with Damar Training on materials and expertise underpinning its new Data Protection and Information Governance Practitioner Level 4 Apprenticeship.


The apprenticeship will help develop the skills of those working in the increasingly important fields of data protection and information governance.

With the rapid advancement of technology, there is a huge amount of personal data being processed by organisations, which is the subject of important decisions affecting every aspect of people’s lives. This poses significant legal and ethical challenges, as well as the risk of incurring considerable fines from regulators for non compliance.

This apprenticeship aims to develop individuals into accomplished data protection and information governance practitioners with the knowledge, skills and competencies to address these challenges.

If you know someone who you think would benefit from doing an apprenticeship in DP and IG, then this may be the perfect solution for them.
Places are limited for each cohort. Cohorts start in September, January and May.

Further details can be found at https://www.actnow.org.uk/apprenticeship

A New GDPR Fine and a New ICO Enforcement Approach

Since May 25th 2018, the Information Commissioner’s Office (ICO) has issued ten GDPR fines. The latest was issued on 30th June 2022 to Tavistock and Portman NHS Foundation Trust for £78,400. The Trust had accidentally revealing 1,781 adult gender identity patients’ email addresses when sending out an email.

This is the second ICO fine issued to a Data Controller in these circumstances. In 2021, HIV Scotland was fined £10,000 when it sent an email to 105 people which included patient advocates representing people living with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name. From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk. 

The latest fine was issued to Tavistock and Portman NHS Foundation Trust following an e mail sent in early September 2019. The Trust intended to run a competition inviting patients of the adult Gender Identity Clinic to provide artwork to decorate a refurbished clinic building. It sent two identical emails promoting the competition (one to 912 recipients, and the second to 869 recipients) before realising they had not Bcc’d the addresses.

It was clear from the content of the email that all the recipients were patients of the clinic, and there was a risk further personal details could be found by researching the email addresses. The Trust immediately realised the error and tried, unsuccessfully, to recall the emails. It wrote to all the recipients to apologise and informed the ICO later that day.

The ICO investigation found:

  • Two similar, smaller incidents had affected a different department of the same Trust in 2017. While that department had strengthened their processes as a result, the learning and changes were not implemented across the whole Trust.
  • The Trust was overly reliant on people following policy to prevent bulk emails using ‘to’ in Outlook. There were no technical or organisational safeguards in place to prevent or mitigate against this very predictable human error. The Trust has since procured specialist bulk email software and set “a maximum ‘To’ recipient” rule on the email server.

The ICO reduced the fine issued to the Trust from £784,800 to £78,400 to reflect the ICO’s new approach to working more effectively with public authorities. This approach, which will be trialled over the next two years, was outlined in an open letter from the UK Information Commissioner John Edwards to public authorities. It will see more use of the Commissioner’s discretion to reduce the impact of fines on the public sector, coupled with better engagement including publicising lessons learned and sharing good practice. 

In practice, the new approach will mean an increased use of the ICO’s wider powers, including warnings, reprimands and enforcement notices, with fines only issued in the most serious cases. When a fine is considered, the decision notice will give an indication on the amount of the fine the case would have attracted. This will provide information to the wider economy about the levels of penalty others can expect from similar conduct. Additionally, the ICO will be working more closely with the public sector to encourage compliance with data protection law and prevent harms before they happen.

The ICO followed its new approach recently when issuing a reprimand to NHS Blood and Transplant Service. in August 2019, the service inadvertently released untested development code into a live system for matching transplant list patients with donated organs. This error led to five adult patients on the non-urgent transplant list not being offered transplant livers at the earliest possible opportunity. The service remedied the error within a week, and none of the patients involved experienced any harm as a result. The ICO says that, if the revised enforcement approach had not been in place, the service would have received a fine of £749,856. 

The new approach will be welcome news to the public sector at a time of pressure on budgets. However some have questioned why the public sector merits this special treatment. It is not as if it has been the subject of a disproportionate number of fines. The first fine to a public authority was only issued in December 2021 (more than three and a half years after GDPR came into force) when the Cabinet Office was fined £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients online. Perhaps the ICO is already thinking about the reform of its role following the DCMS’s response to last year’s GDPR consultation. It will be interesting to see if others, particularly the charity sector, lobby for similar treatment. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

Act Now Launches New Website

Act Now Training is pleased to announce the launch of its new website

With a fresh new design and new navigation buttons and menus, we are confident that the new website will help you find all the information you need about our courses and services much more efficiently. We have also added our blog to the front page giving you fast access to all the latest news and developments in the world of information law. 

This is only the first phase of our website. We want to improve and streamline the delegates’ learning experience even further. To assist delegates on their learning journey, Phase 2 will include improved backend support and additional learner support with customisable content. All of this, combined with our focus on providing courses that are underpinned by a solid skills and competency framework, will allow us to continue in our aim of being the premier provider for your information governance needs.

We really do hope you enjoy the new website and we would love to receive your feedback about how we can improve the site further to meet your needs. Please get in touch.

The Future of the UK Data Protection Regime

Last week, the Government signalled its plans to reform the UK Data Protection regime by publishing its response to the consultation launched in September last year. In “Data: A New Direction” the Government said it intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” Time will tell whether the proposed changes set out it in the response will achieve this aim. 

The Government has avoided the temptation to change the title of the UK GDPR to something more post Brexit which says “see, we told you Brexit would bring benefits”. No DPA 2022, however the UK GDPR will be amended as will the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). 

Privacy Management Programmes

The main proposed change will be to the UK GDPR’s accountability framework. This proposal would require an organisation to develop and implement a risk-based privacy
management programme that reflects the volume and sensitivity of the personal information it handles, and the type(s) of data processing it carries out. A privacy management programme would include the appropriate personal information policies and processes for the protection of personal information.

To support the implementation of the new accountability framework, the Government intends to remove the requirement to :

  • Designate a Data Protection Officer under Article 37.  This will be replaced by the need to appoint a suitable individual to oversee the organisation’s DP compliance. A DPO by another name?
  • Undertake a Data Protection Impact Assessment under Article 35. Under the new privacy management programme, organisations will still be required to identify and manage risks, but they will be granted greater flexibility as to how to meet these requirements.
  • Maintain a Record of Processing Activity (ROPA) under Article 30. Organisations will still need to have personal data inventories as part of their privacy management programme which describe what and where personal data is held, why it has been collected and how sensitive it is, but they will not be required to do so in the way prescribed by the requirements set out in Article 30.
  • Consult the ICO, under Article 36, in relation to high-risk personal data processing that cannot be mitigated 

Some commentators have likened these proposals to “the Emperor’s new clothes.” There is a lot of tinkering and changing of names but the bottom line (no pun intended) remains the same. Those who take data protection seriously will continue to do what they have always done (e.g. DPIAs and having a DPO) whist those who see data protection as a burden will consider the proposals as an excuse to do the absolute minimum. 

Subject Access Costs

The Government, in its response to the consultation,  recognises the burden subject access requests can place on some organisations. However, despite there being a proposal in the consultation, it does not plan to reintroduce a fee for a subject access request; nor will there be a cost ceiling for responding to a request like under the Freedom of Information Act. However, in the future, “vexatious or excessive” requests will be able to be refused under Article 12. Query the difference between this and the current wording of “manifestly unfounded or excessive”. 

PECR and Marketing 

The government also consulted on possible changes to PECR which regulates, amongst other things, cookie rules and unsolicited direct marketing communications. The main changes to expect include:

  • Permitting organisations to use analytics cookies and similar technologies without a users’ consent. 
  • Permitting organisations to store information on, or collect information from, a user’s device without their consent for other limited purposes.
  • Extending “the soft opt-in” to electronic communications from organisations other than businesses where they have previously formed a relationship with the person, perhaps as a result of membership or subscription including political parties and non-commercial entities.
  • Making it easier for political groups to use data for “political engagement”.
  • Increasing the PECR fines to GDPR levels.

There are many more proposals, including to change the structure and governance of the ICO, helpfully summarised in Annex A of the Government’s response. The big question now is how the proposed changes will be viewed by the European Commission. Will it be prompted to review the UK’s current “adequacy status” allowing free transfer of personal data between the UK and the EU? Let us know your thoughts in the comment field below.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September. 

Calling Cyber Security Trainers:
We are Hiring

Are you a cyber security expert with a reputation for delivering engaging training? We are recruiting trainers to join our team of expert associates who deliver in-house and external training courses throughout the UK and worldwide.

We are one of Europe’s leading information law training companies with a 20 year track record of delivering practical and engaging training which makes the complex simple. We recently won the Information and Records Management Society (IRMS) Supplier of the year award for 2022-23. This is the second consecutive year we have won this award.

Despite recently expanding our team, we are seeing an increase in global demand for our courses and consulting services from both the public and private sectors. We need more talented trainers who enjoy the challenge of explaining difficult concepts in a practical, jargon-free manner.

We have opportunities for full time trainers and those who wish to add an extra “string to their bow” without leaving their day job. What is important is that you are enthusiastic about Cyber Security and passionate about teaching it.

If you think you have what it takes to become an Act Now trainer, please get in touch with your CV explaining your knowledge and experience of delivering training and consultancy services in Cyber Security. 

The New EU Data Governance Act

On 17th May 2022, The Council of the European Union adopted the Data Governance Act (DGA) or Regulation on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act) (2020/0340 (COD) to give its full title. The Act aims to boost data sharing in the EU allowing companies to have access to more data to develop new products and services. 

The DGA will achieve its aims through measures designed to increase trust in relation to data sharing, creating new rules on the neutrality of data marketplaces and facilitating the reuse of public sector data. The European Commission says in its Questions and Answers document

The economic and societal potential of data use is enormous: it can enable new products and services based on novel technologies, make production more efficient, and provide tools for combatting societal challenges“.

Application

The DGA will increase the amount of data available for re-use within the EU by allowing public sector data to be used for purposes different than the ones for which it was originally collected. The Act will also create sector-specific data spaces to enable the sharing of data within a specific sector e.g. transport, health, energy or agriculture.

Data is defined as “any digital representation of acts, facts or information and any compilation of such acts, facts or information, including in the form of sound, visual or audiovisual recording” that is held by public sector bodies and which is not subject to the Open Data Directive but is subject to the rights of others. Examples include data generated by GPS and healthcare data, which if put to productive use, could contribute to improving the quality of services. The Commission estimates that the Act could increase the economic value of data by up to €11 billion by 2028.

Each EU Member State will be required to establish a supervisory authority to act as a single information point providing assistance to governments. They will also be required to establish a register of available public sector data. The European Data Innovation Board (see later) will have oversight responsibilities and maintain a central register of available DGA Data. 

On first reading the DGA seems similar to The Re-use of Public Sector Information Regulations 2015 which implemented Directive 2013/37/EU. The aim of the latter was to remove obstacles that stood in the way of re-using public sector information. However the DGA goes much further. 

Data Intermediary Services 

The European Commission believes that, in order to encourage individuals to allow their data to be shared, they should trust the process by which such data is handled. To this end, the DGA creates data sharing service providers known as “data intermediaries”, which will handle the sharing of data by individuals, public bodies and private companies. The idea is to provide an alternative to the existing major tech platforms.

To uphold trust in data intermediaries, the DGA puts in place several protective measures. Firstly, intermediaries will have to notify public authorities of their intention to provide data-sharing services. Secondly, they will have to commit to the protection of sensitive and confidential data. Finally, the DGA imposes strict requirements to ensure the intermediaries’ neutrality. These providers will have to distinguish their data sharing services from other commercial operations and are prohibited from using the shared data for any other purposes. 

Data Altruism

The DGA encourages data altruism. This where data subjects (or holders of non-personal data) consent to their data being used for the benefit of society e.g. scientific research purposes or improving public services. Organisations who participate in these activities will be entered into a register held by the relevant Member State’s supervisory authority. In order to share data for these purposes, a data altruism consent form will be used to obtain data subjects’ consent.

The DGA will also create a European Data Innovation Board. Its missions would be to oversee the data sharing service providers (the data intermediaries) and provide advice on best practices for data sharing.

The UK

Brexit means that the DGA will not apply in the UK, although it clearly may affect UK businesses doing business in the EU. It remains to be seen whether the UK will take similar approach although it notable that UK proposals for amending GDPR include “amending the law to facilitate innovative re-use of data for different purposes and by different data controllers.”

The DGA will shortly be published in the Official Journal of the European Union and enter into force 20 days after publication. The new rules will apply 15 months thereafter. To further encourage data sharing, on 23 February 2022 the European Commission proposed a Data Act that is currently being worked on.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

%d bloggers like this: