The Hidden Reach of the Prevent Strategy:
Beyond Counter-Terrorism Units

The UK government’s anti-radicalisation program, Prevent, is reportedly sharing the personal details of thousands of individuals more extensively than previously known. This sharing includes not just counter-terrorism units, but also airports, ports, immigration services, and officials at the Home Office and the Foreign, Commonwealth and Development Office (FCDO). Critics argue that such widespread data sharing could be illegal, as it involves moving sensitive personal data between databases without the consent of the individuals. 

A Metropolitan police document titled “Prevent case management guidance” indicates that Prevent details are also shared with the ports authority watchlist. This raises concerns that individuals may face increased scrutiny at airports or be subjected to counter-terrorism powers without reasonable suspicion. The document also mentions that foreign nationals may have their backgrounds checked by the FCDO and immigration services for any overseas convictions or intelligence. 

Furthermore, the Acro Criminal Records Office, which manages UK criminal records, is notified about individuals referred to Prevent, despite the program dealing with individuals who haven’t necessarily engaged in criminal behaviour.
Counter-terror police emphasise their careful approach to data sharing, which aims to protect vulnerable individuals. 

Prevent’s goal is to divert people from terrorism before they offend, and most people are unaware of their referral to the program. 95% of referrals result in no further action. A secret database, the National Police Prevent Case Management database, was previously disclosed in 2019, revealing the storage of details of those referred to Prevent. 

Newly disclosed information, obtained through a freedom of information request by the Open Rights Group (ORG), reveals that Prevent data is shared across various police databases, including the Police National Computer, specialised counter-terrorism and local intelligence systems, and the National Crime Agency. 

The sharing of this data was accidentally revealed due to a redaction error in a heavily edited Met document. Despite its sensitive nature, the ORG decided to make the document public. Sophia Akram of the ORG expressed concerns over the extent of the data sharing and potential harms, suggesting that it could be unfair and possibly unlawful. 

The guidance also indicates that data is retained and used even in cases where no further action is taken. There are concerns about the impact on young people’s educational opportunities, as Prevent requires public bodies like schools and the police to identify individuals at risk of extremism. 

Recent figures show thousands of referrals to Prevent, predominantly from educational institutions. From April 2022 to March 2023, a total of 6,817 individuals were directed to the Prevent program. Within this group, educational institutions were responsible for 2,684 referrals. Breaking down the referrals by age, there were 2,203 adolescents between the ages of 15 and 20, and 2,119 referrals involved children aged 14 or younger.

There are worries about the long-term consequences for children and young people referred to the program. Several cases have highlighted the intrusive nature of this data sharing and its potential impact on individuals’ lives. Cases in which students have missed gaining a place at a sixth form college and other cases involving children as young as four years old.  

Prevent Watch, an organisation monitoring the program, has raised alarms about the data sharing, particularly its effect on young children. The FoI disclosures challenge the notion that Prevent is non-criminalising, as data on individuals, even those marked as ‘no further action’, can be stored on criminal databases and flagged on watchlists. 

Counter-terrorism policing spokespeople defend the program, emphasising its
multi-agency nature and focus on protecting people from harm. They assert that data sharing is carefully managed and legally compliant, aiming to safeguard vulnerable individuals from joining terror groups or entering conflict zones. 

Learn more about data sharing with our UK GDPR Practitioner Certificate. Dive into the issues discussed in this blog and secure your spot now.

The NHS-Palantir Deal: A Pandora’s Box for Patient Privacy? 

The National Health Service (NHS) of England’s recent move to sign a £330 million deal with Palantir Technologies Inc. has set off alarm bells in the realm of patient privacy and data protection. Palantir, a data analytics company with roots in the U.S. intelligence and military sectors, is now at the helm of creating a mammoth NHS data platform. This raises critical questions: Is patient privacy the price of progress? 

The Controversial Contractor 

Palantir’s pedigree of working closely with entities like the CIA and its contribution to the UK Ministry of Defence has painted a target on the back of the NHS’s decision. This association, coupled with its founder’s contentious remarks about the NHS, casts a long shadow over the appointment. Critics highlight Palantir’s controversial history, notably its involvement in supporting the US immigration enforcement’s stringent policies under the Trump administration. The ethical ramifications of such affiliations are profound, given the sensitive nature of health data. Accenture, PwC, NECS and Carnall Farrar will all support Palantir, NHS England said on Tuesday. 

Data Security vs. Data Exploitation 

NHS England assures that the new “federated data platform” (FDP) will be a secure, privacy-enhancing technology that will revolutionise care delivery. The promise is a streamlined, efficient service with live data at clinicians’ fingertips. However, the concern of the potential for data exploitation looms large. Can a firm, with a not-so-distant history of aiding in surveillance, be trusted with the most intimate details of our lives—our health records? 

The Right to Opt-Out: A Right Denied? 

The debate intensifies around the right—or the apparent lack thereof—for patients to opt out of this data sharing. With the NHS stating that all data will be anonymised and used solely for “direct patient care,” they argue that an opt-out is not necessary. Yet, this has not quelled the concerns of privacy advocates and civil liberty groups who foresee a slippery slope towards a panopticon oversight of personal health information. 

Skepticism is further fuelled by the NHS’s troubled history with data projects, where previous attempts to centralise patient data have collapsed under public opposition. The fear that history might repeat itself is palpable, and the NHS’s ability to sway public opinion in favour of the platform remains a significant hurdle. 

Conclusion 

As we venture further into an age where data is king, the NHS-Palantir partnership is a litmus test for the delicate balance between innovation and privacy. The NHS’s venture is indeed ambitious, but it must not be deaf to the cacophony of concerns surrounding patient privacy. Transparency, robust data governance, and the right to opt out must not be side-lined in the pursuit of technological advancement. After all, when it comes to our personal health data, should we not have the final say in who holds the keys to our digital lives? 

Take a look at our highly popular Data Ethics Course. Places fill up fast so if you would like learn more in this fascinating area, book your place now. 

UK Biobank’s Data Sharing Raises Alarm Bells

An investigation by The Observer has uncovered that the UK Biobank, a repository of health data from half a million UK citizens, has been sharing information with insurance companies. This development contravenes the Biobank’s initial pledge to keep this sensitive data out of the hands of insurers, a promise that was instrumental in garnering public trust at the outset. UK Biobank has since come out and responded to the article calling it “disingenuous” and “extremely misleading”. 

A Promise Made, Then Modified 

The UK Biobank was set up in 2006 as a goldmine for scientific discovery, offering researchers access to a treasure trove of biological samples and associated health data. With costs for access set between £3,000 and £9,000, the research derived from this data has been nothing short of revolutionary. However, the foundations of this scientific jewel are now being questioned. 

When the project was first announced, clear assurances were given that data would not be made available to insurance companies, mitigating fears that genetic predispositions could be used discriminatorily in insurance assessments. These assurances appeared in the Biobank’s FAQs and were echoed in parliamentary discussions. 

Changing Terms Amidst Grey Areas 

The Biobank contends that while it does strictly regulate data access, allowing only verified researchers to delve into its database, this includes commercial entities such as insurance firms if the research is deemed to be in the public interest. The boundaries of what constitutes “health-related” and “public interest” are now under scrutiny.   

However, according to the Observer investigation, evidence suggests that this nuance—commercial entities conducting health-related research—was not clearly communicated to participants, especially given the categorical assurances given previously although the UK Biobank categorically denies this and shared its consent form and information leaflet. 

Data Sharing: The Ethical Quandary 

This breach of the original promise has raised the ire of experts in genetics and data privacy, with Prof Yves Moreau highlighting the severity of the breach of trust. The concern is not just about the sharing of data but about the integrity of consent given by participants. The Biobank’s response indicates that the commitments made were outdated and that the current policy, which includes sharing anonymised data for health-related research, was made clear to participants upon enrolment. 

The Ripple Effect of Biobank’s Data Policies 

Further complicating matters is the nature of the companies granted access. Among them are ReMark International, a global insurance consultancy, Lydia.ai, a Canadian “insurtech” firm that wants to give people “personalised and predictive health scores”, and Club Vita, a longevity data analytics company. These companies have utilised Biobank data for projects ranging from disease prediction algorithms to assessing longevity risk factors. The question that is raised is how can one ensure that this is in fact in the Public Interest, do we take a commercial entities word for this? UK Biobank says all research conducted is “consistent with being health-related and in the public interest” and it has an expert data access committee who decide on any complex issues but the who checks the ethics of the ethics committee? The issues with this self-regulation are axiomatic. 

The Fallout and the Future 

This situation has led to a broader conversation about the ethical use of volunteered health data and the responsibility of custodians like the UK Biobank to uphold public trust. As technology evolves and the appetite for data grows across industries, the mechanisms of consent and transparency may need to be revisited.  The Information Commissioner’s Office is now considering the case, spotlighting the crucial need for clarity and accuracy in how organisations manage and utilise sensitive personal information. 

As the UK Biobank navigates these turbulent waters, the focus shifts to how institutions like it can maintain the delicate balance between facilitating scientific progress and safeguarding the privacy rights of individuals who contribute their personal data for the greater good. For the UK Biobank, regaining the trust of its participants and the public is now an urgent task, one that will require more than just a careful review of policies but a reaffirmation of its commitment to ethical stewardship of the data entrusted to it. 

Take a look at our highly popular Data Ethics Course. Places fill up fast so if you would like learn more in this fascinating area, book your place now. 

All Go for UK to US Data Transfers 

On 10th July 2023, the European Commission adopted its adequacy decision under Article 45 of GDPR for the EU-U.S. Data Privacy Framework (DPF).
It concluded that the United States ensures an adequate level of protection, comparable to that of the European Union, for personal data transferred from the EU to US companies under the new framework. It means that personal data can flow safely from the EU to US companies participating in the Framework, without having to put in place additional data protection safeguards under the GDPR. 

The question then is, “What about transfers from the UK to the US which were not covered by the above?” The Data Protection (Adequacy) (United States of America) Regulations 2023 (SI 2023/1028) will come into force on 12th October 2023. The effect of the Regulations will be that, as of 12th October 2023, a transfer of personal data from the UK to an entity in the USA which has self-certified to the Trans-Atlantic EU-US Data Privacy Framework and its UK extension and which will abide by the EU-US Data Privacy Framework Principles, will be deemed to offer an adequate level of protection for personal data and shall be lawful in accordance with Article 45(1) UK GDPR.  

Currently, data transfers from the UK to the US under the UK GDPR must either be based on a safeguard, such as standard contractual clauses or binding corporate rules, or fall within the scope of a derogation under Article 49 UK GDPR. 

UK Data Controllers need to update privacy policies and document their own processing activities as necessary to reflect any changes in how they transfer personal data to the US. 

The new US – EU Data Privacy Framework will be discussed in detail on our forthcomingInternational Transfers workshop. 

New GDPR Adequacy Decision for the EU-US Data Privacy Framework 

On 10th July 2023, the European Commission adopted its adequacy decision under Article 45 of GDPR for the EU-U.S. Data Privacy Framework (DPF). Thus, ends years of uncertainty and legal risk for European organisations wishing to transfer personal data to the US. In May, Meta Ireland (the owner of Facebook) was the subject of the largest ever GDPR fine of €1.2bn (£1bn) when Ireland’s Data Protection Commission ruled that its US data transfers were not GDPR compliant.  The new adequacy decision concludes that the United States ensures an adequate level of protection, comparable to that of the European Union, for personal data transferred from the EU to US companies under the new framework. Personal data can now flow safely from the EU to US companies participating in the Framework, without having to put in place additional data protection safeguards under the GDPR. 

The Journey to Adequacy 

In July 2020, the European Court of Justice (ECJ) in “Schrems II”, ruled that organisations that transfer personal data to the USA can no longer rely on the Privacy Shield Framework as a legal transfer tool as it failed to protect the rights of EU data subjects when their data was accessed by U.S. public authorities. In particular, the ECJ found that US surveillance programs are not limited to what is strictly necessary and proportionate as required by EU law and hence do not meet the requirements of Article 52 of the EU Charter on Fundamental Rights. Secondly, with regard to U.S. surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the USA, as required by Article 47 of the EU Charter. The ECJ stated that organisations transferring personal data to the USA can still use the Article 49 GDPR derogations or standard contractual clauses (SCCs). If using the latter, whether for transfers to the USA or other countries, the ECJ placed the onus on the data exporter to make a complex assessment about the recipient country’s data protection legislation (a Transfer Impact Assessment or TIA), and to put in place “additional measures” to those included in the SCCs. Since the Schrems ruling, replacing the Privacy Shield has been a priority for EU and US officials. In March 2022, it was announced that a new  Trans-Atlantic Data Privacy Framework had been agreed in principle. The US President signed an executive order in October, giving effect to the US commitments in the framework, and paving the way for the European Commission to publish a draft ‘adequacy decision’ on 14th December 2022. 


The Changes

The EU-U.S. Data Privacy Framework (DPF) introduces new binding safeguards to address all the concerns raised by the European Court of Justice in Schrems. This includes limiting access to EU data by US intelligence services to what is necessary and proportionate, and establishing a Data Protection Review Court (DPRC), to which EU individuals will have access. The new framework introduces significant improvements compared to the mechanism that existed under the Privacy Shield. For example, if the DPRC finds that data was collected in violation of the new safeguards, it will be able to order the deletion of the data. The new safeguards in the area of government access to data will complement the obligations that US companies importing data from the EU will have to subscribe to. EU individuals will also benefit from several redress avenues in case their data is wrongly handled by US companies. This includes free of charge independent dispute resolution mechanisms and an arbitration panel. 


The Mechanics 

Just like the old Privacy Shield, US companies can self-certify their participation in the DPF by committing to comply with a detailed set of privacy obligations. These could include privacy principles such as purpose limitation, data minimisation and data retention, as well as specific obligations concerning data security and the sharing of data with third parties. The DPF will be administered by the US Department of Commerce, which will process applications for certification and monitor whether participating companies continue to meet the certification requirements. Compliance will be enforced by the US Federal Trade Commission. Many US companies remain self-certified to Privacy Shield standards. Consequently, it is no going to be a difficult task for them to transition to the DPF. As far as EU organisations go all they need to do now, before making a transfer of personal data to the US, is check that the organisation receiving their personal data is certified under the DPF. More information including the self-certification process is expected to be posted on the U.S. Department of Commerce’s new Data Privacy Framework website

Impact on Other Data Transfer Tools  

The safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) apply to all data transfers under the GDPR to companies in the US, regardless of the transfer mechanism used. These safeguards therefore also facilitate the use of other transfer tools, such as standard contractual clauses and binding corporate rules. This means that, when conducting a transfer impact assessment, a data controller can refer to the DPF adequacy decision as a conclusive finding by the European Commission that the 2 big protections introduced in the USA by the related Executive Order are applicable to transfers under your SCCs and provide suitable restrictions on government surveillance plus suitable redress for EEA data subjects. This makes any needed transfer impact assessment for the USA very straightforward. 
It is important to note that this adequacy decision only covers transfers of personal data from the EU to the US. The UK Government is also working on an adequacy finding for the US and this decision should expedite the process. 

The new US – EU Data Privacy Framework will be discussed in detail on our forthcomingInternational Transfers workshop.

The New EU Data Governance Act

On 17th May 2022, The Council of the European Union adopted the Data Governance Act (DGA) or Regulation on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act) (2020/0340 (COD) to give its full title. The Act aims to boost data sharing in the EU allowing companies to have access to more data to develop new products and services. 

The DGA will achieve its aims through measures designed to increase trust in relation to data sharing, creating new rules on the neutrality of data marketplaces and facilitating the reuse of public sector data. The European Commission says in its Questions and Answers document

The economic and societal potential of data use is enormous: it can enable new products and services based on novel technologies, make production more efficient, and provide tools for combatting societal challenges“.

Application

The DGA will increase the amount of data available for re-use within the EU by allowing public sector data to be used for purposes different than the ones for which it was originally collected. The Act will also create sector-specific data spaces to enable the sharing of data within a specific sector e.g. transport, health, energy or agriculture.

Data is defined as “any digital representation of acts, facts or information and any compilation of such acts, facts or information, including in the form of sound, visual or audiovisual recording” that is held by public sector bodies and which is not subject to the Open Data Directive but is subject to the rights of others. Examples include data generated by GPS and healthcare data, which if put to productive use, could contribute to improving the quality of services. The Commission estimates that the Act could increase the economic value of data by up to €11 billion by 2028.

Each EU Member State will be required to establish a supervisory authority to act as a single information point providing assistance to governments. They will also be required to establish a register of available public sector data. The European Data Innovation Board (see later) will have oversight responsibilities and maintain a central register of available DGA Data. 

On first reading the DGA seems similar to The Re-use of Public Sector Information Regulations 2015 which implemented Directive 2013/37/EU. The aim of the latter was to remove obstacles that stood in the way of re-using public sector information. However the DGA goes much further. 

Data Intermediary Services 

The European Commission believes that, in order to encourage individuals to allow their data to be shared, they should trust the process by which such data is handled. To this end, the DGA creates data sharing service providers known as “data intermediaries”, which will handle the sharing of data by individuals, public bodies and private companies. The idea is to provide an alternative to the existing major tech platforms.

To uphold trust in data intermediaries, the DGA puts in place several protective measures. Firstly, intermediaries will have to notify public authorities of their intention to provide data-sharing services. Secondly, they will have to commit to the protection of sensitive and confidential data. Finally, the DGA imposes strict requirements to ensure the intermediaries’ neutrality. These providers will have to distinguish their data sharing services from other commercial operations and are prohibited from using the shared data for any other purposes. 

Data Altruism

The DGA encourages data altruism. This where data subjects (or holders of non-personal data) consent to their data being used for the benefit of society e.g. scientific research purposes or improving public services. Organisations who participate in these activities will be entered into a register held by the relevant Member State’s supervisory authority. In order to share data for these purposes, a data altruism consent form will be used to obtain data subjects’ consent.

The DGA will also create a European Data Innovation Board. Its missions would be to oversee the data sharing service providers (the data intermediaries) and provide advice on best practices for data sharing.

The UK

Brexit means that the DGA will not apply in the UK, although it clearly may affect UK businesses doing business in the EU. It remains to be seen whether the UK will take similar approach although it notable that UK proposals for amending GDPR include “amending the law to facilitate innovative re-use of data for different purposes and by different data controllers.”

The DGA will shortly be published in the Official Journal of the European Union and enter into force 20 days after publication. The new rules will apply 15 months thereafter. To further encourage data sharing, on 23 February 2022 the European Commission proposed a Data Act that is currently being worked on.

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We also have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

To Share or Not to Share; That is the Question! 

elaine-casap-qgHGDbbSNm8-unsplash

On 5th October 2021 the Data Sharing Code of Practice from the Information Commissioner’s Office came into effect for UK based Data Controllers.  

The code is not law nor does it ‘enforce’ data sharing, but it does provide some useful steps to consider when sharing personal data either as a one off or as part of an ongoing arrangement. Data Protection professionals, and the staff in the organisations they serve, will still need to navigate a way through various pressures, frameworks, and expectations on the sharing of personal data; case by case, framework by framework. A more detailed post on the contents of the code can be read here.  

Act Now Training is pleased to announce a new full day ‘hands on’ workshop for Data Protection professionals on Data Sharing. Our expert trainer, Scott Sammons, will look at the practical steps to take, sharing frameworks and protocols, risks to consider etc. Scott will also explore how, as part of your wider IG framework, you can establish a proactive support framework; making it easier for staff to understand their data sharing obligations/expectations and driving down the temptation to use a ‘Data Protection Duck out’ for why something was shared/not shared inappropriately.  

Delegates will also be encouraged to bring a data sharing scenario to discuss with fellow delegates and the tutor. This workshop can also be customised and delivered to your organisation at your premises or virtually. Get in touch to learn more.

advanced_cert

GDPR News Roundup

So much has happened in the world of data protection recently. Where to start?

International Transfers

In April, the European Data Protection Board’s (EDPB) opinions (GDPR and Law Enforcement Directive (LED)) on UK adequacy were adopted. The EDPB has looked at the draft EU adequacy decisions. It acknowledge that there is alignment between the EU and UK laws but also expressed some concerns. It has though issued a non-binding opinion recommending their acceptance. If accepted the two adequacy decisions will run for an initial period of four years. More here.

Last month saw the ICO’s annual data protection conference go online due to the pandemic. Whilst not the same as a face to face conference, it was still a good event with lots of nuggets for data protection professionals including the news that the ICO is working on bespoke UK standard contractual clauses (SCCs) for international data transfers. Deputy Commissioner Steve Wood said: 

“I think we recognise that standard contractual clauses are one of the most heavily used transfer tools in the UK GDPR. We’ve always sought to help organisations use them effectively with our guidance. The ICO is working on bespoke UK standard clauses for international transfers, and we intend to go out for consultation on those in the summer. We’re also considering the value to the UK for us to recognise transfer tools from other countries, so standard data transfer agreements, so that would include the EU’s standard contractual clauses as well.”

Lloyd v Google 

The much-anticipated Supreme Court hearing in the case of Lloyd v Google LLC took place at the end of April. The case concerns the legality of Google’s collection and use of browser generated data from more than 4 million+ iPhone users during 2011-12 without their consent.  Following the two-day hearing, the Supreme Court will now decide, amongst other things, whether, under the DPA 1998, damages are recoverable for ‘loss of control’ of data without needing to identify any specific financial loss and whether a claimant can bring a representative action on behalf of a group on the basis that the group have the ‘same interest’ in the claim and are identifiable. The decision is likely to have wide ranging implications for representative actions, what damages can be awarded for and the level of damages in data protection cases. Watch this space!

Ticketmaster Appeal

In November 2020, the ICO fined Ticketmaster £1.25m for a breach of Articles 5(1)(f) and 32 GPDR (security). Ticketmaster appealed the penalty notice on the basis that there had been no breach of the GDPR; alternatively that it was inappropriate to impose a penalty, and that in any event the sum was excessive. The appeal has now been stayed by the First-Tier Tribunal until 28 days after the pending judgment in a damages claim brought against Ticketmaster by 795 customers: Collins & Others v Ticketmaster UK Ltd (BL-2019-LIV-000007). 

Age Appropriate Design Code

This code came into force on 2 September 2020, with a 12 month transition period. The Code sets out 15 standards organisations must meet to ensure that children’s data is protected online. It applies to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have the best possible access to online services whilst minimising data collection and use.

With less than four months to go (2 September 2021) the ICO is urging organisations and businesses to make the necessary changes to their online services and products. We are planning a webinar on the code. Get in touch if interested.

AI and Automated Decision Making

Article 22 of GDPR provides protection for individuals against purely automated decisions with a legal or significant impact. In February, the Court of Amsterdam ordered Uber, the ride-hailing app, to reinstate six drivers who it was claimed were unfairly dismissed “by algorithmic means.” The court also ordered Uber to pay the compensation to the sacked drivers.

In April EU Commission published a proposal for a harmonised framework on AI. The framework seeks to impose obligations on both providers and users of AI. Like the GDPR the proposal includes fine levels and an extra-territorial effect. (Readers may be interested in our new webinar on AI and Machine Learning.)

Publicly Available Information

Just because information is publicly available it does not provide a free pass for companies to use it without consequences. Data protection laws have to be complied with. In November 2020, the ICO ordered the credit reference agency Experian Limited to make fundamental changes to how it handles personal data within its direct marketing services. The ICO found that significant ‘invisible’ processing took place, likely affecting millions of adults in the UK. It is ‘invisible’ because the individual is not aware that the organisation is collecting and using their personal data. Experian has lodged an appeal against the Enforcement Notice.

Interesting that recently the Spanish regulator has fined another credit reference agency, Equifax, €1m for several failures under the GDPR. Individuals complained about Equifax’s use of their personal data which was publicly available. Equifax had also failed to provide the individuals with a privacy notice. 

Data Protection by Design

The Irish data protection regulator issued its largest domestic fine recently. Irish Credit Bureau (ICB) was fined €90,000 following a change in the ICB’s computer code in 2018 resulted in 15,000 accounts having incorrect details recorded about their loans before the mistake was noticed. Amongst other things, the decision found that the ICB infringed Article 25(1) of the GDPR by failing to implement appropriate technical and organisational measures designed to implement the principle of accuracy in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects (aka DP by design and by default). 

Data Sharing 

The ICO’s Data Sharing Code of Practice provides organisations with a practical guide on how to share personal data in line with data protection law. Building on the code, the ICO recently outlined its plans to update its guidance on anonymisation and pseudonymisation, and to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing.

UK GDPR Handbook

The UK GDPR Handbook is proving very popular among data protection professionals.

It sets out the full text of the UK GDPR laid out in a clear and easy to read format. It cross references the EU GDPR recitals, which also now form part of the UK GDPR, allowing for a more logical reading. The handbook uses a unique colour coding system that allows users to easily identify amendments, insertions and deletions from the EU GDPR. Relevant provisions of the amended DPA 2018 have been included where they supplement the UK GDPR. To assist users in interpreting the legislation, guidance from the Information Commissioner’s Office, Article 29 Working Party and the European Data Protection Board is also signposted. Read what others have said:

“A very useful, timely, and professional handbook. Highly recommended.”

“What I’m liking so far is that this is “just” the text (beautifully collated together and cross-referenced Articles / Recital etc.), rather than a pundits interpretation of it (useful as those interpretations are on many occasions in other books).”

“Great resource, love the tabs. Logical and easy to follow.”

Order your copy here.

These and other GDPR developments will also be discussed in detail in our online GDPR update workshop next week.

Viva Las Vegas

Welcome to fabulous Las Vegas sign

Act Now is pleased to announce that Ibrahim Hasan has accepted an invitation to address the 21st Annual NAPCP Commercial Card and Payment Conference in Las Vegas, April 6-9 2020.

high_rez_NAPCP all black with url

The NAPCP is a membership-based professional association committed to advancing Commercial Card and Payment professionals and industry practices globally, with timely research and resources, peer networking and events serving a community of almost 20,000 individuals worldwide. The NAPCP is a respected voice in the industry and an impartial resource for members at all experience levels in the public and private sectors.

In a session entitled “Complying with the GDPR and United States Privacy Legislation” Ibrahim will examine the impact of GDPR and the California Consumer Privacy Act (CCPA) on the Payment Card industry. He will also be presenting webinars pre and post conference on these subjects to the NAPCP community.

The NAPCP Annual Conference is the can’t-miss event for the industry, bringing together 600 professionals from around the world to share perspectives on all Commercial Card and Payment vehicles, including Purchasing Card, Travel Card, Fleet Card, Ghost Card, Declining Balance Card, ePayables and other electronic payment options. Experts and practitioners share case studies, successes and thought-provoking ideas in almost 80 breakout sessions, all with an eye for trends and innovation across sectors.

Diane McGuire, CPCP, MBA, Managing Director of the NACP, said:

“I am really pleased that Ibrahim has accepted our invitation to join us in Las Vegas. As legislators and governments globally are starting to wake up to the implications of the digital revolution on individuals’ rights, our conference delegates will benefit from his GDPR and privacy expertise in what is sure to be a thought-provoking session.”

This is one of a number of international projects that Act Now has worked on in recent years. In June 2018 we delivered a GDPR workshop in Dubai for Middle East businesses and their advisers. In 2015 Ibrahim went to Brunei to conduct data protection audit training for government staff.

Ibrahim Hasan said:

“I am really pleased to address the NACP conference in Las Vegas. Our GDPR expertise is now being recognised abroad. The United States is the latest addition to our increasing international portfolio. We hope to use the conference as a platform to showcase our expertise to the US Data Controllers.”

Regular registration is now open for the event. Head over to this link to confirm registration.

NAPCPConferenceLogo_2020-high rez

Act Now’s forthcoming live and interactive CCPA webinar will cover the main obligations and rights in CCPA and practical steps to compliance. This webinar is ideal for data protection officers and advisers in UK and US businesses.

A New (GDPR) Data Sharing Code

Copy files, data exchange. Files transfer. Fast file transfer management

The law on data sharing is a minefield clouded with myths and misunderstandings.
The Information Commissioner’s Office (ICO) recently launched a consultation on an updated draft code of practice on this subject. Before drafting the new code, the ICO launched a call for views in August 2018, seeking input from various organisations such as trade associations and those representing the interests of individuals. (Read a summary of the responses here). The revised code will eventually replace the version made under the Data Protection Act 1998, first published in 2011.

The new code does not impose any additional barriers to data sharing, but aims to help organisations comply with their legal obligations under the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA 2018).
Launching the consultation, which will close on 9th September 2019, the ICO said the code will:

“… address many aspects of the new legislation including transparency, lawful bases for processing, the new accountability principle and the requirement to record processing activities”.

Once finalised, the code will be a statutory code of practice under section 121 of the DPA 2018. Under section 127, the ICO must take account of it when considering whether a Data Controller has complied with its data protection obligations in relation to data sharing. The code can also be used in evidence in court proceedings and the courts must take its provisions into account wherever relevant.

Following the code, along with other ICO guidance, will help Data Controllers to manage risks; meet high standards; clarify any misconceptions about data sharing; and give confidence to share data appropriately and correctly. In addition to the statutory guidance, the code contains some optional good practice recommendations, which aim to help Data Controllers adopt an effective approach to data protection compliance.
It also covers some special cases, such as databases and lists, sharing information about children, data sharing in an emergency, and the ethics of data sharing.Reference is also made to the provisions of the Digital Economy Act 2017 which seeks to promote data sharing across the public sector

There is also section on sharing data for the purposes of law enforcement processing under Part 3 of the DPA 2018. This is an important area which organisations have not really understood as demonstrated by the recent High Court ruling that Sussex Police unlawfully shared personal data about a vulnerable teenager putting her “at greater risk.”

Steve Wood, the Deputy Information Commissioner for Policy, said:

“Data sharing brings many benefits to organisations and individuals, but it needs to be done in compliance with data protection law.”

“Our draft data sharing code gives practical advice and guidance on how to share data safely and fairly, and we are encouraging organisations to send us their comments before we launch the final code in the Autumn.”

You can respond to the consultation via the ICO’s online survey, or email datasharingcode@ico.org.uk until Monday 9 September 2019.

More on these and other developments in our GDPR update workshop presented by Ibrahim Hasan. Looking for a GDPR qualification? Our practitioner certificate is the best option.