Facial Recognition to Monitor Attendance: ICO Takes Action

Employers have always had a keen interest in monitoring and tracking employees.
In 2017, we addressed this topic in a blog post focusing on the GDPR implications of employee monitoring using GPS trackers and similar devices. Recent advances in surveillance technology, particularly facial recognition, have not only streamlined employee monitoring but has also rendered it more cost-effective and, concurrently, more intrusive. A good example is this video of a coffee shop using facial recognition technology (FRT) and AI to monitor employee productivity. 

In 2022, the TUC warned employee surveillance technology and AI risks “spiralling out of control” without stronger regulation to protect employees. It warned that, left unchecked, these technologies could lead to widespread discrimination, work intensification and unfair treatment. Earlier this year the French Data Protection Regulator, CNIL, fined Amazon  €32m (£27m) under the GDPR for “excessive” surveillance of its workers. The CNIL said Amazon France Logistique, which manages warehouses, recorded data captured by workers’ handheld scanners. It found Amazon tracked activity so precisely that it led to workers having to potentially justify every break. 

Employee surveillance is now primarily regulated in the UK by the UK GDPR.
As with well all activities involving the processing of personal data, the surveillance must be fair, lawful and transparent. The Human Rights Act and the Regulation of Investigatory Powers Act may also apply (see our previous earlier blog post for more detail on these laws).  

On 23rd February 2024, the Information Commissioner’s Office (ICO) issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance. The ICO’s investigation found that Serco Leisure and the trusts had been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities for the purpose of attendance checks and subsequent payment for their time.  

Serco Leisure will not be appealing against the notices; a wise decision! As readers will know they had to have a lawful basis for processing employees’ data under Article 6 of the UK GDPR as well as Article 9 as they were processing Special Category Data (Biometric Data). Consent was not an option due to the imbalance of power between employer and employee. In the words of the Commissioner:  

“Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy. There is no clear way for staff to opt out of the system, increasing the power imbalance in the workplace and putting people in a position where they feel like they have to hand over their biometric data to work there.” 

Serco tried to rely on Article 6(1)(b) and Article 6(1)(f) as lawful bases for processing the employees’ personal data. In relation to Article 6(1)(b) (contractual necessity) it argued that the processing of attendance data was necessary to ensure employees are paid correctly for the time they have worked. The ICO ruled that although recording attendance times may be necessary for Serco to fulfil its obligations under employment contracts, it does not follow that the processing of biometric data is necessary to achieve this purpose especially when less intrusive means could be used to verify attendance. These included radio-frequency identification cards or fobs, or manual sign-in and sign-out sheets. Serco had failed to demonstrate why these less intrusive methods were not appropriate. They did assert that these methods are open to abuse but did provide evidence of widespread abuse, nor why other methods, such as disciplinary action against employees found to be abusing the system, had not been considered to be appropriate.  

Regarding Serco’s reliance on Article 6(1)(f) (legitimate interests), the ICO said that it will not apply if a controller can reasonably achieve the same result in another less intrusive way. As discussed above, Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system is a necessity, rather than simply a further benefit to Serco. The ICO also said: 

“In applying the balancing test required to rely on legitimate interests, Serco has failed to give appropriate weight to the intrusive nature of biometric processing or the risks to data subjects. “ 

In relation to Article 9, the ICO said that Serco had again failed to demonstrate that the processing of biometric data is “necessary” for Serco to process Special Category Data for the purpose of employment attendance checks or to comply with the relevant laws identified by Serco in their submissions.  

The Enforcement Notices not only instruct Serco Leisure and the trusts to stop all processing of biometric data for monitoring employees’ attendance at work, but also require them to destroy all biometric data that they are not legally obliged to retain. This must be done within three months of the notices being issued. 

This enforcement action coincided with the ICO publishing new guidance for all organisations that are considering using people’s biometric data. The guidance outlines how organisations can comply with data protection law when using biometric data to identify people. Last year, the ICO also published guidance on monitoring employees and called on organisations to consider both their legal obligations and their employee’s rights to privacy before they implement any monitoring. 

This is the first time the ICO has taken enforcement action against an employer to stop it processing the biometric data of staff. It will serve as a warning to organisations who use biometric tech just because it is cheap and easy to use without considering the legal implications.  

Our CCTV Workshop will also examine the use of facial recognition technology. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Free Webinar: Understanding FOI Requests from Journalists with Martin Rosenbaum  

Journalists’ FOI requests can be challenging for public authorities. Sometimes they are viewed with suspicion. A public authority may even feel that a journalist is on a “fishing expedition”.  

This free webinar is a unique opportunity for FOI practitioners to understand FOI from a journalist’s perspective and improve their FOI practice. Martin will share his experience of breaking some of the top news stories using FOI, key development in UK FOI Law, his top tips for FOI practitioners and his hopes for FOI in the future. 

Martin Rosenbaum is a former BBC Programmes Editor, Producer and FOI Specialist. He is the author of “Freedom of Information: A practical guidebook” and has been involved in some of the major stories broken by the BBC using FOI. Martin is the writer-in-residence and honorary research fellow with the Centre for British Political Life, Birkbeck College, University of London. 

This is sure to be one of our most popular FOI webinars. Join us on the 29th February at 12pm for this insightful session. To reserve your free place, simply email events@actnow.org.uk.

AI Regulation and the EU AI Act  

2024 is going to be the year of AI regulation. As the impact of AI increases in our daily lives, governments and regulatory bodies globally are grappling with the need to establish clear guidelines and standards for its responsible use. 

ChatGPT 

Ask people about AI and many will talk about AI powered chat bots like ChatGPT and Gemini –The Bard Replacement from Google. The former currently has around 180.5 million users who generated 1.6 billion visits in December 2023. However, with great popularity comes increased scrutiny as well as privacy and regulatory challenges. 

In March 2023, Italy became the first Western country to block ChatGPT when its data protection regulator (Garante Per La Protezione Dei Dati Personali) cited privacy concerns. Garante’s communication to to OpenAI, owner of ChatGPT, highlighted both the lack of a suitable legal basis for the collection and processing of personal data for the purpose of training the algorithms underlying ChatGPT, the potential to produce inaccurate information about individuals and child safety. In total, Garante said that it suspected ChatGPT to be breaching Articles 5, 6, 8, 13 and 25 of the EU GDPR. 

ChatGPT was made accessible in Italy, fours week after the above decision but Garante launched a “fact-finding activity” at the time. This culminated in a statement on 31st January 2024, in which it said it “concluded that the available evidence pointed to the existence of breaches of the provisions contained in the EU GDPR [General Data Protection Regulation]”. The cited breaches are essentially the same as the provisional finding discussed above; focussing on the mass collection of users’ data for training purposes and the risk of younger users may being exposed to inappropriate content. ChatGPT has 30 days to respond with a defence. 

EU AI Act 

Of course there is more to AI than ChatGPT and some would say much more beneficial use cases. Examples include the ability to match drugs to patients, numerous stories of major cancer research breakthroughs as well as the ability for robots to do major surgery. But there are downsides too including bias, lack of transparency, and failure to take account of the ethical implications. 

On 2nd February 2024, EU member states unanimously reached an agreement on the text of the harmonised rules on artificial intelligence, the so-called
Artificial Intelligence Act” (AI Act). The final draft of the Act will be adopted by the European Parliament in a plenary vote in April and will come into force in 2025 with a two year transition period.  

The main provisions of Act can be read here. They do not differ much from the previous draft can be read on our previous blog here. In summary, the AI Act sets out comprehensive rules for AI applications, including a risk-based system to address potential threats to health and safety, and human rights. The Act will ban some AI applications which pose an “unacceptable risk” (e.g. Real-time and remote biometric identification systems, like facial recognition) and impose strict obligations on others considered as “high risk” (e.g. AI in EU-regulated product safety categories such as cars and medical devices). These obligations include adherence to data governance standards, transparency rules, and the incorporation of human oversight mechanisms.  

Despite Brexit, UK businesses and entities engaged in AI-related activities will still be affected by the Act if they intend to operate within the EU market. The Act will have an extra territorial reach just like the EU GDPR.  

UK Response 

The UK Government’s own decisions on how to regulate AI will be influenced by the EU’s approach. An AI White Paper was published in March last year entitled
“A pro-innovation approach to AI regulation”. The paper sets out the UK’s preference not to place AI regulation on a statutory footing but to make use of “regulators’ domain-specific expertise to tailor the implementation of the principles to the specific context in which AI is used.”  

The government’s long-awaited follow-up to the AI White Paper was published last week. 

Key takeaways are: 

  • The government’s  proposals for regulating AI, still revolve around empowering existing regulators to create tailored, context-specific rules that suit the ways the technology is being used in the sectors they scrutinise i.e. no legislation yet (regulators have been given until 30th April 2024 to publish their AI plans). 
     
  • The government generally reaffirmed its commitment to the whitepaper’s proposals, claiming this approach to regulation will ensure the UK remains more agile than “competitor nations” while also putting it on course to be a leader in safe, responsible AI innovation. 
     
  • It will though consider creating “targeted binding requirements” for select companies developing highly capable AI systems. 
     
  • It also committed to conducting regular reviews of potential regulatory gaps on an ongoing basis: “We remain committed to the iterative approach set out in the whitepaper, anticipating that our framework will need to evolve as new risks or regulatory gaps emerge.” 
     

According to Michelle Donelan,  Secretary of State for Science, Innovation and Technology, the UK’s approach to AI regulation has already made the country a world leader in both AI safety and AI development. 
 

“AI is moving fast, but we have shown that humans can move just as fast,” she said. “By taking an agile, sector-specific approach, we have begun to grip the risks immediately, which in turn is paving the way for the UK to become one of the first countries in the world to reap the benefits of AI safely.” 

Practical Steps 

Last year, the ICO conducted an inquiry after concerns were raised about the use of algorithms in decision-making in the welfare system by local authorities and the DWP. In this instance, the ICO did not find any evidence to suggest that benefit claimants are subjected to any harms or financial detriment as a result of the use of algorithms. It did though emphasise a number of practical steps that local authorities and central government can take when using AI: 

  • Take a data protection by design and default approach 
  • Be transparent with people about how you are using their data by regularly reviewing privacy policies
  • Identify the potential risks to people’s privacy by conducting a Data Protection Impact Assessment

In January 2024, the ICO launched  a consultation series on Generative AI, examining how aspects of data protection law should apply to the development and use of the technology. It is expected to issue more AI guidance later in 2024. 

Join our Artificial Intelligence and Machine Learning, How to Implement Good Information Governance workshop for hands-on insights, key resource awareness, and best practices, ensuring you’re ready to navigate AI complexities fairly and lawfully

Police Misuse of Body Worn Camera Footage 

Last week the BBC reported that police officers made offensive comments about an assault victim while watching body camera footage of her exposed body.  

The woman had been arrested by Thames Valley Police and placed in leg restraints before being recorded on body-worn cameras. While being transported to Newbury police station, she suffered a seizure which resulted in her chest and groin being exposed. A day later she was released without charge. 

A female officer later reviewed the body camera footage, which the force told Metro.co.uk was for ‘evidential purposes’ and ‘standard practice’. The BBC reports that three male colleagues joined her and made offensive comments about the victim.
The comments were brought to the attention of senior police officers by a student officer, who reported his colleagues for covering up the incident. The student officer was later dismissed; though the police said this was unrelated to the report. 

The policing regulator says Thames Valley Police should have reported the case for independent scrutiny. The force has now done so, following the BBC investigation. 

This is not the first time the BBC has highlighted such an issue. In September 2023 it revealed the findings of a two-year investigation. It obtained reports of misuse from Freedom of Information requests, police sources, misconduct hearings and regulator reports. It found more than 150 camera misuse reports with cases to answer over misconduct, recommendations for learning or where complaints were upheld. (You can watch Bodycam cops uncovered on BBC iPlayer) 

The most serious allegations include: 

  • Cases in seven forces where officers shared camera footage with colleagues or
    friends – either in person, via WhatsApp or on social media 

  • Images of a naked person being shared between officers on email and cameras used to covertly record conversations 

  • Footage being lost, deleted or not marked as evidence, including video, filmed by Bedfordshire Police, of a vulnerable woman alleging she had been raped by an inspector – the force later blamed an “administrative error” 

  • Switching off cameras during incidents, for which some officers faced no sanctions – one force said an officer may have been “confused”

Body worn cameras are used widely these days by not just police but also  council officers, train guards, security staff, and parking attendance (to name a few). 

There is no all-encompassing law regulating body worn cameras.  Of course they are used to collect and process personal data therefore will be subject to the UK GDPR. Where used covertly they also be subject to Regulation of Investigatory Powers Act 2000.  

The Information Commissioner’s Office (ICO) provides comprehensive guidelines on the use of CCTV, which are largely considered to extend to body worn cameras(BWCs) for security officers. There is a useful checklist on its website which recommends:  

  • Providing a privacy information  to individuals using BWCs, such as clear signage, verbal announcements or lights/indicators on the device itself and having readily available privacy policies. 
  • Training staff using BWV to inform individuals that recording may take place if it is not obvious to individuals in the circumstances. 
  • Having appropriate retention and disposal policies in place for any footage that is collected. 
  • Having efficient governance procedures in place to be able to retrieve stored footage and process it for subject access requests or onward disclosures where required. 
  • Using technology which has the ability to efficiently and effectively blur or mask footage, if redaction is required to protect the rights and freedoms of any third parties. 

Our one-day CCTV workshop will teach you how to plan and implement a CCTV/BWC project including key skills such as completing a DPIA and assessing camera evidence.
Our expert trainer will answer all your questions including when you can use CCTV/BWC, when it can be covert and how to deal with a request for images.  
 
This workshop is suitable for anyone involved in the operation of CCTV, BWCs and drones including DPOs, investigators, CCTV operators, enforcement officers, estate managers and security personnel. 

The Data Protection and Digital Information (No.2) Bill: Where are we now? 

The Data Protection and Digital Information (No.2) Bill is currently in the Committee stage of the House of Lords. It will make changes to the UK GDPR, the Data Protection Act 2018 and Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). It is expected to be passed in May and will probably come into force after a short transitional period.  

The current Bill is not substantially different to the previous version whose passage through Parliament was paused in September 2022 so ministers could engage in “a co-design process with business leaders and data experts” and move away from the “one-size-fits-all’ approach of the European Union’s GDPR.”  

The Same 

Many of the proposals in the new Bill are the same as contained in the previous Bill. These include: 

  • Amended Definition of Personal Data: This proposed change would limit the assessment of identifiability of data to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world.

  • Vexatious Data Subject Requests: The terms “manifestly unfounded” or “excessive” requests, in Article 12 of the UK GDPR, will be replaced with “vexatious” or “excessive” requests. Explanation and examples of such requests will also be included. 

  • Data Subject Complaints: Data Controllers will be required to acknowledge receipt of Data Subject complaints within 30 days and respond substantively “without undue delay”. The ICO will be entitled not to accept a complaint if a Data Subject has not made a complaint to the controller first. 

  • Data Protection Officer: The obligation for some controllers and processors to appoint a Data Protection Officer (DPO) will be removed. However, public bodies and those who carry out processing likely to result in a “high risk” to individuals will be required to designate a senior manager as a “Senior Responsible Individual”.  

  • Data Protection Impact Assessments: These will be replaced by leaner and less prescriptive “Assessments of High-Risk Processing.”  

  • International Transfers: There will be a new approach to the test for adequacy applied by the UK Government to countries (and international organisations) and when Data Controllers are carrying out a Transfer Impact Assessment or TIA. The threshold for this new “data protection test” will be whether a jurisdiction offers protection that is “not materially lower” than under the UK GDPR. (For more detail see also our forthcoming International Transfers webinar). 
  • The Information Commission: The Information Commissioner’s Office will transform into the Information Commission; a corporate body with a chief executive. 

  • PECR: Cookies will be allowed to be used without consent for the purposes of web analytics and to install automatic software updates. Furthermore, non-commercial organisations (e.g. charities and political parties) will be able to rely on the “soft opt-in” for direct marketing purposes, if they have obtained contact details from an individual expressing interest. Finally, there will be an increase to the fines from the current maximum of £500,000 to UK GDPR levels i.e. up to £17.5m of 4% of global annual turnover (whichever is higher).  

The Changes 

The main changes are summarised below: 

  • Scientific Research: The definition of scientific research is amended so that it now includes research for the purposes of commercial activity. This expands the circumstances in which processing for research purposes may be undertaken, providing a broader consent mechanism and exemption to the fair processing requirement. 
  • Legitimate Interests: The Previous Bill proposed that businesses could rely on legitimate interests (Article 6 lawful basis) without the requirement to conduct a balancing test against the rights and freedoms of data subjects where those legitimate interests are “recognised”. These “recognised” legitimate interests cover purposes for processing such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement.  The new Bill, whilst keeping the above changes, introduces a non-exhaustive list of cases where organisations may rely on the “legitimate interests” legal basis, including for the purposes of direct marketing, transferring data within the organisation for administrative purposes and for the purposes of ensuring the security of network and information systems; although a balancing exercise still needs to be conducted in these cases.  
  • Automated Decision Making: The Previous Bill clarified that its proposed restrictions on automated decision-making under Article 22 UK GDPR should only apply to decisions that are a result of automated processing without “meaningful human involvement”. The new Bill states that profiling will be a relevant factor in the assessment as to whether there has been meaningful human involvement in a decision.  
  • Records of Processing Activities (ROPA): The Previous Bill streamlined the required content of ROPAs. The new Bill exempts all controllers and processors from the duty to maintain a ROPA unless they are carrying out high risk processing activities.  
  • Subject Access: Clause 12 of the Bill introduced at the House of Commons Report Stage amends Article 12 of UK GDPR (and the DPA 2018) so that Data Controllers are only obliged to undertake a reasonable and proportionate search for information request under the right of access.  

Adequacy 

Although the Government states that the new Bill is “a new system of data protection”, it still retains the UK GDPR’s structure and fundamental obligations. Organisations that are already compliant with the UK GDPR will not be required to make any major changes to their systems and processes.  

The EU conducts a review of adequacy with the UK every four years; the next adequacy decision is due on 27th June 2025. Some commentators have suggested that the changes may jeopardise the UK’s adequate status and so impact the free flow of data between the UK and EU. Defend Digital Me, a civil liberties organisation, has claimed that the Bill would, among other things, weaken data subjects’ rights, water down accountability requirements, and reduce the independence of the ICO.  

Other Parts of the Bill 

The Bill would also: 

  • establish a framework for the provision of digital verification services to enable digital identities to be used with the same confidence as paper documents. 
     
  • increase fines for nuisance calls and texts under PECR. 

  • update the PECR to cut down on ‘user consent’ pop-ups and banners. 

  • allow for the sharing of customer data, through smart data schemes, to provide services such as personalised market comparisons and account management. 
  • reform the way births and deaths are registered in England and Wales, enabling the move from a paper-based system to registration in an electronic register.
  • facilitate the flow and use of personal data for law enforcement and national security purposes. 

  • create a clearer legal basis for political parties and elected representatives to process personal data for the purposes of democratic engagement. 

Reading the Parliamentary debates on the Bill, it seems that the Labour party have no great desire to table substantial amendments to be the Bill. Consequently, it is expected that the Bill will be passed in a form similar to the one now published.  

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. Dive into the issues discussed in this blog and secure your spot now. 

HelloFresh fined by the ICO

The Information Commissioner’s Office (ICO) has fined food delivery company HelloFresh £140,000 for a campaign of 79 million spam emails and 1 million spam texts over a seven-month period

HelloFresh, under its official name Grocery Delivery E-Services UK Limited, was deemed to contravene regulation 22 of the Privacy and Electronic Communications Regulations 2003. 

Key points from this case include: 

  1. Inadequate Consent Mechanism: The opt-in statement used by HelloFresh did not specifically mention the use of text messages for marketing. While there was a mention of email marketing, it was ambiguously tied to an age confirmation statement, which could mislead customers into consenting. 
  1. Lack of Transparency: Customers were not properly informed that their data would continue to be used for marketing purposes for up to 24 months after they cancelled their subscriptions with HelloFresh. 
  1. Continued Contact Post Opt-Out: The ICO’s investigation revealed that HelloFresh continued to contact some individuals even after they had explicitly requested for the communications to stop. 
  1. Volume of Complaints: The investigation was triggered by numerous complaints, both to the ICO and through the 7726 spam message reporting service. 
  1. Substantial Fine: As a result of these findings, HelloFresh was fined £140,000. 
     
    Andy Curry, Head of Investigations at the ICO, emphasised the severity of the breach, noting that HelloFresh failed to provide clear opt-in and opt-out information, leading to a bombardment of unwanted marketing communications. The ICO’s decision to impose a fine reflects their commitment to enforce the law and protect customer data rights. 

This case serves as a reminder of the importance of complying with data protection and electronic communications regulations, especially in terms of obtaining clear and informed consent for marketing communications.

Dive deeper into the realm of data protection with our UK GDPR Practitioner Certificate, offering crucial insights into compliance essentials highlighted in this blog. Limited spaces are available for our January cohort – book now to enhance your understanding and navigate data regulations with confidence. 

Seasons Greetings

As we end another year, the Act Now team would like to wish everyone ‘Seasons’ greetings’ and best wishes for the new year. Thank you to all our delegates and colleagues for their continued support and dedication.

Our office will be closed for the holiday season from Thursday, 21st December, and we will return on Thursday, 4th January 2023.

Data Protection Bill Faces Scrutiny:
Commissioner Calls for Tighter Safeguards 

In a recent development, the Information Commissioner has weighed in on the debate surrounding the Data Protection and Digital Information Bill (DPDI Bill), legislation aimed at modernising data protection in the UK. While acknowledging the government’s efforts to strengthen the independence of the Information Commissioner’s Office (ICO) and update data protection practices, the Commissioner’s response highlights significant concerns, particularly around the use of personal data in social security contexts. We wrote a detailed breakdown on our blog here

The response, detailed and thorough, applauds the government’s amendments to the bill, recognising their potential to enhance ICO’s autonomy and bring data protection practices up to date with the digital age. However, the Commissioner expresses reservations about the adequacy of safeguards in the current draft of the bill, especially in terms of personal data handling for social security purposes. 

The Commissioner’s concern primarily revolves around the need for more precise language in the bill. This is to ensure that its provisions are fully aligned with established data protection principles, thereby safeguarding individual rights.
The response suggests that the current wording might be too broad or vague, potentially leading to misuse or overreach in the handling of personal data. 

Importantly, the Commissioner has provided detailed technical feedback for further improvements to the bill. It indicates a need for scrutiny and adjustments to the bill to ensure that it not only meets its intended purpose but also robustly protects the rights of individuals. 

While the Commissioner supports the bill’s overarching aim to enhance the UK’s data protection regime, the emphasis is clearly on the necessity of refining the bill.
This is to ensure it strikes the right balance between enabling data use for public and economic benefits and protecting individual privacy rights. 

The response from the Information Commissioner is a significant moment in the ongoing development of the DPDI Bill. It underscores the complexity and importance of legislating in the digital age, where data plays a crucial role in both the economy and personal privacy. 

As the bill progresses, the government and legislators should consider the Commissioner’s input. The balance they strike in the final version of the bill will be a key indicator of the UK’s approach to data protection in a rapidly evolving digital landscape. 

Learn more about the updated bill with our Data Protection and Digital Information Bill: Preparing for GDPR and PECR Reforms workshop. Dive into the issues discussed in this blog and secure your spot now.

The Hidden Reach of the Prevent Strategy:
Beyond Counter-Terrorism Units

The UK government’s anti-radicalisation program, Prevent, is reportedly sharing the personal details of thousands of individuals more extensively than previously known. This sharing includes not just counter-terrorism units, but also airports, ports, immigration services, and officials at the Home Office and the Foreign, Commonwealth and Development Office (FCDO). Critics argue that such widespread data sharing could be illegal, as it involves moving sensitive personal data between databases without the consent of the individuals. 

A Metropolitan police document titled “Prevent case management guidance” indicates that Prevent details are also shared with the ports authority watchlist. This raises concerns that individuals may face increased scrutiny at airports or be subjected to counter-terrorism powers without reasonable suspicion. The document also mentions that foreign nationals may have their backgrounds checked by the FCDO and immigration services for any overseas convictions or intelligence. 

Furthermore, the Acro Criminal Records Office, which manages UK criminal records, is notified about individuals referred to Prevent, despite the program dealing with individuals who haven’t necessarily engaged in criminal behaviour.
Counter-terror police emphasise their careful approach to data sharing, which aims to protect vulnerable individuals. 

Prevent’s goal is to divert people from terrorism before they offend, and most people are unaware of their referral to the program. 95% of referrals result in no further action. A secret database, the National Police Prevent Case Management database, was previously disclosed in 2019, revealing the storage of details of those referred to Prevent. 

Newly disclosed information, obtained through a freedom of information request by the Open Rights Group (ORG), reveals that Prevent data is shared across various police databases, including the Police National Computer, specialised counter-terrorism and local intelligence systems, and the National Crime Agency. 

The sharing of this data was accidentally revealed due to a redaction error in a heavily edited Met document. Despite its sensitive nature, the ORG decided to make the document public. Sophia Akram of the ORG expressed concerns over the extent of the data sharing and potential harms, suggesting that it could be unfair and possibly unlawful. 

The guidance also indicates that data is retained and used even in cases where no further action is taken. There are concerns about the impact on young people’s educational opportunities, as Prevent requires public bodies like schools and the police to identify individuals at risk of extremism. 

Recent figures show thousands of referrals to Prevent, predominantly from educational institutions. From April 2022 to March 2023, a total of 6,817 individuals were directed to the Prevent program. Within this group, educational institutions were responsible for 2,684 referrals. Breaking down the referrals by age, there were 2,203 adolescents between the ages of 15 and 20, and 2,119 referrals involved children aged 14 or younger.

There are worries about the long-term consequences for children and young people referred to the program. Several cases have highlighted the intrusive nature of this data sharing and its potential impact on individuals’ lives. Cases in which students have missed gaining a place at a sixth form college and other cases involving children as young as four years old.  

Prevent Watch, an organisation monitoring the program, has raised alarms about the data sharing, particularly its effect on young children. The FoI disclosures challenge the notion that Prevent is non-criminalising, as data on individuals, even those marked as ‘no further action’, can be stored on criminal databases and flagged on watchlists. 

Counter-terrorism policing spokespeople defend the program, emphasising its
multi-agency nature and focus on protecting people from harm. They assert that data sharing is carefully managed and legally compliant, aiming to safeguard vulnerable individuals from joining terror groups or entering conflict zones. 

Learn more about data sharing with our UK GDPR Practitioner Certificate. Dive into the issues discussed in this blog and secure your spot now.

Act Now Partners with Middlesex
University Dubai for UAE’s first
Executive Certificate in DP Law

Act Now Training, in collaboration with Middlesex University Dubai, is excited to announce the launch of the UAE’s first Data Protection Executive training programme. This qualification is ideal as a foundation for businesses and organisations aiming to comply with the UAE Federal Data Protection Law.

This practical course focusses on developing a data protection framework and ensuring compliance with the UAE Data Protection Law’s strict requirements. This is particularly relevant given the recent advancements in Data Protection law in the Middle East, including the UAE’s first comprehensive national data protection law, Federal Decree Law No. 45/2021. 

This law regulates personal data processing, emphasising transparency, accountability, and data subject rights. It applies to all organisations processing personal data within the UAE and abroad for UAE residents.

The importance of understanding this law is paramount for every business and organisation, as it necessitates a thorough reassessment of personal data handling practices. Non-compliance can lead to severe penalties and reputational damage.

The Executive Certificate in UAE DP Law is a practical qualification delivered over 5-weeks in two half day sessions per week and offers numerous benefits:

  1. Expertise in Cutting-Edge Legislation: Gain in-depth knowledge of the UAE’s data protection law, essential for professionals at the forefront of data protection practices.

  2. Professional Development: This knowledge enhances your resume, especially for roles in compliance, legal, and IT sectors, showing a commitment to legal reforms.

  3. Practical Application: The course’s structured format allows gradual learning and practical application of complex legal concepts, ensuring a deep understanding of the law.

  4. Risk Mitigation: Understanding the law aids in helping organisations avoid penalties and reputational harm due to non-compliance.

  5. Networking Opportunities: The course provides valuable connections in the field of data protection and law.

  6. Empowerment of Data Subjects: Delegates gain insights into their rights as data subjects, empowering them to protect their personal data effectively.

Delegates will receive extensive support, including expert instruction, comprehensive materials, interactive sessions, practical exercises, group collaboration, ongoing assessment, and additional resources for further learning. Personal tutor support is also provided throughout the course.

This program is highly recommended for officers in organisations both inside and outside the UAE that conduct business in the region or have customers, agents, and employees there. 

Act Now will be delivering and has designed the curriculum. Act Now Training is the UK’s premier provider of information governance training and consultancy, serving government organisations, multinational corporations, financial institutions, and corporate law firms.   

With a history of delivering practical, high-quality training since 2002.
Act Now’s skills-based training approach has led to numerous awards including most recently the Supplier of Year Award 2022-23 by the Information and Records Management Society in the UK. 

Our associates have decades of hands-on global Information Governance experience and thus are able to break down this complex area with real world examples making it easy to understand, apply and even fun!

Middlesex University Dubai is a 5 star rated KHDA university and one of three global campuses including London and Mauritius. It is the largest UK University in the UAE with over 5000 student enrolments from over 120 nationalities.

For more information and to register your interest, visit Middlesex University Dubai’s website. Alternatively you can Click Here.

Exit mobile version
%%footer%%