Filming People in Public for Social Media: Is it time for a new law?

In the content creator world, filming people without their consent has become everyday behaviour. From TikTok nightlife clips to YouTube street pranks, millions of people capture others in public places and post the footage online. Whether it is for likes, shares or monetisation, this behaviour is not without consequences for the creators as well as the subjects. Over the weekend the BBC ran a story about two women whose interactions with ‘friendly strangers’ were uploaded to social media causing the women much alarm and distress. 

Dilara was secretly filmed in a London store where she works, by a man wearing smart glasses. The footage was then posted to TikTok, where it received 1.3 million views. Dilara then faced a wave of unwanted messages and calls. It later turned out that the man who filmed her had posted dozens of similar videos, giving men tips on how to approach women. Another woman,Kim, was filmed last summer on a beach in West Sussex, by a different man wearing smart sunglasses. Kim, who was unaware she was being filmed, chatted with him about her employer and family. Later, the man posted two videos online, under the guise of dating advice, which received 6.9 million views on TikTok and more than 100,000 likes on Instagram.  

The Law 

UK law does not expressly prohibit filming or photographing people in public places; unlike other jurisdictions such as UAE, Greece and South Korea. 
However, a number of legal issues arise when such filming occurs once the footage is uploaded and particularly where it is intrusive, monetised or causes harm.  

Although being in public generally reduces people’s privacy expectations, the UK courts have recognised that privacy rights can still arise in public places. Filming may become unlawful where it captures people in sensitive or intimate situations, such as medical emergencies, emotional distress or vulnerability.
The manner of filming, the focus on the individual, and the purpose of publication are all relevant factors in deciding whether the subject’s privacy has been violated.

Back in 2003, in a landmark decision, the European Court of Human Rights ruled that a British man’s right to respect for his private life (Article 8 of the European Convention on Human Rights) was violated when CCTV footage of him attempting suicide was released to the media. The case was brought by Geoffrey Peck, who, on the evening of 20th August 1995 and while suffering from depression, walked down Brentwood High Street in Essex with a kitchen knife and attempted suicide by cutting his wrists. He was unaware that he had been filmed by a CCTV camera installed by Brentwood Borough Council.  The court awarded Mr Peck damages of £7,800. In recent years, media coverage has highlighted situations where women were filmed on nights out and the footage uploaded online . While the filming occurred in public, the intrusive nature of the footage and the harm caused can give rise to privacy claims. 

Victims of secret filming have a direct cause of action in the tort of misuse of private information, developed by the courts in Campbell v MGN Ltd [2004] UKHL 22. This case was about the supermodel Naomi Campbell who successfully sued the Daily Mirror for publishing photos of her attending a Narcotics Anonymous meeting on The King’s Road in London. The court said that in such cases the test is whether the individual had a reasonable expectation of privacy in the circumstances, and if so, whether that expectation is outweighed by the publisher’s right to freedom of expression under Article 10 of the ECHR.  

Data Protection 

When a person is identifiable in a video, that footage constitutes personal data within the meaning of the UK General Data Protection Regulation (UK GDPR). Publishing such footage online involves ‘processing’ personal data and brings the UK GDPR’s obligations into play. The ‘controller’ has a wide range of obligations including having a lawful basis for processing, complying with the principles of fairness and transparency and respecting data subject (the victims’) rights which includes the rights to objection and deletion. 

Content creators and influencers sometimes assume they come under the ‘domestic purposes exemption’ in Article 2(2)(c) UK GDPR. However, this exemption is narrow and does not usually apply where content is shared publicly, monetised, or used to build an online following.  

Failure to comply with the UK GDPR could (at least in theory) lead to enforcement action by the Information Commissioner which could include a hefty fine. Article 82 of the UK GDPR gives a data subject a right to compensation for material or non-material damage for any breach of the UK GDPR. Section 168 of the Data Protection Act 2018 confirms that ‘non-material damage’ includes distress. 

Harassment  

Even where filming in public is lawful in isolation, repeated or targeted filming can amount to harassment or stalking. Section 1 of the Protection from Harassment Act 1997 prohibits a course of conduct that amounts to harassment and which the defendant knows or ought to know causes alarm or distress. Filming someone repeatedly, following them, or persistently targeting them for online content may satisfy this test. In 2024 a man was arrested by Greater Manchester Police on suspicion of stalking and harassment after filming women on nights out and uploading the videos online. The arrest was based not on public filming alone, but on the cumulative effect of the conduct and the harm caused. 

Individuals who discover that a video of them has been published online without consent can make a direct request to the creator to remove the footage, particularly where it causes distress or raises privacy concerns. If this is unsuccessful, most social media platforms offer reporting mechanisms for privacy violations, harassment, or non-consensual content. Videos are often removed by the platforms following complaints. Other civil remedies may also be available including defamation where footage creates a false and damaging impression.  

A New Law?

Despite the growing prevalence of filming strangers in public for social media content, there remains no single, specific piece of legislation in the UK to govern this area. Instead, there is a patchwork of laws including privacy law, the UK GDPR and harassment legislation; to name a few. While these laws can sometimes provide protection, they were not designed with the modern social media ecosystem in mind and often struggle to respond effectively to the scale, speed, and commercial incentives of online content creation.

Furthermore, civil actions are expensive and it is difficult to get Legal Aid for such claims. Victims are left to navigate for themselves complex legal doctrines such as ‘reasonable expectation of privacy’ or ‘lawful basis for processing’. While police involvement may be appropriate in extreme cases, many videos fall short of criminal thresholds yet still cause significant distress and reputational damage.

Is it time for a new, specific statutory framework addressing non-consensual filming (and publication) in public spaces? Such a law could provide clearer boundaries, simpler remedies and more accessible enforcement mechanisms, while balancing legitimate freedoms of expression and journalism. Let us know your thoughts in the comments section.

The data protection landscape continues to evolve. With the passing of the Data (Use and Access) Act 2025, data protection practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.   If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham on 5th February 2026.  

Facial Recognition in Schools: ICO Reprimand

For a number of years schools have used biometrics, particularly fingerprint scanning, to streamline various processes such class registration, library book borrowing and cashless catering. Big Brother Watch (BBW) raised privacy concerns about this way back in 2014. Recently some schools have started to implement facial recognition technology (FRT).

FRT is even more problematic. In May, BBW launched a fundraiser to support two members of the public to bring legal challenges after FRT wrongly flagged them as criminals. And in January 2023, the ICO issued a letter to North Ayrshire Council (NAC) following their use of FRT in school canteens. The Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

Last week the ICO issued a reprimand to Chelmer Valley High School, in Chelmsford, after it started using FRT to take cashless canteen payments from students. The ICO said that the school failed to complete a Data Protection Impact Assessment (DPIA), in compliance with of Article 35(1) of the UK GDPR, prior to introducing the system.

As readers will know, when processing any form of biometric data, a data Controller requires a lawful basis under Article 6 of the UK GDPR as well as Article 9 due to the processing of Special Category Data. In most cases, the only lawful basis for FRT usage is express consent (see the GDPR Enforcement Notices issued to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts  requiring them to stop using FRT and fingerprint scanning to monitor employee attendance)

In March 2023, Chelmer Valley High School sent a letter to parents with a slip for them to return if they did not want their child to participate in the FRT. Positive opt-in consent (express consent) was not sought, meaning until November 2023 the school was wrongly relying on assumed (opt out) consent. The ICO noted most students were old enough to provide their own consent and therefore, parental opt-out deprived students of the ability to exercise their rights and freedoms.

The ICO also noted that the School has failed to consult its Data Protection Officer or the parents and students before implementing the technology. The reprimand included a set of recommendations:

  1. Prior to new processing operations, or upon changes to the nature, scope, context or purposes of processing for activities that pose a high risk to the rights and freedoms of data subjects, complete a DPIA and integrate outcomes back into the project plans. (see our DPIA workshop). 
  1. Amend the DPIA to give thorough consideration to the necessity and proportionality of cashless catering, and to mitigating specific, additional risks such as bias and discrimination.
  1. Review and follow all ICO guidance for schools considering whether to use facial recognition for cashless catering.
  1. Amend privacy information given to students so that it provides for their information rights under the UK GDPR in an appropriate way. (see our Children’s Data workshop). 
  1. Engage more closely and in a timely fashion with their DPO when considering new projects or operations processing personal data, and document their advice and any changes to the processing that are made as a result.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in September.

Facial Recognition Technology and the Risk of Misidentification

In 2023 the Information Commissioner’s Office (ICO) launched an investigation into Facewatch, a company which provides live facial recognition technology (FRT) to the retail sector. Facewatch’s system scans people’s faces in real time as they enter a store and alerts if a “subject of interest” has entered. It is used in numerous stores in the UK, including Budgens, Sports Direct and Costcutter, to identify shoplifters. 

The ICO concluded its investigation by giving Facewatch the go ahead, even though in its letter (closing the investigation) it highlighted a number of breaches. Stephen Bonner, Deputy Commissioner for Regulatory Supervision, wrote in a blog post:

“Based on the information provided by Facewatch about improvements already made and the ongoing improvements it is making, we are satisfied the company has a legitimate purpose for using people’s  information for the detection and prevention of crime. We’ve therefore concluded that no further regulatory action is required.”

But FRT may have an accuracy issue. This weekend the BBC reported on a number of cases where FRT had misidentified people. “Sara” was wrongly accused of being a shoplifter after being flagged by the Facewatch system. She says after her bag was searched she was led out of the shop, and told she was banned from all stores using the technology.

The police are also increasingly using FRT it at live events as well as on the streets. Again not without problems. Civil liberty groups, such as Big Brother Watch, are worried that the accuracy of FRT is yet to be fully established. In February Shaun Thompson was approached at London Bridge by police using FRT and told he was a wanted man. He was held for 20 minutes and his fingerprints were taken. He says he was released only after handing over a copy of his passport. It was a case of mistaken identity. Big Brother Watch have launched a campaign, including taking legal action, to stop the use of FRT . 

The ICO’s has also expressed concerns about the use of FRT in the employment context as well as in schools. On 23rd February 2024, it issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance. The ICO’s investigation found that Serco Leisure and the trusts had been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities for the purpose of attendance checks and subsequent payment for their time.

The ICO issued a letter, in January 2023, to North Ayrshire Council (NAC) following their use of FRT to manage ‘cashless catering’ in school canteens. The Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

In 2019 the ICO published an Opinion on law enforcement use of LFR (Live Facial Recognition) This was followed in 2021 with an Opinion on the use of LFR in public places, setting out key requirements for those considering using this technology. 

Our forthcoming CCTV workshop is ideal for those who want to explore the GDPR and privacy issues around all types of CCTV cameras including those using FRT. 

Facial Recognition to Monitor Attendance: ICO Takes Action

Employers have always had a keen interest in monitoring and tracking employees.
In 2017, we addressed this topic in a blog post focusing on the GDPR implications of employee monitoring using GPS trackers and similar devices. Recent advances in surveillance technology, particularly facial recognition, have not only streamlined employee monitoring but has also rendered it more cost-effective and, concurrently, more intrusive. A good example is this video of a coffee shop using facial recognition technology (FRT) and AI to monitor employee productivity. 

In 2022, the TUC warned employee surveillance technology and AI risks “spiralling out of control” without stronger regulation to protect employees. It warned that, left unchecked, these technologies could lead to widespread discrimination, work intensification and unfair treatment. Earlier this year the French Data Protection Regulator, CNIL, fined Amazon  €32m (£27m) under the GDPR for “excessive” surveillance of its workers. The CNIL said Amazon France Logistique, which manages warehouses, recorded data captured by workers’ handheld scanners. It found Amazon tracked activity so precisely that it led to workers having to potentially justify every break. 

Employee surveillance is now primarily regulated in the UK by the UK GDPR.
As with well all activities involving the processing of personal data, the surveillance must be fair, lawful and transparent. The Human Rights Act and the Regulation of Investigatory Powers Act may also apply (see our previous earlier blog post for more detail on these laws).  

On 23rd February 2024, the Information Commissioner’s Office (ICO) issued Enforcement Notices to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts under the UK GDPR. The notices required the organisations to stop using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance. The ICO’s investigation found that Serco Leisure and the trusts had been unlawfully processing the biometric data of more than 2,000 employees at 38 leisure facilities for the purpose of attendance checks and subsequent payment for their time.  

Serco Leisure will not be appealing against the notices; a wise decision! As readers will know they had to have a lawful basis for processing employees’ data under Article 6 of the UK GDPR as well as Article 9 as they were processing Special Category Data (Biometric Data). Consent was not an option due to the imbalance of power between employer and employee. In the words of the Commissioner:  

“Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy. There is no clear way for staff to opt out of the system, increasing the power imbalance in the workplace and putting people in a position where they feel like they have to hand over their biometric data to work there.” 

Serco tried to rely on Article 6(1)(b) and Article 6(1)(f) as lawful bases for processing the employees’ personal data. In relation to Article 6(1)(b) (contractual necessity) it argued that the processing of attendance data was necessary to ensure employees are paid correctly for the time they have worked. The ICO ruled that although recording attendance times may be necessary for Serco to fulfil its obligations under employment contracts, it does not follow that the processing of biometric data is necessary to achieve this purpose especially when less intrusive means could be used to verify attendance. These included radio-frequency identification cards or fobs, or manual sign-in and sign-out sheets. Serco had failed to demonstrate why these less intrusive methods were not appropriate. They did assert that these methods are open to abuse but did provide evidence of widespread abuse, nor why other methods, such as disciplinary action against employees found to be abusing the system, had not been considered to be appropriate.  

Regarding Serco’s reliance on Article 6(1)(f) (legitimate interests), the ICO said that it will not apply if a controller can reasonably achieve the same result in another less intrusive way. As discussed above, Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system is a necessity, rather than simply a further benefit to Serco. The ICO also said: 

“In applying the balancing test required to rely on legitimate interests, Serco has failed to give appropriate weight to the intrusive nature of biometric processing or the risks to data subjects. “ 

In relation to Article 9, the ICO said that Serco had again failed to demonstrate that the processing of biometric data is “necessary” for Serco to process Special Category Data for the purpose of employment attendance checks or to comply with the relevant laws identified by Serco in their submissions.  

The Enforcement Notices not only instruct Serco Leisure and the trusts to stop all processing of biometric data for monitoring employees’ attendance at work, but also require them to destroy all biometric data that they are not legally obliged to retain. This must be done within three months of the notices being issued. 

This enforcement action coincided with the ICO publishing new guidance for all organisations that are considering using people’s biometric data. The guidance outlines how organisations can comply with data protection law when using biometric data to identify people. Last year, the ICO also published guidance on monitoring employees and called on organisations to consider both their legal obligations and their employee’s rights to privacy before they implement any monitoring. 

This is the first time the ICO has taken enforcement action against an employer to stop it processing the biometric data of staff. It will serve as a warning to organisations who use biometric tech just because it is cheap and easy to use without considering the legal implications.  

Our CCTV Workshop will also examine the use of facial recognition technology. We have also just launched our new workshop, Understanding GDPR Accountability and Conducting Data Protection Audits. 

Police Misuse of Body Worn Camera Footage 

Last week the BBC reported that police officers made offensive comments about an assault victim while watching body camera footage of her exposed body.  

The woman had been arrested by Thames Valley Police and placed in leg restraints before being recorded on body-worn cameras. While being transported to Newbury police station, she suffered a seizure which resulted in her chest and groin being exposed. A day later she was released without charge. 

A female officer later reviewed the body camera footage, which the force told Metro.co.uk was for ‘evidential purposes’ and ‘standard practice’. The BBC reports that three male colleagues joined her and made offensive comments about the victim.
The comments were brought to the attention of senior police officers by a student officer, who reported his colleagues for covering up the incident. The student officer was later dismissed; though the police said this was unrelated to the report. 

The policing regulator says Thames Valley Police should have reported the case for independent scrutiny. The force has now done so, following the BBC investigation. 

This is not the first time the BBC has highlighted such an issue. In September 2023 it revealed the findings of a two-year investigation. It obtained reports of misuse from Freedom of Information requests, police sources, misconduct hearings and regulator reports. It found more than 150 camera misuse reports with cases to answer over misconduct, recommendations for learning or where complaints were upheld. (You can watch Bodycam cops uncovered on BBC iPlayer) 

The most serious allegations include: 

  • Cases in seven forces where officers shared camera footage with colleagues or
    friends – either in person, via WhatsApp or on social media 

  • Images of a naked person being shared between officers on email and cameras used to covertly record conversations 

  • Footage being lost, deleted or not marked as evidence, including video, filmed by Bedfordshire Police, of a vulnerable woman alleging she had been raped by an inspector – the force later blamed an “administrative error” 

  • Switching off cameras during incidents, for which some officers faced no sanctions – one force said an officer may have been “confused”

Body worn cameras are used widely these days by not just police but also  council officers, train guards, security staff, and parking attendance (to name a few). 

There is no all-encompassing law regulating body worn cameras.  Of course they are used to collect and process personal data therefore will be subject to the UK GDPR. Where used covertly they also be subject to Regulation of Investigatory Powers Act 2000.  

The Information Commissioner’s Office (ICO) provides comprehensive guidelines on the use of CCTV, which are largely considered to extend to body worn cameras(BWCs) for security officers. There is a useful checklist on its website which recommends:  

  • Providing a privacy information  to individuals using BWCs, such as clear signage, verbal announcements or lights/indicators on the device itself and having readily available privacy policies. 
  • Training staff using BWV to inform individuals that recording may take place if it is not obvious to individuals in the circumstances. 
  • Having appropriate retention and disposal policies in place for any footage that is collected. 
  • Having efficient governance procedures in place to be able to retrieve stored footage and process it for subject access requests or onward disclosures where required. 
  • Using technology which has the ability to efficiently and effectively blur or mask footage, if redaction is required to protect the rights and freedoms of any third parties. 

Our one-day CCTV workshop will teach you how to plan and implement a CCTV/BWC project including key skills such as completing a DPIA and assessing camera evidence.
Our expert trainer will answer all your questions including when you can use CCTV/BWC, when it can be covert and how to deal with a request for images.  
 
This workshop is suitable for anyone involved in the operation of CCTV, BWCs and drones including DPOs, investigators, CCTV operators, enforcement officers, estate managers and security personnel. 

Facial Recognition CCTV Cameras in Every Store?

The Observer recently reported that Home Office officials have developed covert plans to lobby the Information Commissioner’s Office (ICO) in an effort to hasten the adoption of contentious facial recognition technology in high street stores and supermarkets. Critics argue that such technology raises concerns about bias and data privacy.

Despite these objections, the Home Office appears to be pushing for the adoption of facial recognition in stores. The minutes of the recent meeting, obtained under the Freedom of Information Act, appear to show Home Office officials agreeing to write to the ICO praising the merits of facial recognition technology in combating “retail crime”. This ignores critics who claim the technology violates human rights and is biased, particularly against darker-skinned people.

Police minister Chris Philp, senior Home Office officials, and the commercial company Facewatch came to an agreement on the covert strategy on 8th March 2023 during a meeting held behind closed doors. Facewatch provides facial recognition cameras to help retailers combat shoplifting. It has courted controversy and was investigated by the ICO earlier this year following a complaint by Big Brother Watch.

Despite finding multiple UK GDPR violations on 28th March, the ICO told Facewatch it would take no further action. The ICO said it “welcomed” remedial steps that Facewatch had taken, or would take, to address the above violations. Those remedial steps have been redacted from public information about the case.

Facial recognition technology has faced extensive criticism and scrutiny, leading the European Union to consider a ban on its use in public spaces through the upcoming Artificial Intelligence Act. However, the UK’s Data Protection and Digital Information (No.2) Bill proposes to eliminate the government-appointed Surveillance Camera Commissioner role and the requirement for a surveillance camera code of practice.

Our forthcoming CCTV workshop is ideal for those who want to explore the GDPR and privacy issues around all types of CCTV cameras including drones and body worn cameras. Our Advanced Certificate in GDPR Practice is a practical scenario based course designed to help delegates gain the confidence to tackle complex GDPR issues in methodical way.

Leading Surveillance Law Expert Joins the Act Now Team

Act Now Training welcomes solicitor and surveillance law expert, Naomi Mathews, to its team of associates. Naomi is a Senior Solicitor and a co-ordinating officer for RIPA at a large local authority in the Midlands. She is also the authority’s Data Protection Officer and Senior Responsible Officer for CCTV.

Naomi has extensive experience in all areas of information compliance and has helped prepare for  RIPA inspections both for the Office of Surveillance Commissioners and Investigatory Powers Commissioner’s Office (IPCO). She has worked as a defence solicitor in private practice and as a prosecutor for the local authority in a range of regulatory matters including Trading Standards, Health and Safety and Environmental prosecutions. Naomi has higher rights of audience to present cases in the Crown Court.

Naomi has many years of practical knowledge of RIPA and how to prepare for a successful prosecution/inspection. Her training has been commended by RIPA inspectors and she has also trained nationally. Naomi’s advice has helped Authorising Officers, Senior Responsible Officers and applicants understand the law and practicalities of covert surveillance. 

Like our other associates, Susan Wolf and Kate Grimley Evans, Naomi is a fee paid member of the Upper Tribunal assigned to the Administrative Appeals Chamber (Information Rights Jurisdiction and First Tier Tribunal General Regulatory Chamber (Information Rights Jurisdiction).

Ibrahim Hasan, director of Act Now Training, said:

“ I am pleased that Naomi has joined our team. We are impressed with her experience of RIPA and her practical approach to training which focuses on real life scenarios as opposed to just the law and guidance.”

Naomi will be delivering our full range of RIPA workshops as well developing new ones. She is also presenting a series of one hour webinars on RIPA and Social Media. If you would like Naomi to deliver customised in house training for your organisation, please get in touch for a quote. 

Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Coronavirus and Police Use of Drones

Man operating a drone at sunset using a controller

The police have an important rule to play in the current coronavirus lockdown.  However their actions must at all times be proportionate, transparent and (above all) lawful. Only yesterday, British Transport Police admitted they had wrongly charged a woman who was fined £660 under coronavirus legislation. Marie Dinou was arrested at Newcastle Central Station on Saturday after she refused to tell police why she needed to travel. A police and Crown Prosecution Service review said she was charged under the wrong part of the Corona Virus Act. The court will be asked to set the conviction aside.

This is not the only recent incident of the police overstepping the mark. By now most of us will have seen the story about a couple walking their dog in the Peak District. The video was filmed by a drone operated by the Derbyshire Police Drone Unit, and broadcast to the nation on BBC news. According to Derbyshire Police’s Twitter feed (which broadcast the same 90 second footage) the police force wanted to reinforce the government message of ‘stay at home’ and to point out this was not getting through, by effectively ‘shaming’ the couple who were captured on camera.

The video has sparked huge controversy from various circles including civil liberties campaign group Big Brother Watch and a leading member of the judiciary. According to the BBC, Big Brother Watch has described the move as ‘sinister and counter-productive’. Ex Supreme Court Judge, Lord Sumption, has also been very critical.
In BBC Radio 4’s World at One, Lord Sumption made it clear that the police have no legal power to enforce Government Ministers ‘wishes’ and guidance about non-essential travel. Although the government has enacted the Coronavirus Act 2020, this does not give the police any powers to stop individuals from non-essential travel or walking in isolated places. Lord Sumption’s criticism is most tellingly summed up in the following quotation:

“This is what a police state is like, it is a state in which the government can issue orders or express preferences with no legal authority and the police will enforce ministers’ wishes.”

At Act Now we are not able to comment on whether the police have the powers to do this but we respectfully accept Lord’s Sumption’s view that they did not. Our concern is whether the filming and broadcasting of these individuals was GDPR compliant.
Our conclusion is that it was not.

The use of drones poses a privacy risk. The Police Force took the decision to process this personal data for their own purposes (“to get the message across”). They are therefore Data Controllers and must comply with the General Data Protection Regulation (GDPR) in relation to this processing. Images of individuals constitute personal data where it is possible to identify them from those images (GDPR Article 4(1)). It is entirely possible that the individuals captured in that Derbyshire police video could be identified by their clothing, hair colour and the presence of their dog.

Drones can be used to film people in many locations, often without the knowledge of those being filmed. In these circumstances, the processing of personal data must be lawful (GDPR Article 5 (1)). It is questionable which Article 6 basis the police could rely on here. Arguably processing is necessary for a ‘task carried out in the public interest’. However one would have to ask why it was necessary to film and broadcast these individuals. The police could not rely on ‘legitimate interests’ because this does not apply to processing carried out by public authorities in performance of their task (GDPR Article 6 (1)(f)).

Even if the police could identify a lawful basis, the next question is whether this processing is fair. The ICO guidance states that Data Controllers should only process data in ways that people would reasonably expect and not use it in ways that have unjustified adverse effects on them. I would argue that it is highly unlikely that anybody walking their dog in an isolated part of the Peak District would have any reasonable expectation that they would be secretly filmed by a drone and that their images would be broadcast to the nation in an attempt to shame them. So it seems highly unlikely that this processing is fair.

GDPR also requires transparency when processing personal data. This means data subjects should be made aware that their personal data is being processed and why.
The ‘normal’ transparency requirements (usually the GDPR (Articles 12-14) are less onerous for the police when they are processing personal data for law enforcement purposes under Part 3 of the Data Protection Act 2018. However, the police admitted themselves that the filming was for the purposes of ‘getting a  message out’ and this does not fit easily within the definition of law enforcement purposes under S.31 DPA 2018. At best the police could try and argue that the processing was for the purposes of preventing threats to public security, but it is really difficult to see how this would succeed when it was just a couple walking their dog on an isolated stretch of path.

The police did not comply with the Information Commissioner’s tips on responsible drone use, in particular the advice about thinking carefully about sharing images on social media. The ICO cautions that drone users should avoid sharing images that could have unfair or harmful consequences. There is also little evidence that the Police had due regard to at least the first three guiding principles laid down in the Surveillance Camera Code of Practice or whether they conducted a Data Protection Impact Assessment.

On balance, the Derbyshire Police’s decision to film individuals taking a walk in an isolated area, in order to get a message across about not travelling unnecessarily was at best misguided, and at worst unlawful. The coronavirus is changing almost all aspects of our daily lives, and social distancing and self-isolating are the new norms. However, when the police take action it is still vital that they comply with their legal obligations in relation to the processing of personal data.

More on this and other developments in our FREE GDPR update webinar. Looking for a GDPR qualification from the comfort of your home office? Our GDPR Practitioner Certificate is now available as an online option.

gdprcert-online

Act Now launches GDPR Policy Pack

ACT NOW NEWS

The first fine was issued recently under the General Data Protection Regulation (GDPR) by the Austrian data protection regulator. Whilst relatively modest at 4,800 Euros, it shows that regulators are ready and willing to exercise their GDPR enforcement powers.

Article 24 of GDPR emphasises the need for Data Controllers to demonstrate compliance through measures to “be reviewed and updated where necessary”. This includes the implementation of “appropriate data protection policies by the controller.” This can be daunting especially for those beginning their GDPR compliance journey.

Act Now has applied its information governance knowledge and experience to create a GDPR policy pack containing essential documentation templates to help you meet the requirements of GDPR as well as the Data Protection Act 2018. The pack includes, amongst other things, template privacy notices as well as procedures for data security and data breach reporting. Security is a very hot topic after the recent £500,000 fine levied on Equifax by the Information Commissioner under the Data Protection Act 1998.

We have also included template letters to deal with Data Subjects’ rights requests, including subject access. The detailed contents are set out below:

  • User guide
  • Policies
    • Data Protection Policy
    • Special Category Data Processing (DPA 2018)
    • CCTV
    • Information Security
  • Procedures
    • Data breach reporting
    • Data Protection Impact Assessment template
    • Data Subject rights request templates
  • Privacy Notices
    • Business clients and contacts
    • Customers
    • Employees and volunteers
    • Public authority services users
    • Website users
    • Members
  • Records and Tracking logs
    • Information Asset Register
    • Record of Processing Activity (Article 30)
    • Record of Special Category Data processing
    • Data Subject Rights request tracker
    • Information security incident log
    • Personal data breach log
    • Data protection advice log

The documents are designed to be as simple as possible while meeting the statutory requirements placed on Data Controllers. They are available as an instant download (in Word Format). Sequential files and names make locating each document very easy.

Click here to read sample documents.

The policy pack gives a useful starting point for organisations of all sizes both in the public and private sector. For only £149 plus VAT (special introductory price) it will save you hours of drafting time. Click here to buy now or visit or our website to find out more.

Act Now provides a full GDPR Course programme including one day workshops, e learning, healthchecks and our GDPR Practitioner Certificate.