Filming People in Public for Social Media: Is it time for a new law?

In the content creator world, filming people without their consent has become everyday behaviour. From TikTok nightlife clips to YouTube street pranks, millions of people capture others in public places and post the footage online. Whether it is for likes, shares or monetisation, this behaviour is not without consequences for the creators as well as the subjects. Over the weekend the BBC ran a story about two women whose interactions with ‘friendly strangers’ were uploaded to social media causing the women much alarm and distress. 

Dilara was secretly filmed in a London store where she works, by a man wearing smart glasses. The footage was then posted to TikTok, where it received 1.3 million views. Dilara then faced a wave of unwanted messages and calls. It later turned out that the man who filmed her had posted dozens of similar videos, giving men tips on how to approach women. Another woman,Kim, was filmed last summer on a beach in West Sussex, by a different man wearing smart sunglasses. Kim, who was unaware she was being filmed, chatted with him about her employer and family. Later, the man posted two videos online, under the guise of dating advice, which received 6.9 million views on TikTok and more than 100,000 likes on Instagram.  

The Law 

UK law does not expressly prohibit filming or photographing people in public places; unlike other jurisdictions such as UAE, Greece South Korea (see the recent case of the jailed American YouTuber). 
However, a number of legal issues arise when such filming occurs once the footage is uploaded and particularly where it is intrusive, monetised or causes harm.  

Although being in public generally reduces people’s privacy expectations, the UK courts have recognised that privacy rights can still arise in public places. Filming may become unlawful where it captures people in sensitive or intimate situations, such as medical emergencies, emotional distress or vulnerability.
The manner of filming, the focus on the individual, and the purpose of publication are all relevant factors in deciding whether the subject’s privacy has been violated.

Back in 2003, in a landmark decision, the European Court of Human Rights ruled that a British man’s right to respect for his private life (Article 8 of the European Convention on Human Rights) was violated when CCTV footage of him attempting suicide was released to the media. The case was brought by Geoffrey Peck, who, on the evening of 20th August 1995 and while suffering from depression, walked down Brentwood High Street in Essex with a kitchen knife and attempted suicide by cutting his wrists. He was unaware that he had been filmed by a CCTV camera installed by Brentwood Borough Council.  The court awarded Mr Peck damages of £7,800. In recent years, media coverage has highlighted situations where women were filmed on nights out and the footage uploaded online . While the filming occurred in public, the intrusive nature of the footage and the harm caused can give rise to privacy claims. 

Victims of secret filming have a direct cause of action in the tort of misuse of private information, developed by the courts in Campbell v MGN Ltd [2004] UKHL 22. This case was about the supermodel Naomi Campbell who successfully sued the Daily Mirror for publishing photos of her attending a Narcotics Anonymous meeting on The King’s Road in London. The court said that in such cases the test is whether the individual had a reasonable expectation of privacy in the circumstances, and if so, whether that expectation is outweighed by the publisher’s right to freedom of expression under Article 10 of the ECHR.  

Data Protection 

When a person is identifiable in a video, that footage constitutes personal data within the meaning of the UK General Data Protection Regulation (UK GDPR). Publishing such footage online involves ‘processing’ personal data and brings the UK GDPR’s obligations into play. The ‘controller’ has a wide range of obligations including having a lawful basis for processing, complying with the principles of fairness and transparency and respecting data subject (the victims’) rights which includes the rights to objection and deletion. 

Content creators and influencers sometimes assume they come under the ‘domestic purposes exemption’ in Article 2(2)(c) UK GDPR. However, this exemption is narrow and does not usually apply where content is shared publicly, monetised, or used to build an online following.  

Failure to comply with the UK GDPR could (at least in theory) lead to enforcement action by the Information Commissioner which could include a hefty fine. Article 82 of the UK GDPR gives a data subject a right to compensation for material or non-material damage for any breach of the UK GDPR. Section 168 of the Data Protection Act 2018 confirms that ‘non-material damage’ includes distress. 

Harassment  

Even where filming in public is lawful in isolation, repeated or targeted filming can amount to harassment or stalking. Section 1 of the Protection from Harassment Act 1997 prohibits a course of conduct that amounts to harassment and which the defendant knows or ought to know causes alarm or distress. Filming someone repeatedly, following them, or persistently targeting them for online content may satisfy this test. In 2024 a man was arrested by Greater Manchester Police on suspicion of stalking and harassment after filming women on nights out and uploading the videos online. The arrest was based not on public filming alone, but on the cumulative effect of the conduct and the harm caused. 

Individuals who discover that a video of them has been published online without consent can make a direct request to the creator to remove the footage, particularly where it causes distress or raises privacy concerns. If this is unsuccessful, most social media platforms offer reporting mechanisms for privacy violations, harassment, or non-consensual content. Videos are often removed by the platforms following complaints. Other civil remedies may also be available including defamation where footage creates a false and damaging impression.  

A New Law?

Despite the growing prevalence of filming strangers in public for social media content, there remains no single, specific piece of legislation in the UK to govern this area. Instead, there is a patchwork of laws including privacy law, the UK GDPR and harassment legislation; to name a few. While these laws can sometimes provide protection, they were not designed with the modern social media ecosystem in mind and often struggle to respond effectively to the scale, speed, and commercial incentives of online content creation.

Furthermore, civil actions are expensive and it is difficult to get Legal Aid for such claims. Victims are left to navigate for themselves complex legal doctrines such as ‘reasonable expectation of privacy’ or ‘lawful basis for processing’. While police involvement may be appropriate in extreme cases, many videos fall short of criminal thresholds yet still cause significant distress and reputational damage.

Is it time for a new, specific statutory framework addressing non-consensual filming (and publication) in public spaces? Such a law could provide clearer boundaries, simpler remedies and more accessible enforcement mechanisms, while balancing legitimate freedoms of expression and journalism. Let us know your thoughts in the comments section.

This topic was discussed in detail in Episode 6 of the Guardians of Data Podcast (see below)

Stolen NHS Patient Data Published on Dark Web

NHS England has now confirmed its patient data, managed by blood test management organisation Synnovis, was stolen in a ransomware attack on 3rd June. According to the BBC some of that data has been published on the dark web by the hackers. 

On 4th June 2024, the Independent reported that two major London hospital trusts had to cancel all non-emergency operations and blood tests due to a significant cyber attack. Both King’s College Hospital Foundation Trust and Guy’s and St Thomas’ Hospitals Foundation Trusts have seen their pathology systems compromised by malware.

Synnovis, the service provider responsible for blood tests, swabs, bowel tests, and other critical services for these hospitals, was targeted in this attack. The impact was widespread, affecting NHS patients across six London boroughs. 

It now transpires that, Qilin, a Russian cyber-criminal group, shared almost 400GB of private information on their darknet site on Thursday night.  A sample of the stolen data seen by the BBC includes patient names, dates of birth, NHS numbers and descriptions of blood tests. NHS England said in a statement that there is “no evidence” that test results have been published, but that “investigations are ongoing”.

The Information Commissioner’s Office said in statement:

“While we are continuing to make enquiries into this matter, we recognise the sensitivity of some of the information in question and the worry this may have caused.

“We would urge anyone concerned about how their data has been handled to check our website for advice and support, as well as visiting NHS England’s website.”

We have two workshops coming up in September (Introduction to Cyber Security and Cyber Security for DPOs) which are ideal for organisations who wish to up skill their employees about data security. See also our Managing Personal Data Breaches Workshop.  

Covid-19, GDPR and Temperature Checks

Hands holding Thermometer Infrared Gun Isometric Medical Digital Non-Contact.

Emma Garland writes…

Many countries have now been in some form of lockdown for a considerable length of time. As some of the lockdown measures are slowly being eased, one of the possible solutions to prevent a “second wave” is the implementation of temperature checks in shops and workplaces. This involves placing a thermometer on an individual’s forehead. Of course if the temperature is recorded or there is another way the individual can be identified, it will involve processing health data. Care must be taken to consider the GDPR and privacy implications.

Apple reopened stores across Germany on 11th May with extra safety procedures, including temperature checks and social distancing. It is now facing a probe by a regional German data protection regulator into whether its plan to take the temperature of its store customers violates GDPR.

The benefits of temperature check are self-evident. By detecting members of the public or staff who have a high temperature, and not permitting them to enter the store or workplace, staff have less risk of close contact with people who may have COVID 19. Temperature checks are just one small part of stopping the spread of COVID 19 and can be intrusive. What is the lawful basis for processing such data? Art 6(1)(d) of GDPR allows processing where it is:

“…is necessary in order to protect the vital interests of the data subject or of another natural person”

Of course “data concerning health” is also Special Category Data and requires an Article 9 condition to ensure it is lawful. Is a temperature check necessary to comply with employment obligations, for medical diagnosis or for reasons of public health?

All conditions under Article 6 and 9 must satisfy the test of necessity. There are many causes of a high temperature not just COVID 19. There have also been doubts over the accuracy of temperature readings. They take skin temperature, which can vary from core temperature, and do not account for the incubation phase of the disease where people may be asymptomatic.

ICO Guidance

The Information Commissioner’s Office (ICO) has produced guidance on workplace testing which states:

“Data protection law does not prevent you from taking the necessary steps to keep your staff and the public safe and supported during the present public health emergency.
But it does require you to be responsible with people’s personal data and ensure it is handled with care.”

The ICO suggests that  “legitimate interests” or “public task” could be used to justify the processing of personal data as part of a workplace testing regime. The former will require a Legitimate Interests Assessment, where the benefit of the data to the organisation is balanced against the risks to the individual.  In terms of Article 9, the ICO suggests the employment condition, supplemented by Schedule 1 of the Data Protection Act 2018. The logic used here is that employment responsibilities extend to compliance wide range of legislation, including health and safety.

More generally, the ICO says that that technology which could be considered privacy intrusive should have a high justification for usage. It should be part of a well thought out plan, which ensures that it is an appropriate means to achieve a justifiable end. alternatives should also have been fully evaluated. The ICO also states:

“If your organisation is going to undertake testing and process health information, then you should conduct a DPIA focussing on the new areas of risk.”

A Data Protection Impact Assessment should map the flow of the data including collection, usage, retention and deletion as well as the associated risks to individuals.
Some companies are even using thermal cameras as part of COVID 19 testing.
The Surveillance camera Commissioner (SCC) and the ICO have worked together to update the SCC DPIA template, which is specific to surveillance systems.

As shops begin to open and the world establishes post COVID 19 practices, many employers and retailers will be trying to find their “new normal”. People will also have to decide what they are comfortable with. Temperature should be part of a considered approach evaluating all the regulatory and privacy risks.

Emma Garland is a Data Governance Officer at North Yorkshire County Council and a blogger on information rights. This and other GDPR developments will be covered in our new online GDPR update workshop. Our next online  GDPR Practitioner Certificate course is  fully booked. A few places left  on the course starting on 2nd July.

The NHS COVID 19 Contact Tracing App: Part 3 The Human Rights Angle

christian-lue-P0JL8np1N6k-unsplash

Everyone will agree that the government needs to do everything it can to prevent the further spread of the Coronavirus and to “save lives” (except if your name is Dominic Cummings -Ed). However, there is much less consensus about the what it should do, and this can be seen in the current debate about the proposal to roll out a contact tracing  system and the NHS COVID App. This is the third in a series of blog posts where we examine the COVID App from different perspectives.

On May 7 2020, the  Parliamentary Joint Committee on Human Rights (PJCHR) published its report on the proposed contact tracing system and made a series of important recommendations to address its concerns about the compatibility of the scheme with data protection laws and the Human Rights Act 1998. After waiting for two weeks, the Secretary of State for Health, Matt Hancock, replied to the Committee rejecting its proposals as “unnecessary!” Let us examine those proposals in detail.

The Human Rights Considerations

Section 6 of the Human Rights Act 1998 makes it unlawful for any public authority (that includes the UK government and the NHSX) to act in a way that is incompatible with a Convention right. Article 8(1)of the ECHR states that “Everyone has the right to respect for his private and family life, his home and his correspondence.” This is not an absolute right. Article 8(2) provides that an interference with the right to privacy may be justified if it:

is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”

However, the government also has an obligation to protect the “right to life” enshrined in Article 2 of the ECHR. This means that if the NHS COVID App really can prevent the spread of the virus and save lives, then this is going to a major consideration in deciding whether the interference with Article 8 is necessary and proportionate.

On 7 May the Parliamentary Joint Committee on Human Rights  (PJCHR) published a Report on the NHS COVID App and this provides a very detailed assessment of some of the human rights implications of the “centralised” approach that the NHS has proposed. The overall conclusion of the report is that if the app is effective it could help pave the way out of current lockdown restrictions and help to prevent the spread of Coronavirus. However, it also concludes that the app, in its current form, raises “significant concerns regarding surveillance and the impact on other human rights which must be addressed first.”

How will the COVID App interfere with the right to privacy?

At first glance it would appear that the COVID App does not involve the transfer of any personal data. As explained in the first blog in this series, app user will be given a unique ID which will be made up of a set of random numbers and the first half of a person’s post code. The NHS web site suggests that this ‘anonymises’ the information. However, as the Parliamentary Report notes, there are parts of England where less than 10,000 people live in a post code area and as little as 3 or 4 “bits” of other information could be enough to identify individuals. The report also notes that relying upon people self-reporting alone (without requiring conformation that a person has tested positive for COVID 19) may carry the risks of false alerts thereby impacting on other people’s rights if they have to self-isolate unnecessarily.

Necessary interference?

An interference with a person’s right to privacy under ECHR Article 8 may be justified under Article 8(2) if it is “in accordance with the law” and is “necessary” for the protection of “health” (see above).

To be in accordance with the law, the app must meet the requirements of the General Data Protection Regulation (GDPR) and the Data Protection Act 2018 “http://www. legislation.gov.uk/ukpga/2018/12/contents” (DPA). However, as noted below, the PJCHR believes that the “current data protection framework is contained in a number of different documents and it is nearly impossible for the public to understand what it means for their data which may be collected by the digital contact tracing system”. The Committee’s recommendations in relation to this are considered below.

The remaining  human rights consideration is whether the interference with peoples’ private lives is “necessary”. The answer to this depends on whether the use of the app will contribute to reducing the spread of COVID 19 and whether it will save lives.
This in turn depends on whether the app works and on the uptake of the app.

Although it was reported that uptake of the app in the Isle of Wight has exceeded 50% of the population, this falls short of the 60% that the government had previously suggested was necessary for the app to be effective. It is also debatable whether it necessarily follows that the uptake will be the same on the mainland. If the App is not capable of achieving its objective of preventing the spread of the virus, then the interference with peoples’ privacy rights will not be proportionate and will not fulfil the requirement of necessity in Article 8(2).

Although many people will probably download the app without thinking about privacy issues (how often do any of us download apps without checking Privacy Notices?), many others may have some real privacy concerns, particularly after the recent media debates. This has not been helped by reports that Serco (the company contracted to train call centre staff for the contact tracing scheme) has accidentally shared the email addresses of 300 contact tracers. Or by the fact that in other parts of the world there is growing concern about the privacy issues related to the use of contact tracing apps. Uptake of the app may be adversely affected if people lack confidence in the way in which data is being processed and why, and in the light of above they may have concerns about data security.

Consequently, the PJCHR’s report includes a series of recommendations aimed at ensuring that “robust privacy protections” are put in place as these are key to ensuring the effectiveness of the app .

Central to their recommendations was a proposal that the government introduce legislation to provide legal certainty about how personal data will be processed by the COVID App. Although individuals’ data protection rights are protected by the GDPR and DPA 2018 the Committee believes that it is “nearly impossible” for the public to understand what will happen to their data and also that it is necessary to turn government assurances about privacy into statutory obligations. The PJCHR sent a copy of their draft Bill to Secretary of State, Matt Hancock. However, on 21 May Matt Hancock rejected that proposal on the basis that the existing law provides “the necessary powers, duties and protections” and that participation in contact tracing and use of the app is voluntary.
In contrast the Australian government has passed additional new privacy protection legislation specifically aimed at the collection, use and disclosure of its COVID safe app data.

The Committee’s other recommendations are:

  1. The appointment of a Digital Contact Tracing Human Rights Commissioner to oversee the use, effectiveness and privacy protections of the app and any data associated with digital contact tracing. It calls for the Commissioner to have the same powers as the Information Commissioner. It would appear that Matt Hancock has also rejected this proposal on the basis that there is already sufficient governance in place.
  2. Particular safeguards for children under 18 to monitor children’s use, ensure against misuse and allow for interviews with parents where appropriate. It is noticeable that the Committee has set the age at 18.
  3. The app’s contribution to reducing the severity of the lockdown and to helping to prevent the spread of COVID 19 must be demonstrated and improved at regular intervals for the collection of the data to be reasonable. Therefore the Secretary of State for Health must review the operation of the app on a three weekly basis and must report to Parliament every three weeks.
  4. Transparency. In the second of this series of blog posts, we noted some of the issues relating to the publication of the Data Protection Impact Assessment. The PJCHR calls for this to be made public as it is updated.
  5. Time limited. The data associated with the contact tracing app must be permanently deleted when it is no longer required and may not be kept beyond the duration of the health emergency. However these terms may be open to some interpretation.

Matt Hancock has written that he will respond to these other issues “in due course”.
It is unclear what this means, but it does not suggest any immediate response.

The Draft Bill

The PJCHR’s draft bill (rejected by Matt Hancock) proposed a number of important provisions, some of which are set out below.

The Bill specifically limited the purpose of the COVID App to:

  1. Protecting the health of individuals who are or may become infected with Coronavirus; and
  2. Preventing or controlling the spread of Coronavirus (a) preventing the spread of Coronavirus.

Additionally it contained provisions  that prohibited the use of centrally held data without specific statutory authorisation; limited the amount of time that data could be held on a smart phone to 28 days followed by automatic deletion unless a person has notified that they have COVID 19 or suspected COVID 19. It also prohibited “data reconstruction” in relation to any centrally held data. The fact that the Bill includes this, seems to suggest an implicit recognition that the Unique IDs are not truly anonymous.

The ‘status’ of the NHS COVID App keeps changing and it still remains to be seen when (and if) it will be rolled out. But the Northern Ireland Assembly has already announced it will be working with the Irish government to produce a coordinated response based on a  decentralised model.  It is reported to be doing this because of the difficulties and uncertainties surrounding the app, and the human rights issues arising from a centralised app.

This and other GDPR developments will be covered in our new online GDPR update workshop. Our  next online   GDPR Practitioner Certificate  course is  fully booked. We have  1 place left   on the course starting on 11th  June. 

online-gdpr-banner