Filming People in Public for Social Media: Is it time for a new law?

In the content creator world, filming people without their consent has become everyday behaviour. From TikTok nightlife clips to YouTube street pranks, millions of people capture others in public places and post the footage online. Whether it is for likes, shares or monetisation, this behaviour is not without consequences for the creators as well as the subjects. Over the weekend the BBC ran a story about two women whose interactions with ‘friendly strangers’ were uploaded to social media causing the women much alarm and distress. 

Dilara was secretly filmed in a London store where she works, by a man wearing smart glasses. The footage was then posted to TikTok, where it received 1.3 million views. Dilara then faced a wave of unwanted messages and calls. It later turned out that the man who filmed her had posted dozens of similar videos, giving men tips on how to approach women. Another woman,Kim, was filmed last summer on a beach in West Sussex, by a different man wearing smart sunglasses. Kim, who was unaware she was being filmed, chatted with him about her employer and family. Later, the man posted two videos online, under the guise of dating advice, which received 6.9 million views on TikTok and more than 100,000 likes on Instagram.  

The Law 

UK law does not expressly prohibit filming or photographing people in public places; unlike other jurisdictions such as UAE, Greece and South Korea. 
However, a number of legal issues arise when such filming occurs once the footage is uploaded and particularly where it is intrusive, monetised or causes harm.  

Although being in public generally reduces people’s privacy expectations, the UK courts have recognised that privacy rights can still arise in public places. Filming may become unlawful where it captures people in sensitive or intimate situations, such as medical emergencies, emotional distress or vulnerability.
The manner of filming, the focus on the individual, and the purpose of publication are all relevant factors in deciding whether the subject’s privacy has been violated.

Back in 2003, in a landmark decision, the European Court of Human Rights ruled that a British man’s right to respect for his private life (Article 8 of the European Convention on Human Rights) was violated when CCTV footage of him attempting suicide was released to the media. The case was brought by Geoffrey Peck, who, on the evening of 20th August 1995 and while suffering from depression, walked down Brentwood High Street in Essex with a kitchen knife and attempted suicide by cutting his wrists. He was unaware that he had been filmed by a CCTV camera installed by Brentwood Borough Council.  The court awarded Mr Peck damages of £7,800. In recent years, media coverage has highlighted situations where women were filmed on nights out and the footage uploaded online . While the filming occurred in public, the intrusive nature of the footage and the harm caused can give rise to privacy claims. 

Victims of secret filming have a direct cause of action in the tort of misuse of private information, developed by the courts in Campbell v MGN Ltd [2004] UKHL 22. This case was about the supermodel Naomi Campbell who successfully sued the Daily Mirror for publishing photos of her attending a Narcotics Anonymous meeting on The King’s Road in London. The court said that in such cases the test is whether the individual had a reasonable expectation of privacy in the circumstances, and if so, whether that expectation is outweighed by the publisher’s right to freedom of expression under Article 10 of the ECHR.  

Data Protection 

When a person is identifiable in a video, that footage constitutes personal data within the meaning of the UK General Data Protection Regulation (UK GDPR). Publishing such footage online involves ‘processing’ personal data and brings the UK GDPR’s obligations into play. The ‘controller’ has a wide range of obligations including having a lawful basis for processing, complying with the principles of fairness and transparency and respecting data subject (the victims’) rights which includes the rights to objection and deletion. 

Content creators and influencers sometimes assume they come under the ‘domestic purposes exemption’ in Article 2(2)(c) UK GDPR. However, this exemption is narrow and does not usually apply where content is shared publicly, monetised, or used to build an online following.  

Failure to comply with the UK GDPR could (at least in theory) lead to enforcement action by the Information Commissioner which could include a hefty fine. Article 82 of the UK GDPR gives a data subject a right to compensation for material or non-material damage for any breach of the UK GDPR. Section 168 of the Data Protection Act 2018 confirms that ‘non-material damage’ includes distress. 

Harassment  

Even where filming in public is lawful in isolation, repeated or targeted filming can amount to harassment or stalking. Section 1 of the Protection from Harassment Act 1997 prohibits a course of conduct that amounts to harassment and which the defendant knows or ought to know causes alarm or distress. Filming someone repeatedly, following them, or persistently targeting them for online content may satisfy this test. In 2024 a man was arrested by Greater Manchester Police on suspicion of stalking and harassment after filming women on nights out and uploading the videos online. The arrest was based not on public filming alone, but on the cumulative effect of the conduct and the harm caused. 

Individuals who discover that a video of them has been published online without consent can make a direct request to the creator to remove the footage, particularly where it causes distress or raises privacy concerns. If this is unsuccessful, most social media platforms offer reporting mechanisms for privacy violations, harassment, or non-consensual content. Videos are often removed by the platforms following complaints. Other civil remedies may also be available including defamation where footage creates a false and damaging impression.  

A New Law?

Despite the growing prevalence of filming strangers in public for social media content, there remains no single, specific piece of legislation in the UK to govern this area. Instead, there is a patchwork of laws including privacy law, the UK GDPR and harassment legislation; to name a few. While these laws can sometimes provide protection, they were not designed with the modern social media ecosystem in mind and often struggle to respond effectively to the scale, speed, and commercial incentives of online content creation.

Furthermore, civil actions are expensive and it is difficult to get Legal Aid for such claims. Victims are left to navigate for themselves complex legal doctrines such as ‘reasonable expectation of privacy’ or ‘lawful basis for processing’. While police involvement may be appropriate in extreme cases, many videos fall short of criminal thresholds yet still cause significant distress and reputational damage.

Is it time for a new, specific statutory framework addressing non-consensual filming (and publication) in public spaces? Such a law could provide clearer boundaries, simpler remedies and more accessible enforcement mechanisms, while balancing legitimate freedoms of expression and journalism. Let us know your thoughts in the comments section.

The data protection landscape continues to evolve. With the passing of the Data (Use and Access) Act 2025, data protection practitioners need to ensure their materials reflect the latest changes to the UK GDPR, Data Protection Act 2018, and PECR.   If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop which is running online and in Birmingham on 5th February 2026.  

The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

TikTok Faces a £27 Million GDPR Fine

On 26 September 2022, TikTok was issued with a Notice of Intent under the GDPR by the Information Commissioner’s Office (ICO). The video-sharing platform faces a £27 million fine after an ICO investigation found that the company may have breached UK data protection law.  

The notice sets out the ICO’s provisional view that TikTok breached UK data protection law between May 2018 and July 2020. It found the company may have:

  • processed the data of children under the age of 13 without appropriate parental consent,
  • failed to provide proper information to its users in a concise, transparent and easily understood way, and
  • processed special category data, without legal grounds to do so.

The Information Commissioner, John Edwards said:

“We all want children to be able to learn and experience the digital world, but with proper data privacy protections. Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.

“I’ve been clear that our work to better protect children online involves working with organisations but will also involve enforcement action where necessary. In addition to this, we are currently looking into how over 50 different online services are conforming with the Children’s code and have six ongoing investigations looking into companies providing digital services who haven’t, in our initial view, taken their responsibilities around child safety seriously enough.”

Rolled out in September last year, the Children’s Code puts in place new data protection standards for online services likely to be accessed by children.

It will be interesting to see if and when this notice becomes an actual fine. If it does it will be the largest fine issued by the ICO. It is also the first potential fine to look at transparency and consent and will provide valuable guidance to Data Controllers especially if it is appealed to the Tribunal.  

It is important to note that this is not a fine but ‘notice of intent’ – a legal document that precedes a potential fine. The notice sets out the ICO’s provisional view which may of course change after TikTok makes representations. 

Remember we have been here before. In July 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine was for £20 million issued in July 2020. In November 2020Marriott International Inc was fined £18.4 million, much lower than the £99 million set out in the original notice.

This is not the first time TikTok has found itself in hot water of over its data handling practices. In 2019, the company was given a record $5.7m fine by the Federal Trade Commission, for mishandling children’s data. It has also been fined in South Korea for similar reasons.

Are you an experienced GDPR Practitioner wanting to take your skills to the next level? Our Advanced Certificate in GDPR Practice starts on 25th October.