Facial Recognition CCTV Cameras in Every Store?

The Observer recently reported that Home Office officials have developed covert plans to lobby the Information Commissioner’s Office (ICO) in an effort to hasten the adoption of contentious facial recognition technology in high street stores and supermarkets. Critics argue that such technology raises concerns about bias and data privacy.

Despite these objections, the Home Office appears to be pushing for the adoption of facial recognition in stores. The minutes of the recent meeting, obtained under the Freedom of Information Act, appear to show Home Office officials agreeing to write to the ICO praising the merits of facial recognition technology in combating “retail crime”. This ignores critics who claim the technology violates human rights and is biased, particularly against darker-skinned people.

Police minister Chris Philp, senior Home Office officials, and the commercial company Facewatch came to an agreement on the covert strategy on 8th March 2023 during a meeting held behind closed doors. Facewatch provides facial recognition cameras to help retailers combat shoplifting. It has courted controversy and was investigated by the ICO earlier this year following a complaint by Big Brother Watch.

Despite finding multiple UK GDPR violations on 28th March, the ICO told Facewatch it would take no further action. The ICO said it “welcomed” remedial steps that Facewatch had taken, or would take, to address the above violations. Those remedial steps have been redacted from public information about the case.

Facial recognition technology has faced extensive criticism and scrutiny, leading the European Union to consider a ban on its use in public spaces through the upcoming Artificial Intelligence Act. However, the UK’s Data Protection and Digital Information (No.2) Bill proposes to eliminate the government-appointed Surveillance Camera Commissioner role and the requirement for a surveillance camera code of practice.

Our forthcoming CCTV workshop is ideal for those who want to explore the GDPR and privacy issues around all types of CCTV cameras including drones and body worn cameras. Our Advanced Certificate in GDPR Practice is a practical scenario based course designed to help delegates gain the confidence to tackle complex GDPR issues in methodical way.

Leading Surveillance Law Expert Joins the Act Now Team

Act Now Training welcomes solicitor and surveillance law expert, Naomi Mathews, to its team of associates. Naomi is a Senior Solicitor and a co-ordinating officer for RIPA at a large local authority in the Midlands. She is also the authority’s Data Protection Officer and Senior Responsible Officer for CCTV.

Naomi has extensive experience in all areas of information compliance and has helped prepare for  RIPA inspections both for the Office of Surveillance Commissioners and Investigatory Powers Commissioner’s Office (IPCO). She has worked as a defence solicitor in private practice and as a prosecutor for the local authority in a range of regulatory matters including Trading Standards, Health and Safety and Environmental prosecutions. Naomi has higher rights of audience to present cases in the Crown Court.

Naomi has many years of practical knowledge of RIPA and how to prepare for a successful prosecution/inspection. Her training has been commended by RIPA inspectors and she has also trained nationally. Naomi’s advice has helped Authorising Officers, Senior Responsible Officers and applicants understand the law and practicalities of covert surveillance. 

Like our other associates, Susan Wolf and Kate Grimley Evans, Naomi is a fee paid member of the Upper Tribunal assigned to the Administrative Appeals Chamber (Information Rights Jurisdiction and First Tier Tribunal General Regulatory Chamber (Information Rights Jurisdiction).

Ibrahim Hasan, director of Act Now Training, said:

“ I am pleased that Naomi has joined our team. We are impressed with her experience of RIPA and her practical approach to training which focuses on real life scenarios as opposed to just the law and guidance.”

Naomi will be delivering our full range of RIPA workshops as well developing new ones. She is also presenting a series of one hour webinars on RIPA and Social Media. If you would like Naomi to deliver customised in house training for your organisation, please get in touch for a quote. 

Facial Recognition in Schools: Please, sir, I want some more.

Yesterday the Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”

For a few years now, schools have used biometrics including automated fingerprint identification systems for registration, library book borrowing and cashless catering. Big Brother Watch reported privacy concerns about this way back in 2014. Now a company, called CRB Cunninghams, has introduced facial recognition technology to allow schools to offer children the ability to collect and pay for lunches without the need for physical contact. In addition to the nine schools in Scotland, four English schools are reported to be introducing the technology. Silkie Carlo, the head of Big Brother Watch, said: 

“It’s normalising biometric identity check for something that is mundane. You don’t need to resort to airport-style [technology] for children getting their lunch.”

The law on the use of such technology is clear. Back in 2012, the Protection of Freedoms Act (POFA) created an explicit legal framework for the use of all biometric technologies (including facial recognition) in schools for the first time. It states that schools (and colleges) must seek the written consent of at least one parent of a child (anyone under the age of 18) before that child’s biometric data can be processed. Even if a parent consents, the child can still object or refuse to participate in the processing of their biometric data. In such a case schools must provide a reasonable alternative means of accessing the service i.e. paying for school meals in the present case. 

POFA only applies to schools and colleges in England and Wales. However, all organisation processing personal data must comply with the UK GDPR. Facial recognition data, being biometric, is classed as Special Category Data and there is a legal prohibition on anyone processing it unless one of the conditions in paragraph 2 of Article 9 are satisfied. Express consent of the Data Subjects (i.e. the children, subject to their capacity) seems to be the only way to justify such processing. 

In 2019 the Swedish Data Protection Authority fined an education authority (SEK 200 000 ,approximately 20 000 Euros) after the latter instructed schools to use facial recognition to track pupil attendance. The schools had sought to base the processing on consent. However, the Swedish DPA considered that consent was not a valid legal basis given the imbalance between the Data Subject and the Data Controller. It ruled that there was a breach of Article 5, by processing students’ personal data in a manner that is more intrusive as regards personal integrity and encompasses more personal data than is necessary for the specified purpose (monitoring of attendance), Article 9 and Articles 35 and 36 by failing to fulfil the requirements for an impact assessment and failing to carry out prior consultation with the Swedish DPA. 

The French regulator (CNIL) has also raised concerns about a facial recognition trial commissioned by the Provence-Alpes-Côte d’Azur Regional Council, and which took place in two schools to control access by pupils and visitors. The CNIL concluded that “free and informed consent of students had not been obtained and the controller had failed to demonstrate that its objectives could not have been achieved by other, less intrusive means.” CNIL also said that facial recognition devices are particularly intrusive and present major risks of harming the privacy and individual freedoms of the persons concerned. They are also likely to create a sense of enhanced surveillance. These risks are increased when facial recognition devices are applied to minors, who are subject to special protection in national and European laws.

Facial recognition has also caused controversy in other parts of the world recently. In India the government has been criticised for its decision to install it in some government-funded schools in Delhi. As more UK schools opt for this technology it will be interesting to see how many objections they receive not just from from parents but also from children. This and other recent privacy related stories highlight the importance of a Data Protection Officer’s role.

BONUS QUESTION: The title of this contains a nod to which classic novel? Answers in the comments section below.

All the recent GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop. We have a few places left on our Advanced Certificate in GDPR Practice course starting in November.

Coronavirus and Police Use of Drones

Man operating a drone at sunset using a controller

The police have an important rule to play in the current coronavirus lockdown.  However their actions must at all times be proportionate, transparent and (above all) lawful. Only yesterday, British Transport Police admitted they had wrongly charged a woman who was fined £660 under coronavirus legislation. Marie Dinou was arrested at Newcastle Central Station on Saturday after she refused to tell police why she needed to travel. A police and Crown Prosecution Service review said she was charged under the wrong part of the Corona Virus Act. The court will be asked to set the conviction aside.

This is not the only recent incident of the police overstepping the mark. By now most of us will have seen the story about a couple walking their dog in the Peak District. The video was filmed by a drone operated by the Derbyshire Police Drone Unit, and broadcast to the nation on BBC news. According to Derbyshire Police’s Twitter feed (which broadcast the same 90 second footage) the police force wanted to reinforce the government message of ‘stay at home’ and to point out this was not getting through, by effectively ‘shaming’ the couple who were captured on camera.

The video has sparked huge controversy from various circles including civil liberties campaign group Big Brother Watch and a leading member of the judiciary. According to the BBC, Big Brother Watch has described the move as ‘sinister and counter-productive’. Ex Supreme Court Judge, Lord Sumption, has also been very critical.
In BBC Radio 4’s World at One, Lord Sumption made it clear that the police have no legal power to enforce Government Ministers ‘wishes’ and guidance about non-essential travel. Although the government has enacted the Coronavirus Act 2020, this does not give the police any powers to stop individuals from non-essential travel or walking in isolated places. Lord Sumption’s criticism is most tellingly summed up in the following quotation:

“This is what a police state is like, it is a state in which the government can issue orders or express preferences with no legal authority and the police will enforce ministers’ wishes.”

At Act Now we are not able to comment on whether the police have the powers to do this but we respectfully accept Lord’s Sumption’s view that they did not. Our concern is whether the filming and broadcasting of these individuals was GDPR compliant.
Our conclusion is that it was not.

The use of drones poses a privacy risk. The Police Force took the decision to process this personal data for their own purposes (“to get the message across”). They are therefore Data Controllers and must comply with the General Data Protection Regulation (GDPR) in relation to this processing. Images of individuals constitute personal data where it is possible to identify them from those images (GDPR Article 4(1)). It is entirely possible that the individuals captured in that Derbyshire police video could be identified by their clothing, hair colour and the presence of their dog.

Drones can be used to film people in many locations, often without the knowledge of those being filmed. In these circumstances, the processing of personal data must be lawful (GDPR Article 5 (1)). It is questionable which Article 6 basis the police could rely on here. Arguably processing is necessary for a ‘task carried out in the public interest’. However one would have to ask why it was necessary to film and broadcast these individuals. The police could not rely on ‘legitimate interests’ because this does not apply to processing carried out by public authorities in performance of their task (GDPR Article 6 (1)(f)).

Even if the police could identify a lawful basis, the next question is whether this processing is fair. The ICO guidance states that Data Controllers should only process data in ways that people would reasonably expect and not use it in ways that have unjustified adverse effects on them. I would argue that it is highly unlikely that anybody walking their dog in an isolated part of the Peak District would have any reasonable expectation that they would be secretly filmed by a drone and that their images would be broadcast to the nation in an attempt to shame them. So it seems highly unlikely that this processing is fair.

GDPR also requires transparency when processing personal data. This means data subjects should be made aware that their personal data is being processed and why.
The ‘normal’ transparency requirements (usually the GDPR (Articles 12-14) are less onerous for the police when they are processing personal data for law enforcement purposes under Part 3 of the Data Protection Act 2018. However, the police admitted themselves that the filming was for the purposes of ‘getting a  message out’ and this does not fit easily within the definition of law enforcement purposes under S.31 DPA 2018. At best the police could try and argue that the processing was for the purposes of preventing threats to public security, but it is really difficult to see how this would succeed when it was just a couple walking their dog on an isolated stretch of path.

The police did not comply with the Information Commissioner’s tips on responsible drone use, in particular the advice about thinking carefully about sharing images on social media. The ICO cautions that drone users should avoid sharing images that could have unfair or harmful consequences. There is also little evidence that the Police had due regard to at least the first three guiding principles laid down in the Surveillance Camera Code of Practice or whether they conducted a Data Protection Impact Assessment.

On balance, the Derbyshire Police’s decision to film individuals taking a walk in an isolated area, in order to get a message across about not travelling unnecessarily was at best misguided, and at worst unlawful. The coronavirus is changing almost all aspects of our daily lives, and social distancing and self-isolating are the new norms. However, when the police take action it is still vital that they comply with their legal obligations in relation to the processing of personal data.

More on this and other developments in our FREE GDPR update webinar. Looking for a GDPR qualification from the comfort of your home office? Our GDPR Practitioner Certificate is now available as an online option.

gdprcert-online

Act Now launches GDPR Policy Pack

ACT NOW NEWS

The first fine was issued recently under the General Data Protection Regulation (GDPR) by the Austrian data protection regulator. Whilst relatively modest at 4,800 Euros, it shows that regulators are ready and willing to exercise their GDPR enforcement powers.

Article 24 of GDPR emphasises the need for Data Controllers to demonstrate compliance through measures to “be reviewed and updated where necessary”. This includes the implementation of “appropriate data protection policies by the controller.” This can be daunting especially for those beginning their GDPR compliance journey.

Act Now has applied its information governance knowledge and experience to create a GDPR policy pack containing essential documentation templates to help you meet the requirements of GDPR as well as the Data Protection Act 2018. The pack includes, amongst other things, template privacy notices as well as procedures for data security and data breach reporting. Security is a very hot topic after the recent £500,000 fine levied on Equifax by the Information Commissioner under the Data Protection Act 1998.

We have also included template letters to deal with Data Subjects’ rights requests, including subject access. The detailed contents are set out below:

  • User guide
  • Policies
    • Data Protection Policy
    • Special Category Data Processing (DPA 2018)
    • CCTV
    • Information Security
  • Procedures
    • Data breach reporting
    • Data Protection Impact Assessment template
    • Data Subject rights request templates
  • Privacy Notices
    • Business clients and contacts
    • Customers
    • Employees and volunteers
    • Public authority services users
    • Website users
    • Members
  • Records and Tracking logs
    • Information Asset Register
    • Record of Processing Activity (Article 30)
    • Record of Special Category Data processing
    • Data Subject Rights request tracker
    • Information security incident log
    • Personal data breach log
    • Data protection advice log

The documents are designed to be as simple as possible while meeting the statutory requirements placed on Data Controllers. They are available as an instant download (in Word Format). Sequential files and names make locating each document very easy.

Click here to read sample documents.

The policy pack gives a useful starting point for organisations of all sizes both in the public and private sector. For only £149 plus VAT (special introductory price) it will save you hours of drafting time. Click here to buy now or visit or our website to find out more.

Act Now provides a full GDPR Course programme including one day workshops, e learning, healthchecks and our GDPR Practitioner Certificate. 

CCTV and the Law

By Steve Morris[ File # csp0356261, License # 1228612 ]
Licensed through http://www.canstockphoto.com in accordance with the End User License Agreement (http://www.canstockphoto.com/legal.php)
(c) Can Stock Photo Inc. / fintastique

The updated version of the Information Commissioner’s CCTV Code of Practice address the rising phenomena of surveillance technologies and methods. No longer are surveillance cameras passive image collectors, providing a resource for immediate use or historical evidence.

CCTV, ANPR, Body Worn Cameras, Aerial Drones, together with the associated analytical tools and software, are all technologies being used within many public and private sector organisations.

These technologies are invaluable for efficient and effective public protection as well as revenue collection and enforcement activities. Just one such example might be lone workers performing a caring function and for their safety, wearing audio and video recording equipment when they leave the safety of their own home. These persons then enter the private dwelling of a vulnerable person in need of assistance. In some instances the video and audio will be running throughout the whole of the attendance – often with a live feed to a control room. The benefits for the safety of the carer are clear, and the immediate response and advice by control room personnel is undoubtedly beneficial for the person requiring assistance. But this equipment is capturing images and conversation of an individual, and perhaps family and friends, within that person’s private home. The images and conversation, being witnessed by others many miles away is likely to be very intimate and private.

Does this vulnerable person or those responsible for them realise this is actually taking place?

Do they consent to it as a part of the provision of the service?

Before a public authority undertakes such activity it must conduct a privacy impact assessment, and perhaps obtain consent for the collection and processing of such information. Without such consideration – and a record of such assessment, then it might easily be argued that the organisation has not shown “Respect for the private life” in accordance with Article 8 of the European Convention on Human Rights, and the activity might be deemed to be unlawful – and indeed might be in breach of the Data Protection Act 1998. The Care Quality Commission has issued guidance on use of cameras in care homes.

The Surveillance Camera Commissioner, Tony Porter, pursuing compliance with a Code of Practice issued in accordance with the Protection of Freedoms Act has identified several aspects non-compliance when it comes to CCTV cameras:

  • Inadequate or non-existent privacy impact assessments
  • Equipment deployed with no respect or consideration for privacy or consideration for the benefit balanced with intrusion (proportionality)
  • Equipment in use not fit for purpose
  • Excessive use of surveillance
  • Removal of surveillance such as CCTV to reduce costs with little regard for the void left in relation to public safety and security

In a speech to the CCTV User Group, Mr Porter said budget cuts had led councils to decide to spend less on public space CCTV, meaning there was less money for staff training, poorer understanding of legal issues and a reduced service. He said councils could face greater scrutiny of their use of CCTV, including potential inspections and enforcement. Organisations should carry out annual reviews of their CCTV capacity but many failed to do so. He cited a West Midlands local authority which, upon review, reduced the number of ineffective cameras and saved £250,000 in the process.

Mr Porter, who has been in his post since March 2014, has written to council chief executives to remind them of the law and code of practice.

My latest series of one day CCTV law workshops examine the ‘surveillance landscape’ and the regulatory regime of the Information Commissioner, the Office of the Surveillance Commissioner, and the Surveillance Camera Commissioner. Attendees will be able to identify which regime(s) and codes of practice apply to their surveillance activity, and how to manage efficient, effective and lawful surveillance systems.

Steve Morris is an ex police officer and one of our expert surveillance law trainers. His CCTV law workshops take place in Manchester and London in October.

CCTV Surveillance: Getting It Right

Steve Morris writes…

“I keep six honest serving men, they taught me all I know, their names are what, why, when, how, where and who…”

“I know a person small, she keeps ten million serving-men who get no rest at all! – One million how’s, two million where’s, and seven million whys!”

Rudyard Kipling 1902

Well it’s 2015 and we have an estimated 6 million (give or take a million or so!) surveillance cameras within the UK regulated sector, and that does not include those installed by private individuals. Cameras are no longer stuck on the end of poles recording peoples’ movements. They are worn by officials, installed on public transport and can even predict peoples’ behaviour.

Image technology has advanced tremendously in recent years. Data captured by CCTV systems is often automatically interacting with other databases with the capability of providing very intrusive information about the private lives and activities of innocent individuals as well as offenders and those that pose a risk to society.

We are also going through economically difficult times. CCTV and other surveillance technology can be seen a cost effective answer to the resource problem. However, without careful planning and regular review, it can be a costly option that might in fact provide little or no benefit and/or land an organisation in trouble with the various regulators in this sector. The Information Commissioner’s Office (ICO) has taken enforcement action involving both number plate recognition systems and cameras  recording customers’ conversations in taxis.

The ICO is not the only regulator in this area. The Surveillance Camera Commissioner is tasked with raising awareness of the Surveillance Camera Code. Made pursuant to the Protection of Freedoms Act 2012 it governs the use of surveillance camera systems including CCTV and Automatic Number Plate Recognition (ANPR) operated by the police and councils in England and Wales.

The Office of the Surveillance Commissioner has oversight in relation to the covert surveillance under Part 2 of the Regulation of Investigatory Powers Act 2000  (RIPA). This often involves the deployment of covert CCTV cameras. Recently Ibrahim Hasan alerted you to the revisions of the two RIPA codes of practice.

So why quote Rudyard Kipling’s poem from 1902?

The overall question revolves around whether a ‘scatter gun approach’ (obtaining lots of private data from lots of cameras) is actually a practical, cost effective use of resources. Furthermore is this approach a lawful, necessary and proportionate approach to addressing a ‘pressing social need’ or problem? Or would a smaller number of cameras providing images and data of the quality required, when it is required, be a better use of resources?

Compliance with the various codes and laws which govern CCTV, is easy if key questions are addressed at the outset:

  1. What is the pressing social need or lawful grounds for the CCTV surveillance activity? What type(s) of devices and system is appropriate? What personal data is going to be collected? What policies and processes should we have?
  2. Why do we need this surveillance in this place? Why is surveillance the option we have chosen?
  3. When should the system be capturing and recording information? When is it right to share this information?
  4. How will the system be managed? How much private information are we obtaining about individuals? How will we ensure it is kept secure?
  5. Where will the cameras be positioned? Where will we store the data?
  6. Who will we be watching? Who will have access to the collected information?

Looking for an opportunity to discuss these questions and many others, and to examine the regulatory requirements in relation to the decision making process? Attend one of my CCTV workshops and be brought right up to date with the latest laws, codes of practice and guidance.

Steve Morris is an ex police officer and one of our experts in surveillance law trainers.

Yet Another CCTV Code

picture camCCTV is a hot topic. Following complaints by Big Brother Watch, the ICO has taken enforcement action involving both number plate recognition cameras and cameras recording people’s conversations in taxis.

On 20th May 2014, the Information Commissioner’s Office (ICO) launched a consultation on a revised Code of Practice on CCTV. The previous version was published in 2008. On 15th October the ICO published the 44 page code of practice on surveillance cameras and personal information. Jonathan Bamford, Head of Strategic Liason at the ICO, states in his blog post launching the code:

“Today’s updated CCTV code is one that is truly fit for the times that we live in. The days of CCTV being limited to a video camera on a pole are long gone. Our new code reflects the latest advances in surveillance technologies and their implementation, while explaining the key data protection issues that those operating the equipment need to understand.”

There are no major changes in the code when compared with the previous version. The ICO once again emphasises fundamental Data Protection Act (DPA) principles e.g. informing people about the information being collected about them, keeping data collected secure and having effective retention and disposal schedules.

The new and emerging technologies section of the code covers the key surveillance technologies that the ICO believes will become increasingly popular in the years ahead. Jonathan Bamford says:

The days of CCTV being limited to a video camera on a pole are long gone. Our new code reflects the latest advances in surveillance technologies and their implementation, while explaining the key data protection issues that those operating the equipment need to understand.”[A1]

The code emphasises the importance of conducting a privacy impact assessment before undertaking surveillance using CCTV, especially when fitted to drones e.g. broadcasters seeking to gather footage for production purposes, police forces conducting surveillance on suspects, or construction companies monitoring job progress. Concerns have been expressed about the legal use of drones. The BBC reports, “Drones which could seriously injure or kill are being flown over cities and towns across England, despite laws designed to protect the public.” The code refers to drones as ‘unmanned aerial vehicle’ (UAV) and the overarching systems in which UAV’s are used as ‘unmanned aerial systems’ (UAS). Key points include:

· Organisations should ensure there is an on/off button for recording in UAS’ and have “strong justification” for continuously recording via the system.

· Continuous recording must be both “necessary and proportionate” for the purpose the business is pursuing.

· The Fair Processing Code under Principle 1 of the DPA must be complied with. Website notices, social media, highly visible clothing and signage telling the public about the use of drones for filming in the area can help to do this.

Many councils now use body worn cameras to, amongst other things, help deal with combating anti-social behaviour or to help gather evidence for parking enforcement. These small inconspicuous devices can record both sound and images. This can mean that they are capable of being much more intrusive than traditional town centre CCTV. The code states that the use of such cameras needs to be justified. Safeguards must be put in place to ensure they are only used when needed. Strong security is essential in case the devices fall into the wrong hands. The code identifies other practical steps to help users of these devices stay on the right side of the law.

The new ICO code is said to complement the Surveillance Camera Code (PoFA code) which came into force last year. Made pursuant to the Protection of Freedoms Act 2012 (PoFA) the latter governs the use of surveillance camera systems including CCTV and Automatic Number Plate Recognition (ANPR).

The ICO code applies to all data controllers (public and privacy sector) throughout the UK but the PoFA code currently only applies, in the main, to local authorities and policing authorities in England and Wales. The Scottish Government has produced its CCTV Strategy for Scotland. The strategy provides a common set of principles that operators of public space CCTV systems in Scotland must follow. The principles aim to ensure that these systems are operated fairly and lawfully and are using technologies compatible with the DPA.

As regards the legal effects of the PoFA Code:

“A failure on the part of any person to act in accordance with any provision of this code does not of itself make that person liable to criminal or civil proceedings. This code is, however, admissible in evidence in criminal or civil proceedings, and a court or tribunal may take into account a failure by a relevant authority to have regard to the code in determining a question in any such proceedings” (paragraph 1.16).

The Surveillance Camera Commissioner (SCC) has been appointed by the Home Secretary but has no enforcement or inspection powers unlike the ICO. He “should consider how best to ensure that relevant authorities are aware of their duty to have regard for the Code and how best to encourage its voluntary adoption by other operators of surveillance camera systems” (paragraph 5.3). The ICO says of its revised CCTV code:

“This code is consistent with the [Home Office] code and therefore following the guidance contained in this document will also help you comply with many of the principles in that code”.

It is essential that all CCTV operators, both in the public and private sector, read the new ICO code and revise their policies and procedures accordingly. Whilst the code is not legally binding, it will be taken into account by the Commissioner and the courts in deciding whether the DPA has been complied with.

Steve Morris will explain the new code and the wider law on CCTV surveillance in our full day workshop. Want a new practical qualification for the modern Data Protection Officer? Click here

 

NEW CCTV Code Consultation

CCTV MP900390153

On 20th May 2014, the Information Commissioner’s Office (ICO) launched a consultation on a revised Code of Practice on CCTV. This is intended to replace the current version, which was published in 2008, and aims to:

  • reflect the developments in existing technologies that have taken place in the last six years,
  • discuss the emergence of new surveillance technologies and the issues they present,
  • reflect further policy development in areas such as privacy impact assessments,
  • explain the impact that new case law has had on the area of surveillance systems
  • reflect the wider regulatory environment that exists when using surveillance systems.

Jonathan Bamford, Head of Strategic Liason at the ICO, states in his blog post that the revision covers “everything from automatic recognition of car number plates to flying drones” but emphasises that the underlying principles remain the same.

Since last Summer we have had two codes of practice on CCTV. The Surveillance Camera Code (PoFA code) came into force last year. Made pursuant to the Protection of Freedoms Act 2012 (PoFA) it governs governing the use of surveillance camera systems including CCTV and Automatic Number Plate Recognition (ANPR). The ICO code applies to all data controllers (public and privacy sector) but the PoFA code currently only applies, in the main, to local authorities and policing authorities. As regards its legal effects:

“A failure on the part of any person to act in accordance with any provision of this code does not of itself make that person liable to criminal or civil proceedings. This code is, however, admissible in evidence in criminal or civil proceedings, and a court or tribunal may take into account a failure by a relevant authority to have regard to the code in determining a question in any such proceedings” (paragraph 1.16).

The Surveillance Camera Commissioner (SCC) has been appointed by the Home Secretary but has no enforcement or inspection powers unlike the ICO. He “should consider how best to ensure that relevant authorities are aware of their duty to have regard for the Code and how best to encourage its voluntary adoption by other operators of surveillance camera systems” (paragraph 5.3). The ICO says of its revised CCTV code:

“This code is consistent with the [Home Office] code and therefore following the guidance contained in this document will also help you comply with many of the principles in that code”.

So why have two codes then? (Answers on a postcard or in the comment field below.) 

CCTV is a hot topic. Following complaints by Big Brother Watch, the ICO has taken enforcement action involving both number plate recognition and cameras recording people’s conversations in taxis. Big Brother Watch has welcomed the latest ICO consultation but has expressed concerns:

“We also remain concerned that, given that the responsibility for legally enforcing the Data Protection Act with regard to CCTV (apart from private cameras, which remain exempt) will remain with the ICO rather than the SCC, public confidence will not be helped if the process of making a complaint and action being taken is not straightforward. Equally, the situation of private cameras not being subject to regulation, with the only power available to the police to prosecute for harassment, is unsustainable as the number of people using them increases.”

The consultation runs until 1st July 2014.

Our full day CCTV workshop will explain the revised code and the wider law on CCTV surveillance in detail. Want a new practical qualification for the modern Data Protection Officer? Click here

Sat Nav Bad Day

In March 1998, High Court Judge Lord Justice Brown threw a claim out of court by the Police against a motorist who was caught using a Radar Detector. The Police claimed that under the Wireless Telegraphy Act of 1949, the motorist was illegally using the device. The Judge ruled that the Radar Detector did not actually receive any intelligible police information and that the Detector was only picking up the presence of radar and not any information within it. This case set a precedent and made the use of Radar Detectors legal in the UK. To over-rule this judgement, the Road Safety Act 2006 specifically bans the use of radar and laser detectors. Most drivers are happy with this situation. They know where cameras are because a fair processing notice is in place (to comply with principle 1 of the DPA) and this is usually a picture of an old fashioned camera recognised by millions. Even the mobile cameras that travel to different locations have their way of delivering an FPN although it is usually found on the web rather than in situ.

So we’re relatively happy. We know where all the speed cameras are and we see them in map books, on the net; We hear about mobile cameras on local radio and TV and we’re cool about it. We buy our TomToms and justify using them saying “it’s just an electronic version of publicly available database”. Then we go to France on holiday.

Since decree n°2012-3 was introduced on 3 January 2012 it has been illegal to be warned about the position of fixed or mobile speed cameras while you are driving in France. If your sat nav has this function and you continue to use the service, you risk a fine of up to €1500. Even if the device is switched off and not operational the possession of such witchcraft is the work of the devil. Ken Russell would have loved to have made a film about it. Good old data subjects from Blighty being thwarted by sneaky foreigners not even bothering to use Schedule 2 (6) just ignoring the rights of individuals and worse disapplying the Subject Information Provisions.

Initially this sounds quite tough. There have been discussions on the web, advice from motoring lobbys and horror stories of motorists having their boot searched by a bold gendarme emerging triumphantly from black plastic sacks of dirty washing with an old device and demanding instant payment of a fine. There is also the other view that the law is unenforceable; that Gendarmes cannot search for satnavs, cannot operate them if they see one as it is technically a computer and their common law powers don’t extend to interrogating them, they cannot check your smart phone for that app you downloaded for free…

The truth naturally lies in the middle. There’s been discussions between french satnav manufacturers and government (one french firm feared 2,000 job losses) and they’ve come up with a concept of danger zones. Instead of listing cameras they list danger zones where there may be a hazard (such as a level crossing or a school or where people might speed) and the satnav can issue a warning of the danger.

The french authorities meanwhile are pushing ahead with a programme of taking down existing signs warning of cameras; they are setting up new cameras and not telling drivers where they are and generally acting very french. Pah! I spit on your schedule 2 requirement.

Other solutions suggested in hyperspace include modifying your satnav camera POIs and labelling them lay bys. (or transport caffs); Registering your car in Lithuania; Buying your next satnav from France and specifying UK maps…(although we did hear that french spoken instructions interpret M25 as Monsieur Vingt Cinq) or exploring Germany which has excellent weissbier and many ancient castles.

Glossary.

A speed camera is un radar (pronounced rad – ah).
A satnav is a GPS (pronounced shay pay ess)
Zones of danger – zits noirs
Breathalyser is un alcooltest (did we forget to tell you that by law you must carry two of these in your car as well as a dayglo yellow vest for each passenger)

Useful phrases

  • Bordelle de merde, espece de radar
  • Fer cryin’ out loud a bloody speed camera
  • Est-ce qu’il y a une brasserie independante dans ce trou a rat, j’ai envie d’une biere?
  • Please direct me to a real ale pub if you have one in this dump of a town.
  • Va te faire cuire un oeuf, sale gendarme.
  • I don’t agree with you officer.

Bonnes Vacances!

%d bloggers like this: