Facebook, Social Networks and the Need for RIPA Authorisations

canstockphoto12584745By Ibrahim Hasan

Increasingly local authorities are turning to the online world, especially social media, when conducting investigations. There is some confusion as to whether the viewing of suspects’ Facebook accounts and other social networks requires an authorisation under Part 2 of the Regulation of Investigatory Powers Act 2000 (RIPA). In his latest annual report the Chief Surveillance Commissioner states (paragraph 5.42):

“Perhaps more than ever, public authorities now make use of the wide availability of details about individuals, groups or locations that are provided on social networking sites and a myriad of other means of open communication between people using the Internet and their mobile communication devices. I repeat my view that just because this material is out in the open, does not render it fair game. The Surveillance Commissioners have provided guidance that certain activities will require authorisation under RIPA or RIP(S)A and this includes repetitive viewing of what are deemed to be “open source” sites for the purpose of intelligence gathering and data collation.”

Careful analysis of the legislation suggests that whilst such activity may be surveillance, within the meaning of RIPA (see S.48(2)), not all of it will require a RIPA authorisation. Of course RIPA geeks will know that RIPA is permissive legislation anyway and so the failure to obtain authorisation does not render surveillance automatically unlawful (see Section 80).

There are two types of surveillance, which may be involved when examining a suspect’s Facebook or other social network pages; namely Directed Surveillance and the deployment of a Covert Human Intelligence Source (CHIS). Section 26 of the Act states that surveillance has to be covert for it to be directed:

“surveillance is covert if, and only if, it is carried out in a manner that is calculated to ensure that persons who are subject to the surveillance are unaware that it is or may be taking place” (my emphasis)

If an investigator decides to browse a suspect’s public blog, website or “open” Facebook page (i.e. where access is not restricted to “friends”, subscribers or followers) how can that be said to be covert? It does not matter how often the site is accessed as long as the investigator is not taking steps to hide his/her activity from the suspect. The fact that the suspect is not told does about the “surveillance” does not make it covert. Note the words in the definition of covert; “unaware that it is or may be taking place.” If a suspect chooses to publish information online they can expect the whole world to read it including law enforcement and council investigators. If he/she wants or expects privacy it is open to them to use the available privacy settings on their blog or social network.

The Commissioner stated in last year’s annual report:

“5.31 In cash-strapped public authorities, it might be tempting to conduct on line investigations from a desktop, as this saves time and money, and often provides far more detail about someone’s personal lifestyle, employment, associates, etc. But just because one can, does not mean one should. The same considerations of privacy, and especially collateral intrusion against innocent parties, must be applied regardless of the technological advances.” (my emphasis)

I agree with the last part of this statement. The gathering and use of online personal information by public authorities will still engage Human Rights particularly the right to privacy under Article 8 of the European Convention on Human Rights. To ensure such rights are respected the Data Protection Act 1998 must be complied with. A case in point is the monitoring last year of Sara Ryan’s blog by Southern Health NHS Trust. Our data protection expert Tim Turner wrote recently about the data protection implications of this kind of monitoring.

Where online surveillance involves employees then the Information Commissioner’s Office’s (ICO) Employment Practices Code (part 3) will apply. This requires an impact assessment to be done before the surveillance is undertaken to consider, amongst other things, necessity, proportionality and collateral intrusion. Whilst the code is not law, it will be taken into account by the ICO and the courts when deciding whether the DPA has been complied with. In December 2014, Caerphilly County Borough Council signed an undertaking after an ICO investigation found that the Council’s surveillance of an employee , suspected of fraudulently claiming to be sick, had breached the DPA.

Facebook Friends – A Friend Indeed

Of course the situation will be different if an investigator needs to become a “friend’ of a person on Facebook in order to communicate with them and get access to their profile and activity pages. For example, local authority trading standards officers often use fake profiles when investigating the sale of counterfeit goods on social networks. In order to see what is on sale they have to have permission from the suspect. This, in my view, does engage RIPA as it involves the deployment of a CHIS defined in section 26(8):

“For the purposes of this Part a person is a covert human intelligence source if—

(a) he establishes or maintains a personal or other relationship with a person for the covert purpose of facilitating the doing of anything falling within paragraph (b) or (c);

(b) he covertly uses such a relationship to obtain information or to provide access to any information to another person; or

(c) he covertly discloses information obtained by the use of such a relationship, or as a consequence of the existence of such a relationship”  (my emphasis)

Here we have a situation where a relationship (albeit not personal) is formed using a fake online profile to covertly obtain information for a covert purpose. In the case of a local authority, this CHIS will not only have to be internally authorised but also, since 1st November 2012, approved by a Magistrate.

This is a complex area and staff who do not work with RIPA on a daily basis can be forgiven for failing to see the RIPA implications of their investigations. From the Chief Surveillance Commissioner’s comments (below) in his annual report, it seems advisable for all public authorities to have in place a corporate policy and training programme on the use of social media in investigations:

“5.44 Many local authorities have not kept pace with these developments. My inspections have continued to find instances where social networking sites have been accessed, albeit with the right intentions for an investigative approach, without any corporate direction, oversight or regulation. This is a matter that every Senior Responsible Officer should ensure is addressed, lest activity is being undertaken that ought to be authorised, to ensure that the right to privacy and matters of collateral intrusion have been adequately considered and staff are not placed at risk by their actions and to ensure that ensuing prosecutions are based upon admissible evidence.”

We have a workshop on investigating E – Crime and Social Networking Sites, which considers all the RIPA implications of such activities. It can also be delivered in house.

In conclusion, my view is that RIPA does not apply to the mere viewing of “open” websites and social network profiles. However in all cases the privacy implications have to be considered carefully and compliance with the Data Protection Act is essential.

Ibrahim will be looking at this issue in depth in our forthcoming webinars.

Looking to update/refresh your colleagues’ RIPA Knowledge. Try our RIPA E Learning Course. Module 1 is free.

We also have a full program of RIPA Courses and our RIPA Policy and Procedures Toolkit contains standard policies as well as forms (with detailed notes to assist completion).

Controlling, Lying and Blocking: Ways for the individual to win the privacy arms race?

This is a version of Marion Oswald’s speech at the launch of the Centre for Law & Information Policy at the Institute of Advanced Legal Studies on 24 February 2015.

My talk is about controlling, lying and blocking. Could these activities enable an individual to win the privacy arms race against the data collection, surveillance, behavioural tracking and profiling abilities of search engines, marketers, social networking sites and others?

When we think about an arms race, we might imagine two sides evenly matched, both equally able to equip themselves with weapons and defences. But when it comes to individuals versus data collectors, the position is considerably unbalanced, the equivalent of a cavalry charge against a tank division.

It’s not however as if the individual is without protections. Let’s take consent, a key principle, as we know, of European data protection law. Consent based on privacy policies is rather discredited as an effective means of enforcing privacy rights over data held by commercial third parties. If I might quote Lillian Edwards, ‘consent is no guarantee of protection on Facebook and its like, because the consent that is given by users is non-negotiable, non-informed, pressurised and illusory.’[i] So what about regulatory enforcement? In the UK, it could be described as mostly polite, in the rest of Europe, sometimes a little more robust. The FTC in the US has had some notable successes with its enforcement action based on unfair practices, with Jessica Rich, Director of the FTC’s Bureau of Consumer Protection, advocating privacy as being part of the ‘bottom line.’[ii] It remains to be seen whether market pressures will drive good faith changes in privacy practices – alternative subscription, advertising-free business models have failed to make much headway in terms of market share. The so-called ‘right-to-be-forgotten’ has been much debated and I would question how much the Google Spain decision[iii] adds to the individual’s armoury, the original publication remaining unaffected. And as for personal data anonymisation, this could be subject of a whole afternoon’s debate in itself!

What can individuals do if they want to take matters into their own hands, and become a ‘privacy vigilante’?[iv] Here are three possibilities: first, personal data stores (or ‘personal information management services’) are said by their promoters to enable individuals to take back control over their personal data and manage their relationship with suppliers. Pentland from MIT describes a PDS as ‘a combination of a computer network that keeps track of user permissions for each piece of personal data, and a legal contract that specifies both what can and can’t be done with the data, and what happens if there is a violation of the permissions.’[v]

Secondly, blocking. Systems could prevent tagging of individuals by third parties and set privacy defaults at the most protective. Lifelogging technologies could prevent the display of any recognisable image unless that individual has given permission.[vi] Individuals could deploy a recently invented Google Glass detector, which impersonates the Wi-fi network, sends a ‘deauthorisation’ command and cuts the headset’s internet connection.[vii]

Finally, obfuscation, by which technology is used to produce false or misleading data in an attempt, as Murray-Rust et al. put it, to ‘cloud’ the lens of the observer.[viii] It’s the technological equivalent of what most of us will have already done online: missing off the first line of our address when we enter our details into an online form; subtly changing our birthday; accidentally/on-purpose giving an incorrect email address in exchange for a money-off voucher. A personal data store could, for instance, be used to add ‘chaff’ (adding multiple data points amongst the real ones), or simulating real behaviour such as going on holiday. Brunton & Nissenbaum describe obfuscation as a ‘viable and reasonable method of last-ditch privacy protection.’[ix] On the face of it, obfuscation may seem to be an attractive alternative approach, providing individuals with a degree of control over how much ‘real’ information is released and some confidence that profiling activities will be hampered.

Are these methods ways for the individual to win the privacy arms race? As things stand, I have my doubts, although that is not to say that a legal and regulatory regime could not be created to support these methods. PDSs raise numerous questions about contract formation, incorporation, offers and counter-offers. Service providers would need to be prepared to change their business models fundamentally if PIMS are to fulfil their potential. In the short term, there appears to be little commercial incentive for them to do so.

In terms of blocking, systems could adopt protective measures but they don’t, because they don’t have to. Google Glass blockers may well fall foul of computer misuse legislation if used by members of the public rather than the network owner. In the UK, there would be a risk of a section 3 offence under the Computer Misuse Act 1990 – an unauthorised act with intent to impair the operation of any computer. Haddadi et al. suggest the ‘continuous broadcast of a Do-Not-Track beacon from smart devices carried by individuals who prefer not to be subjected to image recognition by wearable cameras’ although the success of this would depend on regulatory enforcement and whether device providers received and conformed to such requests.[x] It would be rather ironic, however, if one had to positively broadcast one’s presence to avoid image recognition.

As for obfuscation or lying on the internet, Murray-Rust et al. distinguish between official data, where obfuscation may be a criminal offence, and other data that can be obfuscated ‘without legal consequence.’[xi] The distinction is unlikely to be so clear cut: both on the civil side, and on the criminal side (fraud and computer misuse spring to mind), and this is something that I’ll be writing about in the future.

I would like to finish with this question about privacy vigilantism: by continuing to shift responsibility onto the individual, is this letting society off-the-hook for finding better solutions to privacy concerns?[xii] I think it probably is. Finding better solutions will require even closer interaction between computer scientists, lawyers and policy-makers.

Marion Oswald is a Senior Fellow and Head of the Centre for Information Rights at the University of Winchester (marion.oswald@winchester.ac.uk @_UoWCIR). This article was first published by the Society for Computers & Law and is reproduced with the author’s kind permission.

The 2nd Winchester Conference on Trust, Risk, Information & the Law on 21 April 2015 will be exploring the theme of the privacy arms race. To book, please click here.


[i] Lillian Edwards, Privacy, law, code and social networking sites, in Research Handbook on Governance of the Internet, (2013) Edward Elgar (Cheltenham) Ian Brown (Ed), 309-352, 324-328

[ii] Jessica Rich, Director, Bureau of Consumer Protection, Federal Trade Commission Beyond Cookies: Privacy Lessons for Online Advertising, AdExchanger Industry Preview 2015, January 21, 2015, 4 http://www.ftc.gov/system/files/documents/public_statements/620061/150121beyondcookies.pdf

[iii] Google Spain v AEPD and Mario Costeja Gonzalez (C-131/12), 13 May 2014

[iv] Marion Oswald, Seek, and Ye Shall Not Necessarily Find: The Google Spain Decision, the Surveillant on the Street and Privacy Vigilantism, 99-115, Digital Enlightenment Yearbook 2014 (K. O’Hara et al. (Eds)

[v] A. Pentland, Social Physics: How Good Ideas Spread – The Lessons from a New Science, The Penguin Press, New York, 2014

[vi] C. Gurrin, R. Albatal, H. Joho, K. Ishii, ‘A Privacy by Design Approach to Lifelogging’, Digital Enlightenment Yearbook 2014 (K. O’Hara et al. (Eds), 49-73, 68

[vii] A. Greenberg, Cut Off Glassholes’ Wi-Fi With This Google Glass Detector, Wired, June 3, 2014, http://www.wired.com/2014/06/find-and-ban-glassholes-with-this-artists-google-glass-detector/

[viii] D. Murray-Rust, M. Van Kleek, L. Dragan, N. Shadbolt, Social Palimpsests – Clouding the Lens of the Personal Panopticon, 75-96, 76, Digital Enlightenment Yearbook 2014 (K. O’Hara et al. (Eds)

[ix] Finn Brunton, Helen Nissenbaum, ‘Vernacular resistance to data collection and analysis: A political theory of obfuscation’ First Monday, Volume 16, Number 5, 2 May 2011 http://firstmonday.org/article/view/3493/2955

[x] H. Haddadi, A. Alomainy, I. Brown, Quantified Self and the Privacy Challenge in Wearables, Society for Computers & Law, 5 August 2014 http://www.scl.org/site.aspx?i=ed38111

[xi] nviii,90

[xii] nix

Exit mobile version
%%footer%%