The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

The Facebook Data Breach Fine Explained

2000px-F_icon.svg-2

 

On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998.  In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case.  For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.

The Facts

In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:

  • Their public FB profile, date of birth and current city
  • Photographs they were tagged in
  • Pages they liked
  • Posts on their time lime and their news feed posts
  • Friends list
  • Facebook messages (there was evidence to suggest the app also accessed the content of the messages)

The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.

The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).

Facebook Fine Graphic

In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information.  The App remained in operation until May 2015.

Breach of the DPA

The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.

The FB companies had breached s 4 (4) DPA 1998  by failing to comply with the 1stand 7th data protection principles. They had:

  1. Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly  given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
  2. Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing.  The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing.  (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.

Breach of FB Platform Policy

Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:

  • Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
  • Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
  • The App required permission from users to obtain personal data that the App did not need in breach of the policy.

The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.

Joint Data Controllers

The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.

The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.

The Use of Data Analytics for Political Purposes

The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.

As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.

Susan Wolf will be delivering these upcoming workshops and the forthcoming FOI: Contracts and Commercial Confidentiality workshop which is taking place on the 10th December in London. 

Our 2019 calendar is now live. We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. 

Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.

LGL Advert

 

Decision: Facebook Fan Page Administrators are Data Controllers

canstockphoto29052513

By Susan Wolf

On 5th June 2018 the Court of Justice of the European Union (CJEU) delivered its long awaited Facebook fan page decision. The case concerned the definition of data controller under the now repealed Data Protection Directive 95/46/EC [1] and in particular whether the administrator user of a Facebook fan page was a data controller.

The fact that the Data Protection Directive has been replaced by the GDPR 2016 should not diminish the importance of this ruling, particularly for organisations that use Facebook or other social media platforms to promote their business or organisation.

We explain some of the issues raised in the case and consider the implications of the ruling for administrators of Facebook fan pages under the GDPR.

The case

The case involved Wirtschaftsakademie Schleswig-Holstein GmbH, a private training academy in Germany. The company provided business training for commerce and industry (including GDPR training).  It operated a Facebook fan page to make people aware of its range of services and activities.

Fan pages are user accounts that can be set up on Facebook by individuals or businesses. According to Facebook, a fan page is a place where businesses can create a space on Facebook, to connect with people to tell them about their business.  Fan pages are not the same as Facebook profiles, which are limited purely for individuals’ personal use. Unlike a personal Facebook profile, a Fan page is accessible to anyone using the Internet.

Authors of fan pages must register with Facebook in order to use the online platform to post any kind of communication. At that time, fan page administrators could obtain, from Facebook, anonymous statistical information on visitors to the fan page, via a function called ‘Facebook Insights’. That information was collected by means of ‘cookies’, each containing a unique user code, which remained active for two years and were stored by Facebook on the hard disk of the computer or on other media of visitors to fan pages. The user code, which could be matched with the connection data of users registered on Facebook, was collected and processed when the fan pages were opened.

The service, which was provided free of charge under non-negotiable terms, was no doubt very useful to the German Training Academy.  Unfortunately, neither Wirtschaftsakademie, nor Facebook Ireland notified anybody ‘visiting’ the fan page about the use of the cookies or the subsequent processing of the personal data.  The German Data Protection Supervisory Authority for the Schleswig-Holstein Land (Region) took the view that by setting up its fan page, the Wirtschaftsakademie had made an active and deliberate contribution to the collection by Facebook of personal data relating to visitors to the fan page, from which it profited by means of the statistics provided to it by Facebook.  The regulator concluded (in November 2011) that the Wirtschaftsakademie was a data controller and consequently ordered it to deactivate its fan page and threatened a penalty payment if the page was not removed.

The Wirtschaftsakademie challenged that before the German Administrative Court. Their main argument was that it was not responsible under data protection law for the processing of the data by Facebook or the cookies that Facebook installed, and neither had it commissioned Facebook to process personal data on its behalf. This argument was successful before the administrative court. However the regulator appealed and what followed was lengthy protracted litigation in the German courts. By 2016 the case had reached the Federal Administrative Court. The Federal Court also agreed that the Wirtschaftsakademie was not responsible for the data processing as defined by Article 2 (d) of the Data Protection Directive:

  • (d) ‘controller’ shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data. The GDPR, Article 4 defines data controller in identical terms.

However, the Federal Court also decided that it was necessary to refer the question to the CJEU under the preliminary rulings, particularly since the CJEU had previously ruled [2] that the concept of data controller should be given a broad interpretation in the interests of the effective protection of the right of privacy.

The CJEU Ruling

The CJEU has no difficulty in concluding that Facebook Inc. and Facebook Ireland were data controllers because they

determined the purposes and means of processing the personal data of Facebook users and anyone visiting fan pages hosted on Facebook. However, the Court recalls that the definition includes entities that  ‘alone or jointly with others’ determine the purposes and means of data processing. In other words, the purposes may be determined by more than one controller and may be determined by ‘several actors taking part in the processing’ with each being subject to the provisions of the Directive.

On the facts, the Court considered that the administrator of a Facebook fan page:

  • Enters into a contract with Facebook Ireland and subscribes to the conditions of use, including the use of cookies.
  • Is able to define the parameters of the fan page, which has an influence on the processing of personal data for the purposes of producing statistics based on visits to the fan page.
  • Could, with the help of filters made available by Facebook, define the criteria for statistical analysis of data.
  • Could designate the categories of persons whose personal data is to be made use of by Facebook.
  • Can ask Facebook for demographic data relating to its target audience, including age, sex, relationship and occupation, lifestyle and purchasing habits.

These factors pointed to the fact that the administrator of a fan page hosted on Facebook takes part in the determination of the purposes and means of processing the personal data of visitors to the fan page. Consequently the administrator of the fan page is to be regarded as a data controller, jointly with Facebook Ireland.

The Court rejected arguments that the Wirtschaftsakademie only received the statistical data in anonymised form because the fact remained that the statistics were based on the collection, by cookies, of the personal data of visitors to the fan page.

The fact that the fan page administrator uses the platform provided by Facebook does not exempt it from compliance with the Directive. The Court also added that non Facebook users may visit a fan page and therefore the administrator’s responsibilities for the processing of the personal data appears to be even greater as the mere consultation of the home page automatically starts the processing of personal data.

[1]  Case C210/16 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v

Wirtschaftsakademie Schleswig-Holstein GmbH

[2]Case C 212/13  František Ryneš v Úřad pro ochranu osobních údajů

 

We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. New Dates added for London!

Need to train frontline staff quickly? Try our extremely popular GDPR e-learning course.

Dont forget about our GDPR Helpline, its a great tool to use for some advice when you really need it.

GDPR: Notification and the future of ICO Charges

canstockphoto7747142

By Jon Baines

Data Protection law has, since 1984 in the UK (with the first Data Protection Act), and since 1995 across Europe (with the Data Protection Directive), contained a general obligation on those who process personal data to notify the fact to the relevant supervisory authority (the Information Commissioner’s Office, or “ICO”, in the UK) and pay a fee for doing so. For many organisations it has in effect meant the payment of an annual fee in order to deal with people’s personal data.

Currently, in the UK, under the Data Protection Act 1998 (DPA), data controllers (those organisations who determine the purposes for which and the manner in which personal data are processed) pay either £35 or £500, according to their size (data controllers whose annual turnover is £25.9m or more and who have more than 249 staff must, in general, pay the larger amount). There are various exemptions to the general obligation, for instance for some controllers who are not-for-profit and for those who process personal data only for staff administration (including payroll), or advertising, marketing and public relations (in connection with their own business activity), or for accounts and records.

Failure by a controller to make a notification, unless it has an exemption, is a criminal offence under sections 17 and 21 of the DPA, punishable by a fine. However, only one successful prosecution appears to have been brought by the ICO in the last calendar year – a surprisingly low figure, given that, anecdotally, the author is aware of large numbers of controllers failing to make a notification when they should do so.

The General Data Protection Regulation (GDPR) does away with what has often been seen as a fragmented and burdensome notification requirement, substituting for it, at least in part, an accountability principle, under which relevant organisations (“data controllers”) will have to keep internal records of processing activities. As far back as 1997 the Article 29 Working Party, representing data protection authorities across the EU, recognised that excessively bureaucratic requirements in relation to notification not only represent a burden for business but undermine the whole rationale of notification by becoming an excessive burden for the data protection authorities.

And in its impact assessment in 2012, when the GDPR was first proposed, the European Commission explained some of the reasoning behind the removal of the requirement:

“[Notification] imposes costs and cumbersome procedures on business, without delivering any clear corresponding benefit in terms of data protection. All economic stakeholders have confirmed…that the current notification regime is unnecessarily bureaucratic and costly. [Data protection authorities] themselves agree on the need to revise and simplify the current system.”

However, in the UK at least the removal under the GDPR of notification fees would have had a catastrophic effect on the ICO’s existence, because, at the moment, all of the funding for its data protection work comes from fees income – almost £24m last year.

To address this impending shortfall, the government has aimed to provide powers (actually in the form of two pieces of legislation – first the Digital Economy Act and now the recent Data Protection Bill (DP Bill) (presumably the former will fall away given the introduction of the latter) to make regulations to create a domestic scheme for data protection fees. The explanatory notes to the Data Protection Bill state that”

“[Clause 132] provides the Secretary of State with a power to make regulations requiring data controllers to pay a charge to the Commissioner. Those regulations may provide for different charges in different cases and for a discounted charge. In setting the charge the Secretary of State will take into account the desirability of offsetting the amount needed to fund the Commissioner’s data protection and privacy and electronic communications regulatory functions. It also provides that the Secretary of State may make regulations requiring a controller to provide information to the Commissioner to help the Commissioner identify the correct charge.”

A clue as to how the charges might be set has now been provided by means of a questionnaire, sent on behalf of the Department for Digital, Culture, Media and Sport (DCMS) to 300 lucky data controllers, seeking their views on what the fee structure might be. There is nothing on the DCMS, or ICO, website about this, so it’s not clear if it takes the form of a consultation, or, more likely, a scoping exercise. But what it appears to be putting forward for consideration is a three-tier scheme, under which data controllers would pay £55, £80 or £1000, based on the size of the data controller and the number of “customer records” it handles.

As drafted, the questionnaire doesn’t propose any exemptions. One assumes that these would follow, but even so, the proposal to levy a fee for data protection on business, at a time when the European legislature has removed it, must raise questions about how business-friendly this particular piece of law-making will be.

Additionally, it is not clear what the sanction for non-compliance, and what the enforcement regime, would be. As indicated above, the current criminal sanction does not appear to have prevented any number of data controllers from avoiding their legal obligations, with apparent impunity. One presumes, though, that enforcement would be left as a function of the ICO, and, given that Commissioner Elizabeth Denham has said on various occasions that her office needs to grow to cope with the demands of GDPR, it is to be supposed that she will aim to be strict on this matter.

There are estimated to be approximately 5.5 million businesses in the UK. If each of those paid only the bottom tier under the suggested fees structure, this could equate to a potential cost to business of about £3bn per annum. Even if only a proportion of businesses actually end up paying (bearing in mind the likely exemptions, and likely avoidance/ignorance of some – just like now), £55 is a 57% increase on the current lower fee, and, added to the administrative costs of actually making a notification marks a considerable overall burden on UK business and – indeed – other data controllers.

There is no easy answer to the question of how the ICO’s regulatory functions can effectively be funded, and on one view it makes sense to retain a similar arrangement to the existing one, despite the European legislature having determined it is both ineffective and burdensome. However, it would not be a great surprise to see business interests in the UK lobbying against a domestic measure which is in fact more costly for them than the measures of the European Union the UK is planning to leave.

Jon Baines, is chair of NADPO (www.nadpo.co.uk) and blogs in a personal capacity.

Many of our GDPR workshops are fully booked. We have added a new course on the Data Protection Bill to our programme. 

Councillors, council tax arrears and FOI

POUNDLAND

Some council chiefs, as well as some councillors, do not like the Freedom of Information Act 2000(FOI) claiming, amongst other things, that it costs too much and is used to request trivial information. Against this backdrop, how do council FOI officers deal with requests (often from journalists) for the names of councillors who are in arrears or have defaulted on their council tax bills?

Some councils have refused such requests citing the section 40(2) exemption for third party personal data. For this exemption to be engaged a public authority must show that disclosure of the name(s) would breach one of the Data Protection Principles. Most cases in this area focus on First Principle and so public authorities have to ask, would disclosure be fair and lawful? They also have to justify the disclosure by reference to one of the conditions in Schedule 2 of the DPA (as well as Schedule 3  in the case of sensitive personal data). In the absence of consent, most authorities end up considering whether disclosure is necessary for the applicant to pursue a legitimate interest and, even if it is, whether the disclosure is unwarranted due to the harm caused to the subject(s) (condition 6 of Schedule 2)? Of course when the new General Data Protection Regulation (GDPR) comes into force on 25th May 2018 the disclosure of the data will have to be justified by reference to Article 6 of GDPR.

A 2016 Upper Tribunal decision sheds light on this difficult issue. Haslam v Information Commissioner and Bolton Council [2016] UKUT 0139 (AAC) (10 March 2016) concerned a request by a journalist (Mr Haslam) for disclosure of information about councillors who had received reminders for non-payment of council tax since May 2011.  The Council told the appellant that there were six such councillors and informed him which political party they were members of, how much had been owed, how much was outstanding, and that two had been summoned to court.  The Appellant asked for the names of the individual councillors.  The Council refused stating that the names were exempt from disclosure under section 40(2) FOI.  The Appellant appealed to the First-tier Tribunal, against the decision of the Information Commissioner to uphold the Refusal Notice, in relation to the two councillors who had been summoned to court. The First-tier Tribunal dismissed the appeal.  Subsequently one councillor voluntarily identified himself, so that there was only an issue regarding one councillor before the Upper Tribunal.

The Upper Tribunal allowed the appeal concluding that releasing the name would not contravene the data protection principles, because processing was necessary for the purposes of legitimate interests pursued by the Appellant, and was not unwarranted because of prejudice to the councillor’s rights/legitimate interests.  This was a public matter in which the councilor could not have a reasonable expectation of privacy. Judge Markus in her judgment said:

“40. But, in the case of a councillor, it is not only a private matter. A councillor is a public official with public responsibilities to which non-payment of council tax is directly and significantly relevant.  A number of specific features of this were advanced in submissions to the First-tier Tribunal.  In particular, section 106 of the Local Government Finance Act 1992 bars a councillor from voting on the Council’s budget if he or she has an outstanding council tax debt of over two months.  If a councillor is present at any meeting at which relevant matters are discussed, he or she must disclose that section 106 applies and may not vote.  Failure to comply is a criminal offence. Thus council tax default strikes at the heart of the performance of a councillor’s functions. It is evident that setting the council’s budget is one of the most important roles undertaken by councillors.  The loss of one vote could make a fundamental difference to the outcome. This adds a significant public dimension to the non-payment of council tax.  The very fact that Parliament has legislated in this way reflects the connection between non-payment and the councillor’s public functions.  Moreover, as the Commissioner observed in his decision notice, recent failure to pay council tax is likely to impact on public perceptions and confidence in a councillor as a public figure.

  1. These factors are of critical relevance to expectation.  As the Commissioner  had observed, those who have taken public office should expect to be subject to a higher degree of scrutiny and that information which impinges on their public office might be disclosed.  More specifically, unless the local electorate know the identity of a councillor to whom section 106 applies, they cannot discover that that councillor is failing to fulfil his functions.  Nor can they know that the process of declarations under section 106 is being adhered to. In addition the electorate may wish to know whether they can trust a councillor properly to discharge his functions if he stands for office again.” 

So there we have it. Councillors can normally expect to have their names disclosed if they default on council tax. However this is not an absolute rule. In the words of Judge Markus (at paragraph 56):

“There may be exceptional cases in which the personal circumstances of a councillor are so compelling that a councillor should be protected from such exposure.”

The Bolton News, where the Appellant works, finally named the councillor who is the subject of this case (Click here if interested). By the way, I may share a name with him but I can assure you that I am up to date with my council tax bill payments!

We will be discussing this and other recent FOI decisions in our forthcoming FOI workshops and webinars.

How would you do on the BCS Certificate in Freedom of Information exam? Have a go at our test.

iPhone -> abcPhone

82b54e6dd1d42e1fbfaa6bac4f93f66c

By Paul Simpkins

First the joke

I had a friend who played in a band. When he got his new smart phone he put all his gigs for the next 12 months into his calendar with an alert set for the day before so he knew when he was needed and he could plan the rest of his life.

A few days later his fellow band members rang him from a venue saying “Where are you, we’re on stage in 3 hours…”.

He looked at his phone and found that almost all the dates except 8 that he’d typed in hadn’t gone into his calendar. Only 8 were listed. The rest had disappeared. He dashed down to the phone shop and asked why to which the teenage assistant replied ”Sorry mate you’ve only got an 8 gig phone” [groan…]

But do you really need a phone with massive capacity and hundreds of apps? Do you need two level security or thumbprint login or many of the fancy apps that make your life so complicated efficient?

Is there a market for a simpler smartphone (maybe a dumbphone) that just has 8 key apps built in and no possibility of adding any more. We could call it the 8 app phone to remind us of the old joke. There would only be one home screen so we could call it… the screen.

Many old people don’t use 99% of the functionality of a smartphone. Yes youngsters are in constant contact with every social media platform that exists and are forever uploading and viewing videos of their friends eating junk food in branded outlets while streaming Spotify tunes but do we need all this connectivity?

This revolutionary concept crossed my mind this morning. I’d installed an update on my i-phone and instead of getting on with being my faithful companion my phone reverted to Hello Hola mode. All I had to do was set it up again and all my data would mysteriously flow back through the air to fill it up again. The problem was that I couldn’t remember my i-Cloud code as I’d bravely migrated to (see I can speak the lingo) two level authentication a few days ago. The phone wasn’t playing until it had the code. (I know I should have written it down on a yellow post it note but most of my reminders are in Notes on my phone). I also know that apple groupies will now be screaming “you stupid old git” at their screens and I acknowledge that I don’t know the front end of a universal serial bus from the back end but I’m happy in my own way. I just don’t see the point of unasked for updates that add on features I don’t think I’ll ever use. I’m often quite a few updates late and I still don’t know why I accepted this one so readily.

I went on the web and signed in with my Apple ID and it said no problem we’ll send a 6 letter code to your trusted device and you can type it in and you’ll be fine. Unfortunately my trusted device was the phone that the update had turned into a small door stop so the code I needed to unlock it was stopping at the door and not going in.

I rang Apple support and pointed out the problem and they ummed and ahhhed for 30 minutes before deciding that I had to take the SIM out of the phone, put it into another phone, set it up as a clone of my small doorstop, look in the text inbox, retrieve the code I’d been sent, take the SIM out, return it to my small doorstop and type in the code which would make my door stop suddenly metamorphose into a beautiful smartphone and fly off into the sunset.

The local phone shop refused to do it as it might lock the donor phone so I went home to find an old i phone. Soon I had no i-cloud code and 2 locked phones.

Fortunately I also had a macbook and an IT literate partner and for 3 hours we trawled the web, switched off this, switched on that, reset the donor phone and through trying every possible route through the Hello Hola roadblock finally made it work. Then we saw 9 texts each containing a 6 letter unlock code.

With feverish glee we put the SIM back where it belonged and tried to replicate the process. We did at one stage receive an email message saying that someone in Middlesbrough had tried to sign into my account but ignored as it was so obviously a ruse de guerre. (Heckmondwike yes but Middlesbrough no way…). An hour later we’d made it. It involved changing an apple ID password and several cups of coffee and a few cookies but we made it. By now darkness had fallen and we were both too tired to actually use the phone.

Back to the brilliant idea. The next development for Apple after the i-phone should (obviously) be the j-phone. The J stands for ‘just a few things on the” phone. Essentials are phone, text, web, calendar, maps, settings, camera, contacts and nothing else. (There will be a focus group later to decide which 8 are essential). {We’ll make them big icons while we’re at it}. But lets make even simpler and to save a law suit, just call it the abc-phone. Being as there’s no video or music or social media this can be produced cheaply and only sold to anyone who can produce a bus pass or a senior rail card (with photo ID  – we’re not letting any spotty youngsters in on the secret). There’ll be no real security on the phone – if someone pinches it there will be no value to sell on and the user can just buy another.

Over to you Apple…

Regards

A grumpy old man.

 

Make 2017 the year you achieve a GDPR qualification? See our full day workshops and new GDPR Practitioner Certificate.

 

 

 

image credits: http://www.techradar.com/reviews/phones/mobile-phones/iphone-6-1264565/review

The Right to Data Portability under GDPR

canstockphoto11651619

The new General Data Protection Regulation (GDPR) will come into force on 25th May 2018. Whilst it will replaces the UK’s Data Protection Act 1998 (DPA), it still includes the right of the Data Subject to receive a copy of his/her data, to rectify any inaccuracies and to object to direct marketing. It also introduces new rights, one of which is the right to Data Portability.

Article 20 of GDPR allows for Data Subjects to receive their personal data, which they have provided to a Data Controller, in a structured, commonly used and machine-readable format, and to transmit it to another Data Controller. The aim of this right is to support user choice, user control and consumer empowerment. It will have a big impact on all Data Controllers but particularly data driven organisations such as banks, cloud storage providers, insurance companies and social networking websites. These organisations may find that customers are encouraged to move suppliers, as they will be armed with much more information than they previously had accessed to. This in turn may lead to an increase in competition driving down prices and improving services (so the theory goes; we live in hope!).

When the Right Can Be Exercised

Unlike the subject access right, the Data Portability right does not apply to all personal data held by the Data Controller concerning the Data Subject.  Firstly it has to be automated data. Paper files are not included. Secondly the personal data has to be knowingly and actively provided by the Data Subject. For example account data (e.g. mailing address, user name, age) submitted via online forms, but also when they are generated by and collected from the activities of users, by virtue of the use of a service or device.

By contrast personal data that are derived or inferred from the data provided by the Data Subject, such as a user profile created by analysis of raw smart metering data or a website search history, are excluded from the scope of the right to Data Portability, since they are not provided by the Data Subject, but created by the Data Controller.

Thirdly the personal data has to be processed by the Data Controller with the Data Subject’s consent or pursuant to a contract with him/her. Therefore personal data processed by local authorities as part of their public functions (e.g. council tax and housing benefit data) will be excluded from the right to Data Portability.

It is important to not that this right does not require Data Controllers to keep personal data for longer than specified in their retention schedules or privacy polices. Nor is there a requirement to start storing data just to comply with a Data Portability request if received.

Main elements of Data Portability

Article 20(1) gives a Data Subject two rights:

  1. To receive personal data processed by a Data Controller, and to store it for further personal use on a private device, without transmitting it to another Data Controller.

This is similar to the subject access right. However here the data has to be received “in a structured, commonly used, machine readable format” thus making it easier to analyse and share. It could be used to receive a playlist from a music streaming service, information about online purchases or leisure pass data from a swimming pool.

  1. A right to transmit personal data from one Data Controller to another Data Controller “without hindrance”

This provides the ability for Data Subjects not just to obtain and reuse their data, but also to transmit it to another service provider e.g. social networking sites and cloud storage providers etc. It facilitates the ability of data subjects to move, copy or transmit personal data easily. In addition it provides consumer empowerment by preventing “lock-in”.

The right to Data Portability is expected to foster opportunities for innovation and sharing of personal data between Data Controllers in a safe and secure manner, under the control of the data subject.

Time Limits

Data Controllers must respond to requests for Data Portability without undue delay, and within one month. This can be extended by two months where the request is complex or a number of requests are received. Data Controllers must inform the individual within one month of receipt of the request and explain why the extension is necessary.

Information is to be provided free of charge save for some exceptions. Refusals must be explained as well as the right to complain to the Information Commissioner’s Office (ICO).

Notification Requirements

Data Controllers must inform Data Subjects of the right to Data Portability within their Privacy Notice as required by Article 13 and 14 of GDPR.  (More on Privacy Notices under GDPR here.  See also the ICO’s revised Privacy Notices Code.)

In December 2016, the Article 29 Data Protection Working Party published guidance on Data Portability and a useful FAQ. (Technically these documents are still in draft as comments have been invited until the end of January 2017). It recommends that Data Controllers clearly explain the difference between the types of data that a Data Subject can receive using the portability right or the access right, as well as to provide specific information about the right to Data Portability before any account closure, to enable the Data Subject to retrieve and store his/her personal data.

Subject to technical capabilities, Data controllers should also offer different implementations of the right to Data Portability including a direct download opportunity and allowing Data Subjects to directly transmit the data to another Data Controller.

Impact on the Public Sector 

Local authorities and the wider public sector might be forgiven for thinking that the Data Portability right only applies to private sector organisations which processes a lot of personal data based on consent or a contract e.g. banks, marketing companies, leisure service providers, utilities etc. Major data processing operations in local authorities (e.g. for the purposes of housing benefit, council tax etc.) are based on carrying out public functions or statutory duties and so excluded. However a lot of other data operations will still be covered by this right e.g. data held by personnel, accounts and payroll, leisure services and even social services. An important condition is that the Data Subject must have provided the data.

The Government has confirmed that GDPR is here to stay; well beyond the date when the UK finally leaves the European Union. All Data Controllers need to assess now what impact the right to Data Portability will have on their operations. Policies and Procedures need to be put into place now.

Make 2017 the year you get prepared for the General Data Protection Regulation (GDPR). See our full day workshops and new GDPR Practitioner Certificate.

New Webinar on GDPR and the Right to Data Portability. Register onto the live session or watch the recording.

New Data Sharing Powers in the Digital Economy Bill

illust_01_e

Much has been written about the complexities of the current legal regime relating to public sector data sharing. Over the years this blog has covered many stops and starts by the government when attempting to make the law clearer.

The Digital Economy Bill is currently making its way through Parliament. It contains provisions, which will give public authorities (including councils) more power to share personal data with each other as well as in some cases the private sector.

The Bill has been a long time coming and is an attempt by the Government to restore some confidence in data sharing after the Care.Data fiasco. It follows a consultation which ended in April with the publication of the responses.

The Bill will give public authorities a legal power to share personal data for four purposes:

  1. To support the well being of individuals and households. The specific objectives for which information can be disclosed under this power will be set out in Regulations (which can be added to from time to time). The objectives in draft regulations so far include identifying and supporting troubled families, identifying vulnerable people who may need help re tuning their televisions after changes to broadcasting bands and providing direct discounts on energy bills for people living in fuel poverty.
  2. For the purpose of debt collection and fraud prevention. Public authorities will be able to set up regular data sharing arrangements for public sector debt collection and fraud prevention but only after such arrangements have been through a business case and government approval process.
  3. Enabling public authorities to access civil registration data (births, deaths and marriages) (e.g. to prevent the sending of letters to people who have died).
  4. Giving the Office for National Statistics access to detailed administrative government data to improve their statistics.

The new measures are supported by statutory Codes of Practice (currently in draft) which provide detail on auditing and enforcement processes and the limitations on how data may be used, as well as best practice in handling data received or used under the provisions relating to public service delivery, civil registration, debt, fraud, sharing for research purposes and statistics. Security and transparency are key themes in all the codes. Adherence to the 7th Data Protection Principle (under Data Protection Act 1998 (DPA)) and the ICO’s Privacy Notices Code (recently revised) will be essential.

A new criminal offence for unlawful disclosure of personal data is introduced by the Bill. Those found guilty of an offence will face imprisonment for a term up to two years, a fine or both. The prison element will be welcomed by the ICO which has for a while been calling for tougher sentences for people convicted of stealing personal data under the DPA.

The Information Commissioner was consulted over the codes so (hopefully!) there should be no conflict with the ICO Data Sharing Code. The Bill is not without its critics (including Big Brother Watch) , many of whom argue that it is too vague and does not properly safeguard individuals’ privacy.

It is also an oversight on the part of the drafters that it does not mention the new General Data Protection Regulation (GDPR) which will come into force on 25th May 2018. This is much more prescriptive in terms of Data Controllers’ obligations especially on transparency and privacy notices.

These and other Information Sharing developments will be examined in our data protection workshops and forthcoming webinar.

Illustration provided by the Office of the Privacy Commissioner of Canada (www.priv.gc.ca)

DPO or not to DPO: The Data Protection Officer under GDPR

clip_image002

The General Data Protection Regulation (GDPR) is nearly upon us and one of the elements is the requirement for certain organisations to have a Data Protection Officer.

This throws up some interesting issues. A qualified, experienced data protection officer is a valuable commodity. They do exist but command salaries approaching £50,000 in large organisations (stop laughing at the back) and if you’re a small organisation they’re not going to work for you for peanuts. So where do you find a qualified, experienced DPO?

Secondly will there be a requirement upon you to have one? It looks like there will be three clear cases.

  1. processing is carried out by a public authority,
  2. the core activities of the controller or processor consist of processing which, by its nature, scope or purposes, requires regular and systematic monitoring
    of data subjects on a large scale
  3. the core activities consist of processing on a large scale of special categories of data.

But to go back to the DPO what does qualified mean? Yes there are qualifications out there. The accepted gold standard in the UK is the BCS certificate which has 40 hours of training plus a testing 3 hour exam. There are other firms in the sector who offer their own versions and most of them involve significant study (30 or 40 hours) plus exam. Other qualifications exist, like our GDPR Practitioner Certificate and CIPP certification from the International Association of Privacy Professionals – some for US and some for UK professionals – but the question everyone wants answering is which qualifications will satisfy the GDPR?

Do training providers have to apply for acceptance or endorsement from the EU or their national regulator? Will the content of these courses be examined or will a standard be set and the training providers tailor their material to a certain level or will it be a free for all with no standard to work to? Do you want a DPO who knows how to conduct a Privacy Impact Assessment or who knows about International Data Transfers or one with an understanding of the history of Data Protection? Or will there be a requirement to study a certain (large) number of hours to demonstrate competence? At the moment it looks like all the DPO will need is “sufficient expert knowledge” which doesn’t in itself mean a qualification.

Other skills required by a good DPO are those of Diplomat, Trainer; Advisor, Confidante; Interpreter; Persuader; Listener; Friend to requestors; Policy & procedures writer. They have the ability to talk to the top level of the organisation yet explain complex law in Plain English. Not your run of the mill person.

It looks like the route map will require the DPO to be an employee but one with a different type of outlook. Privacy is becoming a big vote winner; organisations who don’t respect customers privacy will feel the backlash of disgruntled consumers. It really needs someone who is part of the organisation who is present at all times and understand the data processing systems of their employer but is detached enough to be able to criticize his own organisation.

There is a way out for small organisations who think they need a DPO to ensure their organisation is fully compliant with the new regulation. Don’t give the job to an existing member of staff and expect them to learn it on the job; Don’t appoint a knowledgeable, qualified, experienced but expensive DPO – bring in an external one you can use as and when you need them.

Externals have significant benefits. They don’t work full time so the on costs disappear; You can bring them in as required for short term task and finish assignments; You can save the costs of training and continuing education for an internal data protection officer; your staff will react better to an external who appears to have the status of a “consultant”.

Externals also won’t have any political or organisational baggage and can act in an unbiased manner without fear for their job. An external data protection officer also has no worries about favouring certain departments or individuals in the company. Many organisations appoint their Head of Legal as their DPO which brings with it the ethical/legal/best course of action conflict. An external won’t need to bother with this.

You can concentrate on your core business and the external can take care of your data protection.

Once you have appointed an external DPO they will compile a detailed data protection audit on your data protection compliance. They will then identify possible data protection issues and legal risks and explain what is required to remedy them. Then you can start making the necessary changes.  Your business will soon be in full compliance with current data protection laws.

But it doesn’t stop there. The external DPO will be on call and can discuss day-to-day DP issues by phone or email for a small fee. If more detailed work is required further fees and timescales can be agreed.

Working with an external data protection officer is based on a consulting agreement. There may be a retainer fee plus an hourly or daily rate to follow. If your Data Protection needs are low you may not have to consult your EDPO too often.

Not surprisingly EDPOs are starting to appear on the web. They’re quite common in Germany and it’s likely they will become a staple in the UK. Various UK law firms advertise such a service but unsurprisingly the rates they charge are not on view. It might end up costing more than you think especially if you opt for a ’big’ name.

There’s also the scope however for sharing a DPO. This has already happened in various parts of the country as cash strapped rural councils pay for a percentage of a DPO and have them on site part of a week.

At a recent educational conference a group of 30 schools in the same region kicked around the idea of each contributing to buy a DPO for all of them who would fulfill their information law obligations. Sounds quite a good idea until you realise there’s only about 240 working days in a year so each school would have 8 of those days to themselves and the shared DPO would have a significant petrol expenses tab. A few rural councils with a shared DPO would have a much better deal.

Sadly GDPR is not well understood and there are those who think Brexit will derail it (though not true) but a wise organisation should be thinking now if and when they will need a DPO, what qualification they will have and how do they find one.

An external who is called on infrequently might appear be the cheapest option but might have further hidden costs and a part share of a DPO might be a good short term solution but would they be as good as the expert knowledge and day to day hands on work of a full timer.

Good news for Data Protection Officers…

We are running a series of GDPR webinars and workshops and our team of experts are available to come to your organisation to deliver customised data protection/GDPR workshops as well as to carry out health checks and audits. Our GDPR Practitioner Certificate (GDPR.Cert), with an emphasis on the practical skills required to implement GDPR, is an ideal qualification for those aspiring for such positions.

Privacy Notices under #GDPR: Have you noticed my notice?

DPA2Please also read our updated blog on privacy notices here.

As you all know by now the General Data Protection Regulation (GDPR) is here and it is (as predicted) starting to get various people fired up ready for its 2018 implementation date. (Dear reader, it is still relevant despite the Brexit vote.) We’ve been exploring various aspects of the GDPR and in this particular blog I want us to look at the concept of privacy notices and what they will need to start looking like under the Regulation.

Data Protection Act 1998:

Under the current Data Protection Act 1998, and indeed the Information Commissioner’s Office Privacy Notices Code of Practice, privacy notices should be on any collection point where personal data is being collected from a Data Subject. Especially if being collected for a new purpose. In that notice Data Controllers should (at the very least) include the following;

  • The identity of the Organisation in control of the processing;
  • The purpose, or purposes, for which the information will be processed;
  • Any further information necessary, in the specific circumstances, to enable the processing in respect of the individual to be ‘fair’ (in accordance with the 1st Principle).

The requirements also outline that this information must be clear and in ‘plain English’ and your purposes cannot be too vague. The less vague the purpose the less likely it’s going to be a valid consent (or indeed a valid notification if you are not relying on consent).

While privacy notices vary most of them aren’t that much longer than your average paragraph (the paragraph I’ve just written for example) and that, providing it’s clear, concise and meets your legal grounds for processing, is generally how privacy notices work under the Data Protection Act 1998. Further information on a Controllers processing is then often outlined in Terms and Conditions either in the contract paperwork or online.

The New World:

The GDPR builds on the current expectations around privacy notices but expands on the requirements based on the widened first principle which now specifically requires controllers to be transparent with their processing.

Article 13 Paragraph 1 (a-f) of the GDPR outlines that the following information should be provided to the data subject at the point of data collection;

(a) the identity and the contact details of the controller and, where applicable, of the controller’s representative;

(b) the contact details of the data protection officer, where applicable;

(c) the purposes of the processing for which the personal data are intended as well as the legal basis for the processing;

(d) where the processing is based on point (f) of Article 6(1), the legitimate interests pursued by the controller or by a third party;

(e) the recipients or categories of recipients of the personal data, if any;

(f) where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available.

Depending on what processing is going on, Article 13 Paragraph 2 (a-f) states that controllers will also need to provide some of the following;

(a) the period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period;

(b) the existence of the right to request from the controller access to and rectification or erasure of personal data or restriction of processing concerning the data subject or to object to processing as well as the right to data portability;

(c) where the processing is based on point (a) of Article 6(1) or point (a) of Article 9(2), the existence of the right to withdraw consent at any time, without affecting the lawfulness of processing based on consent before its withdrawal;

(d) the right to lodge a complaint with a supervisory authority;

(e) whether the provision of personal data is a statutory or contractual requirement, or a requirement necessary to enter into a contract, as well as whether the data subject is obliged to provide the personal data and of the possible consequences of failure to provide such data;

(f) the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

Now if you are engaging in some quite complicated processing, like in the insurance industry for example, your new notices under GDPR are going to need to strike a balance between being ‘too much information’ and being far too simple and high level that they don’t actually meet your transparency requirements to demonstrate effective notice or consent.

Article 13 Paragraph 3 also outlines that should a controller seek to process personal data for purposes different to which it was collected the controller shall project the subject (prior to that processing commencing) information on that purpose and any other relevant information from paragraph 2.

I’ve attempted to ‘mock up’ what one of these new notices could look like. Now this is very much an imaginary one but if we assume that a controller is processing Personal Data for complex purposes their notice may look something like this;

Your Personal Data:

What we need

The A Notice Ltd will be what’s known as the ‘Controller’ of the personal data you provide to us. We only collect basic personal data about you which does not include any special types of information or location based information. This does however include name, address, email etc.

Why we need it

We need to know your basic personal data in order to provide you with notice writing and analysis services in line with this overall contract. We will not collect any personal data from you we do not need in order to provide and oversee this service to you.

What we do with it

All the personal data we process is processed by our staff in the UK however for the purposes of IT hosting and maintenance this information is located on servers within the European Union. No 3rd parties have access to your personal data unless the law allows them to do so.

We have a Data Protection regime in place to oversee the effective and secure processing of your personal data. More information on this framework can be found on our website.

How long we keep it

We are required under UK tax law to keep your basic personal data (name, address, contact details) for a minimum of 6 years after which time it will be destroyed. Your information we use for marketing purposes will be kept with us until you notify us that you no longer wish to receive this information. More information on our retention schedule can be found online.

What we would also like to do with it

We would however like to use your name and email address to inform you of our future offers and similar products. This information is not shared with third purposes and you can unsubscribe at any time via phone, email or our website. Please indicate below if this is something you would like to sign up to.

Please sign me up to receive details about future offers from A Notice Ltd.

What are your rights?

If at any point you believe the information we process on you is incorrect you request to see this information and even have it corrected or deleted. If you wish to raise a complaint on how we have handled your personal data, you can contact our Data Protection Officer who will investigate the matter.

If you are not satisfied with our response or believe we are processing your personal data not in accordance with the law you can complain to the Information Commissioner’s Office (ICO).

Our Data Protection Officer is Notice McNoticeface and you can contact them at mypersonaldata@anotice.com.

This example is working on the assumption of a simple data processing arrangement. The more complex your data processing the more complex that notice and consent capture will need to be. But this must be comprehensible to the average consumer and cannot be a work of ‘legal-ee brilliance’ that makes no sense to those not trained in law.

I suspect that notices will allow ‘outlines of categories’ of types of processing and third parties however we shall see how big these categories can be. After all, the bigger the ‘bucket’ the less you are actually giving a robust ‘informed’ notice to a data subject.

In addition to all of this, Article 14 states that should you obtain Personal Data via a means not direct from the Data Subject themselves you also need to provide a notification to them (with some exceptions);

(a) within a reasonable period after obtaining the personal data, but at the latest within one month, having regard to the specific circumstances in which the personal data are processed;

(b) if the personal data are to be used for communication with the data subject, at the latest at the time of the first communication to that data subject; or

(c) if a disclosure to another recipient is envisaged, at the latest when the personal data are first disclosed.

The requirement is to provide them with very similar information that you would provide to them if you collected the data directly. How you do this will be a matter of some discussion to come but excluding the reasons outlined in Article 14 (5) (a – d), if you aren’t collecting directly you will now need to take steps to advise and ‘notify’ the Data Subject of what you are up to.

Now that is quite a long list of things to notify a data subject of, especially if you are delivering various services to the data subject (and collecting data on them) via various means. But Paragraph 4 does say that all of the above shall not apply if the data subject already has the data. So, for example, if a customer is simply renewing a service and nothing about the provision of that service (the processing) has changed then there is no obvious requirement here to re-issue the original notice at that point of renewal.

We will delve into the concept of consent at another time (very soon) but the requirement to be transparent as well as the requirement to ensure you have a clear and documented consent means that privacy notices are going to have to become more than just a long legal document but that far away from what we are doing today (assuming we are doing them correctly that is).

More on privacy notices here.

Scott Sammons CIPP/E, AMIRMS is an experienced Data Protection & Information Risk practitioner and a consultant with Act Now Training.

If you need to raise awareness about GDPR, our GDPR e learning course is ideal for frontline staff. Advice and guidance on GDPR is available through our GDPR helpline.

GDPR Practitioner Certificate – A 4 day certificated course aimed at those undertaking the role of Data Protection Officer under GDPR whether in the public or the private sector.

%d bloggers like this: