Act Now Partners with Middlesex
University Dubai for UAE’s first
Executive Certificate in DP Law

Act Now Training, in collaboration with Middlesex University Dubai, is excited to announce the launch of the UAE’s first Data Protection Executive training programme. This qualification is ideal as a foundation for businesses and organisations aiming to comply with the UAE Federal Data Protection Law.

This practical course focusses on developing a data protection framework and ensuring compliance with the UAE Data Protection Law’s strict requirements. This is particularly relevant given the recent advancements in Data Protection law in the Middle East, including the UAE’s first comprehensive national data protection law, Federal Decree Law No. 45/2021. 

This law regulates personal data processing, emphasising transparency, accountability, and data subject rights. It applies to all organisations processing personal data within the UAE and abroad for UAE residents.

The importance of understanding this law is paramount for every business and organisation, as it necessitates a thorough reassessment of personal data handling practices. Non-compliance can lead to severe penalties and reputational damage.

The Executive Certificate in UAE DP Law is a practical qualification delivered over 5-weeks in two half day sessions per week and offers numerous benefits:

  1. Expertise in Cutting-Edge Legislation: Gain in-depth knowledge of the UAE’s data protection law, essential for professionals at the forefront of data protection practices.

  2. Professional Development: This knowledge enhances your resume, especially for roles in compliance, legal, and IT sectors, showing a commitment to legal reforms.

  3. Practical Application: The course’s structured format allows gradual learning and practical application of complex legal concepts, ensuring a deep understanding of the law.

  4. Risk Mitigation: Understanding the law aids in helping organisations avoid penalties and reputational harm due to non-compliance.

  5. Networking Opportunities: The course provides valuable connections in the field of data protection and law.

  6. Empowerment of Data Subjects: Delegates gain insights into their rights as data subjects, empowering them to protect their personal data effectively.

Delegates will receive extensive support, including expert instruction, comprehensive materials, interactive sessions, practical exercises, group collaboration, ongoing assessment, and additional resources for further learning. Personal tutor support is also provided throughout the course.

This program is highly recommended for officers in organisations both inside and outside the UAE that conduct business in the region or have customers, agents, and employees there. 

Act Now will be delivering and has designed the curriculum. Act Now Training is the UK’s premier provider of information governance training and consultancy, serving government organisations, multinational corporations, financial institutions, and corporate law firms.   

With a history of delivering practical, high-quality training since 2002.
Act Now’s skills-based training approach has led to numerous awards including most recently the Supplier of Year Award 2022-23 by the Information and Records Management Society in the UK. 

Our associates have decades of hands-on global Information Governance experience and thus are able to break down this complex area with real world examples making it easy to understand, apply and even fun!

Middlesex University Dubai is a 5 star rated KHDA university and one of three global campuses including London and Mauritius. It is the largest UK University in the UAE with over 5000 student enrolments from over 120 nationalities.

For more information and to register your interest, visit Middlesex University Dubai’s website. Alternatively you can Click Here.

CJEU’s FT v. DW Ruling: Navigating Data Subject Access Requests 

In the landmark case FT v. DW (Case C 307/22), the Court of Justice of the European Union (CJEU), delivered a ruling that sheds light on the intricacies of data subject access requests under the EU General Data Protection Regulation (GDPR). The dispute began when DW, a patient, sought an initial complimentary copy of their dental medical records from FT, a dentist, citing concerns about possible malpractice. FT, however, declined the request based on German law, which requires patients to pay for copies of their medical records. The ensuing legal tussle ascended through the German courts, eventually reaching the CJEU, which had to ponder three pivotal questions. These are detailed below. 

Question 1: The Right to a Free Copy of Personal Data 

The first deliberation was whether the GDPR mandates healthcare providers to provide patients with a cost-free copy of their personal data, irrespective of the request’s motive, which DW’s case seemed to imply was for potential litigation. The CJEU, examining Articles 12(5) and 15(3) of the GDPR and indeed Recital 63, concluded that the regulation does indeed stipulate that the first copy of personal data should be free and that individuals need not disclose their reasons for such requests, highlighting the GDPR’s overarching principle of transparency. 

Question 2: Economic Considerations Versus Rights under the GDPR 

The second matter concerned the intersection of the GDPR with
pre-existing national laws that might impinge upon the economic interests of data controllers, such as healthcare providers. The CJEU assessed whether Article 23(1)(i) of the GDPR could uphold a national rule that imposes a fee for the first copy of personal data. The court found that while Article 23(1)(i) could apply to laws pre-dating the GDPR, it does not justify charges for the first copy of personal data, thus prioritizing the rights of individuals over the economic interests of data controllers. 

Question 3: Extent of Access to Medical Records 

The final issue addressed the extent of access to personal data, particularly whether it encompasses the entire medical record or merely a summary. The CJEU clarified that according to Article 15(3) of the GDPR, a “copy” entails a complete and accurate representation of the personal data, not merely a physical document or an abridged version. This means that a patient is entitled to access the full spectrum of their personal data within their medical records, ensuring they can fully verify and understand their information. 

Conclusion 

The CJEU’s decision in FT v DW reaffirms the GDPR’s dedication to data subject rights and offers a helpful interpretation of the GDPR. It highlights the right of individuals to a free first copy of their personal data for any purpose, refuting the imposition of fees by national law for such access, and establishing the right to a comprehensive reproduction of personal data contained within medical records. The judgement goes on to say the data must be complete even if the term ‘copy’ is used as well as being contextual and intelligible as is required by Article 12(1) of the GDPR. 

We will be examining the impact of this on our upcoming Handling SARs course as well as looking at the ruling in our GDPR Update course. Places are limited so book early to avoid disappointment.

Council Loses High Court Damages Claim for Misuse of Personal Data 

A recent High Court judgment highlights the importance of data controllers treating personal data in their possession with care and in accordance with their obligations under the General Data Protection Regulation (GDPR). Failure to do so will also expose them to a claim in the tort of misuse of private information.

The Facts

In Yae Bekoe v London Borough of Islington [2023] EWHC 1668 (KB) the claimant, Mr. Bekoe, had an informal arrangement with his neighbour to manage and rent out flats on her behalf, with the income intended to support her care needs. In 2015, Islington Council initiated possession proceedings against Mr Bekoe. During the proceedings, the council submitted evidence to the court, including details of Mr. Bekoe’s bank accounts, mortgage accounts, and balances. This provided a snapshot of Mr. Bekoe’s financial affairs at that time. Some of this information, it appears, was held internally by the Council, and disclosed by one department to another for the purpose of “fraud” whilst other information was received after making a court application for disclosure by the bank and Mr Bekoe.  Subsequently, Mr. Bekoe filed a claim against Islington Council, alleging the misuse of his private information and a breach of the GDPR. Amongst other things, he argued that the council obtained his private information without any legal basis. Mr. Bekoe also claimed that the council failed to comply with its obligations under the GDPR in responding to his Subject Access Request (SAR). He made the request at the start of the legal proceedings, but the council’s response was delayed. Mr Bekoe also claimed that the council was responsible for additional GDPR infringements including failing to disclose further data and destroying his personal data in the form of the legal file which related to ongoing proceedings.

The Judgement

The judge awarded Mr. Bekoe damages of £6,000 considering the misuse of private information, the loss of control over that information, and the distress caused by the breaches of the GDPR. He ruled that the information accessed went beyond what was necessary to demonstrate property-related payments. Regarding the breach of the GDPR, the judge concluded that: 

  • The council significantly breached the GDPR by delaying the effective response to the subject access request for almost four years. 
  • There was additional personal data belonging to Mr. Bekoe held by the council that had not been disclosed, constituting a breach of the GDPR. 
  • While the specifics of the lost or destroyed legal file were unclear, there was a clear failure to provide adequate security for Mr. Bekoe’s personal data, breaching the GDPR. 
  • Considering the inadequate response to the subject access request, the loss or destruction of the legal file, and the failure to ensure adequate security for further personal data, the council breached Mr. Bekoe’s GDPR rights under Articles 5 (data protection principles), 12 (transparency), and 15 (right of access). 
     

The Lessons

Whilst this High Court decision is highly fact-specific and not binding on other courts, it does demonstrate the importance of ensuring there is a sound legal basis for accessing personal data and for properly responding to subject access requests.  Not only do individuals have the right to seek compensation for breaches of the UK GDPR, including failures to respond to subject access requests, the Information Commissioner’s Office (ICO) can take regulatory action which may include issuing reprimands or fines. Indeed, last September the ICO announced it was acting against seven organisations for delays in dealing with Subject Access Requests (SARs). This included government departments, local authorities, and a communications company. 

This and other GDPR developments will be discussed in our forthcoming GDPR Update workshop. 

The TikTok GDPR Fine

In recent months, TikTok has been accused of aggressive data harvesting and poor security issues. A number of governments have now taken a view that the video sharing platform represents an unacceptable risk that enables Chinese government surveillance. In March, UK government ministers were banned from using the TikTok app on their work phones. The United States, Canada, Belgium and India have all adopted similar measures. 

On 4th April 2023, the Information Commissioner’s Office (ICO) issued a £12.7 million fine to TikTok for a number of breaches of the UK General Data Protection Regulation (UK GDPR), including failing to use children’s personal data lawfully. This follows a Notice of Intent issued in September 2022.

Article 8(1) of the UK GDPR states the general rule that when a Data Controller is offering an “information society services”  (e.g. social media apps and gaming sites) directly to a child, and it is relying on consent as its lawful basis for processing, only a child aged 13 or over is able provide their own consent. For a child under 13, the Data Controller must seek consent from whoever holds parental responsibility. Article 8(2) further states:

“The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.”

In issuing the fine, the ICO said TikTok had failed to comply with Article 8 even though it ought to have been aware that under 13s were using its platform. It also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO estimates up to 1.4 million UK children under 13 were allowed to use the platform in 2020, despite TikTok’s own rules not allowing children of that age to create an account.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. John Edwards, the Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

In addition to Article 8 the ICO found that, between May 2018 and July 2020, TikTok breached the following provisions of the UK GDPR:

  • Article 13 and 14 (Privacy Notices) – Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and
  • Article 5(1)(a) (The First DP Principle) – Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

Notice of Intent

It is noticeable that this fine is less than half the amount (£27 million) in the Notice of Intent. The ICO said that it had taken into consideration the representations from TikTok and decided not to pursue its provisional finding relating to the unlawful use of Special Category Data. Consequently this potential infringement was not included in the final amount of the fine.

We have been here before! In 2018 British Airways was issued with a Notice of Intent in the sum of £183 Million but the actual fine in July 2020 was for £20 million. Marriott International Inc was fined £18.4 million in 2020; much lower than the £99 million set out in the original notice. Some commentators have argued that the fact that fines are often substantially reduced (from the notice to the final amount) suggests the ICO’s methodology is flawed.

An Appeal?

In a statement, a TikTok spokesperson said: 

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

We suspect TikTok will appeal the fine and put pressure on the ICO to think about whether it has the appetite for a costly appeal process. The ICO’s record in such cases is not great. In 2021 it fined the Cabinet Office £500,000 for disclosing postal addresses of the 2020 New Year Honours recipients. The Cabinet Office appealed against the amount of the fine arguing it was “wholly disproportionate”. A year later, the ICO agreed to a reduction to £50,000. Recently an appeal against the ICO’s fine of £1.35 million issued to Easylife Ltd was withdrawn, after the parties reached an agreement whereby the amount of the fine was reduced to £250,000.

The Children’s Code

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code. This is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. In September, whilst marking the Code’s anniversary, the ICO said:

“Organisations providing online services and products likely to be accessed by children must abide by the code or face tough sanctions. The ICO are currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. We have also audited nine organisations and are currently assessing their outcomes.”

With increasing concern about security and data handling practices across the tech sector (see the recent fines imposed by the Ireland’s Data Protection Commission on Meta) it is likely that more ICO regulatory action will follow. 

This and other GDPR developments will be discussed in detail on our forthcoming GDPR Update workshop.  

Spring Offer: Get 10% off on all day courses and special discounts on GDPR certificates

The Facebook Data Breach Fine Explained

2000px-F_icon.svg-2

 

On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998.  In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case.  For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.

The Facts

In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:

  • Their public FB profile, date of birth and current city
  • Photographs they were tagged in
  • Pages they liked
  • Posts on their time lime and their news feed posts
  • Friends list
  • Facebook messages (there was evidence to suggest the app also accessed the content of the messages)

The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.

The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).

Facebook Fine Graphic

In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information.  The App remained in operation until May 2015.

Breach of the DPA

The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.

The FB companies had breached s 4 (4) DPA 1998  by failing to comply with the 1stand 7th data protection principles. They had:

  1. Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly  given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
  2. Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing.  The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing.  (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.

Breach of FB Platform Policy

Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:

  • Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
  • Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
  • The App required permission from users to obtain personal data that the App did not need in breach of the policy.

The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.

Joint Data Controllers

The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.

The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.

The Use of Data Analytics for Political Purposes

The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.

As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.

Susan Wolf will be delivering these upcoming workshops and the forthcoming FOI: Contracts and Commercial Confidentiality workshop which is taking place on the 10th December in London. 

Our 2019 calendar is now live. We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. 

Need to prepare for a DPO/DP Lead role? Train with Act Now on our hugely popular GDPR Practitioner Certificate.

LGL Advert

 

Decision: Facebook Fan Page Administrators are Data Controllers

canstockphoto29052513

By Susan Wolf

On 5th June 2018 the Court of Justice of the European Union (CJEU) delivered its long awaited Facebook fan page decision. The case concerned the definition of data controller under the now repealed Data Protection Directive 95/46/EC [1] and in particular whether the administrator user of a Facebook fan page was a data controller.

The fact that the Data Protection Directive has been replaced by the GDPR 2016 should not diminish the importance of this ruling, particularly for organisations that use Facebook or other social media platforms to promote their business or organisation.

We explain some of the issues raised in the case and consider the implications of the ruling for administrators of Facebook fan pages under the GDPR.

The case

The case involved Wirtschaftsakademie Schleswig-Holstein GmbH, a private training academy in Germany. The company provided business training for commerce and industry (including GDPR training).  It operated a Facebook fan page to make people aware of its range of services and activities.

Fan pages are user accounts that can be set up on Facebook by individuals or businesses. According to Facebook, a fan page is a place where businesses can create a space on Facebook, to connect with people to tell them about their business.  Fan pages are not the same as Facebook profiles, which are limited purely for individuals’ personal use. Unlike a personal Facebook profile, a Fan page is accessible to anyone using the Internet.

Authors of fan pages must register with Facebook in order to use the online platform to post any kind of communication. At that time, fan page administrators could obtain, from Facebook, anonymous statistical information on visitors to the fan page, via a function called ‘Facebook Insights’. That information was collected by means of ‘cookies’, each containing a unique user code, which remained active for two years and were stored by Facebook on the hard disk of the computer or on other media of visitors to fan pages. The user code, which could be matched with the connection data of users registered on Facebook, was collected and processed when the fan pages were opened.

The service, which was provided free of charge under non-negotiable terms, was no doubt very useful to the German Training Academy.  Unfortunately, neither Wirtschaftsakademie, nor Facebook Ireland notified anybody ‘visiting’ the fan page about the use of the cookies or the subsequent processing of the personal data.  The German Data Protection Supervisory Authority for the Schleswig-Holstein Land (Region) took the view that by setting up its fan page, the Wirtschaftsakademie had made an active and deliberate contribution to the collection by Facebook of personal data relating to visitors to the fan page, from which it profited by means of the statistics provided to it by Facebook.  The regulator concluded (in November 2011) that the Wirtschaftsakademie was a data controller and consequently ordered it to deactivate its fan page and threatened a penalty payment if the page was not removed.

The Wirtschaftsakademie challenged that before the German Administrative Court. Their main argument was that it was not responsible under data protection law for the processing of the data by Facebook or the cookies that Facebook installed, and neither had it commissioned Facebook to process personal data on its behalf. This argument was successful before the administrative court. However the regulator appealed and what followed was lengthy protracted litigation in the German courts. By 2016 the case had reached the Federal Administrative Court. The Federal Court also agreed that the Wirtschaftsakademie was not responsible for the data processing as defined by Article 2 (d) of the Data Protection Directive:

  • (d) ‘controller’ shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data. The GDPR, Article 4 defines data controller in identical terms.

However, the Federal Court also decided that it was necessary to refer the question to the CJEU under the preliminary rulings, particularly since the CJEU had previously ruled [2] that the concept of data controller should be given a broad interpretation in the interests of the effective protection of the right of privacy.

The CJEU Ruling

The CJEU has no difficulty in concluding that Facebook Inc. and Facebook Ireland were data controllers because they

determined the purposes and means of processing the personal data of Facebook users and anyone visiting fan pages hosted on Facebook. However, the Court recalls that the definition includes entities that  ‘alone or jointly with others’ determine the purposes and means of data processing. In other words, the purposes may be determined by more than one controller and may be determined by ‘several actors taking part in the processing’ with each being subject to the provisions of the Directive.

On the facts, the Court considered that the administrator of a Facebook fan page:

  • Enters into a contract with Facebook Ireland and subscribes to the conditions of use, including the use of cookies.
  • Is able to define the parameters of the fan page, which has an influence on the processing of personal data for the purposes of producing statistics based on visits to the fan page.
  • Could, with the help of filters made available by Facebook, define the criteria for statistical analysis of data.
  • Could designate the categories of persons whose personal data is to be made use of by Facebook.
  • Can ask Facebook for demographic data relating to its target audience, including age, sex, relationship and occupation, lifestyle and purchasing habits.

These factors pointed to the fact that the administrator of a fan page hosted on Facebook takes part in the determination of the purposes and means of processing the personal data of visitors to the fan page. Consequently the administrator of the fan page is to be regarded as a data controller, jointly with Facebook Ireland.

The Court rejected arguments that the Wirtschaftsakademie only received the statistical data in anonymised form because the fact remained that the statistics were based on the collection, by cookies, of the personal data of visitors to the fan page.

The fact that the fan page administrator uses the platform provided by Facebook does not exempt it from compliance with the Directive. The Court also added that non Facebook users may visit a fan page and therefore the administrator’s responsibilities for the processing of the personal data appears to be even greater as the mere consultation of the home page automatically starts the processing of personal data.

[1]  Case C210/16 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v

Wirtschaftsakademie Schleswig-Holstein GmbH

[2]Case C 212/13  František Ryneš v Úřad pro ochranu osobních údajů

 

We are running GDPR and DPA 2018 workshops throughout the UK. Head over to our website to book your place now. New Dates added for London!

Need to train frontline staff quickly? Try our extremely popular GDPR e-learning course.

Dont forget about our GDPR Helpline, its a great tool to use for some advice when you really need it.

GDPR: Notification and the future of ICO Charges

canstockphoto7747142

By Jon Baines

Data Protection law has, since 1984 in the UK (with the first Data Protection Act), and since 1995 across Europe (with the Data Protection Directive), contained a general obligation on those who process personal data to notify the fact to the relevant supervisory authority (the Information Commissioner’s Office, or “ICO”, in the UK) and pay a fee for doing so. For many organisations it has in effect meant the payment of an annual fee in order to deal with people’s personal data.

Currently, in the UK, under the Data Protection Act 1998 (DPA), data controllers (those organisations who determine the purposes for which and the manner in which personal data are processed) pay either £35 or £500, according to their size (data controllers whose annual turnover is £25.9m or more and who have more than 249 staff must, in general, pay the larger amount). There are various exemptions to the general obligation, for instance for some controllers who are not-for-profit and for those who process personal data only for staff administration (including payroll), or advertising, marketing and public relations (in connection with their own business activity), or for accounts and records.

Failure by a controller to make a notification, unless it has an exemption, is a criminal offence under sections 17 and 21 of the DPA, punishable by a fine. However, only one successful prosecution appears to have been brought by the ICO in the last calendar year – a surprisingly low figure, given that, anecdotally, the author is aware of large numbers of controllers failing to make a notification when they should do so.

The General Data Protection Regulation (GDPR) does away with what has often been seen as a fragmented and burdensome notification requirement, substituting for it, at least in part, an accountability principle, under which relevant organisations (“data controllers”) will have to keep internal records of processing activities. As far back as 1997 the Article 29 Working Party, representing data protection authorities across the EU, recognised that excessively bureaucratic requirements in relation to notification not only represent a burden for business but undermine the whole rationale of notification by becoming an excessive burden for the data protection authorities.

And in its impact assessment in 2012, when the GDPR was first proposed, the European Commission explained some of the reasoning behind the removal of the requirement:

“[Notification] imposes costs and cumbersome procedures on business, without delivering any clear corresponding benefit in terms of data protection. All economic stakeholders have confirmed…that the current notification regime is unnecessarily bureaucratic and costly. [Data protection authorities] themselves agree on the need to revise and simplify the current system.”

However, in the UK at least the removal under the GDPR of notification fees would have had a catastrophic effect on the ICO’s existence, because, at the moment, all of the funding for its data protection work comes from fees income – almost £24m last year.

To address this impending shortfall, the government has aimed to provide powers (actually in the form of two pieces of legislation – first the Digital Economy Act and now the recent Data Protection Bill (DP Bill) (presumably the former will fall away given the introduction of the latter) to make regulations to create a domestic scheme for data protection fees. The explanatory notes to the Data Protection Bill state that”

“[Clause 132] provides the Secretary of State with a power to make regulations requiring data controllers to pay a charge to the Commissioner. Those regulations may provide for different charges in different cases and for a discounted charge. In setting the charge the Secretary of State will take into account the desirability of offsetting the amount needed to fund the Commissioner’s data protection and privacy and electronic communications regulatory functions. It also provides that the Secretary of State may make regulations requiring a controller to provide information to the Commissioner to help the Commissioner identify the correct charge.”

A clue as to how the charges might be set has now been provided by means of a questionnaire, sent on behalf of the Department for Digital, Culture, Media and Sport (DCMS) to 300 lucky data controllers, seeking their views on what the fee structure might be. There is nothing on the DCMS, or ICO, website about this, so it’s not clear if it takes the form of a consultation, or, more likely, a scoping exercise. But what it appears to be putting forward for consideration is a three-tier scheme, under which data controllers would pay £55, £80 or £1000, based on the size of the data controller and the number of “customer records” it handles.

As drafted, the questionnaire doesn’t propose any exemptions. One assumes that these would follow, but even so, the proposal to levy a fee for data protection on business, at a time when the European legislature has removed it, must raise questions about how business-friendly this particular piece of law-making will be.

Additionally, it is not clear what the sanction for non-compliance, and what the enforcement regime, would be. As indicated above, the current criminal sanction does not appear to have prevented any number of data controllers from avoiding their legal obligations, with apparent impunity. One presumes, though, that enforcement would be left as a function of the ICO, and, given that Commissioner Elizabeth Denham has said on various occasions that her office needs to grow to cope with the demands of GDPR, it is to be supposed that she will aim to be strict on this matter.

There are estimated to be approximately 5.5 million businesses in the UK. If each of those paid only the bottom tier under the suggested fees structure, this could equate to a potential cost to business of about £3bn per annum. Even if only a proportion of businesses actually end up paying (bearing in mind the likely exemptions, and likely avoidance/ignorance of some – just like now), £55 is a 57% increase on the current lower fee, and, added to the administrative costs of actually making a notification marks a considerable overall burden on UK business and – indeed – other data controllers.

There is no easy answer to the question of how the ICO’s regulatory functions can effectively be funded, and on one view it makes sense to retain a similar arrangement to the existing one, despite the European legislature having determined it is both ineffective and burdensome. However, it would not be a great surprise to see business interests in the UK lobbying against a domestic measure which is in fact more costly for them than the measures of the European Union the UK is planning to leave.

Jon Baines, is chair of NADPO (www.nadpo.co.uk) and blogs in a personal capacity.

Many of our GDPR workshops are fully booked. We have added a new course on the Data Protection Bill to our programme. 

Councillors, council tax arrears and FOI

POUNDLAND

Some council chiefs, as well as some councillors, do not like the Freedom of Information Act 2000(FOI) claiming, amongst other things, that it costs too much and is used to request trivial information. Against this backdrop, how do council FOI officers deal with requests (often from journalists) for the names of councillors who are in arrears or have defaulted on their council tax bills?

Some councils have refused such requests citing the section 40(2) exemption for third party personal data. For this exemption to be engaged a public authority must show that disclosure of the name(s) would breach one of the Data Protection Principles. Most cases in this area focus on First Principle and so public authorities have to ask, would disclosure be fair and lawful? They also have to justify the disclosure by reference to one of the conditions in Schedule 2 of the DPA (as well as Schedule 3  in the case of sensitive personal data). In the absence of consent, most authorities end up considering whether disclosure is necessary for the applicant to pursue a legitimate interest and, even if it is, whether the disclosure is unwarranted due to the harm caused to the subject(s) (condition 6 of Schedule 2)? Of course when the new General Data Protection Regulation (GDPR) comes into force on 25th May 2018 the disclosure of the data will have to be justified by reference to Article 6 of GDPR.

A 2016 Upper Tribunal decision sheds light on this difficult issue. Haslam v Information Commissioner and Bolton Council [2016] UKUT 0139 (AAC) (10 March 2016) concerned a request by a journalist (Mr Haslam) for disclosure of information about councillors who had received reminders for non-payment of council tax since May 2011.  The Council told the appellant that there were six such councillors and informed him which political party they were members of, how much had been owed, how much was outstanding, and that two had been summoned to court.  The Appellant asked for the names of the individual councillors.  The Council refused stating that the names were exempt from disclosure under section 40(2) FOI.  The Appellant appealed to the First-tier Tribunal, against the decision of the Information Commissioner to uphold the Refusal Notice, in relation to the two councillors who had been summoned to court. The First-tier Tribunal dismissed the appeal.  Subsequently one councillor voluntarily identified himself, so that there was only an issue regarding one councillor before the Upper Tribunal.

The Upper Tribunal allowed the appeal concluding that releasing the name would not contravene the data protection principles, because processing was necessary for the purposes of legitimate interests pursued by the Appellant, and was not unwarranted because of prejudice to the councillor’s rights/legitimate interests.  This was a public matter in which the councilor could not have a reasonable expectation of privacy. Judge Markus in her judgment said:

“40. But, in the case of a councillor, it is not only a private matter. A councillor is a public official with public responsibilities to which non-payment of council tax is directly and significantly relevant.  A number of specific features of this were advanced in submissions to the First-tier Tribunal.  In particular, section 106 of the Local Government Finance Act 1992 bars a councillor from voting on the Council’s budget if he or she has an outstanding council tax debt of over two months.  If a councillor is present at any meeting at which relevant matters are discussed, he or she must disclose that section 106 applies and may not vote.  Failure to comply is a criminal offence. Thus council tax default strikes at the heart of the performance of a councillor’s functions. It is evident that setting the council’s budget is one of the most important roles undertaken by councillors.  The loss of one vote could make a fundamental difference to the outcome. This adds a significant public dimension to the non-payment of council tax.  The very fact that Parliament has legislated in this way reflects the connection between non-payment and the councillor’s public functions.  Moreover, as the Commissioner observed in his decision notice, recent failure to pay council tax is likely to impact on public perceptions and confidence in a councillor as a public figure.

  1. These factors are of critical relevance to expectation.  As the Commissioner  had observed, those who have taken public office should expect to be subject to a higher degree of scrutiny and that information which impinges on their public office might be disclosed.  More specifically, unless the local electorate know the identity of a councillor to whom section 106 applies, they cannot discover that that councillor is failing to fulfil his functions.  Nor can they know that the process of declarations under section 106 is being adhered to. In addition the electorate may wish to know whether they can trust a councillor properly to discharge his functions if he stands for office again.” 

So there we have it. Councillors can normally expect to have their names disclosed if they default on council tax. However this is not an absolute rule. In the words of Judge Markus (at paragraph 56):

“There may be exceptional cases in which the personal circumstances of a councillor are so compelling that a councillor should be protected from such exposure.”

The Bolton News, where the Appellant works, finally named the councillor who is the subject of this case (Click here if interested). By the way, I may share a name with him but I can assure you that I am up to date with my council tax bill payments!

We will be discussing this and other recent FOI decisions in our forthcoming FOI workshops and webinars.

How would you do on the BCS Certificate in Freedom of Information exam? Have a go at our test.

iPhone -> abcPhone

82b54e6dd1d42e1fbfaa6bac4f93f66c

By Paul Simpkins

First the joke

I had a friend who played in a band. When he got his new smart phone he put all his gigs for the next 12 months into his calendar with an alert set for the day before so he knew when he was needed and he could plan the rest of his life.

A few days later his fellow band members rang him from a venue saying “Where are you, we’re on stage in 3 hours…”.

He looked at his phone and found that almost all the dates except 8 that he’d typed in hadn’t gone into his calendar. Only 8 were listed. The rest had disappeared. He dashed down to the phone shop and asked why to which the teenage assistant replied ”Sorry mate you’ve only got an 8 gig phone” [groan…]

But do you really need a phone with massive capacity and hundreds of apps? Do you need two level security or thumbprint login or many of the fancy apps that make your life so complicated efficient?

Is there a market for a simpler smartphone (maybe a dumbphone) that just has 8 key apps built in and no possibility of adding any more. We could call it the 8 app phone to remind us of the old joke. There would only be one home screen so we could call it… the screen.

Many old people don’t use 99% of the functionality of a smartphone. Yes youngsters are in constant contact with every social media platform that exists and are forever uploading and viewing videos of their friends eating junk food in branded outlets while streaming Spotify tunes but do we need all this connectivity?

This revolutionary concept crossed my mind this morning. I’d installed an update on my i-phone and instead of getting on with being my faithful companion my phone reverted to Hello Hola mode. All I had to do was set it up again and all my data would mysteriously flow back through the air to fill it up again. The problem was that I couldn’t remember my i-Cloud code as I’d bravely migrated to (see I can speak the lingo) two level authentication a few days ago. The phone wasn’t playing until it had the code. (I know I should have written it down on a yellow post it note but most of my reminders are in Notes on my phone). I also know that apple groupies will now be screaming “you stupid old git” at their screens and I acknowledge that I don’t know the front end of a universal serial bus from the back end but I’m happy in my own way. I just don’t see the point of unasked for updates that add on features I don’t think I’ll ever use. I’m often quite a few updates late and I still don’t know why I accepted this one so readily.

I went on the web and signed in with my Apple ID and it said no problem we’ll send a 6 letter code to your trusted device and you can type it in and you’ll be fine. Unfortunately my trusted device was the phone that the update had turned into a small door stop so the code I needed to unlock it was stopping at the door and not going in.

I rang Apple support and pointed out the problem and they ummed and ahhhed for 30 minutes before deciding that I had to take the SIM out of the phone, put it into another phone, set it up as a clone of my small doorstop, look in the text inbox, retrieve the code I’d been sent, take the SIM out, return it to my small doorstop and type in the code which would make my door stop suddenly metamorphose into a beautiful smartphone and fly off into the sunset.

The local phone shop refused to do it as it might lock the donor phone so I went home to find an old i phone. Soon I had no i-cloud code and 2 locked phones.

Fortunately I also had a macbook and an IT literate partner and for 3 hours we trawled the web, switched off this, switched on that, reset the donor phone and through trying every possible route through the Hello Hola roadblock finally made it work. Then we saw 9 texts each containing a 6 letter unlock code.

With feverish glee we put the SIM back where it belonged and tried to replicate the process. We did at one stage receive an email message saying that someone in Middlesbrough had tried to sign into my account but ignored as it was so obviously a ruse de guerre. (Heckmondwike yes but Middlesbrough no way…). An hour later we’d made it. It involved changing an apple ID password and several cups of coffee and a few cookies but we made it. By now darkness had fallen and we were both too tired to actually use the phone.

Back to the brilliant idea. The next development for Apple after the i-phone should (obviously) be the j-phone. The J stands for ‘just a few things on the” phone. Essentials are phone, text, web, calendar, maps, settings, camera, contacts and nothing else. (There will be a focus group later to decide which 8 are essential). {We’ll make them big icons while we’re at it}. But lets make even simpler and to save a law suit, just call it the abc-phone. Being as there’s no video or music or social media this can be produced cheaply and only sold to anyone who can produce a bus pass or a senior rail card (with photo ID  – we’re not letting any spotty youngsters in on the secret). There’ll be no real security on the phone – if someone pinches it there will be no value to sell on and the user can just buy another.

Over to you Apple…

Regards

A grumpy old man.

 

Make 2017 the year you achieve a GDPR qualification? See our full day workshops and new GDPR Practitioner Certificate.

 

 

 

image credits: http://www.techradar.com/reviews/phones/mobile-phones/iphone-6-1264565/review

The Right to Data Portability under GDPR

canstockphoto11651619

The new General Data Protection Regulation (GDPR) will come into force on 25th May 2018. Whilst it will replaces the UK’s Data Protection Act 1998 (DPA), it still includes the right of the Data Subject to receive a copy of his/her data, to rectify any inaccuracies and to object to direct marketing. It also introduces new rights, one of which is the right to Data Portability.

Article 20 of GDPR allows for Data Subjects to receive their personal data, which they have provided to a Data Controller, in a structured, commonly used and machine-readable format, and to transmit it to another Data Controller. The aim of this right is to support user choice, user control and consumer empowerment. It will have a big impact on all Data Controllers but particularly data driven organisations such as banks, cloud storage providers, insurance companies and social networking websites. These organisations may find that customers are encouraged to move suppliers, as they will be armed with much more information than they previously had accessed to. This in turn may lead to an increase in competition driving down prices and improving services (so the theory goes; we live in hope!).

When the Right Can Be Exercised

Unlike the subject access right, the Data Portability right does not apply to all personal data held by the Data Controller concerning the Data Subject.  Firstly it has to be automated data. Paper files are not included. Secondly the personal data has to be knowingly and actively provided by the Data Subject. For example account data (e.g. mailing address, user name, age) submitted via online forms, but also when they are generated by and collected from the activities of users, by virtue of the use of a service or device.

By contrast personal data that are derived or inferred from the data provided by the Data Subject, such as a user profile created by analysis of raw smart metering data or a website search history, are excluded from the scope of the right to Data Portability, since they are not provided by the Data Subject, but created by the Data Controller.

Thirdly the personal data has to be processed by the Data Controller with the Data Subject’s consent or pursuant to a contract with him/her. Therefore personal data processed by local authorities as part of their public functions (e.g. council tax and housing benefit data) will be excluded from the right to Data Portability.

It is important to not that this right does not require Data Controllers to keep personal data for longer than specified in their retention schedules or privacy polices. Nor is there a requirement to start storing data just to comply with a Data Portability request if received.

Main elements of Data Portability

Article 20(1) gives a Data Subject two rights:

  1. To receive personal data processed by a Data Controller, and to store it for further personal use on a private device, without transmitting it to another Data Controller.

This is similar to the subject access right. However here the data has to be received “in a structured, commonly used, machine readable format” thus making it easier to analyse and share. It could be used to receive a playlist from a music streaming service, information about online purchases or leisure pass data from a swimming pool.

  1. A right to transmit personal data from one Data Controller to another Data Controller “without hindrance”

This provides the ability for Data Subjects not just to obtain and reuse their data, but also to transmit it to another service provider e.g. social networking sites and cloud storage providers etc. It facilitates the ability of data subjects to move, copy or transmit personal data easily. In addition it provides consumer empowerment by preventing “lock-in”.

The right to Data Portability is expected to foster opportunities for innovation and sharing of personal data between Data Controllers in a safe and secure manner, under the control of the data subject.

Time Limits

Data Controllers must respond to requests for Data Portability without undue delay, and within one month. This can be extended by two months where the request is complex or a number of requests are received. Data Controllers must inform the individual within one month of receipt of the request and explain why the extension is necessary.

Information is to be provided free of charge save for some exceptions. Refusals must be explained as well as the right to complain to the Information Commissioner’s Office (ICO).

Notification Requirements

Data Controllers must inform Data Subjects of the right to Data Portability within their Privacy Notice as required by Article 13 and 14 of GDPR.  (More on Privacy Notices under GDPR here.  See also the ICO’s revised Privacy Notices Code.)

In December 2016, the Article 29 Data Protection Working Party published guidance on Data Portability and a useful FAQ. (Technically these documents are still in draft as comments have been invited until the end of January 2017). It recommends that Data Controllers clearly explain the difference between the types of data that a Data Subject can receive using the portability right or the access right, as well as to provide specific information about the right to Data Portability before any account closure, to enable the Data Subject to retrieve and store his/her personal data.

Subject to technical capabilities, Data controllers should also offer different implementations of the right to Data Portability including a direct download opportunity and allowing Data Subjects to directly transmit the data to another Data Controller.

Impact on the Public Sector 

Local authorities and the wider public sector might be forgiven for thinking that the Data Portability right only applies to private sector organisations which processes a lot of personal data based on consent or a contract e.g. banks, marketing companies, leisure service providers, utilities etc. Major data processing operations in local authorities (e.g. for the purposes of housing benefit, council tax etc.) are based on carrying out public functions or statutory duties and so excluded. However a lot of other data operations will still be covered by this right e.g. data held by personnel, accounts and payroll, leisure services and even social services. An important condition is that the Data Subject must have provided the data.

The Government has confirmed that GDPR is here to stay; well beyond the date when the UK finally leaves the European Union. All Data Controllers need to assess now what impact the right to Data Portability will have on their operations. Policies and Procedures need to be put into place now.

Make 2017 the year you get prepared for the General Data Protection Regulation (GDPR). See our full day workshops and new GDPR Practitioner Certificate.

New Webinar on GDPR and the Right to Data Portability. Register onto the live session or watch the recording.