The code is not law nor does it ‘enforce’ data sharing, but it does provide some useful steps to consider when sharing personal data either as a one off or as part of an ongoing arrangement. Data Protection professionals, and the staff in the organisations they serve, will still need to navigate a way through various pressures, frameworks, and expectations on the sharing of personal data; case by case, framework by framework. A more detailed post on the contents of the code can be read here.
Act Now Training is pleased to announce a new full day ‘hands on’ workshop for Data Protection professionals on Data Sharing. Our expert trainer, Scott Sammons, will look at the practical steps to take, sharing frameworks and protocols, risks to consider etc. Scott will also explore how, as part of your wider IG framework, you can establish a proactive support framework; making it easier for staff to understand their data sharing obligations/expectations and driving down the temptation to use a ‘Data Protection Duck out’ for why something was shared/not shared inappropriately.
Delegates will also be encouraged to bring a data sharing scenario to discuss with fellow delegates and the tutor. This workshop can also be customised and delivered to your organisation at your premises or virtually. Get in touch to learn more.
So much has happened in the world of data protection recently. Where to start?
In April, the European Data Protection Board’s (EDPB) opinions (GDPR and Law Enforcement Directive (LED)) on UK adequacy were adopted. The EDPB has looked at the draft EU adequacy decisions. It acknowledge that there is alignment between the EU and UK laws but also expressed some concerns. It has though issued a non-binding opinion recommending their acceptance. If accepted the two adequacy decisions will run for an initial period of four years. More here.
Last month saw the ICO’s annual data protection conference go online due to the pandemic. Whilst not the same as a face to face conference, it was still a good event with lots of nuggets for data protection professionals including the news that the ICO is working on bespoke UK standard contractual clauses (SCCs) for international data transfers. Deputy Commissioner Steve Wood said:
“I think we recognise that standard contractual clauses are one of the most heavily used transfer tools in the UK GDPR. We’ve always sought to help organisations use them effectively with our guidance. The ICO is working on bespoke UK standard clauses for international transfers, and we intend to go out for consultation on those in the summer. We’re also considering the value to the UK for us to recognise transfer tools from other countries, so standard data transfer agreements, so that would include the EU’s standard contractual clauses as well.”
Lloyd v Google
The much-anticipated Supreme Court hearing in the case of Lloyd v Google LLC took place at the end of April. The case concerns the legality of Google’s collection and use of browser generated data from more than 4 million+ iPhone users during 2011-12 without their consent. Following the two-day hearing, the Supreme Court will now decide, amongst other things, whether, under the DPA 1998, damages are recoverable for ‘loss of control’ of data without needing to identify any specific financial loss and whether a claimant can bring a representative action on behalf of a group on the basis that the group have the ‘same interest’ in the claim and are identifiable. The decision is likely to have wide ranging implications for representative actions, what damages can be awarded for and the level of damages in data protection cases. Watch this space!
In November 2020, the ICO fined Ticketmaster £1.25m for a breach of Articles 5(1)(f) and 32 GPDR (security). Ticketmaster appealed the penalty notice on the basis that there had been no breach of the GDPR; alternatively that it was inappropriate to impose a penalty, and that in any event the sum was excessive. The appeal has now been stayed by the First-Tier Tribunal until 28 days after the pending judgment in a damages claim brought against Ticketmaster by 795 customers: Collins & Others v Ticketmaster UK Ltd (BL-2019-LIV-000007).
Age Appropriate Design Code
This code came into force on 2 September 2020, with a 12 month transition period. The Code sets out 15 standards organisations must meet to ensure that children’s data is protected online. It applies to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have the best possible access to online services whilst minimising data collection and use.
With less than four months to go (2 September 2021) the ICO is urging organisations and businesses to make the necessary changes to their online services and products. We are planning a webinar on the code. Get in touch if interested.
AI and Automated Decision Making
Article 22 of GDPR provides protection for individuals against purely automated decisions with a legal or significant impact. In February, the Court of Amsterdam ordered Uber, the ride-hailing app, to reinstate six drivers who it was claimed were unfairly dismissed “by algorithmic means.” The court also ordered Uber to pay the compensation to the sacked drivers.
In April EU Commission published a proposal for a harmonised framework on AI. The framework seeks to impose obligations on both providers and users of AI. Like the GDPR the proposal includes fine levels and an extra-territorial effect. (Readers may be interested in our new webinar on AI and Machine Learning.)
Publicly Available Information
Just because information is publicly available it does not provide a free pass for companies to use it without consequences. Data protection laws have to be complied with. In November 2020, the ICO ordered the credit reference agency Experian Limited to make fundamental changes to how it handles personal data within its direct marketing services. The ICO found that significant ‘invisible’ processing took place, likely affecting millions of adults in the UK. It is ‘invisible’ because the individual is not aware that the organisation is collecting and using their personal data. Experian has lodged an appeal against the Enforcement Notice.
Interesting that recently the Spanish regulator has fined another credit reference agency, Equifax, €1m for several failures under the GDPR. Individuals complained about Equifax’s use of their personal data which was publicly available. Equifax had also failed to provide the individuals with a privacy notice.
Data Protection by Design
The Irish data protection regulator issued its largest domestic fine recently. Irish Credit Bureau (ICB) was fined €90,000 following a change in the ICB’s computer code in 2018 resulted in 15,000 accounts having incorrect details recorded about their loans before the mistake was noticed. Amongst other things, the decision found that the ICB infringed Article 25(1) of the GDPR by failing to implement appropriate technical and organisational measures designed to implement the principle of accuracy in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects (aka DP by design and by default).
The ICO’s Data Sharing Code of Practice provides organisations with a practical guide on how to share personal data in line with data protection law. Building on the code, the ICO recently outlined its plans to update its guidance on anonymisation and pseudonymisation, and to explore the role that privacy enhancing technologies might play in enabling safe and lawful data sharing.
UK GDPR Handbook
The UK GDPR Handbook is proving very popular among data protection professionals.
It sets out the full text of the UK GDPR laid out in a clear and easy to read format. It cross references the EU GDPR recitals, which also now form part of the UK GDPR, allowing for a more logical reading. The handbook uses a unique colour coding system that allows users to easily identify amendments, insertions and deletions from the EU GDPR. Relevant provisions of the amended DPA 2018 have been included where they supplement the UK GDPR. To assist users in interpreting the legislation, guidance from the Information Commissioner’s Office, Article 29 Working Party and the European Data Protection Board is also signposted. Read what others have said:
“A very useful, timely, and professional handbook. Highly recommended.”
“What I’m liking so far is that this is “just” the text (beautifully collated together and cross-referenced Articles / Recital etc.), rather than a pundits interpretation of it (useful as those interpretations are on many occasions in other books).”
“Great resource, love the tabs. Logical and easy to follow.”
The law on data sharing is a minefield clouded with myths and misunderstandings.
The Information Commissioner’s Office (ICO) recently launched a consultation on an updated draft code of practice on this subject. Before drafting the new code, the ICO launched a call for views in August 2018, seeking input from various organisations such as trade associations and those representing the interests of individuals. (Read a summary of the responses here). The revised code will eventually replace the version made under the Data Protection Act 1998, first published in 2011.
“… address many aspects of the new legislation including transparency, lawful bases for processing, the new accountability principle and the requirement to record processing activities”.
Once finalised, the code will be a statutory code of practice under section 121 of the DPA 2018. Under section 127, the ICO must take account of it when considering whether a Data Controller has complied with its data protection obligations in relation to data sharing. The code can also be used in evidence in court proceedings and the courts must take its provisions into account wherever relevant.
Following the code, along with other ICO guidance, will help Data Controllers to manage risks; meet high standards; clarify any misconceptions about data sharing; and give confidence to share data appropriately and correctly. In addition to the statutory guidance, the code contains some optional good practice recommendations, which aim to help Data Controllers adopt an effective approach to data protection compliance.
It also covers some special cases, such as databases and lists, sharing information about children, data sharing in an emergency, and the ethics of data sharing.Reference is also made to the provisions of the Digital Economy Act 2017 which seeks to promote data sharing across the public sector
There is also section on sharing data for the purposes of law enforcement processing under Part 3 of the DPA 2018. This is an important area which organisations have not really understood as demonstrated by the recent High Court ruling that Sussex Police unlawfully shared personal data about a vulnerable teenager putting her “at greater risk.”
Steve Wood, the Deputy Information Commissioner for Policy, said:
“Data sharing brings many benefits to organisations and individuals, but it needs to be done in compliance with data protection law.”
“Our draft data sharing code gives practical advice and guidance on how to share data safely and fairly, and we are encouraging organisations to send us their comments before we launch the final code in the Autumn.”
On 24th October the Information Commissioner imposed a fine (monetary penalty) of £500,000 on Facebook Ireland and Facebook Inc (which is based in California, USA) for breaches of the Data Protection Act 1998. In doing so the Commissioner levied the maximum fine that she could under the now repealed DPA 1998. Her verdict was that the fine was ‘appropriate’ given the circumstances of the case. For anyone following the so-called Facebook data scandal the fine might seem small beer for an organisation that is estimated to be worth over 5 billion US Dollars. Without doubt, had the same facts played out after 25th May 2018 then the fine would arguably have been much higher, reflecting the gravity and seriousness of the breach and the number of people affected.
In summary, the Facebook (FB) companies permitted Dr Aleksandr Kogan to operate a third-party application (“App”) that he had created, known as “thisisyourdigitallife” on the FB platform. The FB companies allowed him and his company (Global Science Research (GSR) to operate the app in conjunction with FB from November 2013 to May 2015. The app was designed to and was able to obtain a significant amount of personal information from any FB user who used the app, including:
Their public FB profile, date of birth and current city
Photographs they were tagged in
Pages they liked
Posts on their time lime and their news feed posts
Facebook messages (there was evidence to suggest the app also accessed the content of the messages)
The app was also designed to and was able to obtain extensive personal data from the FB friends of the App’s users and anyone who had messaged the App user. Neither the FB friends or people who had sent messages were informed that the APP was able to access their data, and nor did they give their consent.
The APP was able to use the information that it collected about users, their friends and people who had messaged them, in order to generate personality profiles. The information and also the data derived from the information was shared by Dr Kogan and his company with three other companies, including SCL Elections Ltd (which controls the now infamous Cambridge Analytica).
In May 2014 Dr Kogan sought permission to migrate the App to a new version of the FB platform. This new version reduced the ability of apps to access information about the FB friends of users. FB refused permission straight away. However, Dr Kogan and GSR continued to have access to, and therefore retained, the detailed information about users and the friends of its users that it had previously collected via their App. FB did nothing to make Dr Kogan or his company delete the information. The App remained in operation until May 2015.
Breach of the DPA
The Commissioner’s findings about the breach make sorry reading for FB and FB users. Not only did the FB companies breach the Data Protection Act, they also failed to comply or ensure compliance with their own FB Platform Policy, and were not aware of this fact until exposed by the Guardian newspaper in December 2015.
The FB companies had breached s 4 (4) DPA 1998 by failing to comply with the 1stand 7th data protection principles. They had:
Unfairly processed personal data in breach of 1st data protection principle (DPP1). FB unfairly processed personal data of the App users, their friends and those who exchanged messages with users of the APP. FB failed to provide adequate information to FB users that their data could be collected by virtue of the fact that their friends used the App or that they exchanged messages with APP users. FB tried, unsucesfully and unfairly, to deflect responsibility onto the FB users who could have set their privacy settings to prevent their data from being collected. The Commissioner rightly rejected this. The responsibility was on Facebooks to inform users about the App and what information it would collect and why. FB users should have been given the opportunity to withhold or give their consent. If any consent was purportedly given by users of the APP or their friends, it was invalid because it was not freely given , specific or informed. Conseqauntly, consent did not provide a lawful basis for processing
Failed to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data, in breach of the 7th data protection principle (DPP7). The processing by Dr Kogan and GSR was unauthorised (it was inconsistent with basis on which FB allowed Dr Kogan to obtain access of personal data for which they were the data controller; it breached the Platform Policy and the Undertaking. The processing by DR Kogan and his company was also unlawful, because it was unfair processing. The FB companies failed to take steps (or adequate steps) to guard against and unlawful processing. (See below). The Commissioner considered that the FB companies knew or ought to have known that there was a serious risk of contravention of the data protection principle sand they failed to take reasonable steps to prevent such a contravention.
Breach of FB Platform Policy
Although the FB companies operated a FB Platform Policy in relation to Apps, they failed to ensure that the App operated in compliance with the policy, and this constituted their breach of the 7th data protection principle. For example, they didn’t check Dr Kogan’s terms and conditions of use of the APP to see whether they were consistent with their policy (or presumably whether they were lawful). In fact they failed to implement a system to carry out such a review. It was also found that the use of the App breached the policy in a number of respects, specifically:
Personal data obtained about friends of users should only have been used to improve the experience of App users. Instead Dr Kogan and GSR was able to use it for their own purposes.
Personal data collected by the APP should not be sold or third parties. Dr Kogan and GSR had transferred the data to three companies.
The App required permission from users to obtain personal data that the App did not need in breach of the policy.
The FB companies also failed to check that Dr Kogan was complying with an undertaking he had given in May 2014 that he was only using the data for research, and not commercial, purposes. However perhaps one of the worst indictments is that FB only became aware that the App was breaching its own policy when the Guardian newspaper broke the story on December 11 2015. It was only at this point, when the story went viral, that FB terminate the App’s access right to the Facebook Login. And the rest, as they say, is history.
Joint Data Controllers
The Commissioner decided that Facebook Ireland and Facebook Inc were, at all material times joint data controllers and therefore jointly and severally liable. They were joint data controllers of the personal data of data subjects who are resident outside Canada and the USA and whose personal data is processed by or in relation to the operation of the Facebook platform. This was on the basis that the two companies made decisions about how to operate the platform in respect of the personal data of FB users.
The Commissioner also concluded that they processed personal data in the context of a UK establishment, namely FB UK (based in London) in respect of any individuals who used the FB site from the UK during the relevant period. This finding was necessary in order to bring the processing within scope of the DPA and for the Commissioner to exercise jurisdiction of the two Facebook companies.
The Use of Data Analytics for Political Purposes
The Commissioner considered that some of the data that was shared by Dr Kogan and his company, with the three companies is likely to have been used in connection with, or for the purposes of, political campaigning. FB denied this as far as UK residents were concerned and the Commissioner was unable, on the basis of information before her, whether FN was correct. However, she nevertheless concluded that the personal data of UK users who were UK residents was put at serious risk of being shared and used in connection with political campaigning. In short Dr Kogan and/or his company were in apposition where they were at liberty to decide how to use the personal data of UK residents, or who to share it with.
As readers will know, this aspect of the story continues to attract much media attention about the possible impact of the data sharing scandal on the US Presidential elections and the Brexit referendum. The Commissioner’s conclusions are quite guarded, given the lack of evidence or information available to her.
Much has been written about the complexities of the current legal regime relating to public sector data sharing. Over the years this blog has covered many stops and starts by the government when attempting to make the law clearer.
The Digital Economy Bill is currently making its way through Parliament. It contains provisions, which will give public authorities (including councils) more power to share personal data with each other as well as in some cases the private sector.
The Bill will give public authorities a legal power to share personal data for four purposes:
To support the well being of individuals and households.The specific objectives for which information can be disclosed under this power will be set out in Regulations (which can be added to from time to time). The objectives in draft regulations so far include identifying and supporting troubled families, identifying vulnerable people who may need help re tuning their televisions after changes to broadcasting bands and providing direct discounts on energy bills for people living in fuel poverty.
For the purpose of debt collection and fraud prevention. Public authorities will be able to set up regular data sharing arrangements for public sector debt collection and fraud prevention but only after such arrangements have been through a business case and government approval process.
Enabling public authorities to access civil registration data (births, deaths and marriages) (e.g. to prevent the sending of letters to people who have died).
Giving the Office for National Statistics access to detailed administrative government data to improve their statistics.
The new measures are supported by statutory Codes of Practice (currently in draft) which provide detail on auditing and enforcement processes and the limitations on how data may be used, as well as best practice in handling data received or used under the provisions relating to public service delivery, civil registration, debt, fraud, sharing for research purposes and statistics. Security and transparency are key themes in all the codes. Adherence to the 7th Data Protection Principle (under Data Protection Act 1998 (DPA)) and the ICO’s Privacy Notices Code (recently revised) will be essential.
A new criminal offence for unlawful disclosure of personal data is introduced by the Bill. Those found guilty of an offence will face imprisonment for a term up to two years, a fine or both. The prison element will be welcomed by the ICO which has for a while been calling for tougher sentences for people convicted of stealing personal data under the DPA.
The Information Commissioner was consulted over the codes so (hopefully!) there should be no conflict with the ICO Data Sharing Code. The Bill is not without its critics (including Big Brother Watch) , many of whom argue that it is too vague and does not properly safeguard individuals’ privacy.
It is also an oversight on the part of the drafters that it does not mention the new General Data Protection Regulation (GDPR) which will come into force on 25th May 2018. This is much more prescriptive in terms of Data Controllers’ obligations especially on transparency and privacy notices.
These and other Information Sharing developments will be examined in our data protection workshops and forthcoming webinar.
Illustration provided by the Office of the Privacy Commissioner of Canada (www.priv.gc.ca)
I bought a new car. On delivery day it was in the showroom draped in a royal blue cloth with a sign saying Reserved for Mr Onassis. The salesman before handing me the keys mumbled in an apologetic fashion “The Sales Manager likes to talk to every customer when they take delivery…”
The Sales Manager didn’t waste much time. He said that I’d shortly be receiving a call from a company who surveys new car buyers to find out what they thought of the dealership. Then he slipped in the hard sell. “They’ll ask you to mark us on a scale of 1 to 10. Only 9 and 10 are positive; anything below that is negative.”
The survey duly arrived. I declined to answer even though I was very happy with the car and the dealership.
Days later my bank called me. I was probably going to be asked to rate my bank. From a list of phrases from very displeased to very pleased I had to choose the phrase that best described my experience. “Please be sure to say you’re very pleased with our service. Anything else is considered negative”. Again I declined to do the survey even though my bank is pretty awful.
Last week a hotel that Act Now Training uses did the same thing. Please let us know what you think of our hotel. This time the hotel manager foolishly put his suggestion “Actually it’s a yes/no question; anything under 8 is negative. We need 9s and 10s” in an email. Now we have the evidence that the practice exists. Previously the conspiracy had only survived by word of mouth.
I haven’t answered yet.
What value does a survey have when the surveyees are primed to deliver the response the company wants? Is every survey result is the product of a self selecting group – the group of people who like to give high scores in surveys? Or is there another group like me who never participate in the survey who feel there’s no value in a survey where the traditionalLikert scale has been morphed into a 50/50 shot? Most brits are stiff upper lip types who won’t take a survey if their views would have been critical in case someone contacted them afterwards.
Is the information age producing better information or is the value or a survey subjective, objective or merely the result of a carefully orchestrated customer manipulation.
This article already had 12,500 likes before I posted it. Find them on Ebay.
Paul Simpkins is a Director and Trainer at Act Now Training Ltd. He will be delivering the internationally recognized BCS certificate in Data Protection in June. If you are interested in this or any other Act Now training courses on Information governance, please visit our website www.actnow.org.uk
Last year the Law Commission launched a consultation on the law around sharing of personal information between public sector organisations. The paper outlined the current law and asked 22 broad questions. In July, following analysis of the consultation responses, the Commission recommended a full-scale, UK-wide reform project to consider how the current law can be simplified and modernised.
The legalities of data sharing is a subject which often confuses public sector officials. Local authorities, in particular, are often stumped by the “To Share or Not to Share” question, even if the sharing is for very good reasons (e.g. child protection or crime prevention). More often than not, the Data Protection Act 1998 (DPA) is made the scapegoat for officials’ failure to fully understand the law. It is wrongly perceived as a barrier to data sharing despite offering a range of justifications (e.g. consent, legal obligation, protecting vital interests etc. (Schedule 2)). According to Nicholas Paines QC, the Law Commissioner responsible for public law:
“Data sharing law must achieve a balance between the public interest in sharing information and the public interest in protecting privacy,”
We recommend that a full law reform project should be carried out in order to create a principled and clear legal structure for data sharing, which will meet the needs of society. These needs include efficient and effective government, the delivery of public services and the protection of privacy. Data sharing law must accord with emerging European law and cope with technological advances. The project should include work to map, modernise, simplify and clarify the statutory provisions that permit and control data sharing and review the common law.
The scope of the review should extend beyond data sharing between public bodies to the disclosure of information between public bodies and other organisations carrying out public functions.
The project should be conducted on a tripartite basis by the Law Commission of England and Wales, together with the Scottish Law Commission and the Northern Ireland Law Commission.
The Commission suggests that the project could usefully include consideration of the functions of the Information Commissioner in relation to data sharing, including the Commissioner’s enforcement role (Read the ICO’s response to the consultation.)
The Cabinet Office and the Ministry of Justice will now decide together whether to refer a full law reform project to the Law Commission.
Don’t hold your breath. We have been here before! Furthermore, do we really need new laws on data sharing or a better awareness of the existing ones? As I have said before, the current law is adequate to regulate yet allow responsible data sharing. The DPA and the ICO Data Sharing Code can be very useful tools for allowing responsible data sharing if they are properly understood.
STOP PRESS – The final report of the Home Office research into “Multi Agency Working and Information Sharing” was published on 1st August. The report makes for interesting reading, and sets out a number of commitments the Home Office is making to continue to support multi agency working and information sharing.
ICO invites practitioners to feedback on its Data Sharing Code of Practice
The ICO is inviting feedback on its Data Sharing Code of Practice.
Published in May 2011 the publication continues to be one of our most popular pieces of guidance. We would like to hear about how you’re using the guidance and how it has helped your organisation meet its data protection and freedom of information obligations.
As many data protection practitioners are well aware, there is a whole raft of legislation affecting the sharing of personal data. There are laws to tell us we must share. There are laws to tell us we absolutely must not share. There are laws that say we can share specific personal data with specific named bodies. There are laws that suggest implicitly that we can share, maybe, if the wind is blowing in the right direction that day…but we could always be challenged on that sort of sharing. It’s a minefield for your data protection officer who dreads that question “Can we share that data?” The response inevitably, and somewhat unhelpfully, is often “Well, it depends….”. With monetary penalties available to the Information Commissioner of up to £500,000 if your organisation gets it horribly wrong, it’s hardly surprising such organisations are often risk-averse when it comes to data sharing with other third parties.
It seems almost impossible for any one individual to be knowledgeable about all of these different rules, hidden within numerous Acts of Parliament, Regulations, and Statutory Instruments (There is no consistent way of publishing these). Take for example birth and death data. Local authorities need to know where new-born babies are, not only to plan future school places but also to meet statutory Ofsted reach targets which require them to contact the new parent or parents to offer services for the child. It would seem obvious to acquire that information from their local authority registration service. Yet the Office of National Statistics point out that Statistics and Registration Service Act 2007 explicitly prohibits the local registration service from passing that information to its local council, its own employer, except for public health purposes.
The Law Commission, which has been studying this issue for the last year, says that it has only just begun to scratch the surface regarding the huge amount of different pieces of legislation that contain references to data sharing. It looks set to recommend this month to Government that there should be a full review of the law relating to information and personal data sharing.
The Government has for some time realised that this is a problem and wishes to “develop a better understanding of the economy and society, deliver more targeted and joined-up public services, and save public money lost through fraud, error and debt ” through effective, and legal, information sharing.
Current legislative and cultural barriers have resulted in a cottage industry of data sharing agreements between government departments and other partners, which can take time and resources to put in place. The government is well aware, at a time when Care.Data is the elephant in the room, that trust is a key issue in this process. How can the government ensure sensible data sharing to provide more efficient and less costly services for the public, which complies with all relevant legislation (the Data Protection Act 1998, the Human Rights Act 1998 and the more tricky issue of the unwritten common law duty of confidentially) and maintain the trust of the public? To address this it has decided to launch an open policy making process:
“The intention is to embark upon an open policy process that brings together those inside and outside government interested in maximising the benefits and minimising the downsides to citizens of personal data sharing within government.”
The Cabinet Office, in collaboration with other government departments, is leading on the work, driven by Cabinet Minister Francis Maude, who is responsible for the Government’s transparency policy. This work must now dovetail with the Law Commission’s proposals and ultimately the new proposed EU Data Protection Regulation. The process is being coordinated byInvolve, a civil society organisation, which has been awarded £20,000 to progress the policy-making process. Initial meetings have been held, a mailing list and website established, and input is now required from anyone interested in helping to shape future data sharing of public sector information.
The expected future developments of the process are as follows:
The Law Commission will report in April recommending a review of the law relating to data sharing.
Proposals will be developed under the new open policy making process until mid-August.
A policy document will be produced for mid-September for MPs to consider upon returning from recess.
Legal Counsel to produce key draft clauses and a White paper for the Christmas break.
Open public consultation Jan – March 2015.
The next Government will consider any data sharing proposals in the first session of Parliament after the 2015 General Election.
Clearly there needs to be cross-party support for this project for it to proceed past the next election. It is however very likely that this will be supported by all main parties and should not be a showstopper. The engagement of the organisations mentioned above at this early stage is important. With high profile organisations such as those on board, ensuring that privacy concerns are addressed early, it reduces the chance of problems for the government later in the process… and enables the government to cash in on a potential £16 billion of estimated income from UK data assets.
More importantly for the public though, this more inclusive process will hopefully genuinely address those privacy concerns that appear to have been wilfully ignored during the Care.Data process. This week has seen media reports focussing on HMRC selling our tax records next, and the fact that children’s records are already sold; a fact parents were no doubt unaware of and certainly not consulted upon. It is therefore vitally important that practitioners and organisations on the ground, the ones physically sharing data on a daily basis, can feed into this process in its initial stages to highlight and bottom out real practical issues as well as legislative and cultural ones.
This is no small task, and the timetable is incredibly tight to keep those involved focussed. As Francis Maude said at the last meeting, we don’t even know if we will be able to come up with anything workable; it could simply be too difficult. It is however worth the effort if it simplifies the data sharing process and offers the public a better and cost-effective service, whilst taking into account privacy concerns.
It’s not too late to get involved. If you have experience of information sharing or are a data protection practitioner or privacy expert, you can sign up at the www.datasharing.org.uk website, or join the mailing list, and help shape the proposed White Paper.
Lynn Wyeth is the Information Governance Manager at Leicester City Council. Follow her on Twitter @LynnFoi.
The Law Commission has opened a consultation on the law around sharing of personal information between public sector organisations. Law Commissioner Frances Patterson QC says:
“It could be that more data sharing would improve public services but, if that is so, we need to understand why data is not being shared. Is there a good reason to prevent data sharing? Or is the law an unnecessary obstacle? Are there other reasons stopping appropriate data sharing? These are the questions we want to answer in this consultation.”
The legalities of data sharing is a subject which often confuses public sector officials. Local authorities, in particular, are often stumped by the “To Share or Not to Share” question, even if the sharing is for very good reasons (e.g. child protection or crime prevention). In some cases, even internal departments have felt constrained from updating each other about a change of a service user’s address.
More often than not, the Data Protection Act 1998 (DPA) is made the scapegoat for officials’ failure to fully understand the law. It is wrongly perceived as a barrier to data sharing despite offering a range of justifications (e.g. consent, legal obligation, protecting vital interests etc. (Schedule 2)).
Many attempts have been made to resolve this “problem”. In May 2011, the Information Commissioner published a statutory Code of Practice on data sharing. The code explains how the DPA applies to the sharing of personal data both within and outside an organisation. It provides practical advice to the public, private and third sectors, and covers systematic data sharing arrangements as well as one off requests for information. Under Section 52 of the DPA, the code can be used as evidence in any legal proceedings and can be taken into account by the courts and the Commissioner himself when considering any issue.
Despite the clear guidance in the code, the Government has sometimes toyed with the idea of new laws. Last year, according a story in the Guardian newspaper, proposals were to be published by the Cabinet Office minister, Francis Maude, which would make it “easier” for government and public-sector organisations to share confidential information supplied by the public:
“In May, we will publish proposals that will make data sharing easier – and, in particular, we will revisit the recommendations of the Walport-Thomas Review that would make it easier for legitimate requests for data sharing to be agreed with a view to considering their implementation,” said Maude, adding that current barriers between databases made it difficult for public sector workers to access relevant information.
“It’s clearly wrong to have social workers, doctors, dentists, Job Centres, the police all working in isolation on the same problems.”
The Guardian reported that the proposals are expected to include fast-track procedures for ministers to license the sharing of data in areas where it is currently prohibited, subject to privacy safeguards. I could not find the proposals on the web. Anybody know whether they were ever published?
Confusion around data sharing continues to reign! The tragic case of Daniel Pelka is one example. The recent report into the four-year-old’s death, published by the independent Coventry Safeguarding Children Board identified a number of missed opportunities where professionals across a number of agencies should have done more to protect Daniel. Amongst other things, it concluded that the sharing of information and communications between all agencies was not robust enough.
Ill informed comments about the current law (especially the DPA) do not help. In a recent Daily Telegraph article by Michael Gove, the Education Minister claimed that, whilst tying to understand the underlying causes of child exploitation, he discovered that OFSTED “was prevented by “data protection” rules, “child protection” concerns and other bewildering regulations from sharing that data with us, or even with the police.” There is nothing in the DPA which prevents this. Don’t just take my word for it. Read the Information Commissioner’s riposte to the learned Mr Gove.
Do we really need new laws on data sharing or a better awareness of the existing ones? My view is that the current law is adequate to regulate yet allow responsible data sharing. The DPA and the Data Sharing Code need to be properly understood. They can be a tool allowing responsible data sharing. Most public sector data sharing will be lawful if organisations comply with the Eight Data Protection Principles; particularly the First Principle which requires information to be processed fairly and lawfully. There are also numerous exemptions in the Act including where sharing is required for the purpose of prevention or detection of crime (section 29).