New Podcast: The Government’s Plans For Our Children’s Data

“I think privacy is often given a bad name. We talk about it in abstract terms; we should abandon thinking about it in that way. What you do to my data, you do to me. There is no real distinction anymore between our online life and our offline life. So whatever you know about me through my digital footprint, you know about my real life.” 

Jen Persson, Director of Defend Digital Me 

Children today are growing up in a world where almost everything they do leaves a data trail. From the apps they use, to the schools they attend and the healthcare they receive; data is being collected, analysed and increasingly connected and shared.
But at what cost? 

Recent initiatives from the UK Government, such as the Schools White Paper and the Children’s Wellbeing and Schools Act 2026, have major implications for children’s privacy; from age verification to plans for a “Data Spine” to link information across the public sector.  

In our latest Guardians of Data podcast, we analyse the Government’s plans for our children’s data, discuss children’s privacy in the internet age and the role Big Tech is playing in the collection storage and analysis of all our data.  We ask if the government is simply trying to do a better job of protecting children or if it is quietly building a surveillance system which will impact all of us. 

Our guest is Jen Persson, Director of Defend Digital Me,  a not-for- profit organisation that advocates for children’s privacy and digital rights in UK education and the wider public sector. Jen said: 

“Everybody wants to keep children safe… I think the important thing in the Children’s Wellbeing and Schools [Act], is that there is so much going through it that is untested and unevidenced. So some of our work has been to analyse that as it went through Parliament. For example, the single unique identifier is only part of the data aspects of the [Act], but it’s very vague and there’s been very little explanation in writing or in Parliament.” 

Listen on your preferred platform via our podcast page, or download the episode directly.

This podcast is sponsored by Phaselaw – a purpose-built solution for document disclosures, like subject access requests and FOI requests. Instead of redacting PDFs one by one, or forcing litigation software to do a job it wasn’t designed for, with Phaselaw you get collection, review, and redaction in one workflow. Teams across the world are using it to cut response times from weeks to days. 

For Guardians of Data listeners, Phaselaw is offering a two-month free trial; run it on live requests, see what it does to your backlog, decide from there. No card, no commitment. 

Head to https://www.phase.law/guardians to claim your free trial.  

Previous episodes of the Guardians of Data podcast have featured Tahir Latif talking about responsible AI deployment, Naomi Mathews and Ibrahim Hasan explaining the law on filming people in public for social media, Maurice Frenkel looking back at 20 years of the Freedom of Information Act and Olu Odeniyi analysing recent cyber breaches and discussing the lessons learnt.

Government Consultation: Are you ready for UK GDPR 2.0?

On 10 September 2021, the UK Government launched a consultation entitled “Data: A new direction” intended “to create an ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data.” Cynics will say that it is an attempt to water down the UK GDPR just a few months after the UK received adequacy status from the European Union. 

Back in May, the Prime Ministerial Taskforce on Innovation, Growth, and Regulatory Reform (TIGRR) published a 130-page report setting out a “new regulatory framework for the UK. Saying that the current data protection regime contained too many onerous compliance requirements, it suggested that the government: 

“Replace the UK GDPR with a new, more proportionate, UK Framework of Citizen Data Rights to give people greater control of their data while allowing data to flow more freely and drive growth across healthcare, public services and the digital economy.” 

Many of the recommendations made in the TIGRR Report can be found in the latest consultation document:

Research and Re Use of Data

  • Consolidating and bringing together research-specific provisions in the UK GDPR, “bringing greater clarity to the range of relevant provisions and how they relate to each other.” 
  • Incorporating a clearer definition of “scientific research” into the legislation. 
  • Clarifying in legislation how university research projects can rely on tasks in the public interest (Article 6(1)(e) of the UK GDPR) as a lawful ground for personal data processing. 
  • Creating a new, separate lawful ground for research, subject to suitable safeguards. 
  • Clarifying in legislation that data subjects should be allowed to give their consent to broader areas of scientific research when it is not possible to fully identify the purpose of personal data processing at the time of data collection.
  • Stating explicitly that the further use of data for research purposes is both always compatible with the original purpose and lawful under Article 6(1) of the UK GDPR. 
  • Replicating the Article 14(5)(b) exemption (disproportionate effort) in Article 13 (privacy notice), limited only to controllers processing personal data for research purposes.
  • Amending the law to facilitate innovative re-use of data for different purposes and by different data controllers.
  • Creating a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test “in order to give them more confidence to process personal data without unnecessary recourse to consent.” 

AI, Machine Learning and Automated Decision Making

  • Stipulating that processing personal data for the purposes of ensuring bias monitoring, detection and correction in relation to AI systems constitutes a legitimate interest in the terms of Article 6(1)(f) for which the balancing test is not required. 
  • Enabling organisations to use personal data and sensitive personal data for the purpose of managing the risk of bias in their AI systems by amending/clarifying the legitimate interests ground under Art 6 and clarifying/amending schedule 1 of the DPA 2018 (Special Category Data Processing).
  • Removing Article 22 of UK GDPR (the right not to be subject to a decision resulting from solely automated processing if that decision has significant effects on the individual) and permitting solely automated decision making subject to compliance with the rest of the data protection legislation. 

Accountability

  • Allowing data controllers to implementing a more flexible and risk-based accountability framework, which is based on privacy management programmes, that reflects the volume and sensitivity of the personal information they handle, and the type(s) of data processing they carry out. 
  • To support the implementation of the new accountability framework the government intends to remove the requirement to:
    • Consult the ICO in relation to high-risk personal data processing that cannot be mitigated (Article 36)
    • The record keeping requirements under Article 30
    • The need to report a data breach where the risk to individuals is “not material”
  • Introducing a new voluntary undertakings process. 

International Transfers

  • Adding more countries to the adequate list by “progressing an ambitious programme of adequacy assessments.”
  • Adding easier and more international transfer mechanisms.
  • Allowing repetitive use of Article 49 derogations.

PECR and Marketing 

  • Permitting organisations to use analytics cookies and similar technologies without the users’ consent. 
  • Permitting organisations to store information on, or collect information from, a user’s device without their consent for other limited purposes.
  • Extending “the soft opt-in” to electronic communications from organisations other than businesses where they have previously formed a relationship with the person, perhaps as a result of membership or subscription. 
  • Making it easier for political parties to use data for “political engagement”.
  • Increasing the fines that can be imposed under PECR to GDPR levels.

Other Proposals

  • Including “a clear test for determining when data will be regarded as anonymous” within the UK GDPR.
  • Introducing a fee regime (similar to that in the Freedom of Information Act 2000) for access to personal data held by all data controllers. 
  • Requiring the ICO to consider not just data protection but also “growth and innovation” as well as competition.

Businesses may welcome many of these proposals which they might see as limiting the administrative burden of the current data protection regime particularly reporting data breaches and conducting DPIAs. The Government also seems intent on liberalising access to data, to generate a broader market for it, which will suit the commercial interests of big business but at what privacy cost? The consultation runs until 19 November 2021.

What are your thoughts? Let us know in the comment field.

Our  GDPR Practitioner Certificate is our most popular certificate course available both online and classroom. We have added more dates.