News / Blog

DUA Bill amendments and commitments made in HOL Report Stage #DataProtectionDay 2025

On Data Protection Day 2025, Report Stage of the UK Data Use and Access Bill has concluded late tonight. Peers worked tirelessly late into the night, but all their real work is behind the scenes. We are hopeful, despite past experience, that this is the start of a new deal for learners and families on data protection and privacy in education, in a rights respecting environment. Meaningful clarity, consistency and confidence will support industry, staff, learners and families. And where necessary, effective enforcement as dissuasive action will follow from advocacy and understanding of everyday practices. The work starts now.

Here’s a short summary of outcomes most relevant in our scope.

What’s missing?

The Data (Use and Access) Bill was debated in the House of Lords on January 21st and 28th 2025. Few changes were made through amendments, and most notably absent are any safeguards or constraints on new powers on automated decison making, the loss of a legitimate interests (LI) balancing test for certain new recognised conditions, and expanding on powers to not inform data subjects of additional processing or at scale.

Lady Jones of Whitchurch, who was until recently the Minister leading on this Bill, raised concerns about the broad nature of the new LI objectives in 2024 in Opposition. She rightly said: “There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests” [Hansard, 25/3/24; col. GC 106.]

During the session on January 21st, Lord Clement-Jones proposed Amendment 17, aiming to remove the Secretary of State’s power to override primary legislation and modify key aspects of UK data protection law via statutory instruments. This proposal was supported by the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both expressing concerns about the broad scope of these powers. In response, the government acknowledged these concerns but stated only:

“I hope it will reassure the Committee that the power will be used only when necessary and in the public interest.”

Hope is not a safeguard for such strong powers, and we are concerned similar Henry VIII powers are in the Children’s Wellbeing and Schools Bill.

What’s new?

In various iterations of this data bill, and among many others, we shared Viscount Colvillle’s concern about the safeguards for defining scientific research. A positive amendment was passed to require that research should be in “the public interest”, which may be an important safeguard and will require an assessment and no doubt further discussion in future.

On January 28th, many contributors opposed the Government’s consultation suggestion of an opt-out copyright regime, which would require creators to proactively exclude their works from AI training datasets, and peers supported instead stronger protections and an opt-in model. While emphasis was placed on the importance of the UK’s creative industries, it is of particular concern to us and across the education sector since the UK government plans to change copyright law to enable the repurposing of education data and content for AI development.

Baroness Kidron included this, as seen in the copyright consultation (closing February 25th), saying that,

“Children go to school for the state to enable their right to education, not to enable their exploitation for data mining. This is an absolute no.”

What happens next might depend on consultation responses, although from many people in discussions there is uncertainty where these plans have come from, announced in South Korea as they were last August.

What’s progressed?

On a positive note, we were pleased to hear this commitment from government to requiring the Information Commissioner’s Office (ICO) to develop a code of practice addressing issues related to education technology (EdTech) and children’s data. This would include guidance on processing data transparently and with protections for children’s rights.

The amendment proposed was for a “Code of practice on Children’s Data and Education(1) The Commissioner must prepare a code of practice which contains such guidance as the Commissioner considers appropriate on the processing of data in connection with the provision of education.” It did not pass onto the face of the Bill, but did get a firm commitment made to the House.

We are used to disappointment since we have discussed an ICO Code of Practice with peers since Lord Clancarty proposed amendment 117 in 2017, when Labour Lord Stevenson said accompanying explanations of what was going on with the National Pupil Database, “gave him a chill”. It was proposed and debated subsequently by Labour MPs in the Commons, Liam Byrne, Louise Haigh, Chris Elmore and Darren Jones proposed a “Code on processing personal data in education, but it did not pass in New Clause 16 in 2018. A “Code of practice by Information Commissioner’s Office on data sharing in relation to post-16 education” was debated by peers again in 2021 with support from Lord Storey, Baroness Garden and Baroness Sherlock, and Baroness Kidron, and broadly again in 2022, in 2023. In 2024 peers called for a Code in debate led by Lord Clement-Jones recalling the work of Data, Tech & Black Communities, and expanded on by Baroness Kidron and Lord Russell, and supported in much more detail, with comments from Lord Jim Knight too, on December 18th 2024.

“My final cautionary tale, thanks to Defend Digital Me, is on the national pupil database, which was agreed in 2002 on the basis that children’s data would be kept private, protected and used only for research purposes—all the things that we are hearing in the debates on this Bill. Ten years later, that was all changed and 2,500 data- sharing arrangements followed that use that data, including for universal credit fraud detection. When parents allow their children’s data to be shared, they do not expect it to be used, down the line, to check universal credit entitlement. I do not think that was in the terms and conditions. There is an important issue here, and I hope that the Government are listening so that we make some progress.”

This year, we are of course pleased at any progress.

However, we will wait to see a public commitment made from the Department and the ICO. Why? because we have been in a somewhat similar position before, albeit with a weaker commitment.

What’s promised?

In October 2021 Baroness Barran on behalf of the DfE after the Code had been proposed and debated and might have been voted on, told peers that,

“The department’s response to this issue is to set up an education sector certification scheme, with the support of the ICO, that would allow the department to set standards in a wide range of areas. This would cover the data protection needs of the whole education sector, not just the 16 to 19 age group covered by the Bill. We feel that a certification scheme, rather than a code, gives us flexibility to deliver elements when they are ready. We will not have to wait until all elements are complete, which allows us to be flexible when responding to priority needs. In addition, as technology and the law change, we are able to update specific standards without having to update a full code, allowing us to remain flexible to future changes. As the noble Baroness, Lady Garden, mentioned, I have written to both the noble Lord Storey, and the noble Baroness, Lady Kidron, detailing the department’s ambition and next steps in tackling this issue, which will include writing both to the ICO and to the ed-tech companies by the end of the year.”

An “education sector certification scheme with support from the ICO,” was never seen. We wonder what happened to it. And according to the ICO they were never written to. In fact, the regulator seemed more critical that we asked, than concerned what had not happened.

Importantly, this fresh commitment must not be twisted to be only about edTech, but as described in the Code amendment the peers debated, be about all data processing across the education sector. The Minister did say after all,  it would be for the expert panel, ICO audits, and new findings to shape the “breadth and focus of the code”.

“To complement any recommendations that come from the ICO and its audits, the Department for Education will continue to work with educators and parents to help them to make informed choices about the products and services that they choose to support teaching and learning,” said the Minister.

The ICO have audited only 10 EdTech service providers in the last 5 years, we have not seen any published [unless this includes two cases in Scotland and Essex on FRT] or any dissuasive action. They have confirmed plans to audit a further seven. We receive many more complaints than this directly from parents and learners, probably averaging two a month. There are hundreds and hundreds of providers even in only the mostly commonly used in educational settings in England every day. And they are ever growing and many come from abroad.

We’ve never seen the DfE work with parents on edTech nor with schools in any meaningful way, so we’re not sure what they plan to ‘continue’ although it may be interesting, and as seen above, the promises made before came to nothing. Lord Vallance talked about ICO audits, and as we recall the ICO audit of the DfE is expecting an update in the coming weeks, these factors must also be taken into account as joint controller of much of the data that comes out of the sector, and is distributed more widely by the government department, linked to welfare data, children-in-need at pupil level linked to criminal records, and/or further distributed to commercial companies.

Five years after the audit took place, will that statement address the remaining open questions on the ICO Audit of the Department for Education?

  • There is still no fair processing (telling the people whose data it is, what it is used for or where it goes for how long or why (“as required under Articles 12,13 and 14 of the GDPR”);
  • It is still not clear to schools as data controllers, what their role is in telling families what is collected under what law and what is optional, one of the key failings required by law highlighted in the summary of the audit that was published in October 2020;
  • There is no apparent change in the “over reliance on public task” lack of identified supportive legislation, or the “limited understanding of the requirements of legitimate interest” necessary “to ensure the use of this lawful basis is appropriate and considers the requirements set out in Article 6(1)(f) of the GDPR” found (page 6/6);
  • There is no right to object, balancing test and no opt out offered on the collection of, never mind the reuses of any sensitive and identifying pupil data from the NPD, at local or national levels;
  • There is still no user-friendly Subject Access Request process, and not one suitable for children at all, or that 23 million people know about;
  • And no way to know whether your own data have gone or are still with any of the over 2,500 releases of identifying and sensitive data to third parties since 2012.

The Department for Education has answered a question from the House of Lords (UIN HL2566) to ask when the full ICO audit findings would be published. Five years after it was carried out, their answer is that “The department will publish an updated audit closure report in Quarter 1 of 2025/26 on completion of the remedial actions.” [Ed. note: We hope they do mean 2025, and not ’26.]

And the DfE also confirmed that, “it is not possible for a parent/guardian or an individual child to opt out of the school census collection,” so when it comes to the re-uses, no one offers any Right to Object (for processing based on public task) nor any opt-out (nor opt-in) of processing based on Legitimate Interests. Families do not get told how pupils and parents may object to any data being collected in the school census, or by what process they may do so. [UIN HL2698] This is still in breach of the basic principles of the law.

A further commitment was made for a meeting between the ICO, the Department for Education (DfE), and other stakeholders, to ensure a balanced approach in developing the Code. Let us see the action, and we shall believe. We have plenty of research gathered over the last 10 years and children’s and parental complaints from which to to offer lessons learned and recommendations.


 

21/1/2015 Part 1 https://hansard.parliament.uk/Lords/2025-01-21/debates/78DC6C49-32D7-40B9-B438-2FF3BC801A8D/Data(UseAndAccess)Bill(HL)#division-53446

21/1/2015 Part 2 https://hansard.parliament.uk/Lords/2025-01-21/debates/24423D96-CD94-4AFB-A47B-DCF3AB3B350B/Data(UseAndAccess)Bill(HL)

28/1/2015 Part 1 https://hansard.parliament.uk/lords/2025-01-28/debates/9BEB4E59-CAB1-4AD3-BF66-FE32173F971D/Data(UseAndAccess)Bill(HL)

28/1/2015 Part 2 https://hansard.parliament.uk/Lords/2025-01-28/debates/AABE348A-4DCF-4355-A420-6C5FC3DF0A52/Data(UseAndAccess)Bill(HL)