News / news

A Code of Practice on Children’s Data and Education: what’s next?

A Code of Practice on Children’s Data and Education

One month after the government made a commitment to require the ICO to create a Code of Practice for Data Protection in education, what’s going on?

After all, the ICO should not be starting from scratch, but can begin with the skeleton areas highlighted in the Convention 108 Data Protection Guidelines for Educational Settings, since we are a member state and signatory to that Convention (more directly, than the GDPR today since Brexit).

The amendment proposed to the Data Use and Access Bill by Baroness Kidron was for a

Code of Practice on Children’s Data and Education: The Commissioner must prepare a code of practice which contains such guidance as the Commissioner considers appropriate on the processing of data in connection with the provision of education.” (in full at the end of this post)

It did not pass onto the face of the Bill after peers were persuaded by the government, and there was a firm commitment made to the House, that the government would instruct the ICO to make such a Code of Practice happen.

However while true that no new law is needed to do so, related, previous commitments to do so have not materialised and without law that pins it down, will its content do the job schools’ staff and learners, as well as industry and researchers, all need? The intention must not be narrowed for example to only be about EdTech companies.

A Code of Practice must be about the pupils’ data not the products

It makes no sense to have a Code of Practice issued by the ICO that covers only part of an interconnected process and people responsible for it in the state education system. It would be like writing a baking recipe, starting only with only how to mix the ingredients together, leaving out the parts where you have to know the list and how to source them, weigh and measure them first, or put the mixture in the oven afterwards. Such a Code will be half-baked and useless to the protection of rights of the learner, families, teachers, and create no real help for industry whose re-use of data first depends on its lawful use of state mandated collection for the school census and finance systems.

The same data also flows into their edTech and out again, even into their companies for developing their products (whether lawfully or otherwise) but how will they know if lawfully collected if there was no guidance on what that looks like or whose responsibility it is at the point of collection? It would be like telling the school kitchen, “DfE will mandate guidance on water safety, but only when you use it to serve pupils for lunch, not morning break”. The data uses the same pipes, the same collection source, gets the same treatment and gets passed around as waste the same way as the rest of the data, and you cannot separate guidance between the two or start with it at the wrong point in the process and wonder why no one knows whose job it is to make sure it is clean to start with.

Data processing into, across, and out of state education systems and any reuses, must cover the entire process from the data flow, its collection to destruction, across the data life-cycle and lifetime of a pupil, must cover all key stakeholders, data controllers and processors, the role of parents and overlapping rights in law in a non-consensual environment. From the start of the Admissions process through to data retention by the DfE in the National Pupil Database of those first personal data submitted by a parent when a child was 4 years old which form the first parts of a child’s MIS record retained by a school and submitted termly in the school census to the DfE.

How will Seesaw know that their international transfers of infants’ data is lawful if there is no guidance on the nursery collection of its joint controllership because only their half of the process is included, but the setting, the other joint controller is not?  How will only an edTech Code of Practice not include the school setting’s responsibilities as the point of contact for Subject Access Requests or if the company keeps pupils’ data for research indefinitely?

Five years after the ICO audit took place, the remaining open questions on the audit of the Department for Education suggest work has failed to be completed too. It’s not been enforced by the ICO despite their serious and substantive findings, as described in the six-page summary (but the full version of which has been refused to be released via FOI and is still hidden) but as far as we can see:

  • There is still no fair processing (telling the people whose data it is, what it is used for or where it goes for how long or why (“as required under Articles 12,13 and 14 of the GDPR”);
  • It is still not clear to schools as data controllers, what their role is in telling families what is collected under what law and what is optional, one of the key failings required by law highlighted in the summary of the audit that was published in October 2020;
  • There is no apparent change in the “over reliance on public task” lack of identified supportive legislation, or the “limited understanding of the requirements of legitimate interest” necessary “to ensure the use of this lawful basis is appropriate and considers the requirements set out in Article 6(1)(f) of the GDPR” found (page 6/6);
  • There is no right to object, balancing test or opt out offered on the collection of, never mind the reuses of any sensitive and identifying pupil data from the NPD, at local or national levels, particularly concerning for commercial reuses where the purpose cannot be said to be compatible with its reason for collection by a school, to teach a child;
  • There is still no user-friendly Subject Access Request process, and not one suitable for children at all, or that 23 million people know about;
  • And no way to know whether your own data have gone or are still with any of the over 2,500 releases of identifying and sensitive data to third parties since 2012.

The Department for Education answered a question from the House of Lords (UIN HL2566) to ask when the full ICO audit findings would be published. Five years after it was carried out, their answer is that, “The department will publish an updated audit closure report in Quarter 1 of 2025/26 on completion of the remedial actions.”  Assuming this means by the end of this month, we look forward to it, given the DfE closure report was already published by the Department in 2023 and yet did not cover those gaps we set out above.

The DfE has also confirmed on recently in writing that, “it is not possible for a parent/guardian or an individual child to opt out of the school census collection,” so when it comes to the re-uses, no one offers any Right to Object (for processing based on public task) nor any opt-out (nor opt-in) of processing based on Legitimate Interests. Families do not get told how pupils and parents may object to any data being collected in the school census, or by what process they may do so. [UIN HL2698] This is still in breach of the basic principles of the law. To create a Code of Practice about rights that apply to the same data but only when in use by EdTech but not in other processing like the census will cause confusion.

In fact, far from the idea that the written answer to Lord Browne’s question suggests is true,” schools should ensure that they are made aware of their right to decline to provide these data items,” some we hear of, including councils, still tell admin staff to make up what was refused, like ethnicity.

If the school does not receive guidance on how to collect ethnicity and process it properly from start to finish, how can a Code of Practice cover its use only at the point of time of edTech reuse?

Ethnicity is collected once. Name is collected once. Data like these are only asked of parents’ once before a child starts school and almost never again. The same data gets reused and passed around without any further permission asked for, or notifications, or chance to object or opt out. Once collected by a school, the Schools Information Management System is the core record forever and from there, the flows are managed outwards by the school (often by one office admin, who is offered little to no quality data protection training).

The school setting is the only place that parents and the child or adult learner has contact with and where a human face is accessible. They neither know nor have any sight of what goes on at national level or behind the screen at thousands of edTech companies. The process is inextricably linked in systems and human processes to the state education system and needs to cover it as one data landscape.

The Code will need to address what is ‘children’s data’? Is such a conditional description dependent on the data subject’s (the person) age at the time time of data collection, or when they want to exercise rights eg later as an adult? Does the condition remain with the data or expire, and what consequences might that have for exercising the rights attached?

A Code of Practice must be future proofed

The Department has been actively looking for some time at national pupil data and content reuse for AI development among other edTech reuses in the Content Store, and included the question of IP in education in the AI consultation that just closed. The timeline for next steps to starting, is short.

The Bill Committee could ask what the plans are for that, in conjunction with pinning down the policy intent for the Code. There are other broader issues in the Data Use and Access Bill, all of which affect children and learners as a sub-population of measures that will affect all data processing population wide.

The Minister Chris Bryant has proposed an amendment to the Bill in this Committee Stage that will remove a safeguard that the Lords put into the Bill. Without the safeguard that the Lords put in, to require the use be “in the public interest” subclause 2 of clause 67 in the Bill will redefine ‘scientific research’ and ‘scientific research purposes’ in the Bill to mean ‘any research that can reasonably be described a scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity’. We go into more detail in our Briefing v1.7 here for Committee Stage of the Bill, as of March 4, 2025. Today, data protection offers some safeguard against the government simply handing over all our children’s educational content and data to AI companies for development, as it plans to do later this year. If the Minister removes this safeguard, what else will protect us from endless commercial exploitation for product development (including AI) or targeted marketing labelled scientific ‘research’?

However, the biggest problem in the Bill in some ways is in the detail.

By being so focussed on the detail of this Data Use and Access Bill, few will now ask why these changes go to the heart of data protection legislation, and undermine the very essence of what data protection law is for; to prioritise the protections of the person from arbitrary interference in their private and family life, and to ensure people have agency when it comes to ever more automatic processing of personal data about them.

As the LIBE Committee noted at the end of 2022, the UK reform is, “focused solely on economy, innovation and growth and did not make a link to protection of fundamental rights.” In fact, the UK is not only moving away from the GDPR, but when it comes to education more specifically, this example is one of years of neglect to make efforts to put into practice any decent data and digital rights literacy across the population, across the teaching sector and for lifelong learning for society. Without these skills we will never have well functioning data practices because people across society, across public life and commercial businesses, rarely understand how the law works for them, and why, and how to make it routine in everyday practice. It’s desperately needed across the education sector and the curriculum. By creating a Code of Practice for all to understand, we start to teach people from the ground up. The aim is not to achieve much through enforcement, but through education and empowerment.

We hope the serious thinking on this will uphold both the spirit and substance of Baroness Kidron’s call, that, “The Commissioner must prepare a code of practice …on the processing of data in connection with the provision of education.

A Code of Practice: its history

We have discussed an ICO Code of Practice in education with peers since Lord Clancarty proposed amendment 117 in 2017, when Labour Lord Stevenson said the accompanying explanations of what was going on with pupil data in the National Pupil Database, “gave him a chill. It was proposed and debated subsequently by Labour MPs in the Commons, by Liam Byrne, Louise Haigh, Chris Elmore and Darren Jones who proposed a “Code on processing personal data in education“, but it did not pass in New Clause 16 in 2018.

A “Code of Practice by Information Commissioner’s Office on data sharing in relation to post-16 education” was debated by peers again in 2021 with support from Lord Storey, Baroness Garden and Baroness Sherlock, and Baroness Kidron, and broadly again in 2022, in 2023.

Peers debated the interconnected issues in 2022 in a House of Lords debate on Children’s Private Information: Data Protection Law Volume 826: debated (12 December 2022).

Now in this Bill, again, in 2024, peers called for a Code in debate led by Lord Clement-Jones recalling the work of Data, Tech & Black Communities, and expanded on by Baroness Kidron and Lord Russell, and supported in much more detail, with comments from Lord Jim Knight too, on December 18th 2024, who again spoke about broad practice, not only edTech:

“My final cautionary tale, thanks to Defend Digital Me, is on the national pupil database, which was agreed in 2002 on the basis that children’s data would be kept private, protected and used only for research purposes—all the things that we are hearing in the debates on this Bill. Ten years later, that was all changed and 2,500 data- sharing arrangements followed that use that data, including for universal credit fraud detection. When parents allow their children’s data to be shared, they do not expect it to be used, down the line, to check universal credit entitlement. I do not think that was in the terms and conditions. There is an important issue here, and I hope that the Government are listening so that we make some progress.”

Alternative promises made: An Education sector Certification scheme

In a 2021 DfE statement to the House of Lords, Baroness Barran said on behalf of the DfE after the Code had been proposed and debated and might well have been voted on and passed, but for the fact that government committed to peers in writing that, “The department’s response to this issue is to set up an education sector certification scheme.Again she confirmed in 2024 (HL6145), that an education sector certification scheme was underway, but there’s no concrete sign of it yet, at the time of writing this question asked in the House of Lords is more than 2 weeks overdue. That is a separate piece of work that would be important to tie in. However, certification is hard for a wide variety of reasons, not least as it relies on knowing what you are certifying and that can change at any point in time post-issue.

What is consistent over time, despite conditions of age or use, is the flow of data regardless of whether it is in edTech or not, and the legal obligations towards duty bearers and rights holders that the law demands.

 


Baroness Kidron’s proposal was for

“Code of practice on Children’s Data and Education

(1) The Commissioner must prepare a code of practice which contains such guidance as the Commissioner considers appropriate on the processing of data in connection with the provision of education.

(2) Guidance under subsection (1) must include consideration of—
(a) all aspects of the provision of education including learning, school management and safeguarding;
(b) all types of schools and learning settings;
(c) the need for transparency and evidence of efficacy on the use of AI systems in the provision of education;
(d) the impact of profiling and automated decision-making on children’s access to education opportunities;
(e) the principle that children have a right to know what data about them is being generated, collected, processed, stored and shared;
(f) the principle that those with parental responsibility have a right to know how their children’s data is being generated, collected, processed, stored and shared;(g) the safety and security of children’s data;
(h) the need to ensure children’s access to and use of counselling services and the exchange of information for safeguarding purposes are not restricted.

(3) In preparing a code or amendments under this section, the Commissioner must have regard to—
(a) the fact that children are entitled to a higher standard of protection than adults with regard to their personal data as set out in the UK GDPR, and the ICO’s Age Appropriate Design code;
(b) the need to prioritise children’s best interests and to uphold their rights under UN Convention on the Rights of the Child and General Comment 25;
(c) the fact that children may require different protections at different ages and stages of development;
(d) the need to support innovation to enhance UK children’s education and learning opportunities, including facilitating testing of novel products and supporting the certification and the development of standards;
(e) ensuring the benefits from product and service developed using UK children’s data accrue to the UK.

(4) In preparing a code or amendments under this section, the Commissioner must consult with—(a) children,(b) educators,(c) parents,(d) persons who appear to the Commissioner to represent the interests of children,(e) the AI Safety Institute, and(f) the relevant Education department for each nation of the United Kingdom.

(5) The Code applies to data processors and controllers that—

(a) are providing education in school or other learning settings;
(b) provide services or products in connection with the provision of education;
(c) collect children’s data whilst they are learning;
(d) use education data, education data sets or pupil data to develop services and products;
(e) build, train or operate AI systems and models that impact children’s learning experience or outcomes;
(f) are public authorities that process education data, education data sets or pupil data.

6) The Commissioner must prepare a report, in consultation with the EdTech industry and other stakeholders set out in subsection (4), on the steps required to develop a certification scheme under Article 42 of the UK GDPR, to enable the industry to demonstrate the compliance of EdTech services and products with the UK GDPR, and conformity with this Code.

(7) Where requested by an education service, evidence of compliance with this Code must be provided by relevant providers of commercial products and services in a manner that satisfies the education service’s obligations under the Code.

(8) In this section—“EdTech” means a service or product that digitise education functions including administration and management information systems, learning and assessment and safeguarding, including services or products used within school settings and at home on the recommendation, advice or instruction of a school;“education data” means personal data that forms part of an educational record.“education data sets” means anonymised or pseudonymised data sets that include Education Data or Pupil Data.“efficacy” means that the promised learning outcomes can be evidenced.“learning setting” means a place where children learn including schools, their home and extra-curricular learning services for example online and in-person tutors.“pupil data” means personal data about a child collected whilst they are learning which does not form part of an educational record.“safety and security” means that it has been adequately tested.“school” means an entity that provides education to children in the UK including early years providers, nursery schools, primary schools, secondary schools, sixth form colleges, city technology colleges, academies, free schools, faith schools, special schools, state boarding schools, and private schools.”

Other references

Footnote

Edits post publication: March 4, to add mention of the change proposed by government to Clause 67 of the Data Use and Access Bill, and link to our Briefing.