A reflection on 2024 for pupil data in England and beyond
Blog / December 30, 2024
As we reflect on 2024, Defend Digital Me has remained steadfast in advocating for learners’ rights to privacy and family life within educational settings. Our mission to ensure safe, fair, and transparent data practices has been at the forefront of our efforts throughout the year.
This year has seen Generative AI and web scraping hit the public in ways not seen before. On Data Protection Day 2024 we called on the education sector to stop posting children’s faces online and to better protect children’s biometric data and privacy.
The DfE started to explore how genomic data linked with pupil records would mean big risks and potential unintended consequences. The DfE and Government Office for Science awarded a three-month £50,000 contract to look at the implications of future genomic technologies on the education sector, as reported by Schools Week.
The DfE also announced in January that it would add yet another sensitive new label to pupil records, adding “Down’s Syndrome” to the termly state School Census from 2025.
On World Book day 2024, in March the Guardian reported on a survey of 1,000 7- to 14-year-olds in the UK that found that more than a third of children cannot choose what they want to read, and one in five feel judged for what they do read.
In April we were working on what would be the predecessor to today’s’ Data Access and Use Bill. In a joint effort led by Defend Digital Me, expert groups and academics with a focus on state education, data, technology, and human rights, wrote an open letter urging the UK Government to drop the Data Protection and Digital Information Bill, and to support action aimed at better data protection across the education sector, under existing law. We also supported Home Ed Cymru in their consultation response to the Welsh government consultation on “children missing education” Databases and Regulations for Local Authorities and Local Health Boards (2024).
In May, Schools Week reported “Pupil data is being used to check for benefit fraud and pursue parents under a secret deal between the education and work and pensions departments.” The DfE began to have regular data matching exercises of pupil data for DWP purposes.
June was all about the coming General Election.
July saw the ICO issue a reprimand to yet another school that broke the law when it introduced facial recognition technology (FRT). And we participated in the UCL event, More or less technology in the Classroom?
In August, the government announced that pupil assessment data will be used in a ‘store’ to build AI products. About which, there are many as yet open-ended questions.
In October, we closely monitored the Autumn Term School Census, emphasising the importance of data protection and the ethical use of learners’ personal information. Our Director, Jen, was involved in the delivery of the Council of Europe Working Conference on Regulating the use of AI systems in education in Strasbourg.
November marked a significant period as we engaged (again) with the (revised) Data Use and Access Bill. Peers advocated for a Code of Practice in educational settings, aiming to establish clear guidelines for data usage that prioritise learners’ rights in educational settings, and bring clarity for industry, schools staff, and parents and learners of all ages.
We led an event in the House of Lords on AI and Education in the week of World Children’s Day and its motto for 2024, “Listen to the future. Stand up for children’s rights.” It included discussion of the risks of scraping campaign and AI re-uses for model development with Dr. Claire Bessant (Northumbria University Law School), and contributions on children’s copyright matters with Dr Maria Frabboni (Sussex University Law School), as well as from Data, Tech, and Black Communities.
On Human Rights Day, December 10th, we continued our advocacy during the Committee Stage discussions of the Data Use and Access Bill, reinforcing our commitment to upholding human rights in data practices.
And we addressed initial concerns regarding the Children’s Wellbeing and Schools Bill, raising critical questions about its implications on privacy and family life, and the necessity for comprehensive engagement and consultation with those most affected.
There have been various events and discussion on the OFCOM codes of practice around “age appropriate” practices in the digital environment and Online Safety Act in work that continues into 2025.
But what have we not seen? Concerted efforts at the Department for Education to get the ICO Audit issues sorted out once and for all: fair processing, opt out/in, and respect for data rights in practice. Meaningful dissuasive action from the ICO. More and more parents contact us about 360° cameras and always-on voice recording in classrooms, facial recognition and fingerprinting continues to be normalised in schools. Ever more intrusive systems claim ever more ‘safeguarding’ spaces, including school systems that see children in the nude and can take pictures in their bedrooms and despite our reporting these gross failures of child protection, and of data protection law to the ICO, and DSIT Ministers, and the Department for Education, we see no changes, but instead have a government that dreams of unicorns.
Meanwhile the DfE seems to expect more punitive approaches and surveillance through the Children’s Wellbeing and Schools Bill will be ‘better’ and ‘safer’ for children without understanding these kinds of approaches are the very thing that make the vulnerable less disposed to participation in the state systems and exacerbate the problems the state wants to solve.
Busy as they are, tied up with the easier and more populist-led ‘ban mobile phones’ and pointing fingers at big tech agendas national data protection, data strategy and edTech strategies all let children down. We will continue in the vein of the World Children’s Day motto for 2024, to stand up for children’s rights, and for everyone where educational data and pupil records stretch long into adulthood.
Looking ahead, our objectives for the future include taking some of these long-standing efforts over the line:
- Advocating for robust legislation: We will persist in our efforts to strengthen legislation that safeguards children’s personal information going into across and out of educational settings in England in a dedicated piece of legislation towards digital rights and education, whether it be on the face of legislation or through data protection law. The draft legislation above in two Bills will need further attention.
- Promoting real routes to the realisation of rights: Ensuring that data collection and usage in schools are conducted transparently, with clear communication to students and parents about their rights including how the right to object and opt out for legitimate interests should be upheld, through an opt-in mechanism for data re-uses. National pupil data should be used only in ways people reasonably expect and are told.
- Engaging in Policy Development: Actively participating in the creation of policies that prioritise learners’ privacy and promote safe, fair, transparent data practices. This includes national policies on pupil data reuse for AI development, pupil data secondary reuses by other departments, and edTech and SafetyTech policy and practice.
- Raising Awareness: Educating stakeholders, including staff, parents, and policymakers, about the importance of privacy and the potential risks associated with improper data handling in the digital environment and supporting the development of tools to make it happen.
- Action: if the government and regulators will not take their responsibilities seriously in to make digital spaces rights-respecting for children and change state policies and practices, then we expect parents and staff will start taking more direct action.
We extend our gratitude to our Advisory Council, especially to those who have left at the end of 2024, after many years support, and to those who have joined. To all our allies, supporters and partners who have joined us in championing these vital issues throughout the year. Together, we will continue to strive for an educational landscape that respects and protects the privacy and rights of every child and learner into adulthood and across the public sector in England, and beyond.