News / news

Open Letter to Ministers: Drop the damaging UK Data Protection Bill

In a joint effort led by Defend Digital Me, expert groups and academics with a focus on state education, data, technology, and human rights, have written an open letter urging the UK Government to drop the Data Protection and Digital Information Bill, and to support action aimed at better data protection across the education sector, under existing law.

The signatories include two teaching trade unions, as well as UK- and US-based organisations. We are concerned that the government’s proposals will seriously weaken data protection law in the UK, and could particularly harm children.

The Bill is now at Committee Stage in the House of Lords and will next be debated on April 15th and 17th. Amendments to the original Bill due to be discussed this week, include a new Code of Practice for EdTech.

The letter can be downloaded and read in full here.

Summary and importance of the changes planned

The letter, addressed to the Secretary of State for Science, Innovation and Technology (DSIT), the Rt Hon Michelle Donelan, and copied to DSIT data and digital ministers and the Secretary of State for Education, emphasises the critical role the law plays in safeguarding the right to privacy and children’s data protection and outlines threats the proposed changes will bring. Seen through the lens of education, this has outcomes for learners, families and staff, as well as companies and their role in the UK as producers and exporters of safe, quality, and responsible EdTech in a global online environment.

EdTech, when used, must promote learners’ educational outcomes, social development, and human flourishing. But when it does not, the effects can be long lasting. (ICDPPC, 2018) While we want to see the Bill dropped, if it must go ahead, we support the amendment 146 proposal in principle, that would require the Information Commissioner to create a Code of Practice for EdTech. Such a Code is something we at Defend Digital Me have supported for all data processing across education, not only edTech, when proposed in previous parliamentary debates since 2017, from Data Protection laws to the Skills and Post-16 Education Bill in 2021. (and see pages 23-25 of our DDM November 2023 briefing on EdTech).

Stephen Cragg KC also suggested in his legal opinion in November 2023 that, “The best way to protect children’s data is by the retention or introduction of specific safeguards in legislation. However, there is no doubt in my mind that, additionally, such a code of practice as previously advocated for by DDM would be a useful tool for ensuring that special care is taken when the processing of the personal data of children within the education and social care systems (especially) is under consideration.” (paragraphs 65 and 66)

Our recommendation however, is that the regulator would create a code of practice for all pupil data processing, including by the Department for Education and public sector, not only edTech. The Bill makes far wider and damaging changes to the current UK data protection regime, that on balance will harm children’s and student data rights as a whole. The changes the bill will bring to all data processing, undermine the aims any Code could bring before it has started, even in edTech.

The letter calls on the DSIT (and DfE) to drop the Data Protection and Digital Information Bill, and asks instead to use existing Data Protection legislation to better protect children, families, teachers, and society from data harms today—for example based on the recommendations in the Convention 108 Guidelines for Data Protection in Education Settings as it applies to the UK—and to consult widely on building the legislative foundations needed across the digital environment in the education sector, underpinned by rights. This needs infrastructure and policy fit for the known and unknown future threat models for public sector institutions and individuals, as well as in pursuit of the positive opportunities for the public sector, for edTech exports, and above all, to promote the inclusive education of every child.

Evidence of the need for change and support for the sector should be very clear and has been debated often before. Members of the House of Lords held a short debate on Educational Technology in November 2023 (view transcript Vol.834 Hansard html); on Children’s Private Information and Data Protection Law in 2022 (Hansard,Vol. 826: December 2022); and on Biometric Recognition Technologies in Schools in 2021 (vol.815 Hansard) among many others.

With ransomware attacks on education infrastructure, unfavourable contract terms commonly offered at school level, poor availability of teacher training or space in the curriculum for professional development of digital skills, and ever more intrusive technologies and their application on the horizon; with questions over copyright, plagiarism and how to deal with misinformation and fake images and video from generative AI,  now is the time to reinforce the guardrails of data protection in education, not take them away.

The key changes the new law will make

These changes go to the heart of data protection legislation, and undermine the very essence of what data protection law is for; to prioritise the protections of the person from arbitrary interference in their private and family life and ensure people have agency.

The LIBE Committee noted at the end of 2022, that the UK reform is, “focused solely on economy, innovation and growth and did not make a link to protection of fundamental rights.” In fact, the UK is not only moving away from the GDPR, but when it comes to education more specifically, has not yet made efforts to put into practice the Council of Europe Data Protection Guidelines for Educational Settings (2020) adopted by the Committee of the Convention 108, the drafting of which the UK was involved in, and is a signatory.

The Bill fails to prioritise rights, or put in protections and safeguards, and instead removes them, like balancing tests where legitimate interests will be moved to the face of the Bill, or ending consistent obligations on records of processing (ROPA). The Bill redefines some of the fundamental definitions of UK data protection law, including personal data itself. This will be detrimental to children, Stephen Cragg KC found, in his legal opinion commissioned by Defend Digital Me before the Bill was debated in the House of Commons in November 2023. “If the new definition of personal data … is enacted that will also, of course, mean that fewer data of children will be protected under the new law.” (KC opinion, para 61)

We warn that the Bill undermines every one of the seven key data protection principles, lowering today’s standards of obligations on lawfulness, fairness and transparency; purpose limitation; data minimisation; accuracy; storage limitation; confidentiality and security; and accountability.

Further significant changes include

  • The redefinition of terms ‘scientific research’ and ‘scientific research purposes’ by clause 2 of the Bill.
  • Loosening the requirements on purpose limitation. 
  • A list of ‘legitimate interests’ has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored.
  • The changes in Schedule A1 that would permit targeted political marketing at children aged 14-18, may open the floodgates to send children “a deluge” of email direct and misinformation to their inboxes and phones, including political extremism with no fact checking or oversight since political ads are not overseen by the advertising standards association. In 2019 Sky News reported that hundreds of ads were shown to 13 to 17-year-olds on Facebook and Instagram, at a time when it was not permitted. The [then, now former] Children’s Commissioner described the practice of targeting young people as “irresponsible”.
  • Unsafe technology products may be encouraged under the new ‘safeguarding vulnerable individuals’ umbrella in Annex 1.
  • Clause 14 does not offer children any protections from automated decision making. Opaque, unfair algorithms and automated decisions can have life changing outcomes without clear routes for redress.
  • Further changes have high risk potential for detrimental effects for children and their carers, including DWP powers for bank account surveillance, however our focus here is for education.

 

Jen Persson, Director at Defend Digital Me said:

“The Bill undermines the very essence of what data protection law is for. We do not send our children to school to be walking, talking, AI training datasets, and respect for our privacy and data rights must not be scrapped so that companies can treat us as that. Our personal data is the story of our lives, not a product to be privatised — and our children are not for sale.”

“The introduction of many common EdTech tools into the school setting means opening the door into a child’s life to hundreds, often thousands of strangers who may influence a child’s experience, behaviours, learning and development, through interactions with companies and their affiliates as they market, track and measure pupils through data analytics across the digital world. Parents have lost track of their family’s digital footprint created in school, by the end of the first year of state education.

“In 2018, Defend Digital Me carried out a review of school policies via ICT staff. 94% had no communication policy for school leavers to tell them how their personal data will be retained. And in a poll of 1,004 parents of state school aged children, Survation found over two thirds (69%) of parents said they had not been informed that the Department for Education may give away identifiable data from the National Pupil Database to third parties. 81% of parents agreed that parental consent should be required before a child’s special educational needs data is shared. It is time that the DfE respects that, and ends the commercialisation of childhood when a child walks through the school gates.”

#DPDIBill #DataBill

Related background references

(1) Legal Opinion from Stephen Cragg KC. [download as pdf] November 2023
(2) Defend Digital Me Bill briefing to accompany KC opinion by Stephen Cragg KC, December 2023. [download as pdf] and DPDI Briefing Letter as audio [listen to DPDI Bill DDM letter 12122023 audio version].
(3) Defend Digital Me Second Reading Briefing April 2023| Data Protection and Digital Information Bill (2) v1.5.pdf
(4) Defend Digital Me EdTech and pupil data Briefing: November 2023 v.1.9
(5) House of Lords short debate Educational Technology Vol. 834 November 23, 2023 [view transcript Hansard html] [view parliamentary TV from 16:00]
(6) In March 2023, Defend Digital Me, as part of a coalition of 25 further civil society organisations, already wrote to the Secretary of State for Science, Innovation and Technology, Michelle Donelan MP, calling for the Data Protection and Digital Information (DPDI) Bill to be dropped. We reiterated this ask in October 2023 with then DCMS Minister John Whittingdale.
(7) 2018 findings from an informal DDM survey from 35 schools ICT staff [download .pdf]
(8) October 2023: The ICO Audit of the DfE Three Years on https://defenddigitalme.org/2023/10/07/the-ico-audit-of-the-department-for-education-three-years-on/
(9) April 2023: Does your national school record reveal your sexual orientation or religion? DDM blogpost https://defenddigitalme.org/2023/04/02/does-your-national-school-record-reveal-your-sexual-orientation/
(10) Convention 108 Data Protection Guidelines for Educational Settings (2020) Council of Europe https://www.coe.int/en/web/data-protection/-/protect-children-s-personal-data-in-education-setting-
(11) May 2022: DDM research report The State of Biometrics 2022: a review of policy and practice in UK education https://defenddigitalme.org/2022/05/04/the-state-of-biometrics-2022-new-report/