News / news

Peers call for Code of Practice in Educational Settings: Data Use and Access Bill

School data processing practices, including AI and data scraping from school social media, were hot topics in debate of the reform to UK data protection law.

Baroness Kidron, Lord Clement-Jones and Lord Russell included a wide range of subjects in the DUA Bill Second Reading debate:

  • Pupil photos, risks to children from schools own websites and schools’ use of Facebook (and other social media) in context of web scraping (including abuse imagery in both real and AI generated ‘fake’ content)
  • EdTech and apps and questioning if the ICO Age Appropriate Design Code could be adapted for education, or whether a Code of Practice in its own right is needed for the education sector
  • Copyright and creativity and IP rights (in the context of web scraping)
  • The word “protection” being an omission in the title of the Bill

Lord Russell of Liverpool said [col.186]:

“We need to prioritise the creation of a code of practice for edtech. We know of many instances where children’s data has been collected in situations where the educational establishments themselves, although they are charged with safeguarding, are wholly inadequate in trying to do it, partly because they do not really understand it and partly because they do not necessarily have the expertise to do it.”

“It is unacceptable that children in school, a place that should be a place of safety, are inadvertently exposed to potential harm because schools do not have the power, resources and knowledge to protect the children for whom they are responsible.”

Baroness Kidron also made more detailed comments targeted at the education sector:

  • The Bill does not address the impact of generative AI on the lives and rights of children.
  • Continuing to allow tech companies to “use pupil data to build unproven edtech products based on drill-and-practice learning models”
  • The Bill could and should introduce a requirement for evidence-based, pedagogically sound paradigms that support teachers and pupils. 
  • The Bill extends research exemptions in the GDPR to “any research that can reasonably be described as scientific” including commercial research. [This means people losing rights to being informed, to object and to Subject Access].

Baroness Kidron said:

“In the recently announced scheme to give edtech companies access to pupil data [the DfE Content Library], I could not see details about privacy, quality assurance or how the DfE intends to benefit from these commercial ventures which could, as in my previous NHS example, end with schools or the DfE having to buy back access to products built on UK pupil data. There is a quality issue, a safety issue, and an ongoing privacy issue in our schools, and yet nothing in the Bill.”

She asked,

“Why not use the Bill to identify and fix the barriers the ICO faces in enforcing the AADC [Age Appropriate Design Code]? Why not use it to extend existing children’s privacy rights into educational settings, as many have campaigned for?”

Lord Stevenson quoted Lord Knight from a previous debate,

“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data” —[Official Report, 19/12/23; col. 2164.]

Lord Clement-Jones pointed out a key issue for children,

“Subjects cannot make use of their data rights if they do not even know that their data is being processed.”

“A formidable number of noble Lords spoke about web trawling by AI developers to train their models. It is vital that copyright owners have meaningful control over their content, and that there is a duty of transparency and penalties for scraping news publisher and other copyrighted content.”

The Bill debate can be read in full on Hansard with Baroness Kidron early on, and Lord Russell, Lord Clement Jones and the Minister also referencing schools towards the end of the debate.

Baroness Jones the Minister, wrapped up with the usual ‘the DfE is doing some things’ summary which points to the government having no appetite to do anything in the Bill, but clearly many peers feel differently. Politicians have been saying similar things for 8 years, learners now need practice to change.

Our Defend Digital Me briefing is available to download with lots of case studies of what has changed in the sector since the 2018 Data Protection Act was brought into law.

Our case is set out for an ICO Code of Practice to support schools to bring what the law says on paper, into everyday practice — between which there is a gaping chasm allowing harm — children’s data protection and rights are rarely considered effectively (or sometimes at all) in educational settings by the state at local, regional and national levels, or their contracted EdTech, research, or commercial third party processing organisations.


Hansard https://hansard.parliament.uk/lords/2024-11-19/debates/6B196F71-312C-4957-AF14-98B66C5DBEE4/Data(UseAndAccess)Bill(HL)

DDM briefing with case studies to download (end of page) https://defenddigitalme.org/2024/11/17/data-use-and-access-bill-our-view-and-call-to-action/

TV https://parliamentlive.tv/Event/Index/4bc930ca-f81e-4928-a11d-f9d942014004

Dr Claire Bessant’s research on schools and social media policy and pupil photos. [download .pdf 168 KB] Download the press release from a parliamentary event 20/11/2024