News / National Pupil Database pupil privacy

Surveillance of every child’s online activity risks a spiral of silence

Every child’s Internet use will soon be placed under mandatory surveillance thanks to statutory guidance “New measures to keep children safe online at school and at home” proposed by the Department for Education in December 2015 [1] that should take effect from September 2016.

Schools and other educational establishments will: “need to have appropriate filters and monitoring systems, so that no child can access harmful content via the school’s IT systems and concerns can be spotted quickly.”

The Consultation [2] closed mid February 2016 and we await the publication of submissions and the government response.

We have responded [3], and asked that there is an independent assessment made of cost, benefit and risk how these systems work already today in schools, before it is made compulsory in all schools. This would support the suggestion in the consultation that the Department wants to “test that assumption” that these software are already widely used.

Effects need assessment before compulsory imposition

Web monitoring is not to be confused with risks associated with lack of web filtering.

Over filtering and therefore teens not being able to access mature discussions like the BBC Newsnight debate on the death penalty, is a separate question for debate that this duty on schools also raises. But for the intents of this discussion, monitoring and keylogging is our key concern.

How this duty works in practice is entirely unclear. Given that software providers, local authority and schools buying the systems are all different, we wonder whether there is any consistency in codes of practice or application.

How do schools that permit “bring your own device” (BYOD) from home, ensure transparent and
 safe monitoring of only pupils’ school time, not private or parents’ use outside school hours?

How transparent is this monitoring and logging from “keyword libraries” to pupils and parents?

What due diligence is done with the providers of these software who will have direct access to the equipment and data of millions of children?

What course of redress is there for pupils whose password used by another pupil, or who may be stigmatised by judgements made in error by remote third party monitoring where some providers provide a remote service, or machine ‘flagging system’, removing the teacher’s judgement from the classroom?

Children have rights to education, to access information, to freedom of thought and to privacy. Children have the right to have their opinions heard and respected particulary when decisions are made about them.

This statutory duty will potentially put children at risk who feel unable to search for sensitive subjects at home. Children who cannot get online at home to look at health issues, questions of sexuality, or even at certain types of risk, might be disadvantaged.

Confidentiality is key to trust for young people in the teacher-pupil relationship. Wondering “Will Teacher know I looked up transgender issues for teens” for the pupil who does not want to talk about it but does want information will potentially harm that relationship and young person.

Through a chilling effect and spiral of silence, surveillance reduces children’s ability to question and challenge topics of the day which need a safe space to search for in privacy.

Engagement and transparency of use are needed

The public deserves thorough, transparent scrutiny of not only the theory and purposes of these plans, but a practical understanding of codes of practice, ethical decision making and how they are compatible with human rights to privacy, interference with family life, and children’s rights.

At regional and local level, all educational organisations and those working with children’s personal data should be required to put in place clear public statements about their collection, use and analysis of data and for individuals to be informed about this, especially regards automated decision making and monitoring technology (including the role played by third party services).  We note a separate but similar call from the Education Technology Action Group (ETAG) 2016 [4].

Secondly, independent privacy impact assessments should be mandatory and published before children’s data are extracted and shared with any party other than that to which it was given with consent for a specific purpose. Extensions broadening that purpose should require consultation and consent before wider sharing.

Thirdly, we ask that at national level all legislation should meet the call contained in the 2014 report ‘Responsible use of data’, “the Government has a clear responsibility to explain how personal data is being used.”[5].

Understanding how our data are used includes transparency and information on:

  • who may access what information that is monitored,
  • where information are stored and for how long,
  • what may happen as a consequence of the data being read by algorithm or with human intervention, and
  • who is measuring how the goals will be deemed to have been met or not, and evaluation of both the financial and intangible costs for our children, teachers and staff and schools.

We support the call by academics for “a new framework for child protection, provision and participation online that results in clear and effective policy that is born of real needs, targets specific and evidence-based risks, includes measurable goals […] policy implementation is independently evaluated.[6]

Schools and staff have responsibilities to all our children, but imposing online and offline surveillance must not impact on all our rights without debate and transparency.

After all, 
“Children do not lose their human rights by virtue of passing through the school gates.” [7]
[3] our response to the consultation: many open and practical questions