News / National Pupil Database pupil privacy safeguarding

Safer Internet Day 2019

How would you feel if someone shared something about you online without asking?

This is one of the questions in the education packs for teens alongside the hashtags #OurInternetOurChoice on #SaferInternetDay2019.

In the UK the campaign says that it, “particularly empowers young people to take control of their digital lives and consider how consent works in an online context.”

In the highly sensitive debate on children’s safety online, the organisations responsible for the UK Safer Internet Day don’t talk however, about one area of children’s online experience in which the organisations activity is opaque, and in which some young people are denied control of their digital lives or consent in an online context.

We need to talk about safeguarding in schools

Safeguarding-in-schools has created an environment in which it has been decided by adults that it is acceptable to record things children type, search for, or chat about online. This can include for example, the address of the website visited and the words a child uses. Using some of these if they match those in an opaque multi-lingual library of thousands of words, may trigger a screen capture, their words to be recorded, or even a snapshot taken of the computer user. How would you feel if someone shared something about you online without asking? asks the Safer Internet Day, while the organisations behind it promote tools which pass on information to third parties, from confidential conversations between the child and for example, Childline’s Ask Sam or 1-2-1 counsellor.

Many of these systems operate without the child’s knowledge or understanding because it is the exception for a system to come with a screen notification that monitoring is in progress, and the keyword libraries change. It doesn’t matter if content is encrypted, or if staff are booking a medical appointment or using online banking in their lunch break. Everything might be monitored. Unknown keywords from multi-lingual libraries might trigger screen recordings or further action.

Childline, an NSPCC website says, “Talking to one of our counsellors in 1-2-1 counsellor chat is private. This means that whatever you say stays between you and Childline.” Except it doesn’t if your school uses some of the many monitoring softwares installed on equipment or the network provided by school, or your school requires it to be installed on Bring Your Own Device. Some monitor 24/7 “and it is every day of the year”, and on cloud based systems.

Childline’s webpage even suggests, “If you’re worried that what you’re doing or saying on the Childline website might be being tracked by monitoring software that’s been installed on your computer, you can …use a different computer to view the Childline website or contact us online – like one at school, the library or in an internet café.”

One can only assume that the person designing the webpage content and who is accountable for its sign off not know about Childnet’s Safer Internet Centre partners’ role in safeguarding in schools, SWGFL and the IWF.

Where do your data go?

Some of these software only get used locally. The provider doesn’t have access to the data captured, and it stays on site. Others enable data transfers beyond the school network to third parties, including to the company owned cloud-based systems abroad, such as to the US. Some retain sensitive data for over a year, including the recordings captured of keywords, including mental health, sexuality or intimate personal conversations. These data flows beyond school, are opaque to the child, without any consistent communications what is collected, by whom, why, how it is shared or long it can be retained.

Some companies may never see the data, since there’s no data transfers outside a school, and their ownership sits inside the UK or Channel Islands. Others are owned by banks in Bahrain, private equity and property managers or foreign investors. If agreements loosely permit data sharing within the company and their affiliates, school staff and certainly families, might not be clear where those data go or what the agreement permits.

Data flows are opaque inside the policy of safeguarding in schools system itself as well. Designated Safeguarding Leads may be someone in a school, or academies’ Trust. They may or may not know the child about whom the many systems, in a variety of ways, creates flags, or heat maps of online activity, such as searches or typing words around sex, self harm, or Syria.

Public debate is entirely lacking on this, perhaps partly because few parents know it happens. But staff are raising issues around consistency, deletion, error rates and children wrongly labelled or for nothing at all.

So it’s high time some questions were clarified.

Not everything done in the name of safeguarding is the same. Will a teacher intervene if a child is known to have visited a mental health or gender counselling website? What effect might it have on a child’s access to the service? Trusted safe spaces must remain confidential if they are to be trustworthy. What if a child wants to chat about a teacher or a parent, or even ask in confidence about a friend’s experience and not their own?

Is a school entitled to know everything a child types on school owned equipment? What about on the child’s own device? Only in school lessons, or out of hours? Onsite premises, or at home? When does someone intervene and how? Are children asked about activity before it becomes a permanent record? Is the record shared before the child or parent is told? And if you feel your processes works well, let’s share that better practice with others.

Prevent

Separately, but related, the IWF (Internet Watch Foundation) and Safer Internet Day partner, is responsible for some of the content which feeds into the broad variety of safeguarding in schools software. As a result of this and the content coming from CTIRU (Counter Terrorism Internet Referral Unit), the words a child uses online may trigger the child being flagged as at risk of radicalisation, or as a potential gang member, or seen through the lens of terrorism. This is separate from blocking and filtering web content.

Information about a child can then flow into the internal school or local authority mechanisms that lead into Channel and Prevent referrals. Why aren’t schools keen to talk about this? Why do occasionally IT staff get downright lairy, complaining children are no angels, and that anyone asking for system scrutiny should *****r off? Why do companies nor schools have nothing on their websites about which keywords trigger the flags, information to make the implications adequately accessible and foreseeable, to enable a child’s reasonable understanding what any consequences of their conduct will be?

For those who think this must all stay secret, we must ask why. For Local Authorities using the FOI blanket exemption, ‘purposes of safeguarding national security’ passing off questions to the police or Home Office, one wonders where the boundaries are drawn between responsibilities and accountability.

If your interventions affect children, you should be able to be open how many children your area is referring into programmes and how well the program works. Let’s talk about what you can talk about, and why there are parts you cannot.

How is privacy balanced with protection? What is OK to monitor? Where and when is it not? What happens when a child triggers a watchword and who acts on it? How many errors are recorded, deleted, or passed on as false flags? How much do these systems cost and where is the cost-benefit of machine versus human systems? What are the risks, benefits and harms for our children? How are these tools creating permanent profiles of children and are they accurate and fair? How many Prevent referrals out of education are created by these software, some of which claim to employ AI adding to the opacity of human decision making, and who has accountability for their accuracy and consistency?

Consistency, clarity and confidence

Some companies are starting to become much more responsive to the issues and engaging with us in open and constructive ways. Even changing their products and communications. But if the partnership of Childnet, Internet Watch Foundation (IWF) and South West Grid for Learning (SWGfL) are serious about ’empowering children to be in control of their digital lives’, and consent, then they need to enable a national conversation about policy and practice across this whole sector.

Where there is a lack of consistency, confidence and clarity it should be fixed with transparent guidance around use, with public information, on policy and in practice.

There must be consistent communications to children and parents, during all parts of the systems implementations which includes in the risk assessment during introduction and explanations to school ownership and governors, explanations and routes of redress across the process, and fair policies and communications into, through, and out of Prevent.

Who decides if and how a software is safe, and fair, and transparent, or when technology becomes harmful? Or where, when and why it should not be used in schools? Safer Internet Day shouldn’t be selective about who decides what is safe, and its promoters and providers must start increasing transparency and fairness in their own activities.

 

Notes

Safer Internet Day 2019 will be celebrated globally on Tuesday 5th February with the theme ‘Together for a better internet’.

The UK Safer Internet Centre – a partnership of three leading charities; Childnet, Internet Watch Foundation (IWF) and South West Grid for Learning (SWGfL) provide resources for children, schools and families, and tools for getting involved at www.saferinternet.org.uk.

Globally, Safer Internet Day is celebrated in over 130 countries, coordinated by the joint Insafe/INHOPE network, with the support of the European Commission, and national Safer Internet Centres across Europe.

Over fifteen companies supply a variety of monitoring software to schools in the UK. One company claims to provide to 1 in 3 of England’s 23,000 schools. These are all different with different capabilities and data processes and management methods.

The Keeping Children Safe in Education and Safeguarding in Schools Guidance is issued to schools in England. Schools and colleges in England must have regard to it when carrying out their duties to safeguard and promote the welfare of children. There is no obligation for these duties to involve a technology based solution.