News / Blog news

What keeps children safe online? A reflection on #SaferInternetDay 2022

On #SaferInternetDay 2022 we reflected on the question what safe means from this year’s theme, ‘Exploring respect and relationships online’ and in the context of digital monitoring, or so-called “safeguarding in schools” technology or SafetyTech. It was also a topic discussed at the #BCSPolicyJam with @cyberbaines  Prof Andy Phippen and @DrChips on Online Safety and myth busting.

SafetyTech and safe spaces

In 2017 the UK government claimed to want to make the UK “the safest place to go online.” In fact, the story is less about safety, and more about control.

Trusted relationships and safe spaces are vital to protect children, often most needed by those most at risk. The NSPCC’s Childline itself recognises that if monitoring software is being used on your device, it means “someone might be able to track the pages you’re looking at on the Childline website“, or “what you’re saying in a 1-2-1 counsellor chat, what you type in your locker and what you say on the message boards.” (TW: abuse, CSAM) Around 1 in 12 of Childline’s counselling sessions relate to abuse or neglect. Over 90% of sexually abused children were abused by someone they knew. (NSPCC, 2021) Children can need safe spaces where they can chat to counsellors, seek information, and be free from prying eyes even from their teachers, parents and carers, especially where there are conflicts over developing sexuality or religious views.

But a new taxpayer funded campaign claims that making online spaces more secure, will make children less safe. Such claims need challenged. Claims that ending safe online activity by taking away the security offered by encryption as the necessary way to keep children safe online, or claims of support for both encryption and ‘back doors’ simultaneously, are not realistic. They also leave out an important part of the big picture for children who are generally left out of the UK policy debate. Children’s trust.

The Australian eSafety Commissioner carried out research in 2019 and found that, “when it comes to monitoring, young people were split in their views. While the majority — 71%, believed that monitoring systems scanning messages and content in the background were helpful in preventing negative experiences, over half (57%) were uncomfortable with these features running in the background. A sizeable minority found monitoring features intrusive (42%) and were unsure about their effectiveness in ensuring online safety (43%).”

The significance of a breach of safe spaces for children is not only about their physical or financial security risks for everything done online, for safe interactions, transactions, gaming, but about the breach of trust and the effect that can have on young people. Interventions must be necessary and proportionate, and one-size-fits-all approaches are rarely either for a population that is not homogenous.  Where children’s communications are breached, they will never know, nor be able to trust, that their conversations will be kept confidential in future. What harm that does to children’s development, mental health and trust in authority needs to be researched.

SafetyTech and surveillance

Today’s safety tech tools in schools often work around encryption. Some may capture bank details and passwords. Some use libraries of thousands of key words to match content with. And remember the legal case in 2010 in the US, Robbins v. Lower Merion School District, about schools taking covert webcam photos? How many covert webcam photos are taken in the UK using similar kinds of technology? What would a safe hiring process need to look like for the staff that work in companies with this kind of access? What oversight should it all have and from whom?

The Special Rapporteur 2013 Report on the right to freedom of opinion and expression, noted why communications surveillance was even more problematic when it was taken, “out of the realm of judicial authorization and allow[ed] unregulated, secret surveillance, eliminating any transparency or accountability on the part of the State.”

Discussion of safety and respect online, must include respect for human rights, but today’s guidance on digital safeguarding in schools leaves this out.

Critically also often overlooked, children cannot be thought of as stand alone individuals, because their devices and behaviours are connected to other people in their family and friends network. Children can be targeted by malicious actors in order to breach their parents security. Children not only need the protection of secure spaces through encryption for themselves but for their connections too.

Some tools that are available to States that breach families’ safe spaces and target their personal communications are recognised as dangerous or unlawful. Why not ‘Safety Tech’?**

While the Home Office, part of the Five Eyes surveillance nations, and driving the campaign to oppose encryption, seems strangely to omit any mention of the tools already available, we are asking what human rights means for an industry that suggests “privacy doesn’t exist anymoreand that the DCMS is doing all it can to promote?

SafetyTech and accountability

Companies are inserting themselves into the relationship between school and the child, between child and family life, and do so sometimes with Artificial Intelligence based solutions or with human staff interventions unbeknown to families. And while 24/7 monitoring of security firewalls may be appropriate to maintain system integrity, profiling children to this level of invasiveness, knowing exactly who did what when in and out of school, 365 days a year, even if it was a passing personal thought and deleted, is neither necessary nor proportionate.

What are their accuracy and error rates? What triggers a Prevent referral? Who are these companies, where are they based, and to whom are they accountable?

Some have financial ties to the Middle East. At least one has connections to Jersey. One rather publicly failed security analysts testing, more than once. One has used what it claims is a real rape case study in its marketing materials, naming the College. And at another company with a mobile phone based approach, the CEO said in 2021 that his company’s lawyers had in fact warned him its technology was unlawful.**

As Joanna Adler wrote in 2017, taking a wider view of companies generally in the digital space, “None of them should be expected to be our personal ethical filters, and we can’t abrogate parental responsibilities to any entity that has its own duties to employees, shareholders or trustees.

While making an environment less hostile may be desirable, making anything meet everyone’s definition of safe is impossible. In our view children need to be able to cope and feel confident in digital settings with agency and able to speak up, appropriate to their capacity. The should be able to flourish, free from interference of all kinds and be able to seek support or redress when things go wrong. The importance of building children’s digital resilience is summed up by Przybylski et al. in 2014, as

“best reached through supportive and enabling parenting, encouraging the development of digital skills and allowing children to take risks and develop coping strategies in the online world, just as we would in the offline world.”

That is what Safer Internet Day was once about. It’s where the UK Online Harms agenda began but most of the literacy focus has gotten lost along the way. Like the scope creep of Safety Tech, the draft legislation has become focused on control of content, and identifying individuals accessing that content.

State sanctioned mass-intrusion by strangers into children’s safe spaces is the opposite of making children safe. Since 2015 the UK priority has been about making the market for these kinds of technology. On Safer Internet Day 2022 it is time that SafetyTech sector grapples with the problems of its own design.


Jen Persson, defend digital me

February 8th, Safer Internet Day 2022