Recommendations for Policy Makers for the Data Protection Bill #StateOfData2018
National Pupil Database surveillance / March 17, 2018
Seven Recommendations for Policy Makers from the State of Data 2018 Data Protection Bill Discussion, Committee Room 10, March 12.
The safeguards for which civil society advocate, are intended to reduce harm and enhance human flourishing. But policy makers shaping the UK Bill have not yet adopted many of those important safeguards. This leaves people in the UK, including our children, exposed to greater risks than necessary; of discrimination, damage to reputation, participation, or their person — or loss of control and oversight of their identity, or empowerment to seek redress, if personal data are misused without our knowledge. These risks are enabled through implementation of weaker protections of our data and right to privacy, than all the hundreds, perhaps more, people who have shaped the GDPR to date, understood were necessary. They wrote those protections into the GDPR as derogations, but our Bill has missed them out.
We all need to recognise that the contemporary and future threats, and our thinking of what data is, how it is collected and used, is often out of date. We need policy makers to put the safeguards into the Data Protection Bill that the GDPR describes to uphold fundamental rights and freedoms well enough not only now, but in future — these are your rights, as well as your children’s.
As the State and corporations are giving ever more authority to machine-made decisions, and with it, power to profit-led companies, it is vital their effects on people stay at the centre of policy thinking. Human rights must stay foremost in the minds of those with responsibilities for technological change and legislation.
For children, specifically, rights should reflect the gold standard as enshrined in the UN Convention on the Rights of the Child. GDPR Recital 38 demands that, Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing, or creating personality or user profiles, and the collection of personal data with regard to children when using services offered directly to a child.
That recital should be the guiding principle regards children in our Bill. There are around 11 million under nineteen year olds in education in the UK today.
Here’s the seven recommendations defenddigitalme has summed up from the event on Monday March 12.
Of particular relevance to children, specifically
- Support the proposed amendment New Clause 16 [p54/55], that would require the Information Commissioner to publish a UK-wide Statutory Code of Practice regards children and pupils in education, taking age and capacity into account, and embody the Working Party 29 explicit recommendation to create guidance about children on profiling and automated decision-making ’The WP29 noted,“Article 40(2) (g) explicitly refers to the preparation of codes of conduct incorporating safeguards for children.”
- Remove the Immigration exemption, Schedule 2, Paragraph 4 or expect further loss of young people’s respect for state data collection about them, public and professional trust, and damage to administrative data quality.
- Recital 71 states that solely automated decision-making should not apply to a child. Put the necessary safeguards into our Bill, that the GDPR describes and suggests in Article 23 and Article 22(2)(b). There’s no need to create anything new. Just adopt what GDPR suggests to protect fundamental freedoms in Article 22(2)(b).
- Those necessary safeguards must be especially strong with regards to their biometrics [now common in schools even for buying a snack or borrowing a library book, often without consent even though it is required under UK law, or explanation of retention] set out in Article 9, but not in the Bill. We suggest ICO registration of this special category is required.
- Put the Right to Representation and Collective Redress 80(2) into the Data Protection Bill that the GDPR describes and suggests, but we have not yet adopted. This affects everyone, but children more than most as they are often not equipped to hear about, or understand where their rights have been infringed.
And indirectly, as they affect everyone
- Increase transparency and trust in the Intelligence and security services‘ necessary and proportionate powers. This may increase any likelihood of a needed adequacy arrangement.
- Remove or constrain the “government datasharing framework”, so open, that it concerns even the Information Commissioner herself, by being “unnecessarily wide“.
Future democratic engagement, and our economic and social well being, demand improved understanding and broader involvement in how personal data are used by others. Help children and young people, and each one of us, understand our rights and responsibilities where it comes to personal data. Get young people involved in informed debate about data, digital rights and ethics, as well as an inclusive range of organisations that represent and support them. Data protection law and policy should be non-partisan, and must be forward thinking, for the good of all, founded on human rights.
Blog: Discussion Highlights and Reflections
Matt Hancock, Secretary of State for Digital, Culture, Media and Sport, framed keeping children safe online this week, as a world in which screen time is a threat. Policy makers are too narrow in their thinking of what today’s data experience looks like for children, and the threat model. Support of tools not fit for purpose, and this lack of perception means our Data Protection Bill is missing the necessary safeguards that GDPR includes and recommends to protect the rights of children, to protect them in practice.
We doubt that MPs have any idea where their own children’s personal data are, and who is using them. Not because of screen time, but how children’s data are used in the education sector. Parents have lost control of their child’s digital footprint forever, by the child’s fifth birthday, thanks to local and national, policy and practice.
What does data collection look like for children, and will legislation be adequate for the application of rights at international, national and local level?
Wendy M. Grossman described the sensor-filled future as approaching with the speed of an oncoming train. It means we need to address passive personal and behavioural data collection about us from our surroundings in public and private spaces much better in policy and practice. “We are talking about data as though it is merely an online phenomenon, but what we are seeing right now is offline companies who are looking at Google’s business models and saying, ‘I want some of that’.”
Patrick Geary, Corporate Social Responsibility Specialist, United Nations (UNICEF), asked, how we practically enable the GDPR requirement that terms and conditions be accessible for children using gadgets, that may not even have a screen. “How do you meaningfully explain an increasingly complex and ever-evolving digital ecosystem to a 13 year old?”
His key theme is one we want policy makers to keep central in the Data Protection Bill as it regards children. The need to recognise the contemporary and future threats of an ever more machine-driven environment to their human rights, as enshrined in the UN Convention on the Rights of the Child.
“Children have a right to privacy. They not only enjoy the same privacy rights that adults do, but they have a specific right to privacy under the Convention on the Rights of the Child. The special importance of children’s right to privacy is also recognised in legislation like the GDPR…the relationship between privacy and the internet is especially complex. On the one hand the internet gives children the ability to freely and independently connect, learn and communicate in ways that can dramatically enhance their sense of privacy. On the other hand the internet is also a place where children’s privacy is at serious and growing risk of interference. ”
His unit at UNICEF looks specifically at the responsibility of business, and is about to publish an industry toolkit on children’s online privacy and freedom of expression.
Some of the disconnect between technology, legislation, public policy and data, is designing machines well for people. For human interaction. Richard Gomer, Research Fellow at University of Southampton, Interaction Design Consultant, and a School Governor spoke about the importance of addressing the challenge of essentially building intelligible, controllable technology, so that people understand how the decisions they are making in a technology-rich world are going to have repercussions, what they mean, and have some hope of predicting what those actions might lead to.
Ailidh Callander, Legal Officer at Privacy International, reminded us that the reality for children today is that computing is embedded in things designed for data collection about behaviours, interactions with others, and even biometrics; in toys like the Cayla doll that records and passes on children’s voice recordings, or “smart watches” and other IOT [Internet-of-Things] devices aimed at children. The Norwegian Consumer Council research on smart watches #WatchOut found that a stranger could take control of the watch. They could track or eavesdrop, or communicate with the child. Privacy, security and safety issues converge in IOT passive data collection that often happens unseen and without consent.
How in this world, or classrooms, do we ensure the practical application of principles of data minimisation and data protection by design and default (GDPR Article 25)? Article 16 of the UN Convention on the Rights of the Child, specifically enshrines that, ‘No child shall be subjected to arbitrary or unlawful interference with his or her privacy’, and that, ‘A child has the right of the protection of the law against such interference’.
“When you take that into an education context, the rights context of youngsters, it becomes a really difficult question,” Carla Aerts, Director of Futures at UCL Institute of Education, and Deputy Director of EDUCATE reminded us. It is a question nonetheless that education must grapple with as the collection and use of data has expanded beyond reasonable expectations, and digital records retention is now starting to hit the 20+ year mark in some schools with little idea of what is necessary, should be deleted or should be anonymised.
Aerts also pointed out that,“Everybody claims they are anonymising data. Anonymising data is rather hard to do and to do well. “And said we must be alert to the risks of commercial players increasingly becoming the owners of the AI, and that as the AI employs our personal data, we become commoditised.
A desire to be seen to be welcoming of new and shiny smart systems, can dull our appreciation of the limitations of technology, its biases and propensity for discrimination. Lack of understanding is a barrier to develop adequate policy and legislation, as well as for enforcement of how technology is applied and affects our data rights in practice.
NHS hospitals have entered into contracts with Google DeepMind and handed over 1.6m identifying patent records without consent. The ICO ruled the Royal Free NHS Foundation Trust failed to comply with the Data Protection Act . Where it is already happening in education, we have not yet seen. We didn’t get on to discuss that, but as Aerts said, “This is something that needs to be addressed in any education debate and any learning technology debate.”
Importantly debate must also involve more young people themselves, she said, especially as we discuss the ethics of personal data — we would add, even deidentifed data, used to make interventions with groups, and the public good. But we must ensured they are informed in debate, not left to be used as fig leaves for approval for projects after, “engagement”.
Everyone agrees that when it comes to children and young people it matters how information about data collection and its use is communicated. As someone else working with young people pointed out, “It is one thing is informing them, but another getting them to understand this is why you should care, instead of just, ‘This is the information you are presenting’. ”
Toks Oladuti, Director of Information Systems at an Independent School Trust in London, presented what’s possible in schools, how to do data privacy and protection well with young people in practice. He explained how he had improved communications in the Data Protection policies and practice at his school. The starting point a couple of years ago, to be GDPR ready, was assessing what data they held and why, and that included changing the mindset around rights and security, among staff but also in suppliers. “They were not aware of where data was being stored, and whether it was being stored in the US, which we found out for quite a few services, some services were actually storing data in China, which was an issue – and we found one company that was storing data in Russia.”
The good thing about being in education, he went on to say, is that, “where you are working with a group of vocational people, the hearts and minds are in the right place: they are, in essence, wanting to do the right thing. Their whole concern is about child welfare, child education, and that is their focus. We wanted to address awareness, and get them to understand what was involved [in good data practice], and what the requirements were to empower them to use that knowledge. It became clear; both in our own school and also talking to other schools; that awareness and understanding were quite low.”
“There was, when we started, a disjoint between schools, suppliers of school technology, and their understanding or approach to data protection. We were looking at a lot of the suppliers that we as schools were using, and that were available to us as schools, and how they managed data and what their policies were on data, and what they shared with us, with regards to what they were doing with data. There was not a lot of information there it was quite lacking in quite a worrying way.”
He pointed out that children are often mislabeled as digital natives, and while they are, “brilliant digital consumers, they do not actually understand the underlying technology, how it works, and how their information is used.”
He went back to basics. He overhauled policy and practice to deliver a much better standard at their Independent school than we commonly we at defenddigitalme hear of today in state schools. they asked, “Is this the right piece of technology or should we be doing it this way? Why have we got this data? We do not actually need it, so let us not keep it, and let us not collect it. Why have we got so many copies of this data?”
While we expect GDPR will increase passing awareness of data protection responsibilities, we must not forget that DP law is mostly about processing and using what has been collected. Human rights should also remind us to ask the question as Toks did, should the data be collected at all? We are moving in the UK in the direction of always making all data accessible, and rarely asking if it meets people’s reasonable expectations.
Those questions should be part of Data Protection Impact Assessments, and Data Protection by Default and Design of GDPR Article 25 — but how will schools understand what GDPR considers reasonable and “appropriate technical and organisational measures”?
Tony Sheppard, GDPR in Schools, a hands-on-practitioner of what the legislation means for school staff, echoed those concerns in his experience of current state-schools, and said that there needs to be consistent information to answer the questions schools are going to be asking,” because the majority of the information out there is quite generic. At the moment the stuff from the ICO although invaluable, is generic and not necessarily in context.”
The poll we recently commissioned of 1,004 parents carried out by Survation, and a separate small UK-wide sample of views from school IT staff, reinforce our conclusions of the last 12 months how unaware parents are how children’s data are collected, used and distributed from schools. It is rare to see schools meet current DP law well, even before we look at the expected GDPR standards.
Big Brother Watch Director Silkie Carlo shared the concerns over the lack of information available to children and parents. And echoed the stories of poor security in handling children’s sensitive data.
The effects on children of the growth of surveillance culture in schools, using fingerprint and facial recognition technology using biometrics, and the extent of automated decision-making and profiling, are poorly understood, she said, despite becoming trivialised.
“Kids now have to give their fingerprints just to get their lunch. There is a casualisation of biometric data, which is incredibly sensitive. The use of CCTV within schools, in toilets, in changing rooms – but last week we saw some of those cameras got hacked and had been live streamed on the Internet.”
“Then the key logging as well… We are worried at Big Brother Watch that children are growing up with this culture being ingrained in them from school, the expectation that our every move should be watched and monitored for something like counter-extremism for nine or 10 years olds.”
She pointed out the conflict between Matt Hancock’s desire to micro-parent by taking control of children’s social media use into state hands, while at the same time the government refuses parents and children their subject access rights, to see your own record on the National Pupil Database. A database that gives away their children’s personal confidential data to third parties, that 69% of parents said they had never been told about, and never asked for consent even though that is what a two-thirds majority indicate they would expect and want.
The apparent similar conflict between offline and online government policy for protecting the best interests of children, was summed up by Giles Davidson, Head of Policy at Tech UK, not least he said, given the government had just abandoned plans for compulsory PSHE. He asked how we expected to see the principles of data minimisation applied, and said the infrastructure for children’s digital understanding needs thought out. He said, “because a lot of these things require more data on children, not less.”
Another conflict we didn’t touch on is the extraordinary amount of linkage that has begun at the Department for Education of pupil data with health, justice, and police administrative data.
How children flourish in this world of every greater data collection and linkage of datasets, by companies or the state which can build up a picture of an individual’s life in ways the person cannot see, will depend on us building the odds in their favour,
Ailidh, described how hard this is for children to navigate the world and be treated fairly, where there is an imbalance of power. “The spaces and environments we inhabit and pass through perpetually generate and collect data from human behaviour. This data is used to identify and profile individuals, generate intelligence about us and our societies, to anticipate and shape future behaviour and make decisions about us and our world. This creates various problems and can result in abuse. We do not know or control what data is on our devices, and this can be used against us, for example, a young person could be arrested and not know that there is data on their phone that they cannot access, but the police can.” We often forget, but a gadget may always remember, who we were with, or near.
“As a result of design choices and technologies individual and collective behaviour is increasingly traceable, meaning that the intelligence agencies of today will be industry tomorrow. For example, social media is used to generate insights and intelligence on us all we just need to think back to Facebook’s proposal for Messenger Kids. The data that is observed, derived or predicted from behaviour is increasingly used to form consequential decisions about individuals, an example of that in recent times is Admiral Car Insurance who wanted to use Facebook posts for first time drivers to analyse the personalities of car owners and set the price of their premiums.”
Rebalancing this power sharing in children and young people’s favour, is one reason that Right to Representation and Collective Redress (Article 80(2)) is also important to adopt so that civil society is able to represent groups, especially children, who do not have the capacity, the time, or the understanding of how these issues affect them, and we urge the government to adopt the available derogation.
While we are hoping that the Data Protection Bill builds a framework for our children to safely share, store and secure their personal data with transparency, the government is simultaneously seeking exemptions in various areas that cause concern.
One such exemption that concerns Privacy International, is around national security and defence. “It is a very broad exemption and allows for national security certificates. These are broad forward-looking and timeless certificates that just provide a complete exemption to data protection without any transparency, oversight or accountability, and they just exacerbate the problems that we have seen under the current Data Protection Act. They have been issued in the past to the intelligence services and to TfL. Whilst not necessarily directly relevant to children it is such a broad exemption that it has consequences for us all.”
The immigration enforcement exemption is another that is stacked against children, and could cause them harm and distress.
Liberty’s Policy and Advocacy Officer, Gracie Bradley explained why it is an incredibly broad exemption. “It essentially says that when a data controller or processor is processing someone’s information for the maintenance of effective immigration control or the prevention or detection of activities that would undermine it, essentially all of their rights under Chapter 3 of the GDPR can be set aside at the discretion of the data controller.”
Tying together the perfect storm of poorly understood automated decision-making, discriminatory AI and a context of cuts to legal aid and poor decision-making, she explained why it is vitally important to protect people — especially those most disadvantaged or already targeted by the state.
“Ultimately what we are moving towards is a system in which at the click of a button people can be locked out of all kinds of essential goods and services just because their immigration status is wrong”. “Ultimately people will not be able to live the dignified life all of them should be able to, if we do not get this sorted out.”
And she reminded the audience many of whom are data users from academia and data bodies, that, “whatever good things you want to get done with data you have to also be willing to fight with us about those nefarious uses of data, because otherwise the default position from people like me is going to have to be, ‘You cannot have the data, because the state is going to do this and that with it’.
In Committee discussion of the Bill the day after the event on Tuesday March 13, the Bill Committee member on behalf of the Government, Victoria Atkins MP, explained explicitly national pupil data is one area the exemption was intended to be used for embedding the secret policy put in place in 2015, that went undiscovered for a year. “If we know they have a child, we will seek from the Department for Education assistance to find the whereabouts of the child. That child has not committed a criminal offence, so I would be very concerned to ensure that the Home Office, Border Force or whoever else acted lawfully when seeking that data in order to enable them to find the parent or whoever is the responsible adult, as part of the immigration system.” Its discovery led to a highly successful campaign led by Against Borders for Children, and a rate of refusal or not obtained of over 25%.
That is the future damage that looms for all public administrative datasets, if the government does not withdraw the immigration exemption, and rein in some of its Henry VIII powers.
Public trust is the cornerstone to maintaining the public good that everyone wants from data. It is in perilous risk in the NHS for similar reasons, as well as overreach of commercial companies uses without informed consent.
Trust will be incredibly fragile, if its basis is uncertain. That’s a risk the government seems prepared to take, even to jeopardising adequacy, by including its own addition to the Bill, the “government datasharing framework”, which gives Ministers new powers and future scope for more; so open, that it concerns even the Information Commissioner herself, by being “unnecessarily wide“.
Public trust and with it the public good of data used well, once lost is hard to regain. This clause may well be the the undoing of adequacy if not now, then in the future.
Dr Zeynep Engin, suggested that providing a national infrastructure with appropriate controls of privacy and security embedded, is a key responsibility of government. “That is not to say that centralised data, but a distributed, secure privacy preserving infrastructure, so that local governments, businesses, and individuals can make their own plans accordingly. Similar to governments providing road infrastructure or the rail network for everyone else to basically make their plans accordingly.”
Ambitious forward thinking is needed, fit for future technology that is already around the corner and in our classrooms. Based on democracy, fairness and public trust, with respect for fundamental rights and values not yet embodied in our Bill. We are not designing our systems or policy or legislation for children, and we are certainly not educating them. The onus is on us as adults to make it happen.
As the Children’s Commissioner said in her 2017 report, Growing Up Digital, ‘We are designing a world right now that takes away agency from our children’. We at defenddigitalme want government to design policies that give our children agency and understanding, underpinned with respect for their rights.
In a world that speaks ever more about greater personalisation, we are in fact being treated less and less as an individual, but increasingly by comparison and probability, and predictions made of our every behaviour analysed by machines to see how we measure up to each other’s profiles built up from historical data. It not only about screens, but also about sensors.
It’s why we are pleased to see the proposed amendment that would require the Information Commissioner to set out an inclusive Statutory Code of Practice for the data rights and responsibilities regards children and young people in education, taking age and capacity into account aligned with UNCRC. It will enable offer improved clarity, consistency and confidence about Data Protection rights and responsibilities, across the sector, for children, parents, schools and system providers.
We think it’s urgently needed. We hope Government will agree.
Event Full Transcript
#TheStateOfData2018: Rights and Responsibilities House of Commons, Committee Room 10, 12 March 2018
This transcript is an accurate reflection of the full discussion, to the best of our knowledge and ability but given its volume may be imperfect. Please contact us for any corrections needed.
Jen Persson, Director, defenddigitalme
Preamble
Good evening, everyone. Thank you for coming. My name is Jen Persson. I am Director of defenddigitalme. I would like to say thank you to our host Darren Jones MP who unfortunately is not able to be here this evening other business pending, but wishes us well. It is very timely, because we are right in the middle of work on the Data Protection Bill, as many of you will know.
This evening I am going to kick off, and then we will hear from each of our invited speakers who will speak for five minutes. What I am hoping is that we will have a proper discussion afterwards, and we will have plenty of time for that, so that we do not only hear from the expertise of the invited speakers, but your expertise in the room. I know that I have learned an awful lot the last year in my work on data and data protection, particularly for children and in the education sector, from each of the people on this panel and from many of you here tonight. I believe we can only do data well in the UK if we do this kind of thing more often and work collectively to try and make our understanding of data policy and legislation and practice the best we can have it.
Scene Setting for The Data Protection Bill
To briefly set the stage of where each of us fits in. In the House of Lords last November discussing the fourth industrial revolution, Baroness Lane-Fox said, ‘Arguably the biggest challenge we have facing governments globally at the moment is the disconnection between policy, legislation, and the pace of technological change’. I believe there is a fourth element that is vital to that that is missing, and it is us, the people.
It is why I was delighted that Darren welcomed the suggestion to host this evening, to bring these four areas together. The Data Protection Bill that starts in the Committee Stage tomorrow really encompasses all four of those elements, and I am grateful to him, and to Ben in his team for making tonight possible.
It matters that we get this Bill right. It matters that we fix this disconnect between policy, legislation, technological change and people, and I am delighted that all of you are here tonight because you think that matters too. If I were at home I would be reading to my children tonight, probably about another woman’s life from Goodnight Stories for Rebel Girls, their favourite book of the moment. I have given that up tonight, as each of you have given up something and I appreciate your time this evening to come together, because this Bill, this legislation, this policy is not about data, it is about people. It is about us. It is about our lives, and other people’s lives, and we need that discussion to bring together this scope that is bigger than the few of us speaking this evening, or any of us in the room, or even the policymakers who are focusing on the Bill in the next few weeks.
The GDPR scope is bigger than all of this, and it builds on the Data Protection Act that from 20 years ago, but it is already out of date with the pace of technological change. How do we all keep up? All of us tonight? — and I am delighted that we have academics from across the UK from different universities, from experts who spend all their time on data, UK Statistics Authority, from Government Departments, and MPs to anyone who is interested and not only the usual suspects.
Civil Society and Government
We have got three civil society organisations here tonight with ourselves from defenddigitalme, who specialise in data rights for children in the education sector. We have Liberty, Privacy International. We are also expecting Pippa King from Biometrics in Schools, and Silkie Carlo, Director of Big Brother Watch. As civil society we need to come together to make our contributions heard, but we need your support in order to do that.
We believe in working together to make society fairer, more just, more equal, more free, and more about people than about power.
Government policy and legislation have always been about power, but now technological change and data are at the heart of that power, the power that knowledge brings. Knowledge that was once all about understanding or asserting what happened in the past, is now obsessed with prediction or nudging us towards a changed future.
Unlike the industrial revolutions that have gone before, in which those in power may or may not have been concerned how people will adapt to change, and the results of change brought about by machines, this time our leaders and policymakers are also worried about the machines themselves. And if they are not worried about the machines, and the power of machines, and the corporations behind them, they should be.
Corporate Responsibility
That brings with it a responsibility for the type of power that we have now Government and in corporations. Patrick who joins us from UNICEF will set the context of that corporate social responsibility tonight. How machines are being taught. What they learn. The implications of their decision-making as a result of what they have learned, and the power and scale of the knowledge that that brings, are unprecedented. It is vital in this machine driven age that human values stay at the centre of our thinking, and that human rights stay at the foremost in the minds of those with responsibilities for technological change. That is where civil society comes in.
But we cannot be the only checks and balances alone in civil society that this type of power requires. Our corporations know more about what you and I bought than we remember ourselves last week. From Tesco who knows from its Clubcard what the nation buys driving the machine-led position of power that gives them, driving what farmers plant and how they harvest, and the distribution of food we have at scale; to computer decisions whether you or I qualify for that flight discount or insurance discount, or make it through an employment screening; or not. All these machine-made decisions give that decision-outcome authority, and power, and it is power that is opaque. How the computer decided to say, ‘no,’ is not only based on your own personal data, but on that of your profile, and how the computer compares your personal data with that of millions of other people, their profiles, built up from their personal data.
In a world that speaks ever more about greater personalisation, we are in fact being treated less and less as an individual, but rather by comparison and probability, and how we measure up to each other’s profiles built up from historical data.
Power at scale and nudging population-wide change
With this increased level of knowledge comes an increased imbalance of power, and we need to have checks and balances in place to make sure that it is used well. In education we can look at one small example, of a friend who knows their classmate well in the lunch queue, and has the ability to nudge one child’s choice of crisps or fruit. They could help their friend, that one person make a healthier choice, but a state-run Behavioural Insights Unit that knows exactly what the entire population does, can nudge a whole population’s behaviours at scale.
You could help a whole nation make healthier choices, or political choices, but that type of population influence, that nudge, is opaque. Like that of the Department for Education which has built up its own Behavioural Insights Unit about which we know nothing at all, over the last year.
Accountability
The General Data Protection Regulation, the GDPR, demands that people are accountable for their actions concerning the data management of others. While we may talk about social media companies failing in their responsibilities, we should be careful not to dehumanise the faces behind these companies that make these decisions responsible for their actions. In the world of growing algorithmic decision-making, machine-made decisions, we must be careful to remember that these are computers designed and fed information by people. The face of algorithmic accountability is not Pepper or Sophia, but SoftBank Robotics, and Hanson respectively.
I am delighted that Zeynep is here tonight to speak somewhat to algorithmic decision-making in government, but it also sets the context for the type of power in corporate use of algorithms today as well.
Involvement and Action
Kevin Courtney, as well as General Secretary of the National Education Union, has kindly agreed to chair our questions and discussion, as he could so much better than I. He also brings neutrality to the debate, and I am delighted that he has taken an interest in data and how data is being used in education in particular.
I have asked our speakers tonight to specialise and talk about a particular area that they are specialist in, and a particular area, where it concerns a civil society organisation, of the Bill that needs work. I am also asking you to do work tonight. I am going to ask you to take away work. Each of us in this room has a position of privilege, some of us more than others, and if we are not invited to the meetings, or usually take part in this type of discussion about legislation or policy, then we need to host our own. We need to get involved, because at the moment these types of decisions are being made by very, very few, and they are not necessarily decisions that are in the best interests of everyone.
If everyone is involved in these discussions, we have a better chance of those outcomes being fair and just and equal. Councilwoman Tamaya Dennard, in her recent Cincinnati campaign for Council, quoted Shirley Chisholm from the 1970s, ‘If they do not give you a seat at the table, bring a folding a chair’.
It’s been my motto for the last two years, and it’s why we are all here tonight. Because the issues that we have today in data cannot be solved by a few policymakers alone, and some of their solutions have incredibly harmful side-effects, some by design, some by accident. We will speak to some of those this evening.
Each of you far more expert than I am in all sorts of different ways can each bring a chair to this table tonight. Afterwards I hope that you will get involved if you have a particular area that catches your interest from one of the speakers, whether it is with your MP, whether it is writing to the Bill Committee. Whether you think, ‘That is an issue I want to get involved with’, you could also get involved with one of our organisations afterwards.
So, we are going to talk across the scale of data and how children’s data fits into society and fits into the wider context. Then we will go progressively down a level, from different national government departments, right down through to the education sector and schools, my own organisation’s particular area of interest. For that, I am delighted to have Toks speak to us tonight, who has written extensively for TES on the GDPR and is Director of ICT in a London school. So we are going to start with Patrick speaking about corporate social responsibility from UNICEF’s perspective and how children’s privacy and participation fits into that international context.
Panel Session
Kevin Courtney, Joint General Secretary, National Education Union (NUT Section)
Thank you to Jen for setting the scene. I am chairing the discussion. Until September I was General Secretary of the National Union of Teachers. We amalgamated in September and I am Joint General Secretary with Mary Bousted of the National Education Union. Our first speaker is Patrick Geary from UNICEF talking at the global level.
Patrick Geary, Corporate Social Responsibility Specialist, United Nations UNICEF
It is a pleasure to be here tonight. I will do my best to give you a quick run-through of the international framework around children’s rights in a digital world. I use that phrase, because that is what we talk about when we talk about children’s rights in UNICEF in this sphere. We want to look at children’s rights holistically, and with a great understanding of the ways in which they are independent, interdependent, and interrelated.
By children’s rights we mean the UN Convention on the Rights of the Child. This is very much the gold standard. It is a holistic package of rights that recognises the special situation of children, from health and education to participation and expression, the Convention covers all aspects of children’s lives. It has been with us now for more than 25 years, so I guess it is no longer a child itself – it is well into adulthood, probably graduated from university, thinking about what it is going to do next. It was ratified by the UK in 1991, so it has also been in force here for almost that same amount of time.
While the Convention predates the internet as we know it today, it applies equally in the digital world. This means that children have the same rights online that they do offline.
When we think about children’s rights in the digital world it’s important to think in the same way that the convention does, which is, again, holistically. This required understanding how children’s rights relate to one another, and striking the right balance between protecting and empowering children.
Children have a right to privacy. They not only enjoy the same privacy rights that adults do, but they have a specific right to privacy under the Convention on the Rights of the Child. The special importance of children’s right to privacy is also recognised in legislation like the GDPR and the Data Protection Bill. For children, the relationship between privacy and the internet is especially complex, on the one hand the internet gives children the ability to freely and independently connect, learn and communicate in ways that can dramatically enhance their sense of privacy; on the other hand the internet is also a place here children’s privacy is at serious and growing risk of interference.
Children’s personal data is collected even as they develop in the womb and the trails of information that children born today will leave behind become wider and deeper with each passing second. Although we do not yet know what the full implications of this will be, it is reasonable to imagine that the data gathered, collected, retained and processed about children will have significant, long-lasting and life-changing impacts.
Children’s right to privacy is affected by many different actors across many different contexts. Public actors may collect children’s personal data both expressly and covertly in order to deliver services, to track behaviour, to investigate potential illegal activity, and to facilitate surveillance programmes. Private actors may collect children’s personal data via websites, applications, products and services they offer, often with the aim of monetising this data for commercial purposes like marketing and advertising. Parents may set up systems to supervise and control how children use the internet, gaining access to records of children’s online activities and placing restrictions on the kinds of information they may seek online. Schools may use technology not only as a tool to aid learning, but also as a means to monitor children’s behaviour, both inside and outside the classroom.
There are undoubtedly legitimate reasons to interfere with children’s privacy. Children also have rights to be protected from violence, and to develop to the best of their potential. This gives good cause for adults to intervene and guide how children exercise their right to privacy online with the expectation that children will have greater autonomy over their rights as they mature.
There is also an imperative to protect children’s privacy online and to ensure that children’s best interests are taken into account in decisions that affect them. This means we must acknowledged the full sweep of risks and benefits of any measures that would interfere with children’s privacy. In the end, the ultimate goal must be to empower children’s growth into digital citizens who are able to confidently exercise their full range of rights online.
When we look at the legal and policy landscape today it is clear that children’s rights online and children’s rights to privacy in particular, have sky rocketed up the national, regional and international public policy agenda. The UN Special Rapporteur on Privacy highlighted children’s rights as an area of growing focus in February. The Council of Europe will be adopting guidelines on children’s rights in the digital environment this month. The EU’s general data protection regulation will go into force in May, and the Information Commissioner’s Office here in the UK is in the midst of a consultation on children’s online privacy.
I think this is ample inspiration for the UK to re-examine the legal and policy frameworks for children’s privacy, and given the unique challenges and opportunities around education in the digital world I hope the conversation tonight helps all of us better respect, protect and realise children’s rights online. Thank you.
Kevin Courtney
Thanks Patrick for that a fantastic international view. You started with the UN Convention on the Rights of the Child, and now we are moving straight over to Ailidh Callander, Legal Officer with Privacy International.
Ailidh Callander, Legal Officer at Privacy International
Thank you Kevin, Patrick and Jen. As Patrick said, the right to privacy is enshrined in national, regional and international human rights instruments, and is it specifically enshrined for children in Article 16 of the UN Convention on the Rights of the Child, which states that, ‘No child shall be subjected to arbitrary or unlawful interference with his or her privacy’, and that, ‘A child has the right of the protection of the law against such interference’.
As the generation and collection of data, both online and offline in our day-to-day lives increases, so too does the scope of interference with children’s right to privacy through both state and corporate surveillance and data exploitation. One way in which the law seek to protect children’s right to privacy is through data protection legislation, and that is why looking at GDPR and the Data Protection Bill is so relevant. I will quickly outline what we see as the problem of data exploitation at Privacy International, how the GDPR and the Data Protection Bill can help to provide some safeguards, and what we are particularly concerned about in the Data Protection Bill that is going through Parliament at the moment.
What is the problem? As you know, the spaces and environments we inhabit and pass through perpetually generate and collect data from human behaviour. This data is used to identify and profile individuals, generate intelligence about us and our societies, to anticipate and shape future behaviour and make decisions about us and our world. This creates various problems and can result in abuse.
We do not know or control what data is on our devices, and this can be used against us, for example, a young person could be arrested and not know that there is data on their phone that they cannot access, but the police can using tools such as cellebrite. As computing becomes embedded everywhere privacy, security and safety issues converge, and we see this in IOT. Clear examples have arisen with IOT devices aimed at children, and we have seen in the work of the Norwegian Consumer Council on smart watches where they found that a stranger could take control of the watch, track or eavesdrop, or communicate with the child.
At the same time it offered a false sense of security to parents by offering unreliable alerts and SOS functions. Similarly, their work on the Cayla doll where a lack of security meant it was possible to talk and listen to the children through the toy without being physically present, that information shared with the doll was also shared with third parties, and that children were being subject to hidden marketing.
Furthermore as a result of design choices and technologies individual and collective behaviour is increasingly traceable, meaning that the intelligence agencies of today will be industry tomorrow. For example, social media is used to generate insights and intelligence on us all we just need to think back to Facebook’s proposal for Messenger Kids. The data that is observed, derived or predicted from behaviour is increasingly used to form consequential decisions about individuals, an example of that in recent times is Admiral Car Insurance who wanted to use Facebook posts for first time drivers to analyse the personalities of car owners and set the price of their premiums.
For example, someone that was reliable at arranging to meet friends at a specific time could be seen as conscientious, and someone who uses exclamation marks could be over-confident, and that is the way a first time driver would be profiled and they would pay the financial consequences.
Our existing rules and safeguards are unprepared for this new reality and the power imbalance is heavily weighted towards industry and Government, as opposed to individuals, particularly children.
However, GDPR does offer some important provisions to tackle these concerns. It strengthens the rights of data subjects including providing a detailed right to information, stronger consent provisions and limits on certain automated decision making. It outlines the concept of data protection by design and by default, putting into practice principles such as data minimisation and purpose specification. It provides for requirements as to security of personal data, and limits on its sharing, and provides us with tools, such as data protection impact assessments, records of processing, and the appointment of data protection officers.
This does not always go far enough though, in order to go some way to actually protecting individual’s rights and addressing these power imbalances these GDPR obligations that I mentioned need to be taken seriously and put into practice in designing, planning, producing, delivering and maintaining systems, goods and services, in particular, those aimed at children.
Legislation, especially the GDPR, is the fruit of compromise. One element of this is that there are a number of derogations in GDPR which mean that the UK has discretion in how it implements those. We are concerned that the way in which the UK Government has chosen to implement those derogations undermines data protection. I will mention a couple of our concerns. There are broad exemptions and conditions for processing in the UK data protection bill that undermine the protections in GDPR and are open to abuse. One such exemption that we are concerned with is around national security and defence. It is a very broad exemption and allows for national security certificates. These are broad forward-looking and timeless certificates that just provide a complete exemption to data protection without any transparency, oversight or accountability, and they just exacerbate the problems that we have seen under the current Data Protection Act. They have been issued in the past to the intelligence services and to TfL.
Whilst not necessarily directly relevant just to children it is such a broad exemption that it has consequences for us all, and those concerns that I mentioned. Finally, one of the other provisions we are concerned with, is collective redress. For children especially it is very important that the concerns I mentioned are taken forward, and they will not necessarily have the resources or the awareness to do that. The GDPR provides an option for member states to allow civil society organisations to take cases without having a specific mandate from individuals. This has not been implemented by the UK Government and is not included within the UK Bill. Therefore for civil society it is very important that these collective redress provisions are included, particularly for cases like those of the smart watches that we have seen.
There are other concerns that we have around the delegated powers, around automated decision-making and, as we mentioned around some of the broad exemptions. I know that others here will elaborate on those. Data protection does not solve the issues I highlighted, but it is a start, and we will continue at Privacy International to push for better protections in this Bill and through other legislation.
Kevin Courtney
Brilliant, thank you very much. A really good overview of the Bill, some of the protections and some of the weakness, and it leads on really well into what Zeynep is going to talk us about algorithmic government, the use of data for automated decision-making, etc.
Dr Zeynep Engin, Senior Research Associate, Urban Dynamics Laboratory of UCL
Thank you very much. I am very pleased to be here, so thank you for the invitation. I am the computer scientist in the panel, so it is going to be more about technology side of the issue then policy, although I have some exposure to the policy end as well. It is going to be more generalist as well.
Technology through the multinational companies like Google, Facebook and Amazon are changing every aspect of our lives from the way we socialise, work, create value in the economy, everything, and the change is irreversible, so we have to find ways to tackle these changes. The success of the commercial sector particularly the advertising and retail sector has opened up a lot of doors to all the other sectors. We see that the public sector is taking it slow for some really good reasons – my fellow speakers have explained that there are good reasons for the public sector to take it slow – but also we see that it is lagging behind what can be done and how it could benefit citizens. The government sector is obviously the most critical one, given its importance to [inaudible] organisation.
Within this change our perception of privacy is also changing. I remember when I first signed up to Facebook in 2007 I was really concerned my private and family network was being exposed to everyone else, but soon we got used to it and now I am on many other platforms that actually promote these type of things. We give away our data all the time. Google probably knows more about us than we could think of at any one time, and it does not forget. It has more information about us at any time. The application like WeChat in China has got over 1 billion active users as of 2018. Basically there is no privacy concern or no data protection technology around that, and people are still quite happy to use it.
They are concerned about services that Google and Airbnb are offering, intermediaries making a lot of profit out of people’s assets and data. Supermarkets are collecting all our transaction data, and obviously the government front end services from health, to education, to policy.
Although we have been giving away our data to those commercial companies and everyone else, people are starting to realise the privacy value in the first place, and also the commercial value, of their own data. GDPR is obviously a very welcome framework in way that we are going to manage this, and I believe that it is an opportunity for Europe to lead in this case in parallel to the values that it stands for in the era of big data. It is also good guidance for technologists, whose motivations and interests are often very different from that of policymakers. The principles of right to access, right to be forgotten, data portability, privacy by design require technology solutions as well as the culture changes and principles.
To me the key responsibility of governments is to provide a national digital infrastructure with appropriate controls of privacy and security embedded. That is not to say that centralised data, but a distributed, secure privacy preserving infrastructure, so that local governments, businesses, and individuals can make their own plans accordingly. This is similar to governments providing road infrastructure or the rail network for everyone else to basically make their plans accordingly.
It is obviously not a very easy task to get to that sort of technology level, and we may need to accept that we cannot really expect small companies, individuals or small institutions to deal with all the requirements that GDPR is bringing. They cannot deal with them by themselves, given the type and size of resources that are required to get these things right.
I have got some technology topics that are quite keenly explored in computer science departments.
The first of those topics is secure personal data infrastructure – we are talking about blockchain based systems, distributed ledger based systems. These are increasingly becoming the main carrier of such technologies. They are talking about data being processed and encrypted at source only, and the data transmitted is only going to be encrypted data, not the raw data itself. They are talking about the ways to keep all the processing and access to data, and what will happen to that data on the way as well.
Automating data transaction processes through technologies like smart contracts. They are also talking a lot about transparency, how we can ensure that intelligence based mechanisms, automated decision-making mechanisms, can be explained away. Along with all these there is obviously the user friendly interfaces that we need potentially through online platforms and mobile applications, but we need to also account for the fact that users have heterogeneous backgrounds, although it should be relatively easy for more technologically literate people it is only a small percentage of the small population who are able to communicate these issues.
For me we need to challenge the culture change in two ways. First, for everyone to really understand what our rights are in the digital space, who owns our data, and what the effect of these things is – Uber recently lost 57 million customers’ data, and for the ordinary person there is no visible effect, so people really do not know what to make of that.
On the other side we have a need to train technologists to understand what these concerns are as well. We are seeing engineers and computer scientists saying, ‘We are just engineers, we do not really understand and they do not concern us, basically we design things’. This is not the way that it should be; this is not that sort of an area. The opportunity is obviously the new generations are growing as digital natives.
If we can get the principle right on both hands, the future generations will benefit.
Kevin Courtney
We obviously really need to understand as much as we can about the technology, and your point at the end about scientists often to say that they are morally neutral, but engineers really do need to understand the concerns that civil society will have in order to think about the work and build some of that in. I first came across Gracie not from Liberty, but as a campaigner for the rights of children at the start of my interest in this area where I consider that the Home Office was misusing data that they got from the Education Department, in this case in order to make life more difficult for children who were immigrants or thought to be perhaps illegal immigrants. I am very pleased that Gracie is here today with her other hat of Liberty on.
Gracie Mae Bradley, Liberty, Policy Officer
I am here from Liberty. Those of you that do not know Liberty, we are one of the UK’s oldest human rights organisations. We were founded in 1934 and our mission is to protect civil liberties and to promote human rights. We do a lot of work on data protection and privacy and surveillance, and we are doing a lot of work at the minute on the protect Data Protection Bill.
We have a lot of concerns with the Bill, many of which Ailidh has already raised. We are very concerned about delegated powers, which basically grant the power to Ministers to amend the Bill in the future with minimal Parliamentary scrutiny. We are very concerned about exemptions on the right not to be subject to automated decision-making – that is an exemption that cuts across general data processing, but also law enforcement processing and we think that this sets a really worrying precedent.
We and I in particular, are very concerned with an exemption that has been proposed on immigration control grounds. This is an incredibly broad exemption, and it essentially says that when a data controller or processor is processing someone’s information for the maintenance of effective immigration control or the prevention or detection of activities that would undermine it, essentially all of their rights under Chapter 3 of the GDPR can be set aside at the discretion of the data controller. There is no definition of the maintenance of effective immigration control in the Bill or elsewhere, so it is pretty subjective and potentially incredibly broad.
The rights that are affected are numerous. In some sort of attempt to appear conciliatory in the House of Lords the Government very graciously removed the right to data portability and the right to rectification from the exemption. However, the right to know what information is held about you remains subject to the exemption, so it does not really mean much that you can move your data round or correct it, because you are not going to know what it is.
We are very concerned about the right of subject access, your right to object to your data being used in a particular way, your right to know that your information has been passed from one controller to the other. All of these fall under this exemption. As I say, there is no limit on the processors that might apply this exemption, as long as they are processing data for immigration control purposes, they are at liberty to apply the exemption. While often as shorthand we talk about it being a Home Office exemption this obviously applies to Home Office contractors like G4S, like Serco who are scandal-ridden in their own right. It could apply to, for example, NHS services, it would apply to not for profit organisations like Cifas which holds a list of disqualified persons that is then given to banks to refuse undocumented people bank accounts. We are looking at all of these controllers being untethered from their obligations on immigration control grounds.
The other thing that I want to emphasise is that this is not just about immigrants. There is nothing in the exemption that says this is limited in its application to immigrants or people who are not British. From my point of view this exemption is potentially the lynchpin in a system that allows for the digitisation of border control that makes it easy for all of our statuses to be sent from the Home Office to the NHS or to a bank, or wherever, so that they can be checked in an instant. Undocumented people will be excluded really easily, and none of us will have the right to say, ‘I object to my data being used in this way, this information is incorrect’.
As I said, this is a really big problem in terms of access to justice, and our colleagues at the Bar Council and Immigration Law Practitioners Association have raised their concerns. Essentially if people cannot make a subject access request to the Home Office they are very unlikely to be able to launch a successful appeal or to get new legal representation and progress their case properly. We have to remember that this is in a context of 41% of asylum decisions being overturned on appeal in 2016. I think the latest immigration stats show that 50% of cases at tribunal are succeeding on appeal, and yet the Home Office is proposing these swingeing cuts to people’s data protection rights.
It is also a big problem in terms of public services, because one other thing that the exemption does is it removes obligations on secondary controllers to tell a person that their information has been passed to another controller for immigration purposes. That basically means that if the Home Office is saying, ‘Hello GP, do you have this person’s address?’ and the GP then hands that over, they are not under an obligation to notify you.
Having spoken a little bit about the Home Office’s poor immigration decision-making, I can also say a bit about the Home Office’s truly diabolical data protection practice. Some of you might recall that in 2012 the Home Office contracted Capita to text people who it thought were unlawfully in the country. Obviously that is a really inappropriate way to talk to people about their immigration cases, but the Home Office gave faulty data to Capita, so hundreds of people who had the right to remain were texted saying, ‘You need to leave the country’, and that led to loads and loads of complaints.
In terms of refusals of bank accounts, the independent Chief Inspector of Borders and Immigration audited 169 refusals to open bank accounts on immigration grounds, and found that 10% of those refusals had been made in error. What we have been looking at now since January is that quarterly immigration checks will be made on 70 million existing accounts; with a 10% error rate there is pretty massive scope for things to go horribly wrong.
I also want to say a bit about what was happening before this exemption was introduced, and Kevin has mentioned that briefly. The Home Office has essentially been hoovering up data on undocumented migrants from essential, trusted public services. Since 2015 there has been an agreement in place to allow the Home Office to collect the address details of up to 1,500 children and families a month from their school records. There is a similar data sharing agreement with the NHS that lets people’s addresses be pulled off their confidential medical records and handed over to the Home Office. There is a data sharing agreement between the Home Office, the DWP and HMRC that basically means that people’s addresses are being handed to the Home Office, it also means their employers are notified when they are unlawfully in the country and illegally working.
We have seen data collected by homelessness charities being used to create a map to help the Home Office do immigration enforcement against migrant rough sleepers. We have seen ad hoc data sharing on victims of crime and the Home Office – in one case a woman who was five months pregnant who went to the police to report rape was subsequently handed over to the Home Office for immigration enforcement purposes.
The problem with this exemption is that it massively increases the scope for this data sharing and processing on lawful migrants and British citizens. Even if we get this exemption out of Bill, which I sincerely hope that we will, and I would ask you to use Liberty’s template to write to your MP to say, ‘Defend our data protection rights’. Let us get this exemption out of the Bill. We did it in 1983, we can do it again. Even if we do that, the crime exemption in the Data Protection Bill, and that has been in the existing Data Protection Act, has allowed for this data sharing on undocumented people, because so many aspects of their lives are criminalised, whether that is working, driving, just being in the UK.
All of those things are unlawful, so what we are hoping to do over the next few years is to campaign for access without fear. We want that exemption in the Data Protection Act narrowed so that low level immigration crimes are not an excuse for taking away people’s data protection rights.
Ultimately what we are moving towards is a system in which at the click of a button people can be locked out of all kinds of essential goods and services just because their immigration status is wrong, in a context of cuts to legal aid and poor decision-making. Ultimately people will not be able to live the dignified life all of them should be able to if we do not get this sorted out.
Kevin Courtney
Thank you very much Gracie. I think some really scary things in that presentation. People used to proudly say it is better for 100 guilty people to go free than for one innocent person to be jailed, but if 10% of decisions that block bank accounts are based on – that is a terrible statistic. For my union we want to say that schools should not be immigration offices, so we are very worried about that data aspect. Thank you very much for that. Our final speaker from top table is Toks who is very knowledgeable about the stuff that is going in schools, so we are very much looking forward to hearing you.
Toks Oladuti, Director of Information Systems at an Independent School Trust in London
A few years back I decided that I was going to review how we were handling data in line with Data Protection Act at the time. With GDPR going through the European Union I decided to change it and make it into a programme to align with GDPR, and look at how we were handling data, how we were managing it, how we were communicating with pupils, parents and also staff with what we were doing. I am going to talk about GDPR and data protection from that operational, on the ground point of view.
There were a couple of things that were very important that we addressed and enshrined in what we were doing at the time. The big thing was awareness. It became quite apparent that there were many different types of understanding of what personal data was, how schools should actually be handling the data, and who the data belonged to. There were many conversations I had where, in essence to paraphrase, it was, ‘We should not share that, because it is the school’s data’, and it was actually getting past that stage that came first.
The good thing is being in education where you are working with a group of vocational people is the mind-sets and their hearts are in the right place: they are, in essence, wanting to do the right thing. Their whole concern is about child welfare, child education, and that is what their focus is on. We wanted to address the awareness and get them to understand what was involved and what the requirements were to empower them to use that knowledge. We focused a lot on awareness, and it became apparent, both in our own school and also talking to other schools, that awareness were quite low, and understanding was quite low.
There is lots of guidance online from the ICO. There are various educational groups that our school is part of, but even amongst those I remember writing to one of the groups saying, ‘I have just come back from one of your conferences, I am quite concerned about: a) what people are doing given the amount of time that is left to actually start addressing it; and b) the levels of knowledge and the levels of awareness’. The reply I got back was, ‘Everyone is on top of this, this was covered in our most recent newsletter’. That was the sort of approach that this was being given, so there was a lot of work to do internally to raise awareness and actually say, ‘This is quite important and it is not just something that is important for one or two people in senior leadership’. It was an issue that the whole school needed to be aware of, and everyone needed to know that they were responsible for what we were doing.
Awareness was a big thing that I wanted to promote and that we spent a lot of time promoting. We then looked at what we were doing and, again, this was an area where it became quite apparent that there was a disjoint between schools, suppliers of technology in particular to schools and their understanding or approach to data protection. We were looking at a lot of the suppliers that we as schools were using that were available to us as schools and how they managed data and what their policies were on data, and what they shared with us, with regards to what they were doing with data. There was not a lot of information there it was quite lacking in quite a worrying way.
The way that data was being processed, there were lots of services out there where they were using Cloud services to store data, they had not actually really considered that from a data protection point of view, some of them had not even considered it from a security point of view, how it was being stored. They were not aware of where data was being stored, and whether it was being stored in the US, which we found out for quite a few services they were being stored, some services were actually storing data in China, which was an issue – we found one company that was storing data in Russia.
There were lots of red flags, and these were things that staff in schools do not think about. Their main focus is this is a brilliant tool for me to use that is going to help me, help my children learn – that is what my focus is. That is what they want, and to be quite honest that is what you want teachers doing, but it was important and it became apparent that we needed to really get them thinking about, okay, yes, you want this tool, but you need to have another step to your thought process, which is whether the tool is fit for purpose and the right tool.
We did awareness, and we did a lot of training where we spoke about what data was, what the fundamental rules were, the fundamental guidance and ethos behind data protection and in particular GDPR. We spoke a lot about transparency, control and security. Our ethos was, if ever you have got a question as a member of staff and you are thinking, ‘Is what I am doing correct? I am not going to read the regulation it is a massive piece of text. I am not going to read the Bill when it becomes an Act it is massive. I do not need to understand this. How do I go about doing this?’
This was a stumbling block for a lot of people in education – it was too big a thing to really deal with. We narrowed it down to transparency, control and security. We said fundamentally we have to be transparent with what we are doing, so staff need to know what we are doing with their data, parents need to know what we are doing with their data and their children’s data, pupils need to know what we are doing with their data. Underneath all of that they need to know why. That was our first question.
The next thing was control, and we spoke about this thing called suitable control. Do the individuals have a suitable level of controls over their data? Is it data that we are using for something that is a nice to have? If it is, and it is not something we need to do, do they have control over that data? Can they tell us, ‘No’? Do they know what we are doing with their data, and do they have a route to tell us, ‘No’?
We also spoke about security. We spoke about that probably the most, to be quite honest, because that is where the technology side comes in. It was about making sure that whether it is internal solutions we are using or third party provider solutions how secure is that technology in keeping personal data safe? That is where we spoke about reviewing what the suppliers were doing, how they were handling data, what controls they have in place, what policies they had in place, what were our internal processes, were we doing data protection impact assessments for everything we were doing.
It was getting into that mind-set; it was building up awareness and training staff about that. That was what we did in the run-up to things, before we went into implementation and that was incredibly useful, because it then meant that we were having discussions about this is what we do, is this right? Yes. Why are we doing this? We are doing it for this reason. Is this the right piece of technology or should we be doing it this way? Why have we got this data? We do not actually need it, so let us not keep it, and let us not collect it. Why have we got so many copies of this data?
We need to address how we actually administer the data: one, so that it makes our life easier; two, if we do not know where data is how can we show that we have control, so we focused on things like that.
The other thing that was quite important to us, and quite important to me, was as a school our purpose is education. We felt it was very important that we were educating and preparing children to become the young adults that are going to come out into his world where there is a lot of technology, there is a lot of data being collected and used. My concerns were at schools we were doing a lot of things about e-safety. We were telling people, ‘Do not do this. Do not do that. That is safe’.
You would have a conversation with teachers and they would go, ‘They know about this a lot better than we do, because they use it all the time’. My kick-back to that was they are brilliant consumers, but they do not actually understand the underlying technology, how it works, and how their information is used.
It is very important that as part of this programme we also develop – which we are looking to push out in our PSHE programme – a scheme of training to the students, so that we are teaching them to understand how technology works, so they can become young adults who can take a risk-based approach to making decisions about what technology they use and how.
Kevin Courtney
Thank you very much. That made a lot of complicated things understandable for me, I am sure you are fantastic in your school on that question. Jen, do you want to say something to bring those things together.
Jen Persson
This, where Toks and their school are, is where we would like to get to across the education sector. We need this kind of practical, grass roots understanding of what data you hold, why you hold it, what you are doing with it, and what will you do with it when you no longer need it, to be at the heart of our data policy for children. That goes down from national level through government’s use of data, through their national collections, through commercial practice and down to the level of schools.
We would like to ask you tonight if there is something that has grabbed you in that to take away an action.
We are delighted to see that Labour has proposed an amendment for a Statutory Code of Practice in education for children and education and also for young people with special educational needs and education care plans up to the age of 25, which will enable better clarity, consistency and confidence across the sector.
Toks has brought that to his school and we need that across the whole education sector, because today we are launching our survey for the report State of Data and we will be bringing out the actual report online in the course of the next two weeks.
You can take away the highlights of the parental survey we carried out. Survation asked 1,000 parents on our behalf, and for example, 69% said they had never heard of the National Pupil Database. We have a National Pupil Database of over 23 million people’s records. If you are under 36 you are in that database, and it holds absolutely everything from age two to 19 about your education. The work that we have done on the report is bringing together where the picture of data is coming from. From the early years, from aged two, your phonics tests, your behavioural reaction tests, are all being documented through to the age of 19. And those identifying data are given away.
As the Minister announced this morning on BBC Radio 4, there is a project linking data that has been going on already, using education data from aged two to 19 on all of us, linked together with your tax records, linked together with your income, linked together with DWP records, to effectively see – as I see it – very simplistically potentially; the cost-benefit of education. How much did your education cost you, or cost the country? That information is then to be made available to the commercial sector. Now, why do we not know about that? Why is the first thing we hear is when it has already been going on for some time as a brief mention on radio? Why are we not being asked how these data about us are to be used? How can we continue to hand over our personal data if we don’t trust what it might be used for in future? Or the policy it may be shaping?
To sum up, the things we hope you takeaway are the actions needed on the Bill — in addition to the security certificate questions that Private International is concerned about, about the immigration exemptions that Gracie spoke of. There is also an “exemption-like mini Data Protection Act all of its own” in the form of the Government Data Framework. It is a very small clause, but it gives the Government its own Minister’s powers over and above the Information Commissioner’s, and she herself has written and said she is worried about this clause, which is pretty open ended and leaves far too much to general scope of powers without oversight, without controls, and as Gracie has spoken to, future potential concerns for expansions how data may be used.
The Right to Representation is also really important, so that civil society is able to represent groups, especially children, who do not have the capacity, the time, or the understanding of how these issues affect them.
We also need make sure proper scrutiny of things that are being put through in the Bill at the very last minute – the Government has put through a new “Safeguarding of children and individuals at risk” clause. I am personally very concerned about this clause. It sounds great – safeguarding of children, who does not want the safeguarding of children? I have three girls, I am very concerned about their safeguarding in schools, but I am very concerned that how it is carried out today is through web monitoring.
That is imposed in almost every school in secondary school, and in our survey we found about 55% of parents think that their children’s schools are using this to log their internet search terms and create flags on keywords. There are about eight major companies in the UK that provide this software. Children do not know that this software monitors everything they do online in school.
ChildLine is a vital service in the safeguarding of children. They, on their own website have a web page, which says, ‘In order to avoid web monitoring software, because we realise there are potential individuals in your life that could control the use of your computer at home, use a computer at school, so that you can have private conversations and counselling services, and confidential private chats’. That cannot be certain to happen now, if schools are required to impose monitoring software through safeguarding in schools statutory guidance that came in, in 2016.
These software now record the screen as the child sees it, they record the content that the child puts onto the screen. The latest update to one of the leading providers now enables the webcam to be switched on to record photographs of the child using the internet. They are not restricted to the school. They are often used outside of school, at home, in the summer holidays, 365 days of the year, because they can be imposed on a bring-your-own-device that the child has brought to school and / or is being used as soon as the child logs into the school network, so when they are using their internet access at home and on the school network.
Take away the highlights in our flyer, but also look out for perhaps more of that information to come. I would like to thank Child Rights International Network who have done a lot of work on understanding how this software is being used, especially in the Prevent programme, and also to Liberty and Rights Watch who have done work on that.
I would like you to take away actions on writing to your MP about the immigration clause, about the right to representation, about the security exemptions, about the Government Data Framework, and concerns about this brand new safeguarding of children and ask why is this necessary? Why do we need new conditions for processing for things that should already be perfectly legal, perfectly able to do, perfectly fair that happen today? Why do we need these sudden new conditions for processing? Are we embedding risks and poor uses of data protection around practices and policies that frankly we do not think are terribly well understood, when what we want to be getting towards is the good, exemplary practice, safe, fair, understanding, and transparency that can happen in schools, as Toks has explained.
Let us have some discussion. Let us take away some actions and see what we can do to make things better.
Discussion
Kevin Courtney
Thank you very much Jen. Jen is a powerhouse, as people will know.
Alison Hart Wilson, UCL Institute of Education
I am situated on a project, which is working with 240 education technology companies helping them to become a little bit more thoughtful and research-minded about what they are developing. It seems to me that they are a key player in a lot of this discussion. There are two observations from working with the companies in this setting.
One is that in the same way as teachers are not that minded about some of this, if you are so busy developing your product, trying to grow it and get funding, the mind-set to think through what the implications are with regards to the data that is being designed into tools, who it belongs to, where it goes, where it is stored, in not an irresponsible way, but it is an education piece for me.
In the same way Toks has talked about the formative role of working with the teachers, the community, to become better at understanding the scenario, I think that is also needed. It cannot be a blame culture. It has to be that in most cases there is lack of understanding, and therefore trying build in codes of practice for the EdTech world to be able to support them to benchmark where they are.
Your three headings of transparency, control, security – everyone understands them, you can start there, but it seems to me, and certainly from our experience, that they are a key player and we need to ensure that we are working with that EdTech community.
Participant 1
Thank you for your opinions. They are very significant. I am a Professor from Russia. I am a teacher from Russia. I began my research in education and have written books about forming of political culture and civic education. I have written articles and books and Russia and articles in English that you can read in Google books.
I suppose that we have a lot of questions about civic education, political types, civic identity, and inter-ethnic cooperation and inter-personal belief. I think that it is necessary to continue our job, for our people and our students. I invite you to cooperate with my university, schools, and post-graduate students. My students have interesting aspects to this in developing the safety, risk, civic leaders, inter-ethnic cooperation and [inaudible]. Thank you, I am happy today.
Participant 2
I am a member of the EdTech community and what educators are doing is excellent, I totally agree. I know that we are normally classed as baddies in this debate, so I thought I would give a more interesting perspective on this. I work for Up Learn, which is an all in one provision to all for A Level users, it uses cognitive science and AI to help students get A*s.
Mainly our students come to us via Facebook or You Tube advertising, they found out about us individually, but schools got interested in this and they wanted to get involved. What we asked them to do was to supply their name, email address, year group, and FSM status [Free School Meals] as students to sign them up. FSM status is necessary, because we give free licenses to those students.
It took about 20 teachers to regularly supply that information before one said, ‘Actually, GDPR, this might not be great’. They sent us a template; we then used this to send to every other school as our own template, so that we are compliance with GDPR. What is more interesting is the fact that as soon as students sign up to our plan they start interacting with the technology and we start collecting data on rate of recall, rate of forgetting, speed and accuracy of answering questions, incredibly valuable data for teachers.
Teachers beg me on a daily basis to share that information of students back with them – it is not theirs, because the students have signed up by themselves in a relationship with us outside of that dynamic. I want to know particularly from Tok’s perspective what we should in that situation – should I be supplying the data that we are collecting to the UCL Institute of Education, so they can produce some wonderful future research on this. Should I be continuing to deny the teachers the information I am collecting about their students? Should I be giving it to them? Does the teacher still have a monopoly over the student’s education?
I do not really know what we should be doing and what the right thing is to do, but I know that I am fielding more attempts that fail to comply with GDPR from teachers every day than doing things wrong ourselves. I think it is quite an interesting perspective to see where the blame lies, and what we should be doing about sharing data amongst schools, teachers, parents and students, because each has their own agenda and the dynamic is complex.
Kevin Courtney
Thank you. The data is incredibly powerful, is it not, and none of us want to deny that or want to not make the best use of it. From my perspective I am scared of people who know the price of everything and the value of nothing, I am not putting you in that category though. You are in the right place to ask the question.
Richard Gomer, Research Fellow at University of Southampton, Interaction Design Consultant, and a School Governor
I have a couple of points around the idea of understanding how we are engineering these technologies, but more specifically, how we are designing them —, because these technologies are designed — and design professionals are involved as well. I was pleased that user interfaces were mentioned, and I think you can construe that quite broadly. Really, we are thinking about the challenge of essentially building intelligible, controllable technology, so that people understand how the decisions they are making in a technology-rich world are going to have repercussions, what they mean, and have some hope of predicting what those actions might lead to.
I think that is in scope for disciplines like human-computer interaction and service design, but that probably we do not have design professionals at the table when we are making policy about this. I think that often it is quite clear when we look at guidelines from data protection authorities on what things should look like, in particular for instance this obsession with privacy policies, which everyone knows do not work, but which we have not really moved on from ever.
What is ironic is that we talk about tech being ahead of the law, and that the law is scrabbling to catch up, but in the field where I work as an interaction designer for privacy and consent, what we see is the tech scrabbling to catch up with the law. We still do not know how to get people’s consent in a meaningful way. We do not understand how to design that process.
We see some progress with emerging infrastructure for consent management, and companies like Facebook have a huge innovation programme in this design space. The point is that we probably need to be prepared to continually improve how we are doing design, to make sure that design professionals and design practice are continually improving and that that is recognised as we set policies and guidelines. We need to be happy to come back and chuck away what we are doing now, because in 10 years’ time we ought to be able to do better, and what we are doing now will not be good enough.
Kevin Courtney
This is fascinating, thank you very much.
Carla Aerts, UCL Institute of Education
I work on the same project as Alison, but I have also been an EdTech practitioner for about 17 years and a digital practitioner for well over 30 years. The data issue has been around for ages and it is only getting worse. I would like to make two points; the first point is that I think in this debate we have to include youngsters. I came to a debate here which was The Internet on our Own Terms, which was led by Baroness Kidron and 5Rights. These guys know what they are talking about and we should include them in this discussion. You cannot have that debate about their data without them being there.
The second point is actually an extension on what Alison and Tom have said, and it is more to do with the artificial intelligence in education, which is obviously on our doorstep and to some extent already here. The AI needs the data to be the AI, and who is the owner of that AI becomes a really big issue here. Without the data graphs and the data mats AI cannot function. The data can only be provided by the participants in the product to propagate the data and make the AI work, and it becomes bigger and bigger.
Everybody claims they are anonymising data. Anonymising data is rather hard to do and to do well. I do not need to tell you guys this – it is really, really hard. You have got commercial players who increasingly become the owners of the AI, and actually we become the product, because we are propagating the AI to work.
When you take that into an education context and a rights context of youngsters that becomes a really difficult question. This is something that needs to be addressed in any education debate and any learning technology debate, especially when AI is getting involved, because that is going to extrapolate the whole data issue.
Tony Sheppard, GDPR in Schools
We are an organisation helping schools with compliance. I am also here with a school governor hat on as well. There is a standard for suppliers about what questions schools are going to be asking, because the majority of the information out there is quite generic. At the moment the stuff from the ICO about invaluable is generic and not necessarily in context. We are trying to do what we can and others are helping along with that. Although this is about the status of data, the 25 May is just a milestone on the journey. We also have things like the standards for design of age-appropriate websites that suppliers and schools need to think about. We need to ensure that children are heavily involved with that discussion.
The same applies with education as well, because there are established technologies that will always work and will work within education, because if they do not know the age of children, who does? There are other things we need to remember in the context of the wider discussions that are going on. We just need to pull some of it together.
Participant 3
On reflection of the comment made about getting children and young people involved in understand what it is that their data is being used for. Some of the conversations in my PSHE lessons 10 years ago, so I am talking about using Facebook, Twitter and digital stamps, and nowadays it is more about the digital [inaudible] that can be used by companies for marketing and data analytics and all that kind of stuff.
I find that a lot of young people are quite apathetic towards such comments. This idea that when you sign on to Facebook and you look at the terms of use no one actually gets a lawyer to sit next to them and look at it all. Talking about young people being really good consumers, if you can pitch something really, really well it is hard to then see the downsides. It is getting my data, but if everyone else is doing it, so it seems fine. I guess it is how you get young people to actually care about the information that they are being told about, if that makes any sense? One thing is informing them, but the other is getting them to understand this is why you should care, instead of just, ‘This is the information you are presenting’.
Wendy M. Grossman
I am not sure how this fits in, but one thing that is coming at us with the speed of an oncoming train is physical tracking. We are talking about data as though it is merely an online phenomenon, but what we are seeing right now is offline companies who are looking at Google’s business models and saying, ‘I want some of that’. There is a company called Blip Systems, which spends its life deploying sensors in airports and train stations. What they are interested in is passenger flow, making things more efficient.
These things are going to be in the world all round us. Every device is going to be part of a giant cyber physical system. When we are talking about GDPR now, this is what is going to be the next thing and it is much closer than we typically think. I wanted to throw in that unpleasant future.
Things like free WiFi in some areas – I think it was Camden a few years ago, signed a deal with Purple WiFi to have free WiFi in Camden and one of the conditions is that they get all the user data. That kind of thing that is going on without our recognising it – you cannot imagine if the Cookie directive is going to work in the physical world, you are not going to see a sign on every street saying, ‘If you walk down this street you are subject to…’ This is the world we are going into.
Participant 4
I come from the private sector. My question is more fundamental and it might even be too late for this question now. I want to talk about information blindness behaviour of users. Policymakers have tried to make GDPR as large as possible in terms of scope, and have tried to get every minor element of personal information to be within the scope of GDPR. What this may result in is information blindness, because, as an example, if you go to an estate agent, and then give your current address, and the address where you want to buy a new house, they will have to get consent from you.
This is just one single player. During a standard life we will have to give consent all the time, and because the scope is huge the digital players will start sending out consent emails showing consent pop-ups, which might result in information blindness in user. That will mean not reading the text, or the consent at all. Even clicking, ‘yes’ or ‘no’ if we have users who do not know what consent they are giving, this will give an advantage to bad players in the market, which we do not want.
If they start giving the answer ‘no’ to all consent pop-ups or all consent emails, this will also have a bad result for the community and for everyone, because we will stop creating value for our citizens, which is not good in global competition and even for our life standards. Do you see having a larger scope and information blind users, because of the huge scope, as a risk? Or do you not see this as a risk at all and think that it will not happen, and users will be trained enough to know how they should be acting?
Kevin Courtney
Are we going to get an answer to that from the floor?
Anna Fielder, Privacy International
I wanted to respond to that if I may, because you are implying that we need consent for everything, but under the legislation consent is just one of the conditions for processing. There are other lawful ways of processing, like contractual relationships, and so on. Consent is only required in situations where you really, really need it. I do not see how that is going to affect and how that is going to be in the risk you mentioned.
I wanted to ask Zeynep a question. In the GDPR the provision on privacy by design and default is very vague, so nobody knows how you are going to implement it in practice and whether it is going to be via guidelines for everybody or everybody is going to be left to do their own thing and explain it to users. I wondered whether you have any thoughts on how that can be implemented.
Silkie Carlo, Director, Big Brother Watch
I am just wanted to chime in to express my solidarity with Jen, defenddigitalme, Liberty and Privacy International, who are all working in this space and all lobbying the Data Protection Bill as well. There is so much detail that we can become imbued with in this area, but it is important to take a step back and look at the general direction of travel, which is a bit of what Jen was doing earlier. In terms of the general data-driven culture that we are seeing in schools that is happening currently in a fairly haphazard way, and surveillance culture as well, Big Brother Watch for several years has been exposing the growth of that surveillance culture in schools, for example, the growth of biometrics in schools, kids now have to give their fingerprints just to get their lunch. There is a casualisation of biometric data, which is incredibly sensitive. The use of CCTV within schools, in toilets, in changing rooms – last week we were seeing that some of those cameras got hacked and have been live streamed on the Internet. [from schools in Blackpool]
Then there is the state-driven aspect of it as well, with data gathering on the national database, some of that has been used in very insidious ways, for example, in relation to immigration. Then the key logging as well, which is one of the most concerning. We are worried at Big Brother Watch that children are growing up with this culture being ingrained in them from school, the expectation that our every move should be watched and monitored for something like counter-extremism for nine or 10 years olds.
While on the one hand this does seem to be haphazard, on the other hand some of this is inevitably by design. This weekend Matt Hancock was saying that the Government wants to play a role in basically controlling children’s use of social media, which is really clearly an attempt to micro-parent I would suggest, and taking control into state hands.
At the same time you as the parent cannot see your child’s record on the National Pupil Database.
That is a problem that has been designed by policy, and a lot of the surveillance framework and the surveillance culture is to some extent driven from the top. Our groups are working together to try and address that and I think that is something we have got to continue to do.
Giles Davidson, Head of Policy at Tech UK
It follows on from that question; there is a lot of stuff within the Internet Safety Strategy and other documents, which is about protecting children online. I am interested in where the balance lies, for members of the panel, between that and the principles of data minimisation, because a lot of these things require more data on children, not less. How do you see that framework and the role of Government in this? Following on from that there is the protection policy, what schools can do, etc.
How does that balance against the educational side? I am interested to understand about that, particularly given a Government which has just abandoned plans for compulsory PSHE and what is the role of that in delivering rather than ever more rules and regulations, and attempts to protect?
Kevin Courtney
Thank you very much. There is certainly a clear need for education on these questions. I think your questions have to be addressed in all sorts of way. We are going to go back to the Panel.
Patrick Geary
My unit looks within UNICEF looks specifically at the responsibility of business, and we are on the cusp of publishing an industry toolkit on children’s online privacy and freedom of expression. This has both general principles and also a checklist. The checklist is a practical tool that raises important questions to companies about how they collect, retain and process children’s personal data, with the aim of instilling better respect for children’s privacy and expression rights. Both Jen and also colleagues from Child Rights International have kindly contributed to the development of this, so we are excited to release it. We hope that this will help to address some of the concerns that are raised here about business responsibility.
When it comes to consent there are big challenges. The GDPR requires that terms and conditions be accessible, but how do you meaningfully explain an increasingly complex and ever-evolving digital ecosystem to a 13 year old? I do not know, and in some ways these questions seem to suggest a fundamental rethink of the ways in which we use children’s data.
There is one notable stakeholder missing in the room and that is children themselves. I think this is just a reminder that it is incredibly important that we bring children’s voices and experiences into this conversation.
Ailidh Callander
GDPR is just one piece of the puzzle, it is one piece of the legislative puzzle that people are trying to navigate, but it is so important to even take the basic data protection principles and apply them. You have got the issue that that is not happening on the ground through a lack of awareness, but you also have the top-down issues of institutional data sharing and surveillance. Those together do create a real issue.
I also agree with some of the comments around taking an inter-disciplinary approach. It is not just a policy issue, it is not just a legal issue, you need to have the technologists, the designers and developers around the table in order to try and create meaningful rights with those that are provided in GDPR and actual protections as well.
GDPR and the Data Protection Bill are an evolution of the data protection framework and it is a moving process. It is so important to try and get it right now with the current Data Protection Bill and then to keep reviewing that and go back to that, and check these exemptions, check on data sharing that is taking place, and ensure companies and the state are held to account for the way they use data.
Dr Zeynep Engin
I will start with the privacy by design question. The key technology is the blockchain. I am quite optimistic about that. If we can find a way to implement it on a large scale, and for all different types of transaction related issues, and data transaction related issues, then there is a huge opportunity to keep data where it is, and whoever owns the data controls it, and transmits only the information that is needed. The key thing before that is to make technologists understand all these issues, because we design systems as you are putting it in – AI is all about data, for example, and we have recently seen a lot of examples that AI systems got it wrong. It has created a lot of other issues.
It comes down to the type of data that designers are feeding into the system and the assumptions that they are making when designing those systems. If you leave all those things to technologists they are going to be the way they are now. Data scientists obviously have a new-found social status, but education has to be done there as well. They are talking about privacy, but we do not know what it is or how we are going to achieve it in total.
Gracie Mae Bradley
I come at this from a slightly different angle to a lot of people here. Silkie and others have spoken about a broader surveillance culture, and worries about surveillance. I am interested in how that intersects with what the state already does. There needs to be a really keen understanding, and I think it is exemplified in the controversy over the collection of nationality and country of birth data from schoolchildren and the fact that that was going to be put to the use of immigration enforcement. A lot of the conversations that I had with people when that data collection started were around well-meaning people in education who were saying, ‘Yes, but this data could be really useful. We need this data. All data is good data, and more data is better’.
What those of you who do not spend all your time looking at abuses of state power, it would be helpful for you to be aware that all the people who are already targeted by the state, and the ways in which they are targeted, whether that is undocumented migrants, criminals or young people of certain ethnicities who are supposedly associated with certain kind of crime, the gangs matrix, whether it is counter-terror, counter-extremism, whatever extremism is, whether it is profiling people on benefits, or nudging troubled families.
Whatever good things you want to get done with data you have to also be willing to fight with us about those nefarious uses of data, because otherwise the default position from people like me is going to have to be, ‘You cannot have the data, because the state is going to do this with it and do that with it’.
I guess I would ask you to think in an intersectional way and get involved in those struggles, which are not shiny and glamorous, but they are incredibly important.
Toks Oladuti
I am going to whistle-stop through responses to things that were raised. We were talking about web monitoring and logging in schools, which does happen – it happens in the schools where I work. We are very transparent and very open with it. All the pupils know that we have these tools and why we have these tools. The parents are told that we have these tools and why we have these tools. For things that were raised like ChildLine, things like that are very easy to exclude from logging. We exclude support services from logging. There is quite a big list, and it evolves over time, so you can use these things in a responsible manner.
There was a question about EdTech and about sharing data – I come back to our thing, which is about transparency control. When pupils are signing up to your service, what have they signed up to? Have they signed up to you saying, ‘We will share your data with the school’? If they have not then the answer is a very clear, ‘You cannot’.
GDPR and the terminology is quite vague, and I think that introduces a bit of confusion, which is why I say come back to the underlying principles of what it is. If they have agreed to share their data with the school go ahead, if they have not then you cannot, and that is something you need to re-address with new consent, new terms.
Coming back to key logging and surveillance in schools, again, that does happen. I think it has a place, but, again, it is balance. I cannot comment on key logging and I cannot imagine a place where that is actually necessary.
Our attitude when we were talking in schools about what we were doing was if you are not comfortable telling a student or a parent that this is what we do then something is not right there. That should be a strong guiding principle on what you do. If you cannot comfortably justify it, then clearly there is something not quite right there.
Balance of protection versus privacy, again, it is getting that balance right. It comes down to what you are trying to protect and how you are trying to protect it against privacy. Someone mentioned surveillance in toilets – that to me is ludicrous. If your toilet is in such as state that you need to have surveillance in it I think you need to redesign your toilet, not stick in more cameras. It is having that pragmatic approach and I think the best way to achieve that is conversations like this. The more people you speak to, the more ideas, the more thoughts you get, the more experience you get, the better you can be. Things like this are the way forward, continuously doing this and improving what we do.
Kevin Courtney
Thank you. That point that you should feel comfortable, I think schools would not feel comfortable with sharing data that is going to be used for immigration purposes. There are some things that bring us all together.
Closing Remarks
Jen Persson
Thank you all for great comments and questions. The key point that comes up time and time again is what do people expect? What is fair? What is transparent? What is safe? What have you told people you are going to do with their data when you collect it? If you have not told them you are going to do that with it don’t. The first principle of data protection is you must only use it for the purposes for which it was collected and what is compatible with those purposes.
The biggest challenge we have in Government and at State level is that the Government is using data for all sorts of purposes for which they have not been collected. This is our challenge of how we manage those responsibilities together with our rights.
I would hope that all of the points you have raised we have somewhat addressed. We have a discussion that needs to continue. I am very aware and happy that you raised that we do not have enough young people in this discussion. We did invite different groups this evening who were not able to come, but it comes back to a question I asked a year or so ago of a leading academic who talks about data ethics in Europe. I said to him, ‘Why do we not have this kind of debate on data ethics with, say, 16-25 year olds? Let us have this debate. It’s their future.’ He said, ‘We cannot do that, it would be embarrassing’.
This is their future we are talking about, it is not only how we deal with children now, but what they in their lifetime are going to expect, as Wendy raised, in this world of sensors, where you cannot go out of a house without being on CCTV, and walking past a sensor that you cannot see in a shop that interacts with your phone, that sends your data to your transport company, that sends your data to your mobile company. You are tracked 365 days a year 24/7. Children cannot see that, and even if they think they are managing their own privacy they go into somebody else’s home where Siri and Alexa are picking up what they are talking about, while watching Netflix, and so on.
We are not designing our systems for children, but we are certainly not educating them, and far from being embarrassing were we to have them involved, I think it is actually embarrassing for us they are not, and the onus is on us as adults to make it happen. As the Children’s Commissioner said last year in her report, Growing up Digital, ‘We are designing a world right now that takes away agency from our children’. For that we should be ashamed as adults that we are not setting up policies that give our children agency and understanding.
I would ask each of us – we have agency in this room, we can do something.
It is hard, and it is sometimes embarrassing for us, because we have got to stick our heads above the parapet, and write to MPs, or say ‘This needs to change’. But we need your help, and we need you each to do something. How we make things better is that we keep these discussions going, and that I do ask if you can — support us in any way, take away flyers. We have a judicial review right now of sensitive data that has been collected with another statutory instrument that the DfE has started.
If you can support us financially we would be delighted. If you can support us by writing to your MP and saying, ‘We support the Code of Practice for education’, we would be delighted. Take away something, and if it is just to speak to your own family and friends about how we start this conversation with younger people on an everyday basis do that, because by doing something together I really do think we can make changes for the better.
Thank you to all our speakers. Thank you to Kevin enormously, and thank you to everybody who has come and participated.