News / legislation pupil privacy

Response to the Data Protection Bill debate and Green Paper on Online Strategy

Lord Jim Knight asked in debate in July about artificial intelligence and ethics, what plans the Government have to engage children in this discussion about their data. He picked this up again yesterday in the Second Reading of the Data Protection Bill (DP Bill)and again reminded the Government, they need to get their own house in order, before they can comply with the Bill.
The launch of the government Online Safety Strategy Green Paper consultation today, also puts children at the heart of the digital debate. How we do that needs careful discussion and informed understanding of technology. This is understandably often lacking, as the range of technology rapidly outpaces law, and knowledge across society. But the strategy fails to join up much of what is needed in the thinking in the DP Bill and applied online activities, the nuts and bolts of how the Internet and mobile phones work, with the reality of data collection and digital environments. Much of this is data collection is passive and growing in smart cities for example.
There is a big gap for government in both as it fails to address its own use of children’s data across government and education, as it farms out school pupil privacy and health data to commercial companies, while it is quick to point the finger at where others — notably social media and gaming — can improve.
The debate on changing the laws in this area, too often assumes that requiring parental approval means children are safer online. Baroness Lane Fox pointed out that this is basically flawed. Discussion of protection has to be about more than sexting, bullying and porn, and serious issues around exploitation in online grooming, and also address the hidden commercial exploitation of children’s digital identity and ad targeting, in-app sales by companies and grooming children as a future customer base. Risks that parents and charities that focus on physical protection do not see.
Many speakers at second reading in the House of Lords yesterday (Links to debate [part 1] [part 2]) picked up the child protection theme, but in somewhat of a panic rather than informed way. There was a widely stated belief that the age of 13 should be raised, but none mention the risk to child participation and privacy as a result of age verification (AV) requirements. While people are well intentioned to protect children from inappropriate material, this AV will not do that. Attempts to enforce and over police, will likely harm participation, and there is a pervasive unevidenced myth, that children do not care about privacy. That needs to change in the coming discussions.
There were several suggestions that the age of AV in the Data Protection Bill should go out to consultation. The Online Strategy Green Paper and consultation does address AV but only as relates to blocking pornography. There are no real questions in the consultation. It is rather an invitation to comment by December 7. Perhaps general AV should have been included.

Highlights on the GDPR provisions in the DP Bill most relevant for children and education

When it comes to the use of ‘information society services’ (online services targeted at a child, except counselling services), the framing of the discussion on “consent” in Article 8 and parental verification of age, are vitally important.
The UK has set the requirements for parental approval and age verification to 13. But this does NOT enable children from age 13 to consent to the gathering of personal data by companies. Consent is not age, but capacity dependent. This age-based model to protect the virtual child [1], is fundamentally flawed.
It does not give children or parents new rights. It creates the responsibility for providers of ‘information society services’ to ask for age verification if their services are targeted at a child, and if the child answers as under 13, the provider must ask for approval from the holder of parental responsibility before collecting the child’s personal data in use of its services. It’s a shortsighted, if well intentioned tool, but bad-by-design.
Clause 8(2) of GDPR, has not been explicitly transposed. Done poorly, it will mean a big data grab that does more to harm, than helps children’s privacy and exploitation and targeting by companies, because it will be with “consent” and likely undermine protections by design elsewhere. It will also put the onus on 13-18 year olds rather than companies for their own online protections, based on knowledge companies must present in clear language and child-friendly privacy policies. Not what we have today.
It risks that personal data are collected that would not otherwise have been necessary, essentially exploiting age verification to capture valuable data on familial relationships. If this becomes standard practice, this legislation will be harmful not helpful to children’s privacy.
Age verification if it’s going to be meaningful and not just a nuisance, like the new cookie law, must be “a narrow form of ‘identity assurance’ – where only one attribute (age) need be defined, ” as Phil Booth writes, and that needs carefully done.
We need to see clarification and safeguards for privacy and safe child participation online, not just block it with a focus only on child protection.
Article 17 Right to erasure Our fear after the Queen’s Speech and Statement of Intent, was that the government would use the possible derogations to water down the rights that the GDPR gives all citizens. There needs to be much more clarity to distinguish between the rights to object to processing, on correction, erasure and right-to-be-forgotten which are often conflated, as in the Bill.
Profiling – automated decision making GDPR Article 71 The Explanatory Notes and Bill both fail to mention explicitly that these measures under GDPR, “should not concern a child”. We’d like this to be explicit on the face of the bill.
The GDPR under article 80(2) is not in the UK Bill but would grant various groups and associations the right to file a collective action on the grounds of a violation of the Data Protection Act, without a data subject having to ask them to do so first. However, the government has excluded that right from the draft UK Data Protection Bill. We believe this should be included, and we’re glad to see the ICO does too.
The UK Bill Clause 15 would leave the Bill open to any amendment by Statutory Instrument, which feels a far too broad, open-ended power, and as we saw in the 2016 school census expansion of nationality data, can be misused.
The Immigration exemption is ludicrously broad. A get-out-of-jail-free-card for government to do anything it likes, taking away each of our protections today. Policy needs to be transparent and open with the people if affects, not permit secret goings on with no rights to understand or question it. At a time when immigration enforcement is notably and publicly packed with errors, secrecy, and direct interventions are causing harm and distress. This clause must go.
The exemption exercised in GDPR 9(2)(i) requires provision for “suitable and specific measures to safeguard the rights and freedoms of the data subject, in particular professional secrecy.” i.e. the duty to confidentiality that is built-by-design into professionals in health and education, but that the Digital Economy Act Part 5 removed. GDPR requires it is restored.
Two further clauses appear in the Intelligence services Part 4. Exam Scripts are made exempt in Schedule 2, Part 4 (23), and a broad exemption for Economic well-being in Schedule 11 Other exemptions Part 4 (8) are nothing to do with GDPR. These both need clarified that they only apply to the Intelligence services and in what circumstances.

What will it mean in practice?

The GDPR sets out that the processing of personal data should be designed to serve mankind, considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality, and respecting fundamental freedoms, respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity.
There is a large section of the Online Strategy devoted to schools, yet no mention at all of educating children how their own data are used in schools, and where the education system itself is fundamentally flawed, through lack of support, training and resources to properly address data protection, GDPR readiness, and children’s digital rights.

The Online Strategy announces more than a change of name by the UK Council for Internet Safety (UKCIS).Their remit is to be extended beyond children. It is a risk to stretch their capacity to properly address children’s issues at the very time when the stated intention is that more needs done for children.
We need to improve the nation’s digital understanding but the government seems unwilling to step up and do so, and resource it, and instead wants exemptions that remove the need for them to tell people how their data are processed.
For schools and the education sector, practical clarity is needed for the legal basis of standard activities, and that keeping children ‘safe’ online extends to their own classroom activities using AI, apps and biometrics.
The confusion and questions heard in the Second Reading debate, that universities would need consent for research, highlight this lack of clarity.
Research enjoys exemptions, but it’s used as a label to allow a wide range of uses and appears to be accepted by DfE and ICO that research, no matter how broad, trumps all children’s rights. As far as we’re concerned that is not compatible with the principle of recital 38, children merit special protections. Where consent applies, recital 33 of the GDPR enables people to be able to consent to parts of projects.  It won’t improve the research uses which aren’t consent based, but the spirit of this should be something the research community starts to better engage with.
Further, the scattered exemptions in the Bill if applied too broadly, could see the UK government snatch away almost all that the EU gives us and appears to need definition, narrowing or better explanations to be easily understood. The immigration exemption needs to be cut completely.
We’re still campaigning for every person under 36 to be told effectively how their national pupil data, collected in school has been sent to the National Pupil Database without their consent or knowledge, and are being passed to third parties including commercial businesses, Fleet Street and television journalists, and charities, or even passed to the Home Office since 2015.
We must ensure that the laws are applied in practice across the education sector, commercial sector, and across government. GDPR is designed to give people more knowledge and rights to understand and control their own data, and offer clarity for everyone involved in data supply chains, from collection through use, on their responsibilities, and includes the State.
Attitudes and acceptable use of children’s data need to change if the poor practice today is going to end. It’s fine that the online strategy says, government will encourage app developers to ‘think safety first’ but in schools, that can still mean commercial exploitation through apps that schools get for free, and children and parents are encouraged to buy premium services during use. Is this an acceptable measure of “safe”? Ethics do not get any mention in the online strategy green paper. We’re glad to see that smart toys do:


Children using things connected to the Internet, including toys, need to be safe, but how will the Data Protection Bill enable that in reality?
Parents don’t always have the knowledge to keep children safe from commercial identity and privacy exploitation online or misuse of data.
Parents themselves can be the cause of conflict between protection and privacy invasion, and apps that can be hidden by design on a child’s phone highlight the questions of rights and boundaries.
Risks of the exploitation of personal data can be hidden online and in Internet connected things, but harms shouldn’t come from our policy makers. Sadly for some children in society right now, that’s not the case. This Bill and Strategy come at a time when government also has too little regard for children’s data rights, and says one thing, but does another.
The DCMS must be serious about change and take responsibility for requiring actions across government, as well as looking to future technologies, if our data rights are to be meaningfully respected. There is a real risk, that those best intentions for children, misfire and miss doing something more than worry about children viewing adult web content. An overly narrow focus on their relationships with people, online stranger-danger style protections, often comes at the expense of excluding scrutiny of relationships with organisations, like limiting advertising and tracking (online and off), profiling and price discrimination, and hidden data misuse.
The bill and coming strategies could end up the unwanted children of Henry VIII powers and a contemptuous relationship with the EU. Regulation designed to protect people could turn out very ugly for our rights, if these broad exemptions are not reined in, and if online access is over-filtered and over policed in the name of protection.


Our briefing ahead of the Data Protection Bill House of Lords Second Reading [download .pdf 212 kB].


[1] Protecting the Virtual Child – the law and consent to sharing children’s data, Dowty, T. and Korff, D. [2009] https://www.northumbria.ac.uk/static/5007/hces/virtual_child.pdf
[2] [see Andrew D. Murray, International Data Privacy Law, ipx015, https://doi.org/10.1093/idpl/ipx015]
[3] page 145 of 218 in the .pdf version of the Bill. https://publications.parliament.uk/pa/bills/lbill/2017-2019/0066/18066.pdf
[4] Memorandum of Understanding [link to wdtk] paragraphs 15.1.4 and 2.6
[5] Baroness Anelay of St Johns, confirmed in the Lords on September 12, that Brexit will mean leaving the jurisdiction of the CJEU.