News / legislation National Pupil Database pupil privacy transparency

Children’s voice won’t be heard in DCMS consultation on data privacy

Children under 16 in the UK will lose the right to use children’s websites (other than those offering ‘counselling services’) without showing evidence of parental approval from May next year.
That is, unless the government acts to lower the age requirement for some, to between 16 and a lower threshold of as young as thirteen.
This is just one of the areas on which the government must make a decision and may amend how the General Data Protection Regulation derogations will be adopted in the UK.


While the government seems keen on claiming the positive right to erasure as their own**, that particular change is already a done deal.
**Although this has opened up the question, whether in fact it is as good as, or waters down the child’s right to erasure, by suggesting in the UK it would be from age 18, not as should be under GDPR,  regardless of age.
The GDPR is enforceable from May 26, 2018. What the UK government can control and must take decisions on for children’s rights, they are not yet addressing well. Although the UK was involved in the lengthy preparation of the Regulation published in April 2016, questions how these flexibilities should be implemented here are publicly still to be decided.
The significance of this is underestimated and not yet widely debated beyond academic circles and those with GDPR interests. The potential for making things worse for children’s right to privacy than today is huge. We have asked the Department for Culture, Media and Sport for a review how this will affect children.
The Department asked for views in a very short recent consultation, but without anything to consult on. There was no background information, no suggested proposals, and no guidance. It will have been impossible for charities and children’s organisations, or young people themselves to have had any opportunity to participate, unless they have followed the development of the law (over the last three years).

No one would be able to understand the implications of the bare consultation, without some familiarity with the 260 page Regulation including 73 recitals and 99 articles. As the NGO Open Rights Group pointed out this week, this means the consultation appears to breach Cabinet Office guidelines, which state they should:

“Give enough information to ensure that those consulted understand the issues and can give informed responses.”

So why is this so significant for children? The legislation is intended to better protect young people online from predatory practices. As Professor Sonia Livingstone of LSE points out,
“Currently the data practices targeting adults are the same as the ones used for teens—cross platform, mobile location tracking, productive analytics – these are unfair in the sense that they exceed reasonable expectations of young teen’s digital literacy. No wonder that the US’s Center for Digital Democracy, among others, supports the new regulation, arguing that teenagers especially deserve greater opt-in, transparency and individual control.”


However in practice, the intended improvement in data protection could turn out not only to mean a massive privacy grab by commercial companies, but the introduction of new restrictions on online freedoms. Some methods could be deeply invasive of privacy if data is captured on the age of the child, and the relationship to the spending power of the adult.
Of what benefit is it for a child’s data protection or privacy if the parent ‘approves’ using a page? Providers may then capture both their personal data, profile them, store it, share it and use it to target users for commercial ads or ‘in app’ promotion of paid content for example.
Rather than simply verifying that your child has the right to use the web page, providers could start to demand parents’ credit card data, their email address, or even selfies or other evidence of the parental relationship. Conversely, teenagers could miss out on participation in online services if companies think the burden of collecting this evidence of parental consent is not easily manageable or cost-effective.


Janice Richardson, former coordinator of the European Safer Internet Network, and expert to the ITU and the Council of Europe, published an open letter in December 2015 objecting to the change, pointing out the obvious incentive for children between the ages of 13 and 15 to lie about their age  rather than asking their parents to consent. This perversely then creates difficulty for online services to offer children age-appropriate guidance and tools to ensure a safe and privacy-protective experience online.
The single largest and most important data protection and privacy legislation in twenty years should be putting children front and centre at getting their future online lives safeguarded.
The UN Convention on the Rights of the Child emphasises that children’s views should be heard and given due consideration within the family and in schools including “from the most marginalised communities.” The Council of Europe Strategy on the Rights of the Child 2016 – 2021 further notes that, “Digital tracking and surveillance, the collection of personal data, including sensitive data related to health, for the purposes of profiling pose a threat to privacy and the general enjoyment of human rights including freedom of expression and access to information.
If the age verification boundary in Article 8 left at sixteen, this would be a significant difference from the likes of similar US law COPPA, and may come as a surprise to many of those offering Internet content to young people.


Who will decide which sites qualify for the additional requirements and which do not?
We don’t know how this will be policed, or what appeals process may be.
Will the codes of practice under Article 40 ensure that children are protected as intended, and not simply set up as children, to be targeted as commercial fodder on their 16th (or 13th) birthday?
Will parental prejudice trump child rights? Parents cannot truly consent on behalf of a child. Parents may assume responsibility for the method of age verification but their prejudices could impede children’s human rights to participation and access to information, to develop skills and online resilience.
Will privacy organisations being able to make complaints to the ICO under Article 80, defending children’s privacy for example, or not?
The vital question is what age verification methods will be offered under Article 8 and how they are compatible with these principles of special protection?
We should also be asking early, what implications does this have for all website users of all ages. It is possible, that in order to identify young people, and whether parental consent is required or not, sites will claim they require all users to identify as under or over the age of Article 8. See the way the BBC now requires registration of all viewers for its online services.


The GDPR Recital 38 requires that any change on verifying age and whether parental approval is required, is also seen in the context that children merit special protections. This applies to all ages and across all the other parts of the regulation.

Who is a child is not defined. The age threshold can be lowered as set out in the Article 8(1) and is only for the purposes of that Article, i.e. regarding the information society services access, but it will affect where it touches other articles, such as right to erasure. What might  this mean for interpreting other provisions and the rights of children, such as the Article 8 (2) need to evidence parental consent?
Article 8 (2) asks controllers to make ‘reasonable efforts’ to identify that the holder of parental responsibility has been approved the site use, ‘making appropriate use of available technology’. Will companies require some evidence of familial relationship or parental responsibility?
Article 17 (f) suggests children have a special right to erasure and makes an obligation on the controller, if the data collected was done so when the person was a child, ‘in relation to the offer of information services refered to in article 8(1).’
However, Article 17 also suggests organisations could refuse right to erasure on grounds of requirement (where paragraphs 1 and 2 do not apply) (e) defence of legal claims. Could these two provisions come into conflict? I believe that questions like this for practice, still need clarified.

#Care and Change Required

UK derogations must ensure to be privacy enhancing not harmful to young people. We must balance age-based restrictions intended to protect children from content with ensuring participation, their rights to access information and use services, are not limited.
Too often today’s web safety arguments that ‘think of the children’ are no more than knee jerk responses to block and restrict what kids shouldn’t see. There must be evidence based decisions made on what the GDPR will mean, and it is complex and nuanced.
If companies want to see their online businesses thrive they will need to be transparent to be trustworthy. It is also vital to get these legislative changes right for public trust in the education and commercial edTech sectors, as noted in the UK Digital Strategy.
For children it is not only to think of the legislation that has particular implications, but how and where it will apply practically to children’s data capture today. Derogations include use of sensitive data and biometrics. In the UK for example, technologies capturing these data are commonly used for children where they are not for adults, or for children in other countries. In schools lack of ‘freely given’ consent appears still the norm, all data capture seen to be necessary and proportionate, and parts of the public sector has become blasé about sensitive data sharing. There is often no choice offered to children, despite legal requirements to do so. Fingerprinting technology has become ubiquitous in UK school canteens and libraries. A privacy invasion too far for other EU countries and organisations.
It is vital that there is greater in-depth and ongoing discussion to ensure the derogations support the rights of the child, while also underpinning data practices for future technology and economic interests. It will matter to get this right for some time to come.
This is why we have proposed a report on current practices in schools, to present easy steps for getting ready for GDPR in one year.
For children, and parents, getting this wrong will mean our privacy is pimped out to private companies, like our children’s personal lives from their school records already are by the British government.

  1. Our submission to the consultation calls for a separate review for children. Even our own reply was limited by the lack of proposals offered, and the extreme time limit.
  2. The term “information society services” is defined by Directive 98/48/EC amending Directive 98/34/EC laying down a procedure for the provision of information in the field of technical standards and regulations.
  3. Pursuant to Article 1 point 2 of Directive 98/34/EC amending Directive 98/48/EC, an information society service is understood to be any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.
  4. Article 8 in the General Data Protection Regulation 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 Where point (a) of Article 6(1) applies, in relation to the offer of information society services directly to a child, the processing of the personal data of a child shall be lawful where the child is at least 16 years old. Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child. Member States may provide by law for a lower age for those purposes provided that such lower age is not below 13 years.
  5. The controller shall make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.
  6. The needs of children requiring special consideration of Article 38 must also be seen in this context carefully and Article 8 not seen as a stand alone ‘be all and end all’.
  7. Update Monday 15 May: we have been directed to see also the published work of Djordje Krivokapic.