News / legislation National Pupil Database pupil privacy

Comment on Data Protection Bill DCMS Statement of Intent

Parliament may be in six weeks of summer recess, but the DCMS launched a statement of intent today for the Data Protection Bill expected in September.
What will it mean for children’s data privacy and data protection?
Matt Hancock, Minister of State for Digital said:

“Our measures are designed to support businesses in their use of data, and give consumers the confidence that their data is protected and those who misuse it will be held to account.”

We want to see that start with government departments.
Five months ago the government passed the Digital Economy Act in which Cabinet Office refused to take account of the children’s rights which had already been set into GDPR, twelve months earlier.
That Act put government policy first, and data users wants ahead of the rights of citizens. As recognised data experts Tennison and Bracken noted in the Digital Economy Act preparation was, “more driven by the needs of Whitehall and its agencies, than data users accessing that data.” [11:05]

There is a need for honest debate about current datasharing agreements across all government departments to restore public trust in data policy, and data integrity.
The use of school census data for immigration enforcement highlights the gap between what is said will be done with our data, and what is actually done. The intent of the school census expansion is an ongoing and disastrous breach of trust by Government. The ongoing monthly use of children’s data against them and their family members, continues without public transparency or oversight.
The Department for Education also still hands out identifying data about individual pupils to commercial companies, charities, data intermediaries and journalists[1] without consent, or having told the 23 million people whose data these are. Data including, “the most sensitive personal information.” This blatantly contradicts the claim that this DCMS statement of intent makes about wanting to restore control and rights to citizens.
And the same government claims that, “Child online safety is one of the top priorities for this government.” How can it be, if your own data policies put them at risk? There should be no doubt, handing out identifying data to a range of third parties into the wild is a safeguarding risk.
These differences cannot be resolved by legislation alone, but in policy, practice, and enforcement of current and future data protection legislation.
Anyone examining the Data Protection Bill must look closely at what the GDPR and derogations permit, and where the UK government leaves gaps.
It is in those gaps, that misuse may occur, by mistake or design. It is in the gaps that the public’s “reasonable expectations” in GDPR and reality may differ. And it is in the gaps between what is said and what is done, that the public will lose trust in data policy makers.
The outcomes of this bill must bring about better policy on collection, retention, and respect for children’s rights than today’s DfE use of children’s data, and in education across the sector.
It is not enough to say children’s rights are underpinned by data protection and privacy by design. We must all see safe, fair and transparent children’s data in practice.


First thoughts on key statements as regards children:
1. “We will require parents or guardians to give consent to information services where a child is under the age of 13.” [page 8]

This is a clear statement of intent that the age of a child where data processing requires parental consent, will be set at 13 for the UK.This refers to a provision in Article 8 of the GDPR which, as Joseph Savirimuthu, Senior Lecturer in Law at the University of Liverpool wrote,  in March 2016 has its drawbacks and will need detailed thought on how it is implemented:
“in one swift move on 15 December 2015, “turned the clock back to the days when decisions about children were firmly placed in the hands of adults.”
“Children’s rights campaigners and advocates such as Janice Richardson questioned the short-sightedness of policymakers in using data protection regulation to encroach into broader social and developmental issues.”
“What is missing from the way policymakers have drafted Article 8 is an ability to appreciate, at a practical level, that if children as individuals are taken seriously, respect for their human rights would mean that their interests and needs are not readily assumed to be aligned with those of their parents. This is a debate for another time.”
That time will be now in this bill, or never. There will be nuance in the wording of the bill that will matter to be fully aligned not only with Article 8 and requirements to consider ‘available technology’ and children’s rights in the rest of the bill and other human rights law, but the intent of other provisions in GDPR, that children merit special protections, and that no more data should be collected about them than necessary, even to meet the requirement of Article 8.
There are big questions in terms of practicality, so that this is meaningful for children, and offers them protection, does not exclude them from services, and is not simply a way for companies to cover their own bad practices with a ‘get-out-of-jail-free’ card.
And there are unanswered questions that apply to the DP Bill and across the GDPR, such as the exception for counselling services, but how are these to be defined and by whom?
Page 16 of the DCMS statement must however be read carefully where it states that, “the GDPR allows the UK to set that threshold for the minimum age at which a child can consent to data processing to any age between 13 years and 16 years.
It would be incorrect to believe that to mean for all data processing. The Article 8 derogation is only relevant for the use of “information society services” and targeted at children. Basically, online services, with exceptions for counselling. The definition is narrow and specific and means a service as defined in point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council (19).
The statement of intent notes that, “Setting an age threshold is not the only- or best- way to protect children’s personal data.” and that, “the Internet Safety Strategy will show a better way.” [page 18]
That draft strategy must be put into the public domain for discussion as well, and to consider the whole picture at the same time.
We must be clear, the age of a child where data processing requires parental consent in GDPR is NOT about consent for all data processing, but ONLY in the use of certain online services, targeted at children.
2. “In view of all these considerations, we will legislate to allow a child aged 13 years or older to consent to their personal data being processed.” [page 18] In the section ‘notable derogations.’

This only part of GDPR where explicitly for use of ‘information services’ targeted at children, and with exclusions, and we will want to see what this means in detail.
Again, it must be clear to policy makers reviewing this bill that if this is included as a generic principle where data processing requires parental consent and setting that in legislation for children aged 13 -18 themselves for other data, it would not part of GDPR.
If this statement of intent turns out to be a broad principle, we will want to know very clearly where it came from and why, and based on what evidence of need and purpose.

3. “Improved data access: individuals will find it easier to require an organisation to disclose the personal data it holds about them at no charge
“People cannot determine whether information held about them is correct if they do not know what information is held. Data controllers will provide better information on how to access information and empower people to take ownership.”
We agree. So why does the Department for Education itself refuse Subject Access Requests [2] today? Is the new bill from DCMS disingenuous or not know what its departmental neighbour does? We hope that this will mean a new era of transparency and support for children’s empowerment at the Department for Education and a change of policy of refusal, which appears to contradict even their own guidance.
We are campaigning for children to be able to have their rights met by the DfE and find out, what’s in my record.
4. “Right to be forgotten: individuals will be able to ask for their personal data to be erased. This will include provision to allow people to require social media platforms to delete information they posted during their childhood.”
This provision in GDPR is not only about ‘social media platforms’. It’s about personal data processing and retention in general and control of our digital footprint as a whole.
We’ll be watching to ensure government is not trying to water down this right which children will be able to exercise under GDPR, not only when they reach eighteen, and in other areas of data, not only social media.
5. “Profiling: individuals will have greater say in decisions that are made about them based on automated processing. Where decisions are based on solely automated processing individuals can request that processing is reviewed by a person rather than a machine.” [page 10]

The wording of the GDPR goes on to say about such profiling and automated decisions “should not concern a child”. i.e. children should not be profiled in this way where there is a significant effect on them as a result.
Is this missing in the statement of intent by omission or design? Again, we’ll be watching to check government is not trying to water down this right for children.
There are few consultation responses from child rights’ organisations. Most would have found it to hard respond to the call for views, which was published in May, without any information to help young people or organisations to understand the bill. The Article 29 Working Party is still to make recommendations on consent, and this needs to be taken into account during the Data Protection Bill. The ICO consultation earlier this year on consent, said that they would develop specific guidance on children’s privacy, on age verification and parental authorisation, and have a separate consultation. That has not happened, and we believe now, might not.
All round, if this bill is to empower children as digital citizens, the detail of what has been set out, and what is missing, will need attention.


[1] Read our latest briefing on the use of national pupil data and five key questions. [download Briefing.pdf 2.6 mB] updated August 4, 2017
[2] Read more about Subject Access Rights for children from the Information Commissioner’s Office.
 
[Published 7 August 2017, with links and minor edits for clarification, 8 August.]