News / legislation policy

Lords debate rights for children and pupils in education in data protection bill

Cross party peers discussed better data protection safeguards for children, last night, in the first debate at Report Stage of the Data Protection Bill.
“Children are currently disempowered in relation to their own personal data in schools,” said the Earl of Clancarty, as he introduced amendment 117, for a new code of practice on processing personal data in education where it concerns a child or pupil.
Codes of practice for Children and Pupils for the Information Commissioner to set out enforceable guidelines

“A code ought to reflect the child’s right to be heard as set out in Article 12 of the UN Convention on the Rights of the Child. Among other things, it would allow children, parents, school staff and systems administrators to build trust together in safe, fair and transparent practice.”
He picked up the sensitivity after the discussion of a code concerning age-appropriate understanding by children. This was accepted by government in an amendment that requires the Information Commissioner to develop guidance which would apply where children use online “Information Society Services”, targeted at them, excluding counselling services.
That code, originally introduced by Baroness Kidron, would expand on Article 8 of the GDPR which affects the age at which the legal basis of processing personal data, based on GDPR Article 6(1) with consent of the child; or parent where a child is under 13.
The Earl of Clancarty welcomed it, and went on to explain the principle of the second, new proposal. That the same importance should be given in the school setting, “where personal data given out ranges from a new maths app to data collected by the DfE for the national pupil database”.
Schools give children’s personal data to many commercial companies during a child’s education. It is however rarely based on consent, Article 6(1)(a) or 8(1), but assumed, “for the performance of a task carried out in the public interest.” A code should clarify any boundaries of this legal basis where it is an obligation on parents to provide the data, and what this means for the child on reaching maturity and after education.
We would welcome a code of conduct that would give the ICO stronger enforcement powers than simply guidance on what practice should do, although some changes should be made on the face of the Bill that cover gaps for children, especially on safeguards for automated decision making and profiling. A code of practice shouldn’t be used to patch gaps in primary legislation.
It would however, explain what that legislation should mean in practice. Helping edTech companies and developers to understand the requirements of Articles on Data Protection and privacy by default and design, and Security, would benefit companies and individuals.
It would give schools clarity on what policy and practice should look like particularly related to profiling, and consent, and the public interest, and it would enable consistency for children and parents to be able to understand their rights in an age and capacity appropriate manner.
It could cover how controllers and processors should design for data minimisation, and how best to explain where personal data are collected in education, bringing clarity about the exemptions and definitions that are missing at the moment on the face of the bill. This is important for children in schools where profiling and biometrics are commonplace, and it is unclear what constitutes “with significant effect” for children.
Clarity on Rights is missing for schools and Children and Pupils in education today

Issues of consent and the age of a child and their capacity to consent in an informed way, came to a head last year in the school census, noted Lord Clancarty,“when nationality and place of birth were introduced by the DfE last year, many schools demanded that passports be brought into the classroom. In effect, the DfE operated an opt-out system.”
He mentioned the exemption in the Bill as it stands for “immigration purposes” [Schedule 2, para 4]. We welcomed that he shared our concern it, “impinges on fundamental rights to privacy and the basic data protection principles of purposes limitation and fairness,” and we recommend should be removed.

The rights of a child to Subject Access were highlighted as lacking. A right that is refused today by the DfE, but he said should change. “It is wrong too that a company may have more access to a pupil’s data than the pupil themselves, or indeed have such data corrected if wrong.”
National Pupil Data

He pointed out the Department’s historic lack of tracking which pupils’ data are sent where, raised in a question by Darren Jones MP,  as “unacceptable.” “There needs to be a proper auditing process, as data needs to be safe.” We agree the data must be safe. They should be accessed, not distributed.

Lord Clement Jones spoke about the volume of data releases from the National Pupil Database for commercial re-use and other third-party purposes since 2012, saying, the Department for Education has to take responsibility for it, and that, “We cannot solely rely on the digital resilience of children. Industry and government must adapt the digital environment to make it fit for children by acting above and beyond commercial consideration”.
A code should not only improve practice at local level, but what the Government, does today with national pupil data, which is, as Lord Stevenson said, “shocking” and, “several things seem to be going wrong”, [with national pupil data handling] saying, that it give him “a chill.
Current concerns

While the government is talking about better data protections for children using social media, the reality and use of personal data by the state are quite far removed from the rights and freedoms they say children should be able to enjoy. We welcome Lord Clancarty’s comment on the disastrous new data collection in the Alternative Provision will be so damaging for pupil privacy and confidentiality.
“There is now also considerable concern about the further expansion of the census data in January next year to include alternative provision data on mental health, pregnancy and other sensitive information without consent from parents or children, with no commitment to children’s confidentiality and without ceasing the use of identifying data for third-party use.”
 
The crux of this Data Protection Bill is whether government is serious about data protection and the rights of the child, or not.
We welcome peers interventions and continue to work towards shaping this legislation for children and data protection that delivers safe, fair and transparent practice.


Link to:  Report Stage Briefing on amendment 117