News / news

DUA Bill Committee Stage Day 2 #HumanRightsDay 2024 #DataProtection

On international Human Rights Day 2024, December 10, 2024 peers in the House of Lords discussed the next stage of the Data Use and Access Bill that will take the word protection out of UK data protection law. The wide-ranging discussion of probing amendments and questions, included key areas about which we, along with many others across civil society have warned are flawed, and need revision or removal.

Not in chronological order, but let’s begin with a key question to upholding public trust in data protection law, and in a new government.

If the goalposts are moved, if what people were told was true at the time of personal data collection in the past, becomes null and void across the whole of the Bill upon its commencement, how will anything people have been told in the past be believed as trustworthy in future, not only in this law, but all of the new government’s actions?

If you want to #StandUp4HumanRights you can write to them, especially those peers who raised these issues at Second Reading, and / or now.

Lord Cameron of Lochiel:
“My Lords, I want to ask the Minister and the noble Lord, Lord Clement-Jones, in very general terms for their views on retrospectivity. Do they believe that the changes to data protection law in the Bill are intended to be applied to data already held at this time or will the new regime apply only to personal data collected going forwards from this point? I ask that specifically of data pertaining to children, from whom sensitive data has already been collected. Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold, or will it apply only to data held going forward?”

The Minister did not answer the question, saying at 18:30:

“in response to the point raised by the noble Lord, Lord Cameron, that the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively.”

This is a key question. It does not only apply to the legitimate interests new clause, but all of the changes in the rest of the Bill to purpose limitation, new definitions of research, removal of obligations on fair processing, and the extension of consent to new compatible purposes. The Minister will need to clarify: Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold on Day zero, or will it apply only to data first collected and processed going forward, only after the commencement date of the Bill? It is the point of collection in time and what data subjects were told then, that matters. The law must be fair, and foreseeable.

Why the former Scotland Minister might care, is following on from the controversy over the Health and Wellbeing Census where children were asked about their sexual experiences in highly intrusive detail and the data is now kept by the Scottish government. The children were told the data collected would not be identifying, but it was in fact sent to the national data storage using the child’s unique email with pupil ID. The Children’s Commissioner in Scotland asked for it to be stopped but was ignored. Many called for it to be scrapped.

The new clauses could for example be used to re-use the data now held for purposes beyond those the pupils and parents were told at the time of collection. Any future Scottish Government, for example, might consider using the highly sensitive data to train commercial AI products, that under current law would not meet the threshold of ‘scientific’ research.

On data processing in the Education Sector, including for research purposes

Lord Clement-Jones:

“Since the Data Protection Act was introduced in 2018, based on the 2016 GDPR, the education sector has seen enormous expansion of state and commercial data collection, partly normalised in the pandemic, of increased volume, sensitivity, intrusiveness, and high risk. Children need particular care in view of the special environment of educational settings, where pupils and families are disempowered and have no choice over the products procured, which they are obliged to use for school administrative purposes, for learning in the classroom, for homework and for digital behavioural monitoring.”

“The implications of broadening the definition of research activities conducted within the state education sector include questions of the appropriateness of applying the same rules where children are in a compulsory environment without agency or routine practice for research ethics oversight, particularly if the definition is expanded to commercial activity.”

Parental and family personal data is often inextricably linked to the data of a child in education, such as home address, heritable health conditions or young carer status. The Responsible Technology Adoption Unit within DSIT commissioned research in the Department for Education to understand how parents and pupils feel about the use of AI tools in education and found that, while parents and pupils did not expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement.”

“Businesses already routinely conduct trials or profit from children’s use of educational technology for product development, without their knowledge or parental permission. This is contrary to the UNCRC Article 32 principle of a right to protection from economic exploitation [that] public engagement, [which] [this] work suggests parents want.”

Baroness Kidron:

“I turn briefly to Amendment 64, which would limit the use of children’s personal data for the purposes of research and education by making it subject to a public interest requirement and opt-in from the child or a parent. I will speak in our debate on a later grouping to amendments that would enshrine children’s right to higher protection and propose a comprehensive code of practice on the use of children’s data in education, which is scandal and concern. For now, it would be good to understand whether the Government agree that education is an area of research where a public interest requirement is necessary and appropriate and that children’s data should always be used to support their right to learn, rather than to commoditise them.”

“During debate on the DPDI Bill, a code of practice on children’s data and scientific research was proposed; the Minister added her name to it. It is by accident rather than by design that I have failed to lay it here, but I will listen carefully to the Minister’s reply to see whether children need additional protections from scientific research as the Government now define it.”

On changes to the nature of purpose limitation where consent can be extended to new purposes

Viscount Camrose:

“Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.”

“We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.”

“Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection.”

“Children’s unique vulnerabilities demand special consideration. Their personal data, whether collected through educational platforms, social media or health applications, requires the most stringent protections. It is clearly both our moral and legislative obligation to ensure that this data is used responsibly and ethically, without compromising their privacy or exposing them to harm. Moreover, by extending these protections beyond childhood, this amendment recognises that the consequences of data collection during childhood can stretch far into adulthood. This is an acknowledgment of the fact that privacy is a lifelong right; the data collected in our formative years should not be used in ways that could undermine our dignity or well-being later in life.”

“I also welcome Amendments 88 and 135, which underscore our collective responsibility to ensure that the personal data of children is treated with the highest level of care and respect. They would strengthen the existing frameworks of data protection. In today’s increasingly connected world, where personal data is crucial to the functioning of online services, we must recognise that, due to their vulnerability and developmental needs, children require special protection for their personal data.”

The Minister, Baroness Jones of Whitchurch:

“Amendment 82, moved by the noble Lord, Lord Clement-Jones, would exclude children’s personal data from the exemptions to the purpose limitation principles in Schedule 5 to the Bill. The new purposes are for important public interests only, such as safeguarding vulnerable individuals or children. Broader existing safeguards in the data protection framework, such as the fairness and lawfulness principles, also apply. Prohibiting a change of purpose in processing could impede important activities, such as the safeguarding issues to which I have referred.”

[Our comment: This seems unlikely, given the Vital Interests basis for processing remains unaffected, and research exemptions (not product development) remain as is and unaffected]

“In addition, the proposed duty could have some unwelcome and unintended effects. For example, it could lead to questions about why other vulnerable people are not entitled to enhanced protections”. Col 450GC

<< This indeed would be a very welcome question for the Minister to consider further. The default position under the law, should be privacy by design and default, in particular for vulnerable communities and individuals, and lawmakers should not simply turn a blind eye to the fact that the protections are not as good as they should or could be for everyone. Furthermore, in education there would not need be any age verification to comply with the new duty. The organisation type or flag of ‘in education’ in the data, should be entirely sufficient.

On changes to conditions for processing: approved legitimate interests, scrapping the right to opt out and a need for a balancing test

Lord Clement-Jones:

“Amendment 85 seeks to ensure that transparency and accountability obligations are not removed from data controllers when processing personal data for the purposes of safeguarding vulnerable individuals based on undefined characteristics that may change and that may apply or not apply to any given individual at any point in time. Amendment 86 seeks to clarify whether and how the conditions for processing personal data based on the vulnerability of an individual should expire when the individual’s circumstances change.”

“The new recognised legitimate interest legal basis in Clause 70 and Schedule 4 should be dropped. Stephen Cragg KC, giving his legal opinion on the DPDI Bill, which, as I mentioned, has the same provision, stated that this list of recognised legitimate interests “has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned”. The ICO has also flagged concerns about recognised legitimate interests. In its technical drafting comments on the Bill, it said: “We think it would be helpful if the explanatory notes could explicitly state that, in all the proposed new recognised legitimate interests, an assessment of necessity involves consideration of the proportionality of the processing activity”. An assessment of proportionality is precisely what the balancing test is there to achieve. Recognised legitimate interests undermine the fundamental rights and interests of individuals, including children, in specific circumstances. When companies are processing data without consent, it is essential that they do the work to balance the interests of the people who are affected by that processing against their own interests.

Removing recognised legitimate interests from the Bill will not stop organisations from sharing data with the public sector or using data to advance national security, detect crime or safeguard children and vulnerable people. The existing legitimate interest lawful basis is more than flexible enough for these purposes. It just requires controllers to consider and respect people’s rights as they do so. During the scrutiny of recognised legitimate interests in the DPDI Bill—I am afraid to have to mention this—the noble Baroness, Lady Jones of Whitchurch, who is now leading on this Bill as the Minister, raised concerns about the broad nature of the objectives.

She rightly said: “There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests”.—[Official Report, 25/3/24; col. GC 106.] She never spoke a truer word.”

We suggest that interested parties who support the call for a Code of Practice in education write to peers, the Department for Education, and Mninister in charge of the Bill, here https://defenddigitalme.org/support-better-data-protection-in-education/

 


Hansard Committee Stage Day 2 (December 10, 2024) https://hansard.parliament.uk/lords/2024-12-10/debates/0FC43302-4C6D-4676-A6B5-E369366CA99F/Data(UseAndAccess)Bill(HL)

Hansard Second Reading (November 19, 2024)  https://hansard.parliament.uk/lords/2024-11-19/debates/6B196F71-312C-4957-AF14-98B66C5DBEE4/Data(UseAndAccess)Bill(HL)

Defend Digital Me Briefing for Second Reading (For 19 November 2024) policy page and briefing to download .pdf 319kB v2.0

TV catch up House of Lords Grand Committee 3.45pm