News / news

New KC legal opinion on the Data Protection and Digital Information Bill

KC Legal Opinion

Defend Digital Me commissioned and are placing into the public domain a Legal Opinion by Stephen Cragg KC of Doughty Street Chambers relating to the Data Protection and Digital Information Bill (2023).

This Opinion (a) summarises the main legal arguments and analysis; (b) provides a more detailed explanation of the Bill; and (c) lays out the legal opinion in full.

In summary he found the following points of concern:-

    • The proposed change to the definition of ‘personal data’ in the Bill has the potential to mean that some data currently defined as ‘personal’ will in future be excluded from protections in the DPA 2018 and UK GDPR. In particular there is potential for the definition of ‘personal data’ to change depending on who is processing data, and the Bill removes the need for a data controller to have an ongoing duty to consider whether retained data has become ‘personal data’.
    • The terms ‘scientific research’ and ‘scientific research purposes’ would now be defined by clause 2 of the Bill to mean ‘any research that can reasonably be described a scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity’.
    • Loosening of requirements on purpose limitation will assist commercial and non-commercial organisations involved in research and re-using personal data obtained from third parties, but will do nothing to increase protection for individual data subjects.
    • A list of ‘legitimate interests’ has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned. The Secretary of State can add to this list without the need for primary legislation, bypassing important Parliamentary controls. Business friendly interests, such as direct marketing, are now listed, without provisos, as interests which may be seen as ‘legitimate’ giving succour to commercial organisations, but no added protection to the personal data of individuals.
    • The powers of the Information Commissioner are diluted in a way which provides less protection to data subjects, but much more power to the government to restrict and interfere with the role of the Commissioner.

“Overall the Bill is a significant shift away from a rights-based regime towards a set of market standards which treats data a product, raising concerns that the UK is moving away from international benchmarks and standards.”

“If the new definition of personal data is enacted, that will also, of course, mean that fewer data of children will be protected under the new law.”

“The best way to protect children’s data is by the retention or introduction of specific safeguards in legislation. However, there is no doubt in my mind that, additionally, such a code of practice as previously advocated for by DDM would be a useful tool for ensuring that special care is taken when the processing of the personal data of children within the education and social care systems (especially) is under consideration.”

Download the new legal opinion on the Data Protection and Digital Information (DPDI) Bill by Stephen Cragg KC [pdf. 282 KB]

About Stephen Cragg KC

Stephen is team leader for Doughty Street’s Data Protection and Information Rights Team and is listed in the 2024 Chambers legal directory as a top ranked silk for data protection law.


The Defend Digital Me view on the Bill

Defend Digital Me is calling for the Data Protection and Digital Information Bill to be withdrawn. In March 2023, Defend Digital Me, as part of a coalition of 25 further civil society organisations, already wrote to the Secretary of State for Science, Innovation and Technology, Michelle Donelan MP, calling for the Data Protection and Digital Information (DPDI) Bill to be dropped. The signatories included trade unions as well as human rights, healthcare, racial justice, migrants rights, workers’ rights and criminal justice organisations. We share concerns that the government’s proposals will seriously weaken data protection rights in the UK, and will particularly harm people from marginalised communities.

This under-scrutinised Bill makes significant and substantive changes to the safeguards in place for the protection of everyone, including children, in the digital environment.

This comes just as DfE Ministers are reportedly looking at pupil data and ‘what it’s worth’. In our view at Defend Digital Me, that is setting us all up to be used as AI training datasets for wider commercial re-use of the stories of our lives through our digital records, held by the public sector, but used for purposes without our consent.

It will make our personal data more available for commercial benefit, while putting our personal privacy at risk. Do we all want to be turned into training datasets for AI without our permission or being told? For children it could mean a lifetime of direct marketing, political profiling, and data given away as a child they can no longer control using data protection law, once they have capacity.

This Bill upends the principles of necessity and proportionality and data minimisation. It touches on and negatively affects every one of the seven foundational principles of data protection law.

In particular Defend Digital Me believes it means that children will find it much harder to exercise their own rights to manage their personal data once they become adults, for example when consent was once given in their past by a legal guardian. This might include genomic data, DNA, biometrics or increasingly sensitive bodily data collected by emerging technologies in gaming haptics, education and by voice assisted tools.

One of the respected constitutional law scholars of the 20th century, the late Paul Freund, memorably said that the U.S. Supreme Court “should never be influenced by the weather of the day but inevitably they will be influenced by the climate of the era.”  At Defend Digital Me, we see this Bill set in the perfect storm in a post-Brexit, post-pandemic economy, and Home Office “Hostile Environment”. Data protection law is not something that should be driven by the political weather of the day. An ever-increasing volume of data about us in an increasingly automated world demands greater efforts towards upholding protection and power to the people whose lives are affected by its use and abuse.

There is no reason that makes it necessary to change today’s law to make current data practices less safe, less fair and less transparent to the people whose lives are recorded in their digital activity, their likes and habits, and whose data footprints are already tracked across the globe by thousands of third parties, vying for our online attention.

This Bill gives more power to data users and takes it away from the people who the data are about. It will not only weaken protections by enabling more personal data to be excluded from protection using research exemptions, but make vast amounts of processing fall outwith the definition of personal data entirely over time — relabelling the data doesn’t change the nature of the data or its sensitivity or its threat model. It also restricts access to see what is done to our digital-selves through interferences with our private and family life, and with fewer routes to remedy and redress when things go wrong. This Bill takes the UK in not only a new, but completely wrong direction. It must be dropped.

 


Defend Digital Me are calling for

  1. This Bill to be withdrawn
  2. An ICO Code of Practice to be drafted for (existing) Data Protection law in Educational Settings.
  3. Regulations to correct the collection of Higher Education student monitoring of equality of opportunity from identifying to statistical data beyond the point of collection and internal use at their setting.

An amendment for a Code of Practice on pupil data was proposed in 2017 (here on pages 25-26) Code on processing personal data in education where it concerns a child or pupil, and 2018 (at col. 324 Code on processing personal data in education), and yet again in 2021 in the debate on new law Post-16 Skills and Education, which again gave new national data collection powers to government.


References

  1. Original post formated to download as a .pdf file
  2. Data Protection and Digital Information Bill https://bills.parliament.uk/bills/3430/publications
  3. Defend Digital Me Second Reading Briefing | Data Protection and Digital Information Bill (2) v1.5 16/4/2023.pdf
  4. DPDI Briefing accompanying letter as audio [download DPDI Bill DDM letter 12122023 audio version].
  5. The Convention 108 for the Protection of Individuals with regard to Automatic Processing of Personal Data (CETS No. 108) https://rm.coe.int/1680078b37
  6. Open Rights Group led-coalition letter (March 7, 2023) https://www.openrightsgroup.org/press-releases/26-civil-society-groups-call-on-government-to-scrap-data-protection-and-digital-information-dpdi-bill/
  7. The 2018 resolution from the International Conference of Data Protection and Privacy Commissioners https://edps.europa.eu/sites/default/files/publication/icdppc-40th_dewg-resolution_adopted_en_0.pdf
  8. Defend Digital Me edTech and pupil data (2023) Briefing: November 2023 (WIP) v.1.9
  9. Defend Digital Me. (2021). The Words We Use in Data Policy https://defenddigitalme.org/research/words-data-policy/
  10. Research from 30 universities across the UK on their policy towards monitoring of students’ equality of opportunity by sexual orientation, religion and disability. (Download Scotland) and (Download England and Wales).
  11. Added December 2023, link to the European Commission Mission report of November 2022, of which Politico reported at the time, “It was appalling, it was all about growth and innovation and nothing about human rights,” said the French MEP about a meeting with U.K. government officials about the reform plans. “I never heard them say, protecting data is a fundamental right. Even in Hungary they say this”. “Italian MEP Fulvio Martusciello from the center-right European People’s Party said his impression from the visit was that Britain is “giving in on privacy in exchange for business gain.”  “French MEP Gwendoline Delbos-Corfield said she felt “we were taken for fools” after Digital Minister Julia Lopez quit the meeting halfway through, U.K. Home Office ministers didn’t bother to meet them and the U.K.’s data regulator, the Information Commissioner’s Office, sent Acting Executive Director Emily Keaney rather than chief John Edwards.”
  12. And on the question of adequacy: E-001790/2023 Answer given by Mr Reynders on behalf of the European Commission (2.8.2023) https://www.europarl.europa.eu/doceo/document/E-9-2023-001790-ASW_EN.pdf

What is missing in more detail?

Given our remit in the education and children’s sector, we considered the effects on children in particular. In our 2021 report, The Words We Use in Data Policy: Putting People Back in the Picture, we explained that if the government wishes to simplify the law, it should not be rewriting another version of one part of our Data Protection law, at all, but acknowledging the failed transposition of the 1995 Directive implemented in the UK through the 1998 Data Protection Act, we should not be starting from here at all.

This Bill is an attempt to take us back in time. The GDPR came into being recognising the risks of emerging technologies and their implications for human rights, and that what had gone before was not enough to meet growing public outcry over data misuses. The UK government should be consolidating the two amended laws (the UK-GDPR and the Data Protection Act 2018) into one, comprehensive piece of data protection law — not redrafting new complex layers.


Children’s protection and recognition of needs are missing

This Bill contradicts the wide acceptance of today’s position in recital 38 of the GDPR that, “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing …”. Instead it says marketing should become a formally recognised legitimised interest that outweighs the rights and freedoms of individuals. This fails to understand the adTech ecosystem and real-time biding mechanisms behind the scenes of today’s online marketing activities every nano-second of every day wherever we go and whatever we do. It fails to recognise children, as well as adults, should be protected from the undue influence those systems can have on our autonomy, agency and choices we make. Including weakening of protections of data repurposing for political persuasion which is so desperately in need of protection when it comes to mis– and disinformation, to uphold democratic aims.


Children’s ability to exercise the right to rectification and erasure are removed

Recital 65 of the GDPR today will become near impossible to exercise for children later in life, if in future personal data is no longer personal some way down the line. [A] “data subject should have the right to have personal data concerning him or her rectified and a ‘right to be forgotten’ where the retention of such data infringes the law. In particular, a data subject should have the right to have his or her personal data erased and no longer processed where the personal data are no longer necessary in relation to the purposes for which they are collected or otherwise processed, where a data subject has withdrawn his or her consent or objects to the processing of personal data concerning him or her, or where the processing of his or her personal data does not otherwise comply with this Regulation. That right is relevant in particular where the data subject has given his or her consent as a child and is not fully aware of the risks involved by the processing, and later wants to remove such personal data, especially on the internet. The data subject should be able to exercise that right notwithstanding the fact that he or she is no longer a child.”  How is anyone supposed to exercise this right if the redefinition of personal data changes the nature of its status over time, and later in life the former child can no longer exercise rights they should have been able to exercise at the point of collection, but were incapable of doing so?


Safeguards for children from profiling fall away under a list of Legitimate Interests

In particular adding anything to do with children to the list of Legitimate Interests (LI) under a ‘safeguarding’ umbrella is completely unnecessary.

Legitimate interests is overwhelmingly a commercial company legal basis for data processing about children, not one used by public authorities under their public task duties. Where there is an existing LI it continues to be. Adding an open-ended ‘safeguarding’ to a recognised list of LI however, will mean companies no longer need to carry out a risk assessment for high risk processing at large scale. For example, it could cover some safetyTech companies, which may enable millions of covert photographs to be taken of children by schools or the companies, that may label children with suggested vulnerability to ‘risks’ of terrorism and extremism, to self harm, or as ‘oversharers’ or predict harm to others. Tech that is already proven to have unintended consequences including outing LGBTQ teens in the U.S. None of that would be changed by adding safeguarding to the LI list on the face of the Bill, but would mean they can continue or change the scope of their covert activity without oversight or changes to design that profile and track children, where their nudes or intimate records may be retained in the world, or who can access them. This is an area that needs more scrutiny and transparency and limitations, not less.

The Bill contradicts the spirit of Recital 71, which states that children should have the right not to be subject to a decision, “which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her[…]…In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision.”

When it comes to political opinion and data for political purposes, we have had no answer to the question what the safeguards are for teenagers? These would be required by Article 6 of the Convention 108.

The desire to use the UK population as a resource to exploit by the private sector to build data products in the short-term, and with even less oversight and fewer controls, will come at the cost of long-term risks and harms to people through greater exposure of their digital footprint and identity. The effects can last a lifetime.


Data protection is a Gatekeeper to Access Digital Identity

This bill is a disaster for the public, in particular for children and the immediate and future security of UK identity as a national asset and as a national security gatekeeper — weakening the protections around personal data and ID are a threat not only to individuals but the State. Its soft wording seems like insignificant changes but then, if so, why change it at all? In fact this Bill brings in substantive and significant change to the UK data protection regime. The UK will diverge from existing data protection law, become a lone-rider in a new wild west of data washing for data hoarders around the world, and risks the interoperability of common high standards of personal data flows across borders.


Data protection is a gatekeeper to children’s lives with lifelong consequences

In an apparent government u-turn on its recent obsession with children’s online safety, the government is writing law now that will reduce children’s digital protection. This runs contrary to everything anyone interested in the subject has recommended for a long time, including in the 2018 resolution from the International Conference of Data Protection and Privacy Commissioners, because:

“inappropriate data processing practices by e-learning platforms, opaque automated decision-making and misuse of learning analytics, risk undermining data protection and privacy rights. In the case of children and youth, this can have significant and long-term social, economic and professional consequences.

This is furthermore, a particular high-risk for genomic data often captured without consent as a baby– for which there is no need for change by actors who want to operate to the high standards of protection expected by the public, and for public interest research — people who want this changed want to operate to lower standards, for example, by not having to tell the public broadly what they will do in advance (at the point of collection) or not giving people any choice of the purposes of re-uses to exclude, for example military or abortion research to which some object as a matter of conscience.

Genewatch UK agrees. Dr Helen Wallace, Director of GeneWatch UK has said alongside their briefing, “This is a short-sighted and extremely dangerous attempt to tear up existing safeguards for people’s DNA and genetic information. If passed, these changes will damage people’s trust in health, research and police uses of their DNA, perhaps for generations”.


The Bill should be scrapped: it makes our research regime more amateur and puts people at greater risk from harm and unwanted interference.

There is NO need to redraft the definitions of personal data nor the definition of research nor the boundaries of legitimate interests. If those changes go ahead, it will not only weaken safeguards by enabling more personal data to be labelled as “exempt” under research exemptions, but swathes of personal data will be removed outwith the definition entirely — that relabelling will not change the nature of the data or its sensitivity or its threat model when distributed or used– it only serves to enable more exploitation of the UK public by industries  (domestic, international or otherwise) without the public’s ability to object or be informed and express preferences over being used as AI training datasets without our permission as an example. Rebranding personal data as research data, simultaneously weakens its protections set around the qualifications of the people who are accredited and qualified to handle it in safe ways. It reduces recourse to redress from malicious and opaque reuses; with effects for state and border security as well as individuals, for example in the case of children’s DNA and biometrics. Overall, it makes our research environment more amateur and increases the risk for public interest researchers that public trust will be lost in the nature of their work.

In the UK have already lost the protections of fundamental rights post-Brexit, because the government made the choice not to carry them over from EU Charter of Fundamental Rights into UK law. That means in particular a weakening of the rules to uphold respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity. In a world in which automated and machine-led decision making affects us more and more at speed and scale, where we are at constant threat of identity theft and financial fraud. The Bill fails to think Big or think for the future. Instead the Bill creates a future in which digital identity can only be a state accredited ID to exercise housing and employment rights, but makes the frameworks for protecting those identities far weaker. Data protection laws are used to uphold our human rights and are not supposed to be designed to become a gateway to fraud at speed and scale, and mass exploitation in secret.


Remember what harms are we talking about, from which people may need protection: The GDPR Recital 75

The risk to the rights and freedoms of natural persons, of varying likelihood and severity, may result from personal data processing which could lead to physical, material or non-material damage, in particular: where the processing may give rise to discrimination, identity theft or fraud, financial loss, damage to the reputation, loss of confidentiality of personal data protected by professional secrecy, unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage; where data subjects might be deprived of their rights and freedoms or prevented from exercising control over their personal data; where personal data are processed which reveal racial or ethnic origin, political opinions, religion or philosophical beliefs, trade union membership, and the processing of genetic data, data concerning health or data concerning sex life or criminal convictions and offences or related security measures; where personal aspects are evaluated, in particular analysing or predicting aspects concerning performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, in order to create or use personal profiles; where personal data of vulnerable natural persons, in particular of children, are processed; or where processing involves a large amount of personal data and affects a large number of data subjects.