News / Blog

Data Protection and Digital Information Bill

Updated September 5, 2022 ––

The Data Protection and Digital Information Bill  [Bill 143 2022-23] was introduced in the House of Commons on 18 July 2022. The Bill is scheduled to have its second reading on 5 September 2022.

Our briefing for Second Reading v1.5 (September 3, 2022) [download .pdf]

A big risk in this Bill is that UK MPs see this only about protection of data as a product, when in fact a key aim of data protection is about the protection of people, through the power of information granted by access to their personal confidential data.  That power balance will be further moved away from people to businesses and to government in this Bill. With limited understanding of existing data protection law some of these things in the Bill drafting may be read at first pass as having minimal or even positive effects, but on closer reading will open the doors to significantly weaker children’s protection from harms, caused by inappropriate or misuse of their personal data, and reduce their routes to redress.

As we published in 2020 in our work, The Words We Use in Data Policy: Putting People Back in the Picture, and the Office for the Regulation of National Statistics went to to publish in their own report, children’s voice is left out of this debate on data about them, but we must reconcile the focus of the UK national data strategy, with a rights-based governance framework to move forward the conversation in ways that work for the economy and research, and with the human flourishing of our future generations at its heart.

June 20, 22––

The DCMS consultation Data: A new direction has led to the proposals published by the DCMS to change the UK Data Protection Act only four years after it first came into law. The response sets out some of the expected proposals in a Bill yet to be published. What changes might it make compared with the law as it stands today?

Just over two years ago, the ICO completed an audit of the Department for Education. A total of 139 recommendations for improvement were found, with over 60% classified as urgent or high priority about the processing of the 21 million records in the National Pupil Database (“NPD”) in which the Department for Education (“DfE”) had started amassing detailed personal confidential data about school children and students over 20 years ago.

The NPD is now “one of the richest education datasets in the world,” holding a wide range of sensitive information about pupils and students dating back to 1996. Named records are retained indefinitely, long after children leave school, and the ICO audit raised lack of weeding and disposal as an issue. Between March 2012 to June 2021 we have calculated there have been over 2,000 releases containing sensitive, personal or confidential data at pupil level, each release of millions of records.

The promise twenty years ago that names would only be used for statistics was broken. Ten years ago Michael Gove said, “Organisations granted access would need to…fully protect the identity of individuals.” Identifying data has been given away since 2012. Gambling companies were using the Learner Records Service to onboard new customers in 2020.

The Data Protection Act has offered pupils little meaningful protection so far, and is about to get weaker. In 2022, it seems everyone wants access to exploit our children’s education records. No one wants to be held accountable, the ICO doesn’t appear to want to enforce what it found, and not only is the government re-writing law to make what it got wrong, lawful or easier, but spreading that practice around to others.

In summary, “The government’s consultation recognises that respondents fear a potential lowering of standards,” they wrote. We look forward to the draft bill proving it won’t mean exactly that.

Here are six ways we see the planned reforms will undermine children’s protection in today’s law, using examples from the executive summary of the DfE ICO audit. We can’t find a single thing for children that makes rights or redress easier to access.

Instead it looks like the government is writing itself and its friends in the commercial sector, a get-out-of-jail-free-card. Bad practice will get a boost, and infringements of the GDPR can be glossed over as ‘proportionate’. This is not “data a new direction” but will amplify more of the same bad practices of today, heading in the wrong direction to defend children’s right to privacy, identity, and security.

The costs of getting it wrong by taking this short term, fast-buck outlook for Brexit Britain, will be borne not only by individuals, but by business and the research community through a loss of public trust in their data handling as risk increases through more focus on transfers abroad and treating people as the product.

Instead, as we concluded in our report last year, The words we use in data policy: putting people back in the picture (2021), “we must reconcile the focus of the UK national data strategy, with a rights-based governance framework to move forward the conversation in ways that work for the economy and research, and with the human flourishing of our future generations at its heart.

For children themselves the costs of taking this wrong direction, may last a lifetime.


1. Lawfully, fairly and in a transparent manner

The ICO DfE audit found,the DfE are not fulfilling the first principle of the GDPR, outlined in Article 5(1)(a), that data shall be processed lawfully, fairly and in a transparent manner.” They aren’t informing us, the people in the database what they do with our (especially with children’s) personal confidential data. “The DfE are not providing sufficient privacy information to data subjects as required by Articles 12, 13 and 14 of the GDPR.”

The Data Reform proposals suggest, “To ensure that research is not prevented in situations where recontacting data subjects would constitute a disproportionate effort, the government will replicate the exemption currently found in Article 14(5)(b), but only for research purposes.” What will that mean in practice? “This change does not exempt controllers from providing all of the relevant information in Article 13(1)&(2) to data subjects at the point they collect the data.” Who will be responsible for fair processing i.e. telling learners their data is given to companies for “research purposes” to create heat maps for estate agents, to think tanks or the press? It might also become harder to get subject access requests met to see transparently how your data is processed, if an organisation decides it would rather not be as open as you want, because, “the government plans to proceed with changing the current threshold for refusing or charging a reasonable fee for a subject access request from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’,” but thankfully the proposal to restore charges for every SAR seems to have been dropped. Sadly there seems to be no progress on the real direction of travel we should be going and turning the GDPR recital 63 into reality anytime soon, where possible, the controller should be able to provide remote access to a secure system which would provide the data subject with direct access to his or her personal data.’

2. Accountability

The ICO DfE audit found, “the DfE cannot demonstrate accountability to the GDPR,” (p3). “Information assets are not assessed with sufficient frequency to ensure that the process is effective and resulting risks are not recorded with sufficient granularity or detail on the Information Risk Log to enable meaningful control and monitoring. Not all information risks are recorded and where they are, they do not always identify actual risks or control measures.”

The Data Reform proposals suggest, “that current requirements for data controllers to demonstrate how they are complying with the data protection legislation can put a disproportionate burden on some organisations.” The measures around ROPA, DPIA and Data Protection Officers all reduce UK compliance with the core premise of the GDPR, demonstrable accountability.

3. ROPA obligations to keep a record of processing

The audit found, “There is no clear picture of what data is held by the DfE and as a result there is no Record of Processing Activity (ROPA) in place which is a direct breach ofArticle 30 of the GDPR. Without a ROPA it is difficult for the DfE to fulfil their other obligations such as privacy information, retention and security. The requirement for a ROPA has been documented for over a year in audit reports and meeting minutes, however little progress has been made to address this.“(p4) and “Not all information risks are recorded and where they are, they do not always identify actual risks or control measures.“(p5)

The Data Reform proposals suggest,“Organisations will need to have personal data inventories as part of their privacy management programme which describe what and where personal data is held, why it has been collected and how sensitive it is, but they will not be required to do so in the way prescribed by the requirements set out in Article 30. For that reason, the government plans to proceed with removing the requirement for record keeping provisions.”

4. Data Protection Officers (DPO)

The audit found, “The organisational structure of the DfE means the role of the Data Protection Officer (DPO) is not meeting all the requirements of Article 37-39 of the GDPR.” (p4)

The Data Reform proposals suggest,the government plans to proceed with removing the requirement to designate a data protection officer. Most of the tasks of a data protection officer will become the ultimate responsibility of a designated senior individual to oversee as part of the privacy management programme.” This is despite “the majority of respondents” having “disagreed with the proposal to remove the requirement to designate a data protection officer.”

This may mean in practice according to the press announcement, that “small business such as an independent pharmacist won’t have to recruit an independent DPO to fulfil the requirements of UK GDPR, provided they can manage risks effectively themselves, and they will not have to fill out unnecessary forms where the risk is low.” It is surprising the DCMS seems to consider highly sensitive medical records as low risk. This would suggest children’s settings such as nurseries and small educational settings are being considered for exemption, despite processing children’s disabilty, health or biometric data for example.

However the consultation response is more sweeping, about removing all DPOs.

This proposal will water down the obligation to understand data protection law, as is described in the role of the DPO under the GDPR article 37, “The data protection officer shall be designated on the basis of professional qualities and, in particular, expert knowledge of data protection law and practices and the ability to fulfil the tasks referred to in Article 39.” (The UK DPA Articles 69-71)  Now the UK government proposes this might be only a “senior individual,” instead. This would appear to remove the UK DPA 2018 Article 70 obligations to support the DPO by “providing resources necessary to carry out those tasks and access to personal data and processing operations, and to maintain his or her expert knowledge.” We can only assume that a new role like this would have no accountability to the GDPR if not defined as a DPO. The DPO role is a cornerstone of accountability, and among the specific tasks of a DPO omitted in the British ‘senior individual’ is a duty in GDPR Article 45 (1)(h) to ensure that the rights and freedoms of data subjects are not adversely affected by processing operations. The said, even having a DPO is not enough if a government department like the DfE chooses to ignore their recommendations.

5. Data Protection (Risk and Harm) Impact Assessment

The audit found, “Data protection impact assessments (DPIAs) are not being carried out at a stage of the project early enough to influence the outcome and in some cases prior to processing beginning altogether. Some processing which should have been subject to a DPIA is being carried out without any DPIA having been completed. The Privacy Assurance Team (PAT) are risk assessing projects they aren’t fully briefed on, resulting in high level content with no detailed risk assessment of the specifics of the processing activities, and the mitigation actions identified do not often have appropriate effects on the residual risk scores.

The assignment of lawful basis in DPIAs is also high level and does not include a justification for the designated lawful basis for details of how it applies to each specific processing activity.” (p5)  “The Commercial department do not have appropriate controls in place to protect personal data being processed on behalf of the DfE by data processors.” (p5) and  “Consistent assessments to granularly inform the purpose, legality and risks of the application are not formally carried out and there is no requirement for a DPIA to be carried out for all sharing applications. As a result there is no formal assessment of applications for data protection compliance.” (p6)

The Data Reform proposals suggest (2.2.8),”the government plans to proceed with removing the requirement to undertake data protection risk assessments.” “Under the new privacy management programme, organisations will still be required to identify and manage risks, but they will be granted greater flexibility as to how to meet these requirements.” This is despite the fact that, “The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments.

In the current risk-based approach embodied by the GDPR, carrying out a DPIA is already not mandatory for every processing operation. A DPIA is only required today, when the processing is “likely to result in a high risk to the rights and freedoms of natural persons” (Article 35(1)). This change could therefore mean high risks to the rights and freedoms of natural persons will not be identified or documented or open to scrutiny by others, including organisations like us at defend digital me.

6. Definitions of the lawful basis for processing

The ICO DfE audit found, “There is an over reliance on using public task as the lawful basis for sharing which is not always appropriate and supported by identified legislation. Legitimate interest has also been used as a lawful basis in some applications however there is limited understanding of the requirements of legitimate interest and to assess the application and legalities of it prior to sharing taking place how it should be applied to ensure the use of this lawful basis is appropriate and considers the requirements set out in Article 6(1)(f) of the GDPR.” (p6) The DfE has normalised re-use of school records of the 21 million people in the NPD for commercial reasons and furthering the aims of the Hostile Environment, both incompatible purposes with why children’s data are processed by schools or parents expect.

The Data Reform proposals suggest, “Clarifying further processing for an incompatible purpose when based on a law that safeguards important public interest (question 1.3.2)” as well as, exemptions for legitimate interest balancing test (questions 1.4.1, 1.4.2, 1.4.3, 1.4.4).” …”uses of personal data for commercial purposes, for example…processing personal data for internal research and development purposes.”  Our children go to school to exercise their right to education, not to become a digital product to pass around an unlimited number of strangers. These purposes are incompatible and removing this bar seems a stretch too far to be acceptable for adequacy, as in effect, it scraps the second principle of Data Protection law, purpose limitation. In the GDPR point (f) of the first subparagraph, Legitimate Interests shall not apply to processing carried out by public authorities in the performance of their tasks, but legitimate interests is still available for processing which falls outside the tasks of a public authority. But this proposed change appears to seek to go much further and to legitimise the misuse of these lawful bases for incompatible processing children’s data seen to date at the DfE. Furthermore, throughout the consultation, the ‘legitimate’ interests definition appears to be seen through a lay understanding something like ‘justifiable’, not the high bar defined in data protection standards.

This appears to ignore the second half of Article 6(1)(f): “Processing shall be lawful only if and to the extent that at least one of the following applies: (f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.

We can only surmise from the overall tone and approach that safeguards on using personal data will be reduced. While there’s some comments on requirements remaining for children, they are in odd places and look rather unworkable such as online behavioural tracking (cookies) and keeping an LI balancing test.