News / news

#HandsOffOurData Protect families data rights

Remember 4 years ago when a “serious breach” of law meant the national database of 28 million learners’ details was able to be used for gambling companies onboarding new customers?

If you want to stand up for your family’s and others’ safe digital life, can we ask you write to your own MP and /or a member of the House of Lords with interest in the debate so far? Ask them to recommend the new Data Protection and Digital Information Bill is withdrawn and that government should leave data protection law well alone. Support the call for an edTech Code of Practice.

Read in full why, with a KC legal opinion here on our blog. The government is rewriting UK data protection law again, and it threatens to:

1. Snoop on all our bank accounts and risk at-scale harm through an algorithm-led welfare benefit fraud detection system. One in the Netherlands resulted in harm to over 20,000 parents who were wrongly accused.
2. Seriously weaken data protection rights from automated (computer-led) decisions in the favour of business and government making changes that will make recourse and redress harder, (for similar reasons to the Post Office Horizon scandal sub-postmasters found in their cases by weakening rules around accountability of data controllers for the life cycle of data and obligations on risk assessment and dedicated Data Protection Officers).
3. Make it harder to stand up to corporations and find out about computer made decisions and corporations (weakening Subject Access Rights, changing definitions of what is in scope of protection at all as ‘personal data’ when, and what is ‘research’, and when companies can claim to ‘assume’ consent down the line of future processing).The definitions to personal data, to research, and legitimate interests should all stay unchanged, as in current law. The term ‘scientific research purposes’ would now be defined by clause 2 of the Bill to mean ‘any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity’. This reduces when people can exercise their rights to see a copy of the data, ask for corrections, object to re-uses, and it can result in reduced data security when data is kept indefinitely in fully identifiable formats, and not anonymised as it should be now
4. Turn us all into walking AI-training datasets for wide commercial re-use from the stories of our lives held in our digital records in the public sector, and further used for more marketing purposes without our consent (broad ‘legitimate interests’ will not need to offer you a Right to Object or do risk assessment any more and by removing the risk assessment ‘balancing test’ when claiming to use ‘legitimate interests’ in loosely defined “safeguarding” uses of data).
5. Allow teenagers to be bombarded with unlimited political advertising from age 14 (Routine profiling is banned today and something the Children’s Commissioner in the past has called ‘irresponsible‘).
6. In principle we support the amendment 146 proposal that would require the Information Commissioner to create a Code of Practice for EdTech. Action and enforcement are needed in many areas of practice today, including where apps and platforms determine the types of data reuse in research and product development without consent, or offering any ability to opt out. While we would welcome this Code on EdTech, our preference however, is that the regulator would do so (create a code of practice for education data) for all pupil data processing including by the Department for Education and the public sector.

Therefore, our summary ask is for the government to drop the Bill, as it makes far wider damaging changes to the current UK data protection regime and on balance will harm children’s and student data rights as a whole, as Stephen Cragg KC found in his Legal Opinion.

All these parts of the Bill need removed, but at very worst case should be entirely rewritten via amendments.

In particular Defend Digital Me believes the changes will mean that children will find it much harder to exercise their own rights to manage their personal data once they become adults and for life, for example after consent was once given in their past by a legal guardian. This might include ever more sensitive data collected in every day activities like genomic data, DNA, or disabilities or increasingly sensitive bodily data collected by emerging technologies in gaming haptics, or biometrics, like gait, fingerprints and facial recognition data collected in schools and shops, and by voice assisted tools.

This comes hot on the heels after Schools Week reported that DfE Ministers are looking at children’s national pupil data records (which hold hundreds of items of personal data against names in the National Pupil database for around 28 million people), and ‘what it’s worth’.

In completely the wrong law for it, the DWP is trying to shoe-horn new ‘anti fraud powers’ for itself into a bill that is supposed to be about protecting people’s data rights. At the last minute and before it could be scrutinised by MPs before Christmas, the government added new powers for the DWP to be able to ask any UK bank to find people the government want to for ‘signals’ of fraud. It could massively scale up the problems created in 2020, when the DWP was wrongly instructing employers to deduct money directly from salaries.

Many MPs themselves object to how the government snuck this part in late after most debate was already over for the Commons before it went to the Lords, and Chris Bryant suggested it was sent back for redoing, saying “this is no way to scrutinise a Bill, particularly one that gives the Government sweeping powers and limits the rights of our fellow citizens, the public.”

Everyone wants misuses of public money to be dealt with, and the government already has strong powers to check the bank statements of suspects. “But this is a major expansion of government power that takes away our financial privacy like never before and does away with the presumption of innocence – the democratic principle that you shouldn’t be spied on unless police suspect you of wrongdoing. People who are disabled, sick, carers or looking for work or who make innocent mistakes should not be treated like criminals by default. None of us should.” You can also take action to object through the Big Brother Watch petition here.

In the Netherlands, in a similar programme when the full extent of harms from the “Toeslagenaffaire” emerged —it ranged from bankruptcy to trauma for more than 20,000 parents wrongly accused of fraud.

Defend Digital Me is still calling for the Data Protection and Digital Information Bill to be withdrawn, as we did in March 2023 already as part of a coalition of 25 further civil society organisations.

If you too want to stand up for your family’s and others’ safe digital life, very briefly write to your MP and /or a member of the House of Lords. Some of the MPs and Lords most interested are those who spoke in the debates on 29th November and March 20th 2024. Ask them to recommend the new Data Protection and Digital Information Bill is withdrawn and to leave data protection law well alone. You can write to any of the interested peers, or those with an interest in your area of interest too. When it comes to MPs it can be best to write only to your own MP, and explain in your own words why it matters to you.

It will next be debated again in the House of Lords from April 15th 2024, as Committee Stage continues.