News / news

UK Data Protection Law consultation. Speak up.

Did you know that the government has plans to weaken your rights in UK Data Protection Law, in a consultation closing in one week on 19th November? You can take part in the consultation until 11.45pm on Friday, and you don’t need to answer every question. Just tell the government what you think.

With one hand the same government department claims to want to better protect children and young people online in the Online Safety Bill, but with the other is writing proposals that will be harmful to children’s protection of their personal data.

This Consultation mentions children in passing, only seven times in 146 pages. In respect to the Age Appropriate Design Code, with regard to child sexual exploitation, keeping the ICO complaints process open for children, and four times with regard to scrapping the legitimate interests balancing test. They’re not mentioned in the impact assessment at all.

In brief, this is about reducing rights from everyone in how data is used about you, as you go about your business. Your data from going to school, going for a new job, every day tasks in every day life. It will push even more of the risks of data processing back to us, the people, and away from companies in the name of their convenience. But who benefits? Businesses and public services alike need to have trusted processes. They want people to choose to use them, and need clear protections from reputational risk by bad actors in an open playing field.

Do you want to object to an algorithm making decisions about your exam grades or excluding you from a recruitment process for university or a job? You’d no longer be able to object if TIGRR gets their way. You might not even get told that a computer made the decision at all. And what about your private data being given away to unknown people, for purposes that the government will redefine as ‘research’? More predictive algorithms without adequate oversight in children’s social care with dangerous blind spots, or predictive reoffending rates in youth justice? Bring. It. On. is what these plans would mean in practice. Remember the access to 28 million national learner records by gambling companies in 2020? All these reductions on data governance like keeping track of what data you hold, makes it easier for bad things like that to slip past any oversight. Do you want to find out if the Met Police has put your name on its Gangs Matrix? Under the proposals, they could charge you for asking. Do you want an easy go-to-person to find out if and how your personal details have been misused in marketing, or lost in a data breach? You’ll find it harder after the government scraps the need to have a Data Protection Officer, that the UK only just introduced into law in 2018. The Biometrics Commissioner himself has rejected the plans to move his office under a weaker new ICO structure.

According to Michael Sanders, Chief Executive of the What Works for Children’s Social Care in September 2020, as regards machine learning in children’s social care, now is the time to stop and think, not to ‘move fast and break things’. “With the global coronavirus pandemic, everything has been changed, all our data scrambled to the point of uselessness in any case.”

These proposals go against all of the concerns people have today about having too little control of the stories of our lives. Discrimination. Decisions made about us without us. How companies and organisations use our personal data for too much marketing, excessive policing, unfair algorithms. Respond to the government plans and call on them *not* to reduce the protections we all need every day, going about our daily business. What’s also important to remember is that our UK data protection regime is *already* less strong than the GDPR intended. We did not make the derogations that improved rights or added protections in 2018, such as representation under Article 80(2), but instead created the controversial and contested immigration exemption.

The consultation closes on November 19th, 2021. We’ve set out below some selected suggestions for thinking about how the key changes will affect children.

Our full child rights impact assessment is available on request by email to Info @ defenddigitalme dot org


Consultation direct link: https://www.gov.uk/government/consultations/data-a-new-direction


Download our submission in full (19/11/2021)  Data a new direction consultation DDM response


Selected key points we consider the most important to address for children.

1.2 Not to weaken research definitions to give parity to commercial re-use of data (for example, research to turn children’s data into a data product) with scientific research purposes, and public interest research.

1.2.10. Not to weaken people’s rights to know what is being done with personal data under the Article 14(5)(b) exemption in Article 13. See the video. Do you think the government should take away your right to be told that they keep your data and who they give it to and why?

1.3 Not to broaden the legitimate interests definition to give parity to further processing of data for new purposes with public interest research. This Q1.3.2. also appears to suggest that redefining what could be “incompatible” purposes to be acceptable, and it will mean removing purpose limitations of data already collected. Perhaps they want to not be held in breach of the law for deciding your school records can be given away to commercial companies, like the DfE does today?

1.4.4 Agree the legitimate interests balancing test should be maintained for children’s data, irrespective of whether the data is being processed for one of the listed activities. But the legitimate interests (LI) balancing test should be maintained for processing all personal data, regardless of age. This is not only because it is a layer of protection for good data processing, but would likely mean a massive push towards age verification for all data subjects. How else would a data user know if the LI test should be applied to whom? Does the consultation propose defining “children’s data”? At the point of data collection or only for as long as the person is a child? The Learner Records Service holds the records of over 28 million named individuals, which grows each year and the people in the records age. Ditto several other of the DfE and public administrative datasets. Only around 8 million of them at any one time will be aged under 18. Almost all were children when the data was collected.

1.5.1 The consultation conflates the data protection principle of ‘fairness’ telling people what you will do with personal data at the point of collection, with a whole lot of other ideas on being fair i.e. not being discriminatory. This is not the intention of the data protection principle of ‘fairness’. AI has no special case here.

1.5.5 No change is required to existing law unless you are trying to make what is currently unlawful, lawful. That requires more discussion than this consultation permits.

1.5.10-12 No change is required to existing law, which already permits this bias monitoring purposes. The risk is far greater that this change would be used to excessively process sensitive data and protected characteristics. There is also a conflation of “outcome fairness” with equality. (See The Legal Education Foundation explainer https://thelegaleducationfoundation.org/articles/leading-barristers-warn-that-government-proposals-to-reform-uk-data-protection-law-may-lead-to-unintentional-breaches-of-the-equality-act-2010

1.5.17 Disagree strongly with the Taskforce on Innovation, Growth and Regulatory Reform’s recommendation that Article 22 of UK GDPR should be removed. (See exams 2020).In particular in relation to sensitive personal data, this can be where safeguards are most needed to maintain public trust, not where it most needs to be removed.

1.5.19 Revealing the purposes and training data behind algorithms, as well as looking at their impacts should be done already today, as part of risk assessment and fair processing. No change is needed in law to permit it.

1.5.5 Disagree that the Government should permit organisations to use personal data more freely, for the purpose of training and testing AI. Any responsible work is already permitted with existing lawful bases for doing so. This could only result in more exploitation of children’s personal data to turn into products.

1.6 Data minimisation and anonymisation should be encouraged, but not by rewording the definition of what counts as minimised. Government could start with its own administration of public admin datasets in processes to address excessive data retention by weeding and destruction; and aggregation and anonymisation.

2.2.5 Disagree strongly with removing the existing requirements to designate a data protection officer (DPO). Children need a clear go-to person for concerns. This can be a shared service today and does not put disproportionate demands on any business that wants to handle personal data well.

2.2.8 Disagree with proposal to remove the requirement for organisations to undertake a data protection impact assessment. This would encourage the lowest common denominator based on worst practice if organisations were able to adopt different approaches to identify risk. The DPIA already has flexibility how it is done according to organisational needs and its data processes. It would mean the bad actors could hide bad practice more easily and disproportionately disadvantage good actors without common incentives to ‘do the right thing’. In fact, the DPIA should include where appropriate a child rights impact assessment, to take into account situations of acute power imbalance for example.

2.2.9 Disagree strongly with proposals to remove the requirement for prior consultation with the ICO so it is no longer mandatory and organisations would not face any direct penalties for failing to consult the ICO in advance of carrying out the processing. Disagree with proposals to remove record keeping requirements under Article 30. (See the ICO audit of department for education why both need to be kept and perhaps the accountability they are hoping to avoid in future https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/10/statement-on-the-outcome-of-the-ico-s-compulsory-audit-of-the-department-for-education/)

2.2.11 Since most organisations will process some personal data routinely, the requirement to keep a record and know what data you process and why, supports the need to be able to provide this in Articles 13/14, rather than duplication, (as per paras 176/77) you cannot do the latter without having done the first. There is no need for duplication of the same processing information. ROPA is necessary and should not be scrapped.

2.2.18 Does removing the obligation to have a data responsible officer at public authorities intend to remove any duty to be accountable for the authority’s data processing? If not, any change here would only be on paper and create confusion, unless something substantial is intended to change. All public authorities that process personal data need to have someone accountable for it. To carry out risk assessment. To be a ‘go to’ internally and externally for questions. To avoid internal confusion and potential duplication if each department had to perform the tasks rather than have a single designated person. Appointing a data protection officer allows for great flexibility according to the types of processing done and need not be onerous. As is already widely done today in state schools, this role can be provided as a shared service and not even be in-house. No need to change the law.

2.3.4 Disagree there is a case for re-introducing a fee for processing subject access requests (any charges would disproportionately affect children). (See the Met Police Gangs Matrix where it played vital role in transparency for children and young people https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2018/11/ico-finds-metropolitan-police-service-s-gangs-matrix-breached-data-protection-laws/)

2.4.9 Nothing should weaken today’s protections for children from advertising. They need strengthened and all targeted advertising based on profiling children should end. Weakening this would inevitably mean families could be bombarded with even more ads from school for all the companies that engage with children through education and would be detrimental.

3.2.4 Today’s mechanisms to protect transfers abroad are inadequate because they are not enforced. New enforcement of existing law would be welcome.

3.3.3 The proposals for reverse transfers must not create loopholes to enable data washing of personal data collected unlawfully abroad, to then receive a special exemption status.

3.5 Repetitive use of derogations is currently restricted by the UK GDPR recitals and in European Union regulatory guidance because special cases, are  special for a reason. Making them routine removes that recognition and should not be done. This is simply saying we’ll routinely weaken the protections placed on this by the EU and therefore is a risk to adequacy.

4.3 Nothing should weaken protections for children in their personal data used across public services. Nothing in Data Protection (DP) law prevented data sharing with a lawful basis in the pandemic. DP law has explicit exemptions for such situations and we should not normalise weaker governance for convenience in times that fall outside pandemic emergency situation. Not all private bodies are equal and not every processor engaged with a public task should be given the same legal basis as the controller. Nothing today prevents processing by third-party organisations / private organisations that process on behalf of the public sector. Why a change in law is needed is not set out. What is does appear to seek to do is make what is unlawful today, lawful (see case study DeepMind and the Royal Free https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/07/royal-free-google-deepmind-trial-failed-to-comply-with-data-protection-law/).

4.4. Good consistent definitions will matter, but agree with introducing compulsory transparency reporting on the use of algorithms in decision-making for public authorities, government departments and government contractors using public data. In addition clear routes for redress are required to contest automated decision making and request a human-made decision. Plus a ban is needed on automated decision making based on children’s biometrics, and AI in emotional and affective technology.

4.4.8 The police already has Part 3 of the UK DPA 2018 exclusively for law enforcement. These changes must not weaken protections for people on biometrics which is a growing area of concern and a growing risk for public trust in the police. It appears to suggest an increase in powers for police when their adoption of biometrics is a free-for-all and the police adoption of emerging technologies means data from immigration and biometrics databases and national police databases are being merged, but without any new protections. Para 302 is mistaken to seek to pursue an ambition to align more closely the commercial, law enforcement and national security processing frameworks because the public has consistently, over ten years (see list below), reflected a stronger level of trust (albeit not even distributed across the population) between processing for purposes in the public interest by state bodies, and by commercial companies. Watering down “who is the police” is detrimental to public trust and an increasingly blurred line between public and private actors. See also our submission to the Justice and Home Affairs Committee Inquiry New technologies and the application of the law https://defenddigitalme.org/wp-content/uploads/2021/11/Submission-to-the-Justice-and-Home-Affairs-Committee-Inquiry-New-technologies-and-the-application-of-the-law-%E2%80%94-defend-digital-me.pdf

5.2 Any changes to the governance of the ICO should strengthen their powers and independence, not reduce it. The push towards the ICO having to take on not only the existing duty to ‘take account of economic growth’ under the Deregulation Act 2015, but yet another new duty ‘ to have regards to competition’ is all about business not data protection for people and respect for our human rights. Greater enforcement is needed today, not even more ‘business-friendly’ approaches if business is to be seen to be trustworthy and to protect the reputation of good actors by removing the bad.

5.6 That the complainant must attempt to resolve their complaint directly with the relevant data controller before lodging a complaint with the ICO: This is not needed and is already routine practice. Where it fails is where people do not know about the processing that affects them. Do you know who’s got your personal confidential school records given away by the Department for Education for commercial re-use since 2012? In addition group representation should be adopted by the UK akin to Article 80(2) of the GDPR.

5.8 The role of the Biometrics Commissioner should not be moved under the ICO and any changes should reflect his own response https://www.gov.uk/government/publications/data-a-new-direction-commissioners-response. Data Protection is about ensuring the free flow of data in a standardised governance framework. The oversight for emerging technology is about more than just data. People also have rights to privacy, as well as data protection. Children have rights embodied in the UNCRC that speak to human dignity, and the ability to fully develop and flourish into adulthood without undue interference. The government seems to have forgotten all this in the consultation.  When Ayrshire schools adopted facial recognition in September the public outcry, criticism by Scotland’s First Minister in parliament, and in a debate by members of the House of Lords, and widespread criticism in media shows how sensitive the subject is. It is not adequate to have treated as a matter of data processing and protection and should not be within the ICO role of enforcement needed in this same subject. (See https://defenddigitalme.org/2021/11/04/biometrics-in-schools/)


 

Public engagement demands better protections to uphold public trust

The 2010 study with young people conducted by The Royal Academy of Engineering supported by three Research Councils and Wellcome, discussed attitudes towards privacy and the use of medical records and concluded: These questions and concerns must be addressed by policy makers, regulators, developers and engineers before progressing with the design, and implementation of record keeping systems and the linking of any databases.

In 2014, the House of Commons Science and Technology Committee in their report, Responsible Use of Data, said the Government has a clear responsibility to explain to the public how personal data is being used

The same Committee’s Big Data Dilemma 2015-16 report, (p9) concluded “data (some collected many years before and no longer with a clear consent trail) […] is unsatisfactory left unaddressed by Government and without a clear public-policy position.

Or see from 2014, The Royal Statistical Society and Ipsos Mori work on the data trust deficit with lessons for policymakers

2019  DotEveryone’s work on Public Attitudes

2020 The ICO Annual Track survey results

There is also a growing body of literature to demonstrate what the implications are being a ‘data driven’ society, for the datafied child, as described by Deborah Lupton and Ben Williamson in their own research in 2017.

This year our own work with young people, published in our report on data metaphors “the words we use in data policy”, found that young people want institutions to stop treating data about them as a commodity and start respecting data as extracts from the stories of their lives.

The UK government and policy makers, are simply ignoring the inconvenient truth that legislation and governance frameworks such as the UN General Comment no 25 on Children in the Digital Environment, that exist today, demand people know what is done with data about them, and it must be applied to address children’s right to be heard and to enable them to offer them strong privacy as well as data protection rights, and ways to exercise them.


Last updated: November 19, to add our full submission.