News / National Pupil Database

Baseline beaten back to 2021. Time for change in the Accountability system

We are delighted that a delay to the Reception Baseline Assessment (RBA) has been announced today. The new national test was for every 4 year old starting school this September. It was in addition to local assessments that teachers already make routinely as they get to know a new cohort of children. The teachers were *not* going to be given the test score results from this new RBA, but it was to be sent to the National Pupil Database instead.

As Schools Week reports today, this comes after education secretary Gavin Williamson reiterated his support for the new test in the House of Commons only four days ago.

Asked by shadow Schools Minister Margaret Greenwood whether he would “do the right thing and abandon their plans to bring in reception baseline assessments”, Williamson replied: “No, we will not.”

The test is due to eventually replace key stage 1 SATs and claims to be able to provide a baseline measure on which pupil progress scores will be measured at Key Stage Two.

Congratulations to everyone involved in the More-Than-A-Score coalition, unions, teachers and parents who called for it to be scrapped. We are hoping it is kicked into the long grass for good. Here is why.

The RBA fails to meet necessary Data Protection standards

The processing of each data item, must be necessary. There would be ways to reasonably perform the data processing tasks in a less intrusive way, such as through sampling, or pseudonymising data before national collection. There is certainly no necessity for the child, or to determine any progress measure for these data to be sent to the national database on a named basis, and the DfE should be required to justify why a less invasive methodology was not considered.

The Department has been unable to date to explain *how* these data will be used as a progress measure. They say it will be, but cannot explain how. We challenged therefore that its aim is not proportionate, as it cannot be justified to even fulfill the publicly stated purpose.

We have raised our concerns with the test with the DfE, NFER and Information Commissioner, and its flawed assessment of risk and design that it omitted from the Data Protection Impact Assessment:

  • This test data will create the very first record in the National Pupil Database for thousands of children, who did not go to Early Years settings and don’t already have a record. (This is not communicated to parents.)
  • The National Pupil Database that the children’s data will be sent to, are misused for immigration enforcement monthly by the Home Office.
  • There is *nothing* in place to stop the children’s personal details being given away for commercial re-use as the rest of our children’s records are, and not anonymised, but as raw data from the NPD. The National Pupil Database is the dataset from which children’s data currently are given away to over three hundred releases of identifying, pupil level data a year to mainly commercial re-uses, as well as to charities, think tanks, members of the press and to researchers.
  • Each child’s personal data will be stored in the National Pupil Database but the DPIA makes no mention of what that means in terms of linkage, secondary re-use, and safeguards on scope creep of future uses of the data, or how it will comply with indefinite retention requirements under GDPR, such as pseudonymisation which does not happen today, since it will be kept forever — and even linked at 18 at the Department for Education without students’ knowledge to their equality monitoring sexual orientation and religion *at named level* from UCAS applications.
  • There is no public information around the automated decision making involved within the design of the Reception Baseline Assessment test — this was not addressed in the NFER DPIA risk assessment either.

On Monday, the Schools Minister declined in a written question to Parliament, to publish the basis on which he determined that the Information Commissioner’s Office was “content with the proposals for the reception baseline assessment’s use of data” as the Department had written to third parties.

The Minister also confirmed again, in answer to written question 60644 that the Assessments results will not be shared with the school attended by the pupil.  We have asked repeatedly, how therefore the Department can expect schools to be able to meet their data protection obligations to families, on behalf of the child, to be able to exercise:

  1. The right to be informed
  2. The right of access
  3. The right to rectification
  4. The right to restrict processing
  5. The right to object
  6. Rights in relation to automated decision making and profiling

Automated decision making and why it is a problem

The test is adaptive, which means not every child is *offered* all of the same questions that have the ability to get the same high score of 45. Some will only be offered questions that could get them 26 points. It will include pre-designed “routing”, the number of marks presented will vary from pupil to pupil. 

“These routing rules will be automatically applied by the online recording system.” (See NFER guidance page 5).

The logic in the questions cannot be seen, controlled, or influenced by the teacher administering the RBA test so there is no human intervention within the running of the test if and how a child can be offered more questions to get the full range of points on offer — the system will limit how many questions are possible to get by design. This means that the school staff member is unable to explain the logic of why Child A was asked x questions and able to get 45 marks on offer, and Child B was not, beyond a very high level grasp of the principles behind adaptive testing.

This pigeon-holes children by its very nature of the technical design, it decides on the basis of a couple of answers to start to limit the questions to easier ones using a profile of what ‘a child like you’ is assumed by the design *could* achieve, and limiting what questions are asked. This is *deeply flawed* as a comparative measure as a ‘Baseline” test. This method of testing fails to address that learning, especially at this age group, is not linear.

The outcomes will undoubtedly have unconscious influence on staff which could have significant effect on their interventions and treatment of the child— in particular if a child was considered more or less ‘bright’ than the test suggested compared to peers.

This kind of profiling by groupings, and suggesting a child fits X category, needs particular safeguards, should be explainable, challengeable and should not routinely apply to children.

There has been no meaningful information provided in the Baseline Pilots OR trials since 2018 by the NFER or the DfE. You must inform individuals if you are using their data for solely automated decision-making processes with legal or similarly significant effects. This applies whether you have received the data directly from the individuals concerned or from another source.

Schools have been and continue to be unable to provide meaningful information about the logic involved and what the likely consequences are for individuals, required in Article 22(1) “the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and envisaged consequences of such processing for the data subject.”

Not only has the processing of the RBA failed to meet the necessary standards of GDPR and the DPA 2018 in the pilot and trials, it had continued to to do so in its current design and plans for its practical implementation.

Right to Object

At very least a child’s / parent’s Right to Object must be offered  and respected in this very broken design model for mass data surveillance of 700,000 four year olds with urgency, which the Department has refused to do, to date. We will continue to press for this should the test go ahead at any time.

In February, the government announced it was continuing with the statutory introduction this year, despite a study finding half of teachers thought it had a negative impact on children. Thousands of parents and school staff supported the More Than A Score campaign to oppose its introduction.

The government had not yet passed the new statutory regulations it needed to begin the test this Autumn and we have called for the right to be upheld in any drafting. We hope now, that may become unnecessary if the RBA is gone for good.

Looking ahead

We will continue to challenge children’s profiling in the national Accountability System. Now is the time for it to change.

Our particular interest in this area is acute because the SATs scores in particular from primary children, designed as *system* / school accountability measures, are increasingly being used across Local Authorities / public sector as *individual* indicators of achievement linked to poor outcomes linked to poor parenting — ie as proxies for risk. Local Authorities are feeding SATs results into linked multi-source datasets, including DWP, health and police, that are then predicting risk of being take into children’s social care for example.

This pressure on Local Authorities to do so is only growing under the COVID-19 crisis and more children thrust into poverty and stress factors at home. Commercial companies are taking the chance to pitch their (independently unproven) Artificial Intelligence and predictive data analytics on the public sector, despite concrete academic research from the Rees Centre, Turing Institute and

What Works to say such software is potentially “dangerous” and shouldn’t be used, not only from an ethical stance but that their efficacy is unproven.

It’s nonsense to use children’s statutory test scores for these purposes, and potentially unlawful.

It’s also extremely questionable that KS2 test scores are fairly reflective of ability, and we are concerned that they are used even as reference measures in GCSE scoring.

The NAHT and others’ work over the last 5 years, suggests an alternative way forward for measuring accountability.  The FED also has an up to date and similarly positive approach. Now may be the right time, like no other, to reshape the Accountability System and with it, address staff workload, since SATs have been scrapped this year and Progress 8 will be affected — aside from the fact that assessment of achievement as-was pre-COVID-19, will be inadequate to take into account the effects of changed modes of education and the pupils for years to come.


For references

1. Providing educational accountability for Local Authorities based upon sampling pupils within schools: moving away from simplistic school league tables  (2020) Goldstein, H., Leckie G, and Prior, L. https://arxiv.org/abs/2002.09897

2. Should we adjust for pupil background in school value-added models? A study of Progress 8 and school accountability in England (2018) Leckie G, and Goldstein, H. https://arxiv.org/abs/1811.09240

3. NEU 2019 Annual Conference
Source link: https://neu.org.uk/press-releases/accountability-and-assessment\”The resolution demonstrates the conviction of those who work in English education that our system of assessment and accountability is not fit for purpose”“At every stage, from reception class to GCSE, teaching to the demands of the test, and of Ofsted, dominates schools, imposing a culture of fear, overwork and compliance.“Today’s resolution points to growing criticism of the way accountability and assessment work in England. From the OECD to the CBI, from educational researchers to trade unions, the educational costs of the system are increasingly recognised. The case for change is overwhelming. The Union will work with a wide range of allies to bring change about.”

4. The NAHT (National Association of Head Teachers)
Punitive accountability system to blame for teachers leaving the profession, says NAHT, responding to NFER teacher workforce report.” https://www.naht.org.uk/news-and-opinion/press-room/punitive-accountability-system-to-blame-for-teachers-leaving-profession-says-naht-responding-to-nfer-teacher-workforce-report/
2018: “The accountability system provides little benefit to the pupils, parents and staff at the vast majority of schools in this country that are not failing.”
2019: The principles of effective school-to-school peer review, NAHT, September 2019: http://bit.ly/2qh6t0e

5. UCL Institute of Education  (February 2019) Taking back control of school accountability https://ioelondonblog.wordpress.com/2019/02/28/taking-back-control-of-school-accountability/
“The general view was that the teaching profession will need to step forward and take ownership of accountability, to push for these kind of changes.  A reconfigured and revitalised network of Teaching Schools was also mooted, as was a system of peer review (which the NAHT is currently working on taking to scale). A currently fashionable notion is that teachers should take back control of their profession by taking hold of the evidence. But, as one of our panellists asserted, this is arguably a naïve, if not flawed position. We live in a democracy, not a technocracy, and education is highly political. We need a confident, proud teaching profession that is closely engaged with the often ideological and value-laden political debate, to make the case for more intelligent and productive accountability measures, to policy makers and public alike, and to make local systems like peer review stand scrutiny.