News / news

Essex school reprimanded by ICO for facial recognition in canteen

The ICO has issued a reprimand in July 2024 to another school that broke the law when it introduced facial recognition technology (FRT).

According to the data watchdog, Chelmer Valley High School in Chelmsford, England, first started using the technology in March 2023 to take cashless canteen payments from around 1,200 pupils. It failed to carry out a risk assessment and the school had also, “assumed consent” and not actively obtained permissions to process the biometric information, only objections where parents or carers had opted children out from the processing. The children were not given the opportunity to decide whether they did or didn’t want to be used in this way.

The school also failed to seek advice from their DPO in relation to the introduction of the facial recognition technology, nor did they consult with parents or students. Both of these suggest data protection is not a whole school approach nor embedded into its thinking on safeguarding. We’ll be looking to see if Ofsted makes any comment in future.

Interesting to note in this ICO ruling, the reference is only to data protection law where the school had been relying on assumed consent for facial recognition. Rightly the ICO points out that, “Article 4(11) of the UK GDPR is clear that consent requires an affirmative action, and as such consent on an opt-out basis would not have been valid or lawful.”

It is missing, however, any mention of the Protection of Freedoms Act 2012, Chapter 2. This school did not only break data protection law, but the law specific to schools and biometric data. In that, if either parent or the child objects, it is the most protective outcome, the ‘no’ that is upheld. Why is this not mentioned, and was this duty met?

The ICO recommendations say that, “while several areas of non-compliance are apparent, the Commissioner has decided to focus on the controller’s actions prior to the processing.” We also wonder therefore, what else is still ‘apparent’ and being left out or not enforced? In particular with the supplier as the data processor.

The questions without answers include:

The distributor also states that, “GDPR gives the right to individuals to access their personal data and supplementary information held about them. Currently this information is not held in a single report. CRB Cunninghams intend to make a tool available which allows all data to be supplied in a single report to help satisfy these requests should they arise. This tool will be issued as a scheduled upgrade but can be made available on request.”

Since there is always a requirement to offer a non-biometric alternative, it can never be *necessary* to use biometrics at all. If data processing cannot meet the test of necessity it is not lawful. We wonder therefore why the UK Data Protection Authority fails to live up to the standards for school children set by its counterparts in Sweden and France where schools were stopped from using facial recognition.

Biometric technology does not belong in schools. We don’t support giving recommendations on how to use it ‘better’.


For background information on biometrics in schools in the UK see: