News / Press releases

Call on the education sector to stop posting children’s faces online

On Data Protection Day 2024 we are calling on the education sector to stop posting children’s faces online and to better protect children’s biometric data and privacy.

Educational settings’ websites and social media pages often contain a large number of children’s fully recognisable faces. Nurseries, schools and colleges from the state and private sectors need to take action to stop exposing children to high risk of image theft. We are writing to educational settings on Data Protection Day 2024 to recommend they change this practice in the best interests of the children under their care today. The risks and harms posed by excessive digital exposure, can last a lifetime.

We know that educational settings routinely ask parents permission to post pupil photos online, but the risks outweigh the advantages for the child from the digital exposure in school marketing materials. We call on schools to stop using them.

We call on the government to stop the changes planned to UK data protection law going through parliament right now, that reduce protection in the digital environment for everyone, including children.

We recommend that the education sector

Stop posting children’s identifiable or full faces or features on your social media sites and website. Recommend to students and families that they do the same.

Better educate and train staff. Encourage staff to understand how adults’ decisions today affect children’s digital identity in future. Once distributed or made available to companies and in the public domain, biometric data (face, voice, eye and gait data) may not only put children at risk of malicious misuse but disempowers them from how those data may be used in their present and adult futures and loss of ID is for life (eg fingerprints and faces).

Stop using children’s biometric data in educational settings. Biometric facial detection and facial recognition readers are not necessary to use for access to settings. Cashless payment systems and library book borrowing do not reach the requirements of necessity and proportionality tests to use high risk biometric data for trivial applications.

Acknowledge and act on the principle that children’s futures must be championed by educational settings over and above the commercial interest of the use of images in marketing material and websites.

Call on the government to strengthen UK data protection law which is currently under threat as it is being made weaker.

 

Background: What are the problems and risks?

1. Children’s images posted online mean loss of control of those images forever

Web scraping uses automation to collect data from websites. Images can be collected at scale by third parties that you may not be able to see or identify or stop. While data protection law applies to personal information people post online and organisations must have a lawful reason for collecting and using people’s data, even when it is publicly available, there is little effective enforcement action. Data scraping at scale is very easy to do and very hard to prevent.

2. Social media companies control and may change their terms and conditions any time, to decide to whom online images are released, even from behind users log-in or ‘privacy’ settings

In January 2019, IBM released a collection of nearly a million photos scraped from image hosting service Flickr to academic and corporate research groups to build facial recognition products. In 2023, IBM has taken over a Fujitsu-built ‘matcher’ platform in a UK Home Office contract to develop facial recognition tools. The Home Office is in the process of combining IDENT1 and IABS into a “mega-database.”

3. Emerging technology can manipulate images at speed and scale

Once collected children’s images may be edited and added onto naked bodies or used for building AI position training databases beyond your control. These leaves children’s identity vulnerable to digital abuse, despite it not being their own body.

4. Online marketplaces for sharing AI models use real photos of real people

The training datasets used to instruct AI are built with and may be used to enable the creation of nonconsensual sexual images and films of real people, and at least one company has introduced a recent feature that allows users to post “bounties.”

5. Faces can be scraped to build and train facial recognition databases

Case study: Clearview AI is a company that collected 30 billion photos from Facebook and other social media users without their permission, creating a commercially available law enforcement identity database that critics call a “perpetual police line-up,” containing people’s pictures who haven’t done anything wrong. Children have no idea how that might be used in the future and could limit the use of their own choice when and how to use their biometric data as a secure ID. At the same time as more facial images than ever may be open to misuse by people other than the person whose face it is, the UK government is trying to expand the adoption of facial recognition in public places, and the biometric access market continues to grow.

6. Biometric readers in schools have been ruled in breach of data protection law in France and Sweden, and the UK ICO found fingerprint use in Scotland ‘likely unlawful’ in 2023

The use of biometrics in school is an excessive, unnecessary and disproportionate interference with children’s privacy rights that are enshrined in Article 16 of the UN Convention on the Rights of the Child and Article 8 of the European Convention on Human Rights. The standards to collect and process children’s biometric data should be higher even than for adults, as “they may be less aware of the risks involved” (Recital 3 UK GDPR). On 31st January 2023, the Information Commissioner’s Office (ICO) issued a letter to the North Ayrshire Council (NAC) regarding the use of Facial Recognition Technology (FRT) in schools and found it “likely unlawful”. We recommend educational settings do not use biometric data readers, face or fingerprint, at all.

7. Children can only rarely consent to the use of images in marketing due to their limited understanding and the power imbalance in the school setting

There is no necessity for a child’s face to be published online outside a school in order to deliver their education. The educational environment is one of significant power imbalance which therefore affects the nature of agency and autonomy and ‘freely given’ consent. It is nearly always invalid “on behalf of” children from parents even where they have a say in the decision, since it is rarely informed of the risks to weigh up a choice (see all of the above). The UN Convention on the Rights of the Child, UN Convention on the Rights of Persons with Disabilities (UNCRPD) Article 22 and the UN General Comment No.25 on the rights of the child in the digital environment, all recommend better protection for children and their privacy rights.

 


References

AI-generated naked child images shock Spanish town of Almendralejo (BBC) 2023
https://www.bbc.co.uk/news/world-europe-66877718

IBM didn’t inform people when it used their Flickr photos for facial recognition training (The Verge) 2019
https://www.theverge.com/2019/3/12/18262646/ibm-didnt-inform-people-when-it-used-their-flickr-photos-for-facial-recognition-training

IBM wins £54.7M project, taking business from Fujitsu as Home Office consolidates work (The Register) 2023 https://www.theregister.com/2023/08/02/ibm_biometric_deal/

Giant AI Platform Introduces ‘Bounties’ for Deepfakes of Real People (404 Media) 2023
https://www.404media.co/giant-ai-platform-introduces-bounties-for-nonconsensual-images-of-real-people/

Clearview AI scraped 30 billion images from Facebook and other social media sites and gave them to cops: it puts everyone into a ‘perpetual police line-up’ (Business Insider) 2023 https://www.businessinsider.com/clearview-scraped-30-billion-images-facebook-police-facial-recogntion-database-2023-4

Misapplied consent to justify FRT in schools: what is the legal basis? (Information Governance Services) 2023 https://www.informationgovernanceservices.com/misapplied-consent-to-justify-frt-in-schools-what-is-the-legal-basis/

Convention 108 Data Protection Guidelines for Children in Educational Settings (Council of Europe) 2020 https://www.coe.int/en/web/data-protection/-/protect-children-s-personal-data-in-education-setting-

The State of Biometrics 2022: A Review of Policy and Practice in UK Education (Defend Digital Me) 2022 https://defenddigitalme.org/research/state-biometrics-2022/

 


 

About Defend Digital Me

Defend Digital Me is a not-for-profit advocacy organisation based in the UK. It is a call to action to protect children’s rights to privacy and family life. We are working towards safe, fair and transparent data, as part of our vision of a rights’ respecting environment in state education, in England, and beyond. We are funded by philanthropic grants and staff are volunteers. We do not accept any corporate funding. We are non-partisan and have no religious or political affiliations.