A school dining room at lunchtime must be a pretty chaotic environment, and anything that can be done to speed up the passage of pupils through the queue would surely be welcomed. Rather than each pupil having to provide a pass or money at checkout, why not use a solution which captures a picture of each pupil, matches the identity with the pupil’s school record and deducts the relevant charge for the food selected from the pupil’s account (or not, if the pupil qualified for free school meals)? Installation of this “cashless catering” system in each school would require some upfront investment, and there would be a monthly service charge to be paid to the system provider, but this would be more than justified by the removal of the blocked queues when a pupil has forgotten their pass, or issues when a pupil doesn’t have the right change with them. In September 2021, North Ayrshire Council decided to implement this system across nine schools. What could possibly go wrong?

Facial recognition technology and GDPR

Implementation of facial recognition technology (FRT) in this way raises some potentially significant issues under UK data protection law (and would likely raise similar issues under EU data protection law as well). Collecting a facial image of each pupil, matching it to a school record, charging each account – all of this (and more) counts as processing of personal data under the UK’s data protection regime, and would need to comply with the requirements in order to be lawful. Crucially, FRT is likely to use biometric data in the capture and use of facial imagery, and so any processing not only requires one of the “normal” lawful bases but will also require one of the more limited special category conditions.

The implementation of any new technology which is likely to result in a “high risk” to the affected individuals also requires a formal “data protection impact assessment” (DPIA) to be undertaken. This requirement isn’t limited to FRT; it is relevant to all new technology which meets the threshold. It is certainly good practice to undertake a DPIA as if the ICO (the UK’s data protection regulator) is interested in the implementation of a new piece of technology, it is likely to ask for a copy of the DPIA, or for an explanation of why there isn’t one.

FRT and privacy concerns

Some key points to consider when introducing a new piece of technology, such as FRT, which raises potential privacy concerns are:

  • Is it justified and proportionate? Is it the only/best option to achieve the aim? This will be particularly important where the personal data relates to children or vulnerable people (as in the case in North Ayrshire) but will be important for adults as well.
  • Have you told the relevant people? Burying a reference in a dense privacy policy is not enough; you must be open and transparent with the affected individuals, in language that they will understand, of what the technology does and how it impacts them.
  • Are you sure it is lawful? What is your lawful basis for the processing?
  • Have you conducted a thorough DPIA? If not, why not? Be prepared to explain this to the ICO.
  • Consultation – a key part of a DPIA will include consultation with affected individuals.

So, what happened in North Ayrshire? Following press reports and complaints, the ICO began to investigate within weeks of the launch, and by 11 November 2021 the system had been withdrawn and all images deleted. The Council had conducted a DPIA but the ICO was scathing about it. It deemed the Council’s approach to obtaining consent (the lawful basis which it finally identified as being used) as ropey, the information provided to the pupils was not sufficient, the data retention scheme wasn’t compliant, there was a distinct lack of consultation with the pupils and their parents, and there was a lack of senior sign-off for the DPIA. All of the costs incurred in the installation of the FRT were wasted, together with the significant amount of management time that was surely taken up in the implementation and removal of the system and dealing with the ICO investigation. The Council appears to have escaped any further ICO action (such as a monetary penalty notice), and there do not appear to be any actions brought by affected individuals, but it can be assumed that the Council does not consider that to be a victory.

The North Ayrshire experience is a good reminder to get your house in order first before implementation; the last thing you want is for that fancy new piece of technology to cause more problems than solutions.

If you have questions about using facial recognition technology, please contact Daniel Tozer.

For further information please contact:

This article is for general information purposes only and does not constitute legal or professional advice. It should not be used as a substitute for legal advice relating to your particular circumstances. Please note that the law may have changed since the date of this article.