Last week, the Information Commissioner’s Office (ICO) issued a statement that North Ayrshire Council’s use of Facial Recognition Technology (FRT) in its schools (which involved processing the personal data of pupils ranging in age from 11-18 years) infringed UK data protection laws.
In line with the new Information Commissioner’s approach to regulation, which involves greater transparency and sharing of the rationale underlying the ICO’s approach to enforcement, the ICO’s publication of the letter sent to North Ayrshire Council provides useful insights on what level of transparency is expected for all organisations working with children when using children’s personal data and also when relying on consent as a lawful basis.
Back in October 2021, the BBC and other news media reported how concerns had been brought to the ICO’s attention about how the Council was using this technology in school canteens.
This was during the Covid pandemic, and the technology was introduced to minimise physical contact and facilitate cashless transaction from online accounts for school meals. A camera at the cash till, operated manually by catering staff, took a single still image of a pupil which FRT matched to a biometric facial template which had been captured previously.
Summary of ICO decision
Having investigated, the ICO concluded that the FRT was operated in a manner likely to infringe UK GDPR requirements. In particular, it found that:
- the Council could not establish a valid lawful basis for this processing; in particular, it could not rely on having obtained explicit consent to legitimise using special category data (which the biometric data was in this instance, since its use involved processing for identification purposes and therefore came within the legal definition of ‘special category’ data)
- information provided to the children affected and their parents was insufficiently transparent for them to understand, and not all of the information legally required was provided
- there were deficiencies identified with the Council’s data retention procedures (how long personal data was kept) and in how the legally required Data Protection Impact Assessment (DPIA) was carried out in this situation.
The ICO’s analysis included recommendations for the Council to make improvements to its school based FRT operations and ensure staff received training so that they understood and applied data protection law and ICO guidance.
Explicit consent – what went wrong?
It was the Council’s responsibility to ensure it had a valid lawful basis for processing children’s data. The ICO’s view was that explicit consent was the appropriate lawful basis in this case, but that the Council was unlikely to have met requirements for valid consent. Its rationale for this conclusion included how:
- the consent statement was not specific (it was worded generally so that it could apply to a broad range of processing activities)
- the consent wording did not present FRT as an option, but instead implied the system would be introduced irrespective of the wishes of the children or their parents.
The ICO highlighted the inherent power imbalance between the Council and the parents/pupils affected, and how those affected might have felt compelled to consent because of how the information was worded and how introduction of FRT was being presented. In such circumstances, the ICO argued, consent was unlikely freely given. In its view, the Council should have been made clear to pupils and parents that there was no requirement to consent to FRT in order to obtain a school lunch, and alternative options which were as easy to use as the FRT, should have been made available.
Another issue flagged by the ICO was how the Council relied on parental consent for pupils aged 12-14, instead of obtaining consent directly from the pupils themselves. It reminded the Council of UK law on this point, which make it clear that in Scotland pupils aged 12 or over are presumed to be of sufficient age and maturity to give the necessary consent, unless the contrary is shown. In practice, schools may still need to verify parental consent for those children without competence to consent for themselves, but the onus was on the Council to demonstrate that the child in question was unable to provide their own consent, and make that decision on a case-by-case basis (rather than applying a blanket approach to an entire group of 12–14-year-olds).
Consideration should have been given to protecting children’s rights, the ICO also highlighted. In this context, it flagged how it was unclear whether the pupils aged 12-14 knew or understood that parental consent had been sought on their behalf, or whether the pupils affected understood how to object, or were given the opportunity to do so.
Lack of transparency
Many of the ICO’s comments about how the Council was unlikely to have obtained valid consents in this situation related to lack of transparency.
The ICO considered it vital that the Council explained in age-appropriate language how children’s data would be collected, used, stored and retained. It flagged, for example, how information on retention was not sufficiently transparent, containing a broad generic reference to personal data being retained “for as long as necessary.” In its view, pupils should have been given a clearer indication of how long their biometric data would be retained.
While the Council had taken steps to alert pupils and parents to the processing of biometric data via a number of channels (which included direct emails, social media and through the provision of FAQs) it had fallen short of data protection law requirements because it had not ensured the content of its privacy notice was provided to children in a concise, transparent, intelligible and easily accessible form, using clear and plain language. In particular, it had not tried to explain, in child-friendly terms, the potential impact of the processing of biometric data.
The ICO’s approach in this case will come as no surprise to those who have followed emerging trends in the past year.
Last July the regulator published its three-year action plan, ICO25, and one of its priorities during this first year is to safeguard and empower the public, particularly the most vulnerable groups (such as children) through a better understanding of how their information is used and can be accessed.
Privacy implications of new technologies also continues to be a concern for the regulator, although as its statement on this case makes clear, the ICO also want to ensure that educational organisations can access the benefits of new and emerging technologies. But it is confident that, with appropriate assessment and care, FRT and similar technologies can be used lawfully, whilst also protecting children’s data and safeguarding their rights.