Local Government Lawyer Home Page


Sharpe Edge Webpage Banner

Through the ‘eyes’ of the ICO: facial recognition technology in the public sector

Sharpe Edge Icons ResearchThe use of Facial Recognition Technology (FRT) has recently come under scrutiny by the Information Commissioner’s Office (ICO) following the use of FRT in a school. Faizah Patel and Zena Stephenson look at the key issues.

FRT is a way of identifying or confirming an individual’s identity using facial recognition. FRT systems are used to identify people through photos, videos, or in real-time and is classified as biometric data which constitutes ‘special category personal data’ under data protection law and therefore needs to be treated very carefully. FRT is largely used for security and law enforcement, however, there is a growing surge in interest for use in other areas by public authorities which involve education, transport and public spaces.

ICO review

The ICO recently issued a statement on FRT usage by a local authority. North Ayrshire Council (NAC) had deployed FRT in their schools to manage cashless catering in their canteens. The ICO assessed the use of FRT and the processing of biometric data and whether FRT had been lawfully deployed. The ICO’s conclusion was that whilst it may be possible to use FRT in schools lawfully, the technology set up in this instance was likely to have infringed data protection laws. It is notable that as the biometric data processed concerned children, given ‘children merit specific protection with regards to their personal data’ under the GDPR (recital 38), this investigation resulted in heightened scrutiny.

Special category personal data needs to be treated with greater care as the processing of it is more likely to interfere with fundamental rights and/or give rise to discrimination. The lawful basis under GDPR Article 6 must be met with the addition of satisfying one of the Article 9 conditions. The ICO have highlighted why processing of biometric data is a cause for concern, given it is ‘more permanent and less alterable than other personal data; it cannot be changed easily’. Other characteristics, such as age, sex, gender or ethnicity can also be conferred.

The ICO advised that NAC clearly identify their lawful basis for processing personal data to ensure that it meets the requirements under UK GDPR. The ICO advised that the NAC did not take appropriate measures to provide transparent information, failing to communicate the rights to data subjects. Finally, the data protection impact assessment (DPIA) was subject to contention as it failed to fully mitigate the risks to individual rights and freedoms.

Other uses of FRT in the Public Sector

The use of FRT in the public sector has to date been a difficult technology to implement in the public sector. Whilst FRT can be used lawfully, it may not always be the right solution to achieve an authority’s aims.

In Bridges v Chief Constable of South Wales [2020], the use of FRT by the police force was challenged and the Court found that the use of automated facial-recognition by the force was not “in accordance with the law”.  The Court held that: (1) there was no clear guidance on where the technology could be used and who would be subject; (2) the DPIA was inadequate and not compliant; and (3) reasonable steps were not taken to investigate whether the technology had a racial or gender bias.

Another example of a use in schools was in 2019 when a school in Sweden had implemented FRT as a way of monitoring pupil attendance. The Swedish data protection authority took a strict approach which resulted in a school being fined for this use of FRT. The Swedish data protection authority found that data protection rules had not been followed in respect of data minimisation, protection of special category personal data and DPIA requirements. Overall, it was found that school attendance can be recorded in less intrusive methods.

Key takeaways

The ICO’s investigation highlights the caution that ought to be exercised when utilising FRT in the public sector. The use of FRT must be justified and proportionate if there are other means by which to achieve the objective then that should be considered. We have summarised below some of the key takeaways from this case.

Consent

Consent is often the lawful basis that is most relied upon. Consent needs to be informed, specific and freely given. The requirements under Article 13 of the GDPR need to be met to enable organisations to rely on this basis. The ICO found that simply signing a consent form in this instance was deemed unsatisfactory. Organisations using FRT must be aware the right to withdraw consent and the right to object and ensure that they are respecting individuals’ objections to the processing of their personal data.

Transparency

Data subjects must be made aware of how their personal data is being processed and what their rights are under data protection. It should be clear to individuals as to what personal data is being processed, why it is being processed and how long it is being processed for. Where children are the data subjects their age ought to be considered and privacy notices should also be tailored appropriately.

Accountability

The use of FRT can be intrusive and the benefits of the technology need to be weighed against the impact it could have for individuals. When completing DPIAs, thorough and appropriate risk assessments ought to be conducted. It is essential that suitable and adequate DPIAs are in place before deploying the technology.

Faizah Patel and Zena Stephenson are Solicitors at Sharpe Pritchard LLP.



LACAT BookFREE download!

A Guide to Local Authority Charging and Trading Powers

Written and edited by Sharpe Pritchard’s Head of Local Government, Rob Hann,

A Guide to Local Authority Charging and Trading Powers covers:

• Updated charging powers compendium          • Commercial trading options

• Teckal ‘public to public’                                    • Localism Act

FREE DOWNLOAD