English Court of Appeal provides guidance on the lawful use of automatic facial recognition technology
The Court of Appeal has ruled automatic facial recognition (‘AFR’) technology used by South Wales Police was unlawful. The judgment provides useful guidance and a legal framework for all users, and potential users, of this important technology.
This is a key decision on the use of live automatic facial recognition (‘AFR’) by law enforcement, and addresses the important tension between individual rights and the benefit of crime prevention. In Bridges v South Wales Police  EWCA 1058, the use of AFR by South Wales Police (‘SWP’) was unlawful, as it was unlawfully intrusive, contrary to Article 8 of the European Convention on Human Rights. The ground-breaking case involved consideration of the interface of data protection, human rights and equalities legislation, and the Court’s decision sets out a legal framework for the deployment of AFR by police forces and other potential users.
Background to the case
State-operated surveillance is not new in the UK, with CCTV cameras deployed since the 1960s. More recently, in 2017, SWP trialled a form of facial recognition technology known as ‘AFR Locate.’ In simple terms, AFR Locate is a form of biometric identification that uses the live feed of CCTV cameras focused on a specific area, for example a crowd or an area of high pedestrian footfall. The system can capture facial images at a rate of up to 50 faces per second, which are streamed directly to the live facial recognition software system. Police then compare these images with ‘watch lists.’
The system measures the structure of each face, including the distance between the nose, mouth and jaw, to create a facial template. These digital templates are compared in real time to the facial templates of individuals on the ‘watchlist’ of offenders. When a live match is identified by the system, police officers on the ground are alerted who then decide whether to approach and apprehend the individual. If no match is detected, the software will automatically delete the facial image captured from the live feed.
The civil rights group Liberty brought a legal challenge on behalf of Ed Bridges, a resident of Cardiff. Mr. Bridges was caught on camera by SWP on two occasions; first during a Christmas shopping trip in 2017, and then at a peaceful anti-arms protest in 2018. Mr. Bridges argued that SWP’s use of AFR Locate contravened his human rights, as well as data protection and equality legislation.
The decision of the High Court
In Bridges v South Wales Police  EWHC 2341, Mr. Bridges claimed that the use of AFR Locate by SWP had interfered with his right to respect for private and family life under Article 8 of the European Convention on Human Rights (‘ECHR’). Mr. Bridges also claimed that SWP’s use of the technology had breached the Data Protection Act 1998 (‘DPA 1998’), and its successor, the Data Protection Act 2018 (‘DPA 2018’), which implements GDPR. He also claimed that AFR software was unable to identify accurately women and non-white individuals, and so its use did not comply with the Equality Act 2010. In summary, Mr Bridges’ claim was dismissed by the High Court on all grounds and SWP’s use of AFR Locate was found to be proportionate, lawful and consistent with human rights and data protection legislation.
Despite the loss in the High Court, there were important findings on Article 8 ECHR, and the Court determined that it was engaged as along with fingerprints and DNA, AFR enables the extraction of unique information and identifiers about an individual, thereby enabling their identification. The High Court did not consider the short retention time of the data was sufficient to justify the act, and avoid the Article 8 require, but found that the interference with Article 8 rights was justified on public policy grounds as it was in accordance with the law in regards to crime detection and prevention. Specifically they said that the Data Protection Act, the Surveillance Camera Code and SWP’s own policy documents, demonstrated that the infringement of Article 8(1) rights which is consequent on SWP’s use of AFR Locate, occurs within a legal framework that is sufficient to satisfy the ‘in accordance with the law’ requirement in Article 8(2).
Consequently the data protection legislation had not been breached, as the processing of biometric facial data was necessary and proportionate for SWP’s legitimate interest to detect and prevent crime. SWP’s Data Processing Impact Assessment (‘DPIA’) was found not contravene data protection legislation as safeguards were in place to identify the personal data being retained, and under what circumstances. However, although the High Court found the current legal framework to be adequate, it warned that it should be subject to periodic review.
The Court of Appeal
Mr. Bridges appealed on five grounds and was successful in three:
Ground 1: Sufficient legal framework
The unanimous decision of the Court of Appeal was that the use of AFR Locate had breached Mr. Bridges’ rights under Article 8 ECHR because although a legal framework was in place (i.e. the DPA 2018 and secondary legislation such as the Surveillance Camera Code, and various guidance including SWP’s own policy documents), there were ‘fundamental deficiencies.’
In particular, the Court found that there was insufficient guidance on where AFR Locate could be used, and the criteria for placing individuals on the ‘watchlist.’ As a result, the Court found that the legal framework was not sufficiently foreseeable to be compatible with Article 8 ECHR, as there were insufficient constraints on the discretion of police officers to use the technology.
Ground 3: Compliance with Section 64 of the DPA 2018 (deficiencies in the DPIA)
The Court of Appeal found that the Data Protection Impact Assessment carried out by SWP was insufficient, as it did not adequately consider the risks to the rights of data subjects as it had been prepared on the basis that Article 8 ECHR was not breached. However, the Court rejected certain arguments advanced in conjunction with the Information Commissioner’s Office (ICO), an intervener in the proceedings, as falling outside the alleged material error of law specified. These arguments included the assertion that a false positive would result in individuals having their biometric data retained for longer periods.
Ground 5: Public Sector Equality Duty
The High Court’s finding that SWP had complied with its Public Sector Equality Duty (PSED) under Section 149 of the Equality Act 2010, was also overturned. The Court of Appeal noted that there was no clear evidence that the software was in fact biased on the grounds of race and/or sex.
However, the Court held that the PSED required public authorities to consider the discriminatory potential impact of their policies. SWP was found not to have complied with its PSED obligations because it had not taken reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds.
Two of the grounds advanced during the appeal were not successful:
Ground 2: SWP’s use of AFR was not a proportionate interference with rights under Article 8 ECHR
The Court of Appeal addressed this ground even though it had held that the interference with rights under Article 8 ECHR was not in accordance with the law. The Court of Appeal found that the High Court had properly conducted a weighing exercise between the benefits of using AFR and the potential impact on Mr Bridges. The Court concluded that any impact on Mr Bridges was minimal and so the use of AFR was proportionate under Article 8(2) ECHR.
Ground 4: Compliance with Section 42 of the DPA 2018 (‘an appropriate policy document’)
The Court of Appeal disagreed that SWP did not have ‘an appropriate policy document’ in place when carrying out sensitive processing, as required by Section 42 of the DPA 2018. The Court considered that as the DPA 2018 was not in force at the time of deployment, there was not a failure to comply with the law. Furthermore, the Court referred to the fact that SWP had since updated its appropriate policy document in light of subsequent guidance issued by the ICO.
SWP has confirmed that it will not appeal the Court of Appeal’s judgment. The police force has said that the decision helpfully points to a number of policy areas requiring attention, and that it intends to continue deployment of AFR once it is satisfied that it can properly address the issues raised.
AFR technology is an important tool for police authorities and its use is likely to become more widespread. Although focused on the specific deployment of AFR by SWP, the ruling has significant implications for police forces nationwide and provides much needed legal certainty to those police forces intending to use AFR Locate and similar technology.
The decision highlights that whilst AFR can be deployed lawfully, careful and ongoing consideration of the privacy implications is required both before and after deployment.
Indeed, earlier this year, the Metropolitan Police began to use a similar form of AFR technology on the streets of the capital, but then announced that it was considering pausing expansion of the deployment as it had identified privacy concerns similar to those addressed by this case.
The ICO, which has produced detailed guidance on the use of AFR, has welcomed the Court of Appeal’s decision. In a statement, the ICO said that the judgment is a useful step in providing a clear legal framework in order for the public to have trust and confidence in the use of AFR technology.
The Surveillance Camera Commissioner’s Office also welcomed the Court of Appeal’s decision and has urged all parties, including the Home Office, to reflect on the judgment and to act in the public interest by updating the Surveillance Camera Code of Practice. It is likely this will now finally be addressed by Government since the commission has raised this concerns going back to 2015, and the Court of Appeal decision we hope will lead to an update of the guidance and codes for surveillance.
In the current pandemic, public authorities will also need to consider the wearing of facial masks by a large section and how that may impact the lawful deployment of AFR and carrying out compliant DPIAs, given the impact on the ability for the system to function with obscured faces, and that appropriateness of using the system to ensure face masks are being warn where legally required.
- The decision of the High Court can be found here.
- The Court of Appeal’s judgment can be found here.
- The statement by the Surveillance Camera Commissioner’s Office can be read here, and the ICO’s Opinion on the use of live facial recognition technology by law enforcement in public places can be found here.
For more information please contact Partner, James Tumbridge at email@example.com.