The Court of Appeal has given its judgment in Bridges v South Wales Police, on the lawfulness of arrangements in the force's use of facial recognition software on members of the public. The Court has declared that aspects of the way in which the system was set up and deployed were unlawful. 

The case incorporates issues of human rights, data protection and equalities law, and is of relevance to police forces and the wider public sector where automated facial recognition and similar 'surveillance' technologies are being considered.

What was the technology South Wales Police used?

The force was piloting 'AFR Locate', which automatically compares biometric data in faces 'seen' by the system in a live feed from a vehicle-mounted CCTV camera  to faces on a  'watchlist' of images of faces prepared by the police. If no match against the watchlist is detected, the software automatically deletes the facial image captured from the camera feed. If a match against the watchlist is detected, the technology produces an alert and a police officer would review the images to determine whether to make an intervention. Data from that element of the system was retained according to various retention periods of up to 1 month. The system was deployed overtly in the street and at events on about 50 occasions. Mr Bridges was a member of the public who was 'seen' twice by the system in 2017, but was in the 'no match' category (i.e. he was not on a watchlist and his data was deleted near-instantaneously).  The Court commented that the retention framework used by the force was crucial. 

The High Court had previously held that the deployment of the technology was lawful. Mr Bridges appealed on five grounds, three of which were successful. 

What were the successful grounds of challenge?

  1. Mr Bridges alleged that the legal framework governing the use of AFR was not comprehensive enough to be compatible with human rights law (in particular Article 8 of the European Convention on Human Rights). The legal framework should be 'accessible' and 'foreseeable' to those it could affect. The Court of Appeal held that although the legal framework comprised primary legislation (Data Protection Act 2018), secondary legislation (The Surveillance Camera Code of Practice), and local policies promulgated by SWP, there was no clear guidance on where AFR Locate could be used and who could be put on the watchlist. The Court held that this afforded police officers too broad a discretion in how the technology could be used meet the requirements of Article 8, and that (in the absence of legislation or national guidance which dealt with those issues), South Wales Police's policies were not prescriptive enough to give certainty on these issues. This meant that the AFR system legal framework was not sufficiently foreseeable to be compatible with Article 8. 
  2. A 'technical knockout' that the force's Data Protection Impact Assessment was insufficient, because the DPIA did not consider the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies in the legal framework that the Court of Appeal found existed.
  3. The force had not done enough to comply with its duties under the Public Sector Equality Duty (PSED) under s. 149 of the Equality Act 2010. The PSED imposed positive, proactive duties which required the force to make enquiries about whether the AFR Locate software had bias on racial or sex grounds. The Court noted that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex. However, because the force had not taken reasonable steps to investigate this ahead of time, it had not complied with the PSED.

What grounds were not successful?

The Court commented that the actual interference with Mr Bridges (and everyone else's) human rights from the AFR was minimal.  Nobody was wrongly arrested (and the system had led to various important proper arrests, listed by the Court). Nobody apart from Mr Bridges complained about the technology. Any interference with Mr Bridges' own Article 8 rights would have been very limited because his data would have been near instantaneously deleted. No personal information relating to him would have been available to any police officer, or to any human agent or retained. Therefore (if there had been an Article 8 compliant underlying legal framework), it would appear the use of the technology by SWP could have been a lawful interference with Mr Bridges' Article 8 rights.

The Court rejected an argument that the force's 'appropriate policy document' was not sufficient to comply with data protection law, because there was no statutory requirement to have one in place at the time Mr Bridges was 'seen' by the AFR system. 'Appropriate policy documents' are a requirement of the Data Protection Act 2018, but Mr Bridges was captured by the AFR in 2017. The force's current appropriate policy document complied with the ICO's template.

What to take away

Any deployment of new technology needs to be carefully considered. The judgment is clear that under the existing legal regime, AFR can potentially be deployed lawfully, but the case is a useful illustration of the range of factors that need to be taken into account, documented, and subject to rigorous internal challenge and review, before and during any deployment.

In the absence of a clearer legislative framework set out by national/central bodies, questions of how AFR works, where and on whom, need to be carefully assessed and tightly defined at a local level.  The Court commented that it expected police forces (and by extension any other public body considering AFR) would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.

You can find a copy of the judgment here