This case discusses whether the use of Automated Facial Recognition technology (AFR) used by South Wales Police was lawful.
South Wales Police (SWP) were employing the use of AFR Locate, a type of facial recognition technology. This would scan live CCTV footage to compare facial biometric data stored on a police watchlist. Those on the watch list included: persons wanted on warrant, individuals escaped from custody, persons suspected of crimes, those in need of protection (missing persons), individuals whose presence at events may be of particular concern, those of intelligence interest to the SWP and finally, vulnerable persons.
If a match was detected, an alert was created and a police officer will review the images to determine whether to make an intervention. If there is no match, then then the facial image captured from the live feed it deleted automatically.
SWP used AFR Locate 50 times between May 2017 and April 2019 at a variety of large public events. The use of AFR was not secret, the SWP took steps to inform members of the public about AFR and its use at the event for example, using social media, posters and handing out cards.
The First Instance Proceedings:
The claim was filed by the appellant, Mr Bridges, a civil liberties campaigner supported by Liberty. Mr Bridges was in the vicinity of two deployments of AFR Locate by SWP. Mr Bridges claimed that AFR Locate was not compatible with the right to private life under Article 8 of the European Convention on Human Rights. This was because it was not ‘in accordance with the law’ as required by Article 8(2) ECHR. Additionally, he stated that the extent to which using AFR entails processing data was infringing the Data Protection Act (DPA) 1998 and 2018. Finally, he claimed that the use of AFR Locate went against the Public Sector Equality Duty (PSED) that the SWP have because SWP did not consider that the technology may produced results that were indirectly discriminatory on grounds of sex and race.
At first instance the Divisional Court dismissed the claim on all grounds. They found that the right to privacy was engaged for all those scanned by AFR Locate. However, the interference with Article 8 was in accordance with the law and proportionate. They considered that like DNA and fingerprints, AFR technology ‘enables the extraction of unique information and identifiers about an individual allowing [their] identification… in a wide range of circumstances’. They rejected Mr Bridges’ claim that there was no legal basis for using AFR technology.
Secondly, the court rejected that SWP acted unlawfully under the DPA 1998 because it was the legal basis for AFR Locate and was discussed under the Article 8 issue. As for the DPA 2018, they held that AFR Locate met all requirements of processing the data correctly and for the correct purpose.
Finally, the SED argument was dismissed because there was no suggestion that when the use of AFR commenced that there was a higher rate of positive matches for female face and/or for black and ethnic minority faces.
Mr Bridges appealed on five grounds to the Court of Appeal (CA). Three out of those five were successful. The most important are below.
The first successful ground addressed Article 8. The CA found that not only was Article 8 engaged but it had been infringed. The use of AFR Locate was no ‘in accordance with the law’. This was because although there was legal framework of primary legislation such as DPA 1998 and 2018, supported by secondary legislation such as The Surveillance Camera Code of Practice, there was no clear guidance on where AFR could be used and who could be added to a watchlist. These two ‘deficiencies’ of where it could be used and who could used it left too much discretion to police officers.
However, the appeal failed on the next ground. The CA agreed with the Divisional Court that the use of AFR was a proportionate interference with Article 8 rights under Article 8(2). There was a negligible impact on Mr Bridges’ Article 8 rights and ‘an impact that has very little weight cannot become weightier because other people were affected’ 
Finally, the appeal succeeded on the PSED claim. The SWP were in breach of PSED because they did not have due regard to the needed to eliminate discrimination on those two grounds which may arise in AFR Locate. There is scientific evidence that facial recognition software can be biased and generate false identifications of those from BAME backgrounds and in the case of women. Although it was not suggested that AFR Locate does have this effect, SWP did not take reasonable steps to fulfil the Public Sector Equality Duty by investigating whether AFR did have this bias.
This judgment has a significant impact for the use of facial recognition technology. Whenever the technology is used now, Article 8 is engaged and it is interfering with an individual’s right to privacy. In this case, the SWP’s use of facial recognition was proportionate but there may be instances where an individual’s right to privacy outweighs the benefits of facial recognition. Any use of facial recognition technology must now show that it has balanced the individual’s right to privacy and considered the risks with a Data Protection Impact Assessment. Otherwise, liability for infringement of Article 8 is possible.
Additionally, it must be considered whether the facial recognition technology has any bias or discrimination with its software. The PSED was engaged here. Whilst it does not apply to the private sector, it does indicate that the courts are aware of the potential for discrimination and will expect users to prevent bias.