No Justice Without Lawyers—The Myth of an Inquisitorial Solution
January 30, 2024Commercial Awareness Update – W/C 5th February 2024
February 5, 2024Introduction
In a rapidly evolving technological landscape, where the value of data sits at an exponential level, many citizens are becoming increasingly aware of their fundamental rights surrounding privacy and data protection. Edward Bridges (Ed Bridges) of Cardiff was one such citizen, who in 2019 became the first to challenge police utilisation of a live facial recognition technology known as Automatic Facial Recognition Locate, ‘AFR Locate’. South Wales Police’s ongoing trial of this technology uses surveillance cameras to capture digital images of individuals in public spaces and compares them with a watchlist.
Background
Since May 2017, South Wales Police has employed this technology numerous times, potentially gathering facial biometric data from around 500,000 individuals without their consent. In September 2019, the High Court ruled that although facial recognition infringes on the privacy rights of those scanned, the Surveillance Camera Code of Practice and the Personal Data Acts offer sufficient regulation. Nevertheless, Ed Bridges was granted permission to appeal on all grounds in November 2019, and the appeal was consequently heard in June 2020.
Facts
The appeal thus concerned the legality of South Wales Police’s use of the live AFR Locate technology in public places. Specifically for Ed Bridges, paragraph 25 of the judgement states ‘there were two instances of AFR Locate usage by South Wales Police in Cardiff: one on December 21, 2017, at Queen Street, and the other on March 27, 2018, during the Defence Exhibition at the Motorpoint Arena.’ The South Wales Police maintained that their use of the technology was within its powers and lawful and that they notified attendees of public events about its usage. South Wales Police have since used the technology around 70 times and regularly use it on crowds at major public events. However, Ed Bridges contended that South Wales Police failed to provide prior notice of their intention to deploy AFR Locate technology on either occasion.
He appealed the September 2019 decision on the following grounds:
- The Divisional Court mistakenly stated that the South Wales Police’s use of AFR Locate on the dates in question followed the rules of Article 8 of the European Convention on Human Rights (ECHR).
- The court did not assess the proportionality of South Wales Police’s AFR use and how it affected the privacy rights of other public individuals whose facial features were recorded.
- The court was challenged for saying that the South Wales Police’s Data Protection Impact Assessment (DPIA) complied with the requirements of section 64 of the Data Protection Act 2018 (DPA 2018), even though there were mistakes in how they dealt with people’s privacy rights under Article 8 and the processing of biometric data.
- The court was criticised for not checking if the South Wales Police had the correct policy document to handle people’s data legally and fairly.
- The court was wrong to say that South Wales Police met the duty to treat everyone equally because their assessment of the impact of AFR was lacking and failed to consider possible discrimination issues properly.
Proceedings and Judgement Analysis
The Court of Appeal accepted the appeal on grounds 1, 3, and 5. With regard to Ground 1, The Court of Appeal overturned the High Court’s decision due to deficiencies in the legal framework governing the use of AFR. It criticised the police policies for granting excessive discretion to officers in determining watchlists and deployment locations, stating that the policies lacked the necessary clarity and ‘quality of law’.
The court emphasised that deployment locations should be based on reasonable grounds related to the presence of individuals on watchlists, highlighting a need for more specific guidelines in police policies.
In para 85, the court suggests that the use of facial recognition technology falls somewhere between the extremes represented by two previous cases, Sliver v United Kingdom (1983) and R (Catt) v Association of Chief Police Officers [2015]. This implies that while the technology may not be as intrusive as in Sliver, where the European Court of Human Rights found a violation of privacy rights, it also may not be as acceptable as traditional surveillance methods like taking photographs, as established in Catt. Therefore, the court indicates that facial recognition technology poses unique considerations that must be addressed within the legal framework.
Putting his arguments forward for ground 3 of the appeal, Ed Bridges illustrated the DPIA had two main deficiencies:
- It failed to consider the engagement of Article 8 of the ECHR properly for individuals whose facial biometrics were captured by AFR but are not on police watchlists.
- It failed to adequately assess the risks to individuals’ rights and freedoms posed by using AFR, such as potential biases based on gender or race and the retention of innocent people’s biometric data for more extended periods due to false positives.
The court agreed that the DPIA did acknowledge some privacy concerns related to AFR but ultimately ‘failed properly to assess the risks to the rights and freedoms of data subjects’, especially in light of the deficiencies found in the AFR system’s compliance with Article 8 requirements. Therefore, the DPIA didn’t meet the legal obligations in Section 64 of the DPA 2018. As a result, the appeal was allowed on this ground.
In terms of ground 5, the court found that South Wales Police did not do ‘…all that they reasonably could to fulfil the Public Sector Equality Duty’, failing to gather enough evidence to determine if AFR Locate was inherently biased due to two reasons: automatic deletion of data not on watchlists and lack of awareness regarding the dataset used for training. Despite no allegations of biased results, the court criticised the police for not verifying the software’s potential bias based on race or sex, emphasising the importance of ensuring unbiased technology, especially for novel and controversial tools like AFR.
Final Judgement
The court unanimously determined that South Wales Police did breach Article 8 of the ECHR, and as the police force leading this technology trial, they must cease its ongoing use.
Commentary
The ruling, along with its dissenting opinions, establishes a precedent that carries influence beyond its jurisdiction. However, this case does not create a legally binding or persuasive precedent within or outside its jurisdiction. The court concludes that the legal framework governing this technology granted excessive discretion to the police, which will likely affect future legislative or regulatory efforts related to public bodies and other technological advancements’ use of AFR technology. The ruling, therefore, safeguards the privacy of individuals and indirectly reinforces freedom of expression and assembly rights. The prohibition of capturing biometric data during large events, such as social protests, will likely lead to further prohibitions on technologies that could infringe upon our fundamental human rights. Therefore, this is a welcome decision.
Written by Anieka Ali