Give us a call
Email us

Why facial recognition technology was deemed unlawful

10 Sep 2020

Legal complexities surround the use of such tools after a UK police pilot scheme was blocked.

In a landmark decision on 11 August, the Court of Appeal ruled that an ongoing pilot of facial recognition technology was unlawful.

The pilot scheme, carried out by the South Wales Police Force (SWP), involved the use of an Automated Facial Recognition system called AFR Locate, which is designed to match the faces of members of the public with those held on a ‘watchlist’ of potential suspects. Where a match was not made with the watchlist, the images were automatically deleted.

The Appellant, Edward Bridges, a civil liberties campaigner, claimed that his image was captured and processed by AFR Locate on two occasions – in December 2017 and March 2018. He sought to challenge the legality of the use of this system, both on those two occasions and on a continuing basis, by bringing a claim for judicial review. His challenge was dismissed by the High Court in September 2019, but the Court of Appeal overturned that ruling on the following grounds:

  1. SWP’s use of AFR Locate constituted a breach of Article 8 of the European Convention on Human Rights (ECHR) because individual police officers had too much discretion in deciding which individuals were placed on the watchlist, and where the technology could be deployed. In other words, their policies did not sufficiently set out the terms on which police officers’ powers were exercised.
  1. SWP failed to carry out an adequate Data Protection Impact Assessment (DPIA) prior to commencing the use of AFR Locate, which is required by the Data Protection Act 2018 (DPA 2018), because its DPIA failed to properly assess the risks to the rights and freedoms of data subjects; and
  1. SWP did not comply with the terms of the Public Sector Equality Duty (PSED), which it is obliged to do under section 149(1) of the Equality Act 2010 (EA 2010), because SWP ‘never sought to satisfy themselves’ that AFR Locate did not have ‘an unacceptable bias on grounds of race or sex’.

An end for AFR technology?

Despite finding that SWP’s use of AFR Locate was unlawful on three grounds, the case does not appear to signal the end of police use of AFR technology. In fact the Court found that SWP’s use of the technology was a proportionate interference with human rights because the benefits of using AFR Locate outweighed the negligible impact on Mr Bridges and other relevant individuals whose image was momentarily captured by the AFR technology.

This is an important decision for SWP and for the wider use of AFR technology by the police because the three grounds on which SWP’s use was held to be unlawful are all procedural and could be remedied by SWP placing stricter controls and frameworks on the use of the technology and carrying out more detailed assessments of both the risks to the rights and freedoms of data subjects and the potential existence of inherent bias in the AFR system.

Consequences for the police

The case brought more good news for the police as the Court did not impose punitive sanctions on SWP, deciding instead that declaratory relief would be sufficient. Therefore, the SWP can – and has indicated that it will – continue to use AFR Locate. Indeed, Jeremy Vaughan, South Wales Police’s deputy chief constable, told the Independent that there was ‘nothing in the Court of Appeal judgment that fundamentally undermines the use of facial recognition’.

Meanwhile, the Home Office stated that they ‘note the outcome of this case and are carefully considering the details’, whilst remaining committed to ‘empowering the police to use […] facial recognition safely, within a strict legal framework.’

Consequences for the private sector

The case only looked at the use of AFR technology by the police and did not address private sector uses. However, for companies who have been considering whether to use facial recognition technology in a B2B or B2C context, the Court of Appeal’s judgment might cause them to put the brakes on. Indeed, some were doing so even before this ruling was made.

In June, in the United States, IBM announced that it would no longer research or develop facial recognition technology at all due to the ethical concerns surrounding it, and was opposed to its use by law enforcement agencies.

The issue seems a complicated one for businesses to untangle alone and both supporters and opponents of AFR technology have been calling on the Government for greater regulation and guidance on the use of AFR and other biometric technologies.

The UK’s Biometrics Commissioner, Paul Wiles, has said “given that these new technologies have multiple and widespread uses the question of whether we allow such systems to be used, and for what purposes and within what legal control will shape the nature of our social and political world well into the future” and he has called for a public debate “in order to examine how the technologies can serve the public interest whilst protecting the rights of individual citizens to a private life without the unnecessary interference of either the state or private corporations”.

Until such a debate takes place and greater guidance is issued it is clear that any organisation seeking to use AFR technology needs to carefully consider such use, and document it with clear policies and processes, in order to comply with the DPA 2018, EA2010 and Article 8 requirements.

Reprinted with permission from the “September 2020″ edition of the “UK Legal Week”© 2020 ALM Media Properties, LLC. All rights reserved. Further duplication without permission is prohibited, contact 877-257-3382 or reprints@alm.com.

Kathryn Rogers

Partner
Commercial

Talk to us about

Related services

Share