• IT
Choose your location?
  • Global Global
  • Australian flag Australia
  • French flag France
  • German flag Germany
  • Irish flag Ireland
  • Italian flag Italy
  • Polish flag Poland
  • Qatar flag Qatar
  • Spanish flag Spain
  • UAE flag UAE
  • UK flag UK

Policing at the AI Frontier: Face it - facial recognition is here to stay

05 March 2026

The judicial review in R (Thompson and Carlo) v The Commissioner of Police of the Metropolis (“Thompson”) lands amid the rapid expansion of facial recognition tools across UK policing and an active government push to codify how biometrics and AI are used. It offers a timely lens on the UK’s trajectory, as Home Office consultations and proposed reforms point to a more structured legal framework for policing technologies. Thompson offers more than a snapshot of one police force’s policy; it provides a glimpse into the UK’s next phase of AI‑enabled policing and the legal frameworks that will govern it. 

Facial recognition technology: what is it?

UK policing operates at the technological frontier, with routine use of AI-enabled facial recognition technology (FRT) capabilities that can accelerate suspect identification and help locate vulnerable or missing people at scale.

The Home Office identifies three policing uses of FRT:

  • Retrospective facial recognition (RFR) – used post‑incident to compare a still image (e.g., CCTV) against custody images on the Police National Database;
  • Live facial recognition (LFR) – compares faces from a live feed against a targeted watchlist to locate individuals of interest; and
  • Operator‑initiated facial recognition (OIFR) – enables officers to photograph a person and check their identity against police databases.

All forces can use RFR. LFR is deployed by a smaller number, typically at transport hubs, shopping districts, and major events. OIFR is the newest expanding capability. These tools offer speed, efficiency and enhanced identification when compared with manual processes.

The Government White Paper (26 January 2026) on policing reforms includes proposals for expanded use of AI-enabled technology, together with major investments: £26 million for a national facial recognition system, £11.6 million for LFR expansion, and £115 million over three years for a National Centre for AI in Policing, together supporting wider operational rollout including up to 50 LFRT vans across England and Wales.

Public trust and confidence

LFR has become a focal point for civil‑liberties groups, academics, and regulators, with concerns centred on privacy, misuse, over‑broad deployments, levels of human oversight, and the risk of misidentification, particularly affecting minority groups.

Legal context

Recent judicial decisions have shaped the boundaries governing police use of live facial recognition, particularly under the European Convention on Human Rights (ECHR) and the Data Protection Act 2018 (DPA) and the public sector equality duty. Courts have confirmed that LFR involves processing biometric data and therefore requires a clear and legally adequate basis for such processing. This necessitates careful preparation of Data Protection Impact Assessments (DPIAs), Equality Impact Assessments (EIAs), transparent governance arrangements, and strict criteria for compiling and managing watchlists.

In R (Bridges) v Chief Constable of South Wales Police (“Bridges”), the Court of Appeal held that police use of facial recognition technologies required clear and foreseeable rules to satisfy proportionality and equality obligations. As a result, the College of Policing issued national Authorised Professional Practice (APP) guidance in March 2022, establishing that LFR should be targeted, intelligence‑led, time‑bound, and used only where less intrusive methods would not achieve the policing aim.

This guidance is now widely used, but whether it is sufficient to meet legal standards in contemporary deployments is the question that Thompson now places before the courts.

Thompson case overview

The claimants in Thompson, supported by Big Brother Watch, challenge the Metropolitan Police’s policy for deploying LFR, primarily on grounds that it is incompatible with Articles 8, 10 and 11 ECHR (privacy, expression and assembly). Arguing that the policy does not sufficiently constrain where LFR can be used, when deployments are justified, or how individuals are selected for inclusion on watchlists. Their challenge highlights alleged deficiencies in oversight, transparency and the handling of false matches, particularly during large‑scale deployments where thousands of people may be scanned.

Thompson marks the next major test of the UK’s LFR governance framework. Whereas Bridges established the principle that police forces must operate under “clear and foreseeable” rules, Thompson examines whether the post‑Bridges professional guidance and internal LFR policies meet that standard in practice. This is an especially significant question, as deployments have since grown in frequency, sophistication and scale.

The case raises a fundamental question: is guidance alone, even when detailed and widely adopted, sufficient to regulate this evolving technology?

The judgment has not yet been handed down and so we’ll have to wait and see.

What lies ahead?

Police forces currently rely on common‑law powers and data‑protection principles, the direction of travel based on a recent Home Office consultation is toward a bespoke statutory framework governing when and how biometric technologies can be used and how accountability is maintained, including:

  • codified authorisation pathways for LFR deployments;
  • standardised watchlist governance, including documented criteria, verification and retention policies;
  • increased reliance on quantitative accuracy evidence to justify operational thresholds;
  • capability‑specific impact/equality assessments across facial recognition capabilities; and
  • routine transparency measures, including publication of deployment rationales, locations and outcomes.

These developments would enhance accountability and build public trust while supporting lawful and proportionate use of a powerful policing technology.

Please get in touch with the team below if you would like to discuss any of the issues raised in this article.

Thank you to Jenny Leonard, Gabriella Rasiah and Natalie Parnaby for their contribution in producing this article. 

Further Reading