A UK police force has suspended its deployment of live facial recognition technology after a study revealed it was statistically more likely to identify Black people on a watchlist database. Essex Police said it had paused use of the technology to update the system with the help of the algorithm software provider.
The findings expose a tension that has dogged the rollout of facial recognition technology across British law enforcement. The study used 188 volunteers to act as members of the public in a controlled field experiment during a real police deployment. Black people were 27 per cent more likely to be identified than all other ethnicities, and 31 per cent more likely than white people. The technology was also 14 per cent more likely to spot men than women.

What matters legally is how Essex Police responds. A spokesperson for Essex Police said that as part of a commitment to its Public Sector Equality Duty, it had commissioned two independent studies which were completed by academia. The first indicated there was a potential bias in the positive identification rate, while the second suggested there was no statistical relevant bias in the results. Based on the fact there was potential bias, the force decided to pause deployments while working with the algorithm software provider to review the results and seek to update the software.
The contradictory findings between the two studies underscore the complexity of measuring algorithmic bias. The force then sought further academic assessment. As a result of this work, the force has revised its policies and procedures and is now confident that it can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. It will continue to monitor all results to ensure there is no risk of bias against any one section of the community.
Yet the suspension comes as the government moves in the opposite direction. Earlier this year, the British government decided the police in England and Wales should increase their use of live facial recognition and artificial intelligence under wide-ranging government plans to reform law enforcement. In a white paper, the Home Office launched plans to fund 40 more LFR-equipped vans in addition to ten already in use. It said they would be used in town centres and high crime hotspots with the government planning to spend more than £26 million on a national facial recognition system and £11.6 million on LFR capabilities.
The disparity between expansion plans and the practical evidence of bias problems reflects a wider challenge in British policing. People name accuracy (53%), proper officer training (35%) and safeguards against bias (33%) as the top factors in regulating police use of facial recognition technology, according to research from the UK's Information Commissioner's Office. All forces should be conducting routine testing for bias and discriminatory outcomes, whether arising from technology design, training data, or watchlist composition. Without this, there is a real risk of unfairness.
The operational effectiveness of the technology adds another dimension. About 1.3 million faces had been scanned from August 2024 to February 2025, with 48 arrests, about one for every 27,000 faces. The low conversion rate raises questions about proportionality, particularly when the technology being deployed has documented bias characteristics.
Historically, surveillance systems are used to monitor marginalised groups, and recent studies suggest the technology itself contains inherent bias that disproportionately misidentifies women, people of colour, and people with disabilities. Cambridge researchers have gone further, arguing that police use of facial recognition in public spaces should be subject to a ban.
Essex Police's suspension and subsequent resumption with updated systems suggests institutional accountability mechanisms are functional, at least when academic scrutiny surfaces problems. Whether software updates alone resolve the underlying algorithmic biases, or whether the technology requires fundamental rethinking, remains contested among experts. The government's expansion plan will test whether these safeguards hold as deployment accelerates.