Essex Police has suspended its live facial recognition cameras. A University of Cambridge study found the Corsight Apollo 4 system was more likely to correctly identify Black participants than people from other ethnic groups. The UK government plans to expand LFR vans from 10 to 50 across every force in England and Wales.
Researchers tested 188 volunteers in Chelmsford on March 12, 2026. Participants walked past active cameras on a marked police van. Roughly 50% of individuals on the watchlist were correctly matched. False positives (innocent people wrongly flagged) stayed under 1%.
Performance varied by demographic group. Men were identified at roughly 55% versus 42% for women. Black participants were matched at a higher rate than any other ethnicity. Of the six false positives recorded, four involved Black participants despite Black people making up only 24% of the sample. Cambridge researchers said this distribution was "unlikely to be due to chance alone."
If you're an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you're Black. To me, that warrants further investigation.
— Dr Matt Bland, criminologist, University of Cambridge
The bias found in the Cambridge study differs from the commonly discussed failure mode. Prior research (NIST 2019, MIT Media Lab) documented algorithms misidentifying individuals of colour at higher rates, leading to wrongful stops. The Essex finding is the reverse. The system catches Black suspects more reliably while letting others pass unrecognised. Both outcomes produce racially unequal enforcement of the same watchlist.
A second evaluation came from the National Physical Laboratory (NPL). NPL found Black men were most likely to be matched and White men least likely, directionally consistent with Cambridge. But NPL concluded the gap was "not statistically significant." The difference in conclusions reflects methodology. NPL used ISO-standard lab conditions with static images. Cambridge tested real-world conditions with moving volunteers.
Essex Police acted on the study that found bias, not the one that cleared it.
Between August 2024 and February 2025, Essex Police scanned approximately 1.3 million faces using Corsight Apollo 4. Those scans produced 48 arrests. One arrest per 27,000 individuals processed. For every person arrested, 27,000 had their biometrics captured and compared against a watchlist without consent.
A wrongful arrest last month showed these errors have real consequences. UK police detained a man for a burglary in a city he had never visited, 100 miles from his home. Retrospective face-scanning software confused him with another person of South Asian heritage.
The Essex pause collides with aggressive government expansion. In late January 2026, Home Secretary Shabana Mahmood announced plans to make LFR vans available to every force. The three-year AI policing budget totals £115 million. An additional £26 million goes to a national facial recognition platform. A new National Centre for AI in Policing, branded "Police.AI," will build and test models for law enforcement.
At least 13 UK police forces already run live facial recognition. Between January 2024 and September 2025, the technology contributed to over 1,300 arrests in London alone.
Mahmood described her criminal justice vision at a Tony Blair Institute event in December 2025. She said her goal was "to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his panopticon. That is that the eyes of the state can be on you at all times." Her words drew a sharp reaction from within her own party. Labour peer Baroness Chakrabarti, former Liberty director, told the Lords she had assumed the quotes were "fake news."
The ICO conducted its own audit and found Essex Police provides "reasonable data protection assurance" for LFR. It warned all forces that bias testing must be routine. "Without this, there is a real risk of unfairness," the regulator said.
The fact that Essex Police has put its use of live facial recognition cameras on hold due to concerns over racial bias should serve as wake-up call for the Government and police. The body of evidence continues to grow on the discrimination within the technology, which is in widespread use by police forces across the UK.
— Akiko Hart, Director, Liberty
Police across the country must take note of this fiasco. AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.
— Jake Hurfurt, head of research, Big Brother Watch
Essex Police said it has "revised its policies and procedures." The force is "now confident" it can resume LFR to "trace and arrest wanted criminals." A remediation plan will be published within 30 days. No details on which algorithm change was made during the pause.
Based on the fact there was potential bias, the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software.
— Essex Police spokesperson
No mandatory bias-testing framework exists for UK police use of facial recognition. Each force makes its own procurement decisions, commissions evaluations (or does not), and interprets results independently. The expansion budget will scale the technology to every force in England and Wales. Cambridge tested 188 volunteers at one site. The faces Essex scanned before that study were never checked for demographic bias at all.
Have a story? Become a contributor.
We work with independent researchers and cybersecurity professionals. Send us a tip or submit your article for editorial review.