Essex Police Halts Live Facial Recognition Amid Concerns Over Accuracy and Bias
Essex Police has temporarily suspended its deployment of live facial recognition (LFR) technology following the identification of potential issues related to accuracy and discriminatory bias. The biometric system, supplied by Israeli company Corsight, was put on hold as part of a response to findings highlighted in an audit report released by the Information Commissioner’s Office (ICO). The ICO emphasized the need for the police force to address and mitigate these risks before resuming use.
Background and Initial Concerns
Records indicate that Essex Police last utilized the LFR system on 26 August 2025, with operations already paused by the time the ICO conducted its audit in November of the same year. The exact reasons behind the suspension remain unclear, but investigative reporting in May 2025 revealed that the force had not adequately assessed the technology’s potential for discriminatory impact. This conclusion was drawn after privacy advocacy group Big Brother Watch obtained an equality impact assessment (EIA) through Freedom of Information requests, which was widely criticized for its superficial and inconsistent analysis.
Experts condemned the EIA for failing to examine systemic equality implications and for relying on data from unrelated facial recognition algorithms used by other police forces, which were trained on different demographic groups. Additionally, Essex Police faced scrutiny for uncritically echoing Corsight’s claims that their system was free from bias. Notably, the National Institute of Standards and Technology (NIST), a globally respected authority on facial recognition testing with publicly available data, had no evidence supporting Corsight’s assertions of superior accuracy or minimal bias.
Big Brother Watch argued that these shortcomings likely meant Essex Police did not fulfill its public sector equality duty, which mandates consideration of how policies and practices might disproportionately affect certain groups.
Independent Evaluations Shed Light on Bias Issues
In response to criticism, Essex Police commissioned independent assessments from the National Physical Laboratory (NPL) and Cambridge University to evaluate the LFR system’s performance. The Cambridge study, published in March 2026, revealed that the technology was more accurate in identifying men compared to women and was statistically more likely to correctly recognize Black individuals than other ethnic groups. Criminologist Matt Bland, involved in the research, remarked, “If you’re an offender passing facial recognition cameras as deployed in Essex, your chances of being identified are higher if you are Black. This finding demands further scrutiny.”
Conversely, the NPL’s analysis, also released in March 2026, found that Black men were most accurately matched by the system, with white men being the least accurately identified. However, this disparity was deemed statistically insignificant, suggesting no conclusive evidence of bias.
When questioned about the suspension, Essex Police stated that the conflicting results from the two academic studies prompted them to pause LFR use while collaborating with Corsight to refine the software. They emphasized that updated policies and procedures have been implemented, and the force is now confident in reintroducing the technology to assist in locating and apprehending wanted individuals. Continuous monitoring will ensure no group is unfairly targeted.
Jake Hurfurt, head of research at Big Brother Watch, responded to the suspension by warning, “Law enforcement agencies nationwide must learn from this debacle. Experimental, unproven, or biased AI surveillance technologies have no place in public spaces.”
Escalating Use of Facial Recognition Amid Limited Public Dialogue
Since the Metropolitan Police’s initial use of LFR at the Notting Hill Carnival in 2016, the adoption of this technology by UK police forces has increased significantly. Despite this growth, public consultation and debate have been minimal. The Home Office has long maintained that existing legislation provides a comprehensive framework for LFR use.
However, in December 2025, the Home Office initiated a 10-week public consultation to gather input on regulating police use of LFR, acknowledging that the current legal landscape is fragmented and insufficient. The department admitted that the patchwork of laws governing retrospective and operator-initiated facial recognition does not inspire confidence among police or the public for broader deployment.
The Home Office highlighted the complexity of existing regulations, noting that an average citizen would need to navigate multiple statutes, national police guidelines, and data protection policies from various forces to fully understand how LFR is applied in public spaces.
Despite the ongoing consultation, the Home Office announced in January 2026 plans for a substantial expansion of AI and facial recognition technologies within policing. This includes increasing the number of LFR-equipped vans from 10 to 50, establishing a National Centre for AI in Policing-dubbed Police.AI-to develop and validate AI models for law enforcement, and investing ÂŁ115 million over three years to accelerate AI innovation in policing.
The ‘Panopticon’ Ambition: A Vision for Constant Surveillance
In a recent discussion with former Prime Minister Tony Blair, UK Home Secretary Shabana Mahmood articulated her goal to harness AI and facial recognition to realize Jeremy Bentham’s concept of the “panopticon.” Bentham’s design envisioned a prison where a single unseen guard could observe all inmates simultaneously, creating a psychological effect of constant surveillance that encourages compliance.
Today, the panopticon metaphor is often associated with authoritarian oversight. Mahmood expressed her aspiration for the criminal justice system to emulate this model through technology, stating, “My ultimate aim was to use AI and technology to ensure the state’s gaze is upon you at all times.”




