Home Office Admits Racial Bias in Police Facial Recognition Technology

Police facial recognition technology with racial bias.
Table of Contents
    Add a header to begin generating the table of contents

    The UK Home Office has acknowledged a significant issue with its facial recognition technology, admitting it is more prone to misidentifying Black and Asian individuals compared to white people. This admission follows testing by the National Physical Laboratory (NPL) on the police national database, raising serious concerns about inherent bias and the potential for discriminatory outcomes in law enforcement.

    Key Takeaways

    • Facial recognition technology shows a higher false positive identification rate for Black and Asian subjects.
    • The Information Commissioner’s Office (ICO) is seeking "urgent clarity" from the Home Office.
    • Police and crime commissioners are urging caution on national expansion plans.
    • The Home Office states a new, unbiased algorithm has been procured and will be tested.

    Inbuilt Bias Revealed

    Recent testing of the facial recognition technology used within the police national database has revealed a concerning inbuilt bias. The National Physical Laboratory (NPL) found that on certain settings, the technology is "more likely to incorrectly include some demographic groups in its search results." Specifically, the false positive identification rate (FPIR) for white subjects was 0.04%, significantly lower than for Asian subjects (4.0%) and Black subjects (5.5%). The report highlighted particularly high false positive rates for Black women, with an FPIR of 9.9% compared to 0.4% for Black male subjects.

    Calls for Safeguards and Transparency

    Police and crime commissioners have described the findings as "shedding light on a concerning inbuilt bias" and are urging caution regarding plans for a national expansion of the technology. They questioned why these findings were not released earlier or shared with affected communities. Civil liberties groups, like Liberty, have echoed these concerns, stating that the "racial bias in these stats shows the damaging real-life impacts of letting police use facial recognition without proper safeguards in place." Calls are being made for a halt to the rapid rollout until robust safeguards, transparency, and meaningful oversight are established.

    Regulatory Scrutiny and Home Office Response

    The UK’s data protection watchdog, the Information Commissioner’s Office (ICO), has requested "urgent clarity" from the Home Office regarding the racial bias. The ICO stated that further steps, potentially including enforcement action such as fines or legally binding orders, could be considered. A Home Office spokesperson acknowledged the report’s findings, stating that action has already been taken. They confirmed the procurement and testing of a new algorithm that "has no statistically significant bias," which is scheduled for further evaluation early next year. Additionally, the police inspectorate and the forensic science regulator have been asked to review law enforcement’s use of facial recognition technology.

    Sources