The Information Commissioner’s Office (ICO) has criticised the Home Office for failing to inform it about historical bias in the facial recognition algorithms used within the Police National Database.
Last week the National Physical Laboratory published a report of its independent testing of facial recognition algorithms commissioned by the Home Office.
In its report it noted that the police had made efforts to reduce bias through establishing training and guidance.
The ICO’s deputy commissioner, Emily Keaney, has responded to the report expressing “disappointment” that despite the regulatory regularly engaging with the Home Office and police bodies, it had not been informed of this historical issue.
“Last week we were made aware of historical bias in the algorithm used by forces across the UK for retrospective facial recognition within the Police National Database,” said Keaney.
“We acknowledge that measures are being taken to address this bias. However, it’s disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services.”
Keaney said that “any perception of bias and discrimination can exacerbate mistrust” and has asked the Home Office for “urgent clarity on this matter so we can assess the situation and consider our next steps”.
