By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: ‘Urgent clarity’ sought over racial bias in UK police facial recognition technology
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > ‘Urgent clarity’ sought over racial bias in UK police facial recognition technology
News

‘Urgent clarity’ sought over racial bias in UK police facial recognition technology

News Room
Last updated: 2025/12/06 at 5:58 PM
News Room Published 6 December 2025
Share
‘Urgent clarity’ sought over racial bias in UK police facial recognition technology
SHARE

The UK’s data protection watchdog has asked the Home Office for “urgent clarity” over racial bias in police facial recognition technology before considering its next steps.

The Home Office has admitted that the technology was “more likely to incorrectly include some demographic groups in its search results”, after testing by the National Physical Laboratory (NPL) of its application within the police national database.

The report revealed that the technology, which is intended to be used to catch serious offenders, is more likely to incorrectly match black and Asian people than their white counterparts.

In a statement responding to the report, Emily Keaney, the deputy commissioner for the Information Commissioner’s Office, said the ICO had asked the Home Office “for urgent clarity on this matter” in order for the watchdog to “assess the situation and consider our next steps”.

The next steps could include enforcement action, including issuing a legally binding order to stop using the technology or fines, as well as working with the Home Office and police to make improvements.

Keaney said: “Last week we were made aware of historical bias in the algorithm used by forces across the UK for retrospective facial recognition within the police national database.

“We acknowledge that measures are being taken to address this bias. However, it’s disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services.

“While we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust. The ICO is here to support and assist the public sector to get this right.”

Police and crime commissioners said publication of the NPL’s finding “sheds light on a concerning inbuilt bias” and urged caution over plans for a national expansion, which could include cameras being placed at shopping centres, stadiums and transport hubs, without putting in place adequate safeguards.

The findings were released on Thursday, hours after Sarah Jones, the policing minister, had described the technology as the “biggest breakthrough since DNA matching”.

Facial recognition technology scans people’s faces and cross-references the images against watchlists of known or wanted criminals. It can be used while examining live footage of people passing cameras, comparing their faces with those on wanted lists, or to enable officers to target individuals as they walk by mounted cameras.

Police officers can also retrospectively run images of suspects through police, passport or immigration databases to identify them and check their backgrounds.

Analysts who examined the police national database’s retrospective facial recognition technology tool at a lower setting found that “the false positive identification rate (FPIR) for white subjects (0.04%) is lower than that for Asian subjects (4.0%) and black subjects (5.5%)”.

The testing found that the number of false positives for black women was particularly high. “The FPIR for black male subjects (0.4%) is lower than that for black female subjects (9.9%),” the report said.

Responding to the report, a Home Office spokesperson said the department took the findings “seriously”, and had already taken action, including procuring and testing a new algorithm “which has no statistically significant bias”.

“Given the importance of this issue, we have also asked the police inspectorate, alongside the forensic science regulator, to review law enforcement’s use of facial recognition. They will assess the effectiveness of the mitigations, which the National Police Chiefs’ Council supports,” the spokesperson said.

Quick Guide

Contact us about this story

Show

The best public interest journalism relies on first-hand accounts from people in the know.

If you have something to share on this subject, you can contact us confidentially using the following methods.

Secure Messaging in the Guardian app

The Guardian app has a tool to send tips about stories. Messages are end to end encrypted and concealed within the routine activity that every Guardian mobile app performs. This prevents an observer from knowing that you are communicating with us at all, let alone what is being said.

If you don’t already have the Guardian app, download it (iOS/Android) and go to the menu. Select ‘Secure Messaging’.

SecureDrop, instant messengers, email, telephone and post

If you can safely use the Tor network without being observed or monitored, you can send messages and documents to the Guardian via our SecureDrop platform.

Finally, our guide at .com/tips lists several ways to contact us securely, and discusses the pros and cons of each. 

Illustration: Guardian Design / Rich Cousins

Thank you for your feedback.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Best Samsung monitor deal: Save 42% on the 40-inch Samsung Odyssey G7 curved gaming monitor Best Samsung monitor deal: Save 42% on the 40-inch Samsung Odyssey G7 curved gaming monitor
Next Article Wikipedia Rolls Out Spotify Wrapped-Style End-of-Year Recap Wikipedia Rolls Out Spotify Wrapped-Style End-of-Year Recap
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

The buzz over an ‘alien’ interstellar comet shows how way-out speculation goes viral
The buzz over an ‘alien’ interstellar comet shows how way-out speculation goes viral
Computing
Indie App Spotlight: ‘Vector’ is a Spotlight replacement for Mac that’s smart and snappy – 9to5Mac
Indie App Spotlight: ‘Vector’ is a Spotlight replacement for Mac that’s smart and snappy – 9to5Mac
News
Get powerful Microsoft Office apps for less than  each for life
Get powerful Microsoft Office apps for less than $5 each for life
News
Apple iPhone Fold Leak Suggests eSIM-Only Design Ahead Of 2026 Launch
Apple iPhone Fold Leak Suggests eSIM-Only Design Ahead Of 2026 Launch
Mobile

You Might also Like

Indie App Spotlight: ‘Vector’ is a Spotlight replacement for Mac that’s smart and snappy – 9to5Mac
News

Indie App Spotlight: ‘Vector’ is a Spotlight replacement for Mac that’s smart and snappy – 9to5Mac

3 Min Read
Get powerful Microsoft Office apps for less than  each for life
News

Get powerful Microsoft Office apps for less than $5 each for life

3 Min Read
The Middle Class Is Being Automated Out — This Is How Some People Will Still Get Rich
News

The Middle Class Is Being Automated Out — This Is How Some People Will Still Get Rich

11 Min Read
The Affordable Tool That Can Save Your Computer From A Deadly Shock – BGR
News

The Affordable Tool That Can Save Your Computer From A Deadly Shock – BGR

4 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?