By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: London Assembly member: Police should halt facial-recognition technology use | Computer Weekly
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > London Assembly member: Police should halt facial-recognition technology use | Computer Weekly
News

London Assembly member: Police should halt facial-recognition technology use | Computer Weekly

News Room
Last updated: 2026/02/11 at 3:40 PM
News Room Published 11 February 2026
Share
London Assembly member: Police should halt facial-recognition technology use | Computer Weekly
SHARE

The Metropolitan Police’s rapid “unchecked” expansion of live facial-recognition (LFR) technology is taking place without clear legal authority and minimal public accountability, says Green London Assembly member Zoë Garbett in a call for the force to halt its deployments of the controversial technology.

Made during an ongoing government consultation on a legal framework for the technology, Garbett’s call for the force to immediately halt its deployments of LFR is informed by concerns around its disproportionate effects on Black and brown communities, a lack of specific legal powers dictating how police can use the tech, and the Met’s opacity around the true costs of deploying.

Garbett’s intervention also comes as the High Court is considering the lawfulness of the Met’s approach to LFR, and whether it has effective safeguards or constraints in place to protect people’s human rights from the biometric surveillance being conducted.

“Live facial-recognition technology subjects everyone to constant surveillance, which goes against the democratic principle that people should not be monitored unless there is suspicion of wrongdoing,” said Garbett, adding that there have already been instances of “real harm” in children being wrongly placed on watchlists, and the disproportionate targeting and misidentification of Black Londoners.

“These invasive tools allow the police to monitor the daily lives of Londoners, entirely unregulated and without any safeguards. The Met repeatedly claim that live facial recognition is a success, yet they continue to withhold the data required to scrutinise those claims.

“It makes no sense for the home secretary to announce the expansion of live facial recognition at the same time as running a government consultation on the use of this technology. This expansion is especially concerning given that there is still no specific law authorising the use of this technology.”

Highlighting in a corresponding report how facial-recognition technology “flips the presumption of innocence” by turning public spaces into an “identification parade”, Garbett also outlined ways in which both the Met and the Home Office can make its use safer in lieu of a full-blown ban.

This includes creating primary legislation with “strict controls” that limits LFR to the most serious crimes and bans its use by non-law enforcement public authorities or the private sector; and openly publishing deployment assessments so that watchlist creation, location choice and tactical decisions are publicly available for Londoners to review.

On watchlist creation specifically, Garbett dismissed the police claim that LFR is a “precise” tool, highlighting how nearly every watchlist used is larger than the one preceding it.

Highlighting how the number of faces being scanned by the Met is “increasing at a near exponential rate”, Garbett likened the forces watchlist tactics to a “fishing trawler” that it keeps adding to so it can find people.

“Data suggests that rather than making a new unique watchlist for each deployment based on the likelihood of people being in the area of the deployment, it seems from the outside that the MPS is just adding additional people on to a base watchlist [it has],” she said.

Garbett also called on the Met to publish the true financial and operational costs of all LFR deployments, arguing that the force has not only failed to provide a compelling business case for the technology, but is actively obfuscating this information.

“The MPS has a history of a lack of transparency. This is perhaps best summarised by Baroness Casey in her review of the MPS where she said, ‘The Met itself sees scrutiny as an intrusion. This is both short-sighted and unethical. As a public body with powers over the public it needs to be transparent to Londoners for its actions to earn their trust, confidence and respect’,” said Garbett.

She added that while freedom of information requests returned in mid-2023 revealed that, up until that point, the force had spent £500,000 on the tech, without up-to-date reliable figures, it is impossible to verify the Met’s claims that it is delivering a greater impact on public safety through LFR.

“The NHS wouldn’t be able to roll out a new treatment without being able to prove it was worthwhile and effective, but it seems that the police operate under their own rules and seemingly answer to no one,” said Garbett.

Computer Weekly contacted the Met about Garbett’s report. A spokesperson said that LFR “has taken more than 1,700 dangerous offenders off the streets since the start of 2024, including those wanted for serious offences, such as violence against women and girls. This success has meant 85% Londoners support our use of the technology to keep them safe.

“It has been deployed across all 32 boroughs in London, with each use carefully planned to ensure we are deploying to areas where there is the greatest threat to public safety. A hearing into our use of live facial recognition has taken place and we look forward to receiving the High Court’s decision in due course. We remain confident our use of LFR is lawful and follows the policy which is published online.”

A lack of meaningful consultation so far

While the use of LFR by police – beginning with the Met’s deployment at Notting Hill Carnival in August 2016 – has already ramped up massively in recent years, there has so far been minimal public debate or consultation, with the Home Office claiming for years that there is already “comprehensive” legal framework in place.

The lack of meaningful engagement with the public by police and government over facial recognition is reflected in Garbett’s report. She highlights, for examples, that Newham Council unanimously passed a motion in January 2023 to suspend the use of LFR throughout the borough until biometric and anti-discrimination safeguards are in place.

While the motion highlighted the potential of LFR to “exacerbate racist outcomes in policing” – particularly in Newham, the most ethnically diverse of all local authorities in England and Wales – both the Met and the Home Office said that they would press forward with the deployments anyway.

“Since that motion was passed, LFR has been used 31 times in Newham by the MPS,” said Garbett.

On the deployment of permanent LFR cameras mounted to street furniture in Croydon, Garbett added while the Met promised it would consult with the local community, councillors from there are have told her the force did not follow through with this consultation.

The technology was similarly rolled out in Lewisham without meaningful consultation, despite the Met’s claims to the contrary.

However, in December 2025, the Home Office launched a 10-week consultation on the use of LFR by UK police, allowing interested parties and members of the public to share their views on how the controversial technology should be regulated.

The department has said that although a “patchwork” legal framework for police facial recognition exists (including for the increasing use of the retrospective and “operator-initiated” versions of the technology), it does not give police themselves the confidence to “use it at significantly greater scale…nor does it consistently give the public the confidence that it will be used responsibly”.

It added that the current rules governing police LFR use are “complicated and difficult to understand”, and that an ordinary member of the public would be required to read four pieces of legislation, police national guidance documents and a range of detailed legal or data protection documents from individual forces to fully understand the basis for LFR use on their high streets.

Consultation responses

In a section on how people can respond to the Home Office’s LFR consultation, Garbett urged people to call for its ban, adding that further protections in lieu one could include requiring a warrant to be placed on a watchlist, and limiting it to “the most serious and urgent crime purposes”.

She noted that, as it stands, the Met has not used LFR to make any terror-related arrests, with the most common offence being variations on theft or court order breaches

“In a recent press release, the lead example the MPS give for how they have used LFR is using it to arrest a 36-year-old woman who was wanted for failing to appear at court for an assault in 2004 when they were probably 15 years old,” she said. “The public might feel differently about LFR if they knew it was being used on cases such as these.”

On the permanent installation of LFR cameras in Croydon, Garbett added that while the police have said they are only switched on when an operation is taking place, “there is still the potential for 24/7 monitoring, with Londoners unable to tell if the cameras are operational or not. This makes the feeling of being under surveillance in London feel routine and begins to be a slippery slope to preventative policing and a blurry line between safety and social control.”  

Garbett concluded that the rapid deployment of LFR must stop while safeguards are in place to protect people’s rights: “I urge everyone to respond to the government consultation and use the guide I’ve prepared to make sure we have a say in how this technology is used going forward.”

Computer Weekly contacted the Home Office about the contents of Garbett’s report and its decision to massively expand facial-recognition deployments before concluding its consultation, but received no response.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article This Tri-Screen Add-On Brings Desktop-Level Multitasking To Your Laptop For 0 Off This Tri-Screen Add-On Brings Desktop-Level Multitasking To Your Laptop For $150 Off
Next Article 11 ways to use AI in social media (not just for content creation) 11 ways to use AI in social media (not just for content creation)
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

The future of food in the Americas
The future of food in the Americas
News
Report: 48% of Fleets Want Asset Tracking Amid Cargo Theft Surge – Tech.co
Report: 48% of Fleets Want Asset Tracking Amid Cargo Theft Surge – Tech.co
News
Cleveland mayor responds to GeekWire guest column, calls Ohio city a ‘case study of what’s possible’
Cleveland mayor responds to GeekWire guest column, calls Ohio city a ‘case study of what’s possible’
Computing
Photonic AI chip startup Olix nabs 0M investment –  News
Photonic AI chip startup Olix nabs $220M investment – News
News

You Might also Like

The future of food in the Americas
News

The future of food in the Americas

76 Min Read
Report: 48% of Fleets Want Asset Tracking Amid Cargo Theft Surge – Tech.co
News

Report: 48% of Fleets Want Asset Tracking Amid Cargo Theft Surge – Tech.co

1 Min Read
Photonic AI chip startup Olix nabs 0M investment –  News
News

Photonic AI chip startup Olix nabs $220M investment – News

6 Min Read

Instagram chief says he does not believe people can get clinically addicted to social media

5 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?