Brian Sims
Editor

Facial recognition technology report finds no risk of bias or discrimination

NATIONAL PHYSICAL Laboratory testing of facial recognition system algorithms employed by the Metropolitan Police Service and South Wales Police has concluded that the technology can be used in a way that does not discriminate based on gender, age or ethnicity. As a result, live facial recognition cameras will now be brought back into use.

Thanks to prior testing conducted by the National Institute of Standards and Technology, the Metropolitan Police Service and South Wales Police knew that the facial recognition technology deployed was underpinned by a high-performing algorithm.

The aim of the testing was to develop an in-depth understanding of the performance of the algorithms when used in operational environments. The three policing use cases were live facial recognition, retrospective facial recognition and operator-initiated facial recognition.

In terms of live facial recognition, a live camera feed compares those facial images captures with a pre-determined ‘Watch List’ in order to pinpoint individuals who are on that list. Such lists feature people suspected of involvement in crimes and deemed likely by the police service to be at a location where the facial recognition cameras are being deployed. Data associated with a match is held for a period of up to 24 hours. In the event of no match, it will then be immediately and automatically deleted.

The National Physical Laboratory test plan was specifically designed to help identify any impact this technology may have on any protected characteristics, with a particular focus on race, age and gender.

Entitled ‘Facial Recognition Technology in Law Enforcement: Equitability Study’, the National Physical Laboratory’s 34-page report – authored by Dr Tony Mansfield – delivers an “impartial, scientifically underpinned and evidence-based analysis” of the performance of the facial recognition algorithm currently used by the Metropolitan Police Service and South Wales Police.

It emerges that there are settings at which the algorithm can be operated whereby there is “no statistical significance” between demographic performance. There’s no demographic performance variation for retrospective facial recognition or operator-initiated facial recognition.

According to the National Physical Laboratory, the test results will assist both forces in further understanding how to use facial recognition technology fairly in order to detect and prevent crime, safeguard national security and keep people safe.

Statement from New Scotland Yard

The Metropolitan Police Service has issued a detailed statement on the National Physical Laboratory’s report, with the latter having been issued on Wednesday 5 April.

The response from New Scotland Yard reads: “When used at a threshold setting of 0.6 or above, correct matches (ie the True Positive Identification Rate) were 89%. The incorrect match rate (ie the False Positive Identification Rate) was 0.017%. The chance of a false match is just one in 6,000 people walking in front of the camera.”

The statement continues: “When used at a threshold setting of 0.6 or above, any differences in matches across groups were not statistically significant. This means performance was the same across race and gender. In terms of retrospective facial recognition, the true positive identification rate for high-quality images was 100%. This is the best performance possible.”

Lindsey Chiswick, director of intelligence for the Metropolitan Police Service, commented: “Live facial recognition technology is a precise tool for community crime fighting. Led by intelligence, we place our effort where it’s likely to have the greatest effect. This technology enables us to be more focused in our approach to tackling crime.”

Chiswick continued: “This is a significant report for policing as it’s the first time we have had independent scientific evidence to advise us on the accuracy of – and any demographic differences generated by – our facial recognition technology. We commissioned the work for a better understanding of that technology. Certainly, the National Physical Laboratory’s scientific analysis has given us a greater insight into its performance for future deployments.”

Further, Chiswick noted: “All matches are manually reviewed by an officer. If the officer thinks it’s a match, a conversation will follow in order to check the detail. The study was certainly large enough to ensure that any demographic differences would be seen.”

In conclusion, Chiswick said: “We understand the concerns raised by some groups and individuals about emerging technologies and the potential for bias. We’ve listened to these voices. This research means that we better understand the performance of our algorithm. We understand how we can operate to ensure the technology’s performance across race and gender is equal.”

Pause in usage

South Wales Police had paused its use of facial recognition technology following a Court of Appeal judgement handed down in 2020 that highlighted the need for more work to be transacted in order to confirm the software does not discriminate based on race or gender.

South Wales Police chief constable Jeremy Vaughan has reaffirmed the force’s commitment to the use of facial recognition technology, which is now going to be reintroduced in the wake of the report.

“My priority will always be to protect the public,” asserted Vaughan, “while relentlessly pursuing those determined to cause harm in our communities. It’s important to use new technology to help us achieve that aim. It’s also right and proper that our use of technology is subject to legal challenge and scrutiny. The work that has been carried out to scrutinise and test this technology affords me confidence that we are meeting our equality obligations.”

Vaughan went on to state: “The National Physical Laboratory study confirms that the way in which the South Wales Police uses the technology does not discriminate on the grounds of gender, age or race. This reinforces my long-standing belief that the use of facial recognition technology is a force for good. It will help us to keep the public safe and assist us in identifying serious offenders in order to protect our communities from individuals who pose significant risks.”

Concluding his observations, Vaughan said: “I believe members of the public will continue to support our use of all the available methods and technology to keep them safe. Thanks to the work of the National Physical Laboratory and the results of its independent evaluation, we are now in a stronger position than ever before to be able to demonstrate that the use of facial recognition technology is fair, legitimate, ethical and proportionate.”

Prior to the Court of Appeal challenge three years ago, live-time deployments of facial recognition in the force area resulted in 61 people being arrested for offences including robbery, violence, theft and failure to respond to court warrants. The deployments took place at major sporting and public events in Cardiff and Swansea and were also used in support of operations orchestrated to tackle local criminality.

Liberty and Big Brother Watch respond

Responding to the National Physical Laboratory report on facial recognition technology, Katy Watts (lawyer at Liberty) said: “We should all be able to live our lives without the threat of being watched, tracked and monitored by the police. Facial recognition technology is a discriminatory and oppressive surveillance tool that completely undermines this basic right.”

Watts went on to comment: “The National Physical Laboratory’s report tells us nothing new. We know that this technology violates our rights and threatens our liberties. We are deeply concerned to see the Metropolitan Police Service ramp up its use of live facial recognition systems. The expansion of mass surveillance tools has no place on the streets of a rights-respecting democracy.”

Focusing on the technology itself, Watts explained: “Facial recognition doesn’t make people safer. Rather, it entrenches patterns of discrimination and sows the seeds of division. History shows us that surveillance technology will always be disproportionately used on communities of colour and, at a time when racism in UK policing has rightly been highlighted, it’s unjustifiable to use technology that will make this even worse.”

On a political note, Watts concluded: “This Government is intent on wrecking privacy rights and monitoring us as well as ripping up existing protections. It’s impossible to regulate for the dangers created by a technology that’s oppressive by design. The safest – and only – thing to do with facial recognition is ban it.”

Madeleine Stone (legal and policy officer at Big Brother Watch) has commented: “Live facial recognition is suspicion-less mass surveillance that turns us into walking ID cards, subjecting innocent people to biometrics-based police identity checks. This report confirms that live facial recognition does harbour significant race and gender bias, but then goes on to state that the police service can use settings to mitigate this. If rolled out right across the UK at some point in time, this could mean tens of thousands of us will be wrongly flagged as criminals and forced to prove our innocence.”

*Read ‘Facial Recognition Technology in Law Enforcement: Equitability Study’ online at https://science.police.uk/research/resources/operational-testing-of-facial-recognition-technology/

Company Info

WBM

64 High Street, RH19 3DE
East Grinstead
RH19 3DE
UNITED KINGDOM

04478 18 574309

Login / Sign up