Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Thursday, March 28, 2024 | Back issues
Courthouse News Service Courthouse News Service

Report Blasts London Police Use of Facial Recognition Cameras

A team of researchers found troubling flaws with facial recognition cameras deployed by London police in the past three years, including a very high rate of wrongly identifying people as criminals. The researchers concluded that the use of real-time facial scanning is probably illegal under British law.

(CN) — A team of researchers found troubling flaws with facial recognition cameras deployed by London police in the past three years, including a very high rate of wrongly identifying people as criminals. The researchers concluded that the use of real-time facial scanning is probably illegal under British law.

The University of Essex issued its critical report on Wednesday after spending months working with the London's Metropolitan Police Service as independent researchers. The police force commissioned the study.

London police experimented with real-time facial recognition on 10 occasions between 2016 and 2019, the report found. British police have used the cameras to scan faces in public spaces, including at soccer matches and street festivals, and matched them in real-time to the faces of wanted criminals.

The cameras are becoming more popular around the world, especially in China, where they are used to catch not just criminals but also jaywalkers. In China it is common for customers to make purchases by looking into such cameras too.

But human rights experts warn that the spread of their use is a dangerous infringement on privacy and basic rights and will lead to mistaken detentions by police.

The University of Essex team said they found “significant flaws” with the cameras used by London police. Most disturbing was the high rate of false matches.

During the 10 occasions when the cameras were used, police found 46 matches of people to wanted criminals, the report said. Of these matches, police felt confident enough to stop 22 people and ask for identification. Of those 22 stops, 14 were wrong matches and eight were successful matches, the report said. In all, then, the cameras picked the wrong person 81% of the time.

Additionally, the report found that police stopped people who had already had their criminal cases resolved, revealing that the watch lists used by the police were out of date.

The report said it is “highly possible” that the use of the cameras “would be held unlawful if challenged in court.”

The lead researchers, criminologist Pete Fussey and human rights law expert Daragh Murray, advised the London police to stop using the cameras.

“It is essential that human rights compliance is ensured before deployment, and that there be an appropriate level of public scrutiny and debate on a national level,” the university said in presenting the report.

Duncan Ball, the deputy assistant commissioner of London police, said in a statement to Courthouse News on Friday that it was “right that we trial emerging technology” to catch violent criminals.

He added: “We are extremely disappointed with the negative and unbalanced tone of this report.”

He said the cameras have “the potential to help our officers locate criminals who are wanted for serious and violent offenses, such as knife and gun crime, and the sexual exploitation of children.”

Ball said officers were careful to carry out “checks and balances” before “any police action was taken.” He praised the cameras for being able to detect some criminals.

“The final decision to engage with an individual flagged by the technology was always made by a human,” Ball said.

He said the London police acted lawfully in deploying the technology on a trial basis.

“We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer,” he said.

But Murray, the human rights law expert at Essex University, warned against use of the technology.

“This report raises significant concerns regarding the human rights law compliance of the trials (of the cameras' use),” Murray said. “The legal basis for the trials was unclear and is unlikely to satisfy the ‘in accordance with the law’ test established by human rights law.”

The researchers added that “there is no explicit legal authorization for the use of live facial recognition in domestic law.” In other words, the British Parliament has not passed a law specifically allowing the use of the technology by police.

Murray said the police failed “to identify human rights harms or to establish the necessity of” the technology before deploying it.

“Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset, and was not an integral part of the process,” he said.

The researchers were given broad access to police operations during the last six occasions when police used the cameras, between June 2018 and February 2019.

By June last year, use of the cameras had become a subject of controversy and faced legal challenges in London and in Wales.

Police in London used the technology on several occasions, including during the Notting Hill Carnival, a major street festival that celebrates Britain’s Caribbean heritage.

Critics of the technology say the cameras have a chilling effect on democracy and freedom of expression and erode the right to privacy. They call it a new Orwellian-style form of surveillance, a reference to George Orwell and his novel “1984,” in which citizens are watched over by an authoritarian regime ruled by an all-powerful figure called Big Brother.

The cameras use facial recognition software that measures scanned faces and draws up a map of a person’s face using a numerical code. Once this numerical map of a face is created, an algorithm seeks a match with facial images in a database.

“The face of each and every person passing by an automated facial recognition camera will be scanned and analyzed, effectively subjecting every person within view to a biometric identity check,” Big Brother Watch, a London-based nongovernmental privacy advocacy group, said in a report last year.

That report also said that real-time use of the technology is akin to having the general public “asked for their papers without their consent.”

(Courthouse News reporter Cain Burdeau is based in the European Union.)

Follow @cainburdeau
Categories / Civil Rights

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...