Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Friday, March 29, 2024 | Back issues
Courthouse News Service Courthouse News Service

Study Highlights Race & Gender Flaws in Facial Recognition

Asian Americans and black women are a few groups disproportionately likely to be get misidentified by facial-recognition software, a federal study published Thursday reports.

(CN) — Asian Americans and black women are a few groups disproportionately likely to be get misidentified by facial-recognition software, a federal study published Thursday reports.

Taking more than 18 million photos used for criminal mug shots, visa applications and two other government data sets, researchers with the National Institute of Standards and Technology evaluated 189 software algorithms from 99 developers.

What they found is that the algorithms are more likely to produce a false-positive result, one that incorrectly associates two different people as the same person, in people who are racially East African, West African and East Asian people. False positives are less common among people of Eastern European extraction.

When using algorithms developed in China, however, the rate of false positives dropped for East Asian faces.

NIST computer scientist and lead author of the report Patrick Grother pointed out that these findings from the Chinese algorithms may be caused by the data used to train it. That's a good sign for future algorithms.

“These results are an encouraging sign that more diverse training data may produce more equitable outcomes, should it be possible for developers to use such data,” Grother said in a statement.

Faces of children and the elderly also showed higher rates of false positives, particularly in the very old and very young.

When processing mugshots, there were high false positive rates in American Indian, African American and Asian faces, according to the report.

In all algorithms and photographs, false positives occur more for women than men. Black women showed the highest rate.

The report also found the algorithms failed to identify the same person in two different photos, otherwise referred to as a false negative.

According to the report, mugshots of Asian and American Indian faces have higher error rates than faces of Caucasian or African-American individuals.

False negatives appeared more for African faces when using photos from entry points at the U.S. border. The study notes that camera quality could be a factor here, as the ones used for mugshots typically take higher-quality photos.

NIST tasked the algorithms on two applications, the first to confirm that a photo matches a different photo of the same person within a database, also called “one-to one” matching. The second task for the algorithm was to find if a photo of a person has a match in the database, also known as “one-to-many” matching.

Making sure the algorithms do not err in “one-to-many” matching is important to prevent consequences such as false accusations, Grother explained.

“In a one-to-one search, a false negative might be merely an inconvenience — you can’t get into your phone, but the issue can usually be remediated by a second attempt,” Grother said in a statement. “But a false positive in a one-to-many search puts an incorrect match on a list of candidates that warrant further scrutiny.”

Grother further stressed the consistency of the findings and the importance of them as facial-recognition technology grows.

“While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” Grother said in a statement. “While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.”

Though the study did not investigate the causes of the mismatching, it did give a few technical reasons such as a lack of certain components in a camera and the quality of photos.

The photos also included data about the person’s age, sex and race or country of birth.

NIST conducted the study through its Face Recognition Vendor Test program that uses face recognition algorithms submitted by academic and industry developers.

This is the first time the agency has looked into how facial recognition technology can identify people of different demographics.

Categories / Government, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...