Facial-Recognition Tools Banned for Boston Police

A video surveillance camera hangs on a pole outside City Hall in Springfield, Mass., last year. (AP Photo/Matt O’Brien)

BOSTON (CN) — The Boston City Council voted unanimously Wednesday to prohibit police from using facial-recognition technology, becoming the largest city in the country to do so after San Francisco.

The move comes as the American Civil Liberties Union and other critics sharpen their evidence that the tool has inherent racial biases in that it is more likely to return false positives for black people.

Just this morning, the ACLU lodged a complaint in Detroit over the Jan. 9 arrest of a black man after a police computer mistakenly identified the culprit behind a Shinola watch store burglary.

Detroit police sent a blurry surveillance camera image of the 2018 incident to Michigan State Police, which identified a match with Robert Williams after running it through a system that compared it to a database of driver’s license photos.

Detroit police arrested Robert Williams in early 2019 based on a faulty ID from facial-recognition technology linking him to a 2018 robbery. The ACLU brought a complaint Wednesday, June 24, with the city. (Photo by ACLU via Courthouse News)

At the time of the robbery, however, Williams was leaving work in a town 40 minutes from the Shinola store in Detroit.

Boston’s vote will have little practical effect because the police department doesn’t currently use facial recognition. “Until this technology is 100%, I’m not interested in it,” Boston Police Commissioner William Gross said during a hearing earlier this month.

“I didn’t forget that I’m African American and I can be misidentified as well,” Gross added.

Facial-recognition technology is being developed on a large scale by companies such as IBM, Google, Facebook and Microsoft.

Although several companies have recently announced that they will suspend or limit sales to law enforcement because of controversy, the ACLU says about a quarter of the country’s 18,000 police departments have access to the technology.

In Florida, where the system is searched some 8,000 times a month, the ACLU says it can access millions of mugshots and driver’s license photos.

In San Diego, police can snap anyone’s photo on the street with a phone and immediately determine if the person has a mug shot on file.

Apart from false matches, there is a potential for misuse of the system. The Georgetown Center on Privacy and Technology reports that few law enforcement agencies audit their systems‘ usage for abuse such as stalking and targeting of political protesters.

“In a time where we’re seeing so much direct action in the form of marches and protests for rights, any kind of surveillance technology that could be used to essentially chill free speech or … more or less monitor activism or activists is dangerous,” Boston City Councilor Ricardo Arroyo said.

But racial bias has drawn the most attention. One study at the MIT Media Lab found that Amazon’s technology, called Rekognition, misclassified black women as men 31% of the time. An IBM system had a similar error rate and Microsoft’s system, while better, made the same mistake about 20% of the time.

In December, a 1,200-page report from the nonpartisan National Institute of Standards and Technology reached similar results.

“Lawmakers need to stop allowing law enforcement to test their latest tools on our communities, where real people suffer real-life consequences,” said ACLU senior legislative counsel Neema Singh Guliani in a statement. “It’s past time for lawmakers to prevent the continued use of this technology. What happened to the Williams family should never happen again.”

In addition to San Francisco, bans similar to Boston’s have been enacted in five other Massachusetts cities — Springfield, Cambridge, Northampton, Brookline and Somerville — and in Berkeley and Oakland, California.

Beyond police use of the technology, there are concerns about companies making it available to private individuals. For instance, a company called Clearview AI has captured billions of faceprints from social media and other sources and has offered its face-recognition technology to businesses including Best Buy, Walmart and Macy’s.

Several weeks ago the ACLU sued Clearview, claiming the company violates citizens’ right to privacy by allowing secret tracking of domestic-violence survivors, undocumented immigrants, sex workers, and anyone who attends a protest rally or an Alcoholics Anonymous meeting, for instance.

Clearview “will end privacy as we know it if it isn’t stopped,” the ACLU warned.

The Boston City Council vote was 13-0 in favor of banning the technology. The measure now goes to Mayor Marty Walsh; it will automatically become law if he takes no action within 15 days.

One councilman, Michael Flaherty, said that he would be open to the technology if it worked. “In the event that it does get perfected, in the interest of public safety, we’re going to have to take a long hard look at it,” he said.

%d bloggers like this: