San Francisco DA Turns to AI to Tame Racial Bias

SAN FRANCISCO (CN) – Hoping to eliminate the unconscious bias that influences criminal charging decisions, San Francisco District Attorney George Gascon unveiled new open-source technology that promises to “take race out of the equation” for prosecutors.

San Francisco District Attorney George Gascón (L) and Alex Chohlas-Wood explain a new tool to help bias from process of deciding whether to file criminal charges. (Maria Dinzeo / CNS)

“We wanted to create something that is above the human touch, something that can actually filter the work to ensure that race is not going to play a role in our decision-making process,” Gascon told a room of reporters at his Hall of Justice headquarters Wednesday.

A new tool developed at no cost to the city by the Stanford Computational Policy Lab scans the police incident reports prosecutors use to decide whether to charge someone with a crime. The program uses an algorithm to remove any information that could be used to determine the suspect’s race – hair and eye color, specific neighborhoods and the names of suspects, victims and the arresting officer – replacing those identifiers with generic terms.

Alex Chohlas-Wood, deputy director of the Stanford Computational Policy Lab, explained the tool uses both pattern recognition and named entity recognition (NER), which identifies and classifies words and puts them into defined categories like names, locations, and the like. He said the tool will also color-code each classified word and make muddled narratives more coherent.

“So you don’t have to, as an intake attorney, keep track of someone’s name. It’s much easier to read,” Chohlas-Wood said.

Example of a police report, which prosecutors use to decide whether to bring criminal charges. (Maria Dinzeo / CNS)

The tool will be used on reports electronically submitted through his office’s general intake department. Prosecutors would make a preliminary charging decision based on that filtered report, then a final decision after reviewing any supplemental information, such as body-camera or video footage.

Gascon said the tool will also highlight biases within the San Francisco Police Department and other agencies.

“They know that this is going to happen. At this stage it doesn’t really impact them at all because they’re going to send their reports the same way they always have, but I think it’s going to be a tool for them to see their own work,” Gascon said.

Gascon said that while technology must have safeguards in place to protect against misuse, it can also enhance the work police and prosecutors do.

“Technology is a genie that is out of the bottle,” he said. “Anybody who says we’re going to take this or that kind of technology and we’re going to ignore it and put it back in the bottle, guess what, you may as well smoke that new legal stuff that we have here plenty, because that’s not going to happen. Let’s deal with what we have today and take advantage of what we have.”

He added that he hopes prosecutors around the country will also be interested.

“Anybody who wants to use it will have access to the algorithms and certainly we’re making an open invitation to any prosecutor in the country who wants to come here and see how it works here so that whatever mistakes we’re making they can avoid them,” he said.

Example of a police report in which an AI tool has purged information that could spur bias when deciding on criminal charges. (Maria Dinzeo / CNS)


%d bloggers like this: