San Francisco D.A. unveils program aimed at removing implicit bias from prosecutions
The San Francisco district attorney’s office said Wednesday it plans to launch a program that would allow prosecutors to make charging decisions in some cases without knowing the race or background of the suspects and victims, a move aimed at reducing the potential for implicit bias in prosecutions.
Dist. Atty. George Gascon said the program, using software engineered by Stanford University, will remove references to race and other identifying factors from cases presented by the San Francisco Police Department, allowing prosecutors to review each case only “through the lens of the behavior that is being alleged, and whether or not that is criminal behavior.”
The program will redact identities from a police narrative and replace them with generic terms — such as “suspect” or “victim” — and it will remove references to the locations of crimes and the identities of arresting officers in order to prevent prosecutors from making assumptions, Gascon said. The program could roll out by early July, he said.
“If you were in Los Angeles, and the crime occurred in Watts, and you use the word ‘Watts,’ you’d think the prosecutor is very likely going to assume the perpetrator is black or Latino,” Gascon said.
The software was developed by Sharad Goel, an assistant professor at Stanford’s School of Engineering, who hopes to see the program employed by other prosecutors in California and beyond.
“I’m hoping that we can make this very easy for other agencies to use. I’d like to see this used across the country. We plan on making all our code available freely,” he said. “We’re hoping to mitigate the role of implicit bias in charging decisions, and hopefully that will lead to more equitable decisions for everyone.”
The review process does have its limitations. It cannot eliminate racial or identifying factors from photo or video evidence, and Gascon said the “bias mitigation reviews” will not be conducted in homicide, sexual assault or domestic violence cases at first. The program cannot be used in officer-involved shooting cases or other use-of-force reviews, as there are different legal thresholds for what constitutes a crime in those situations and a prosecutor would need to know a person’s law enforcement background.
In qualifying cases, prosecutors will be asked to make an initial charging decision through the blind review process, then again once they are given access to evidence that had not been run through the mitigation tool, including police officer body camera video. If a prosecutor’s decision changes between the two phases, they will be expected to document what led to the change “in order to refine the tool and to take steps to further remove the potential for implicit bias to enter our charging decisions,” according to a statement issued by the district attorney’s office.
Eugene O’Donnell, a former New York City prosecutor who now teaches at the John Jay College of Criminal Justice in Manhattan, said that while Gascon’s goal is laudable, prosecutors may also find themselves blind to certain contextual information under the program. He was especially irked by the potential removal of geographical information from a case file.
“You can’t ignore crime maps in a city,” he said. “A huge amount of crime takes place on certain streets, while there are streets in San Francisco that have never had a shooting.”
Gascon’s announcement is part of a broader push for prosecutors to take stock of racial disparities in prosecutions. Earlier this month, the Connecticut Legislature passed a law that will require prosecutors to collect data on the number of defendants receiving prison time, plea deals or access to pretrial intervention programs. Those categories will be separated out by race, ethnicity, sex and age.
Eric Schweitzer, a defense attorney and vice president of the California Attorneys for Criminal Justice, said he hopes to see other prosecutors make use of the software developed at Stanford.
“Implicit and explicit racial bias undermines our criminal justice system at every step in the process. This innovative approach that Stanford appears to have designed might be a good first step for prosecutors to root out their own bias,” he said. “Everybody has biases, we know this from years and years of study. If you say I have no bias, you’re mistaken.”
Follow @JamesQueallyLAT for crime and police news in California.
The perils of parenting through a pandemic
What’s going on with school? What do kids need? Get 8 to 3, a newsletter dedicated to the questions that keep California families up at night.
You may occasionally receive promotional content from the Los Angeles Times.