Santa Cruz becomes the first U.S. city to ban predictive policing
Nearly a decade ago, Santa Cruz was among the first cities in the U.S. to adopt predictive policing. This week, the California city became the first in the country to ban the policy.
In a unanimous decision Tuesday, the City Council passed an ordinance that banishes the use of data to predict where crimes may occur and also barred the city from using facial recognition software.
In recent years, both predictive policing and facial recognition technology have been criticized as racially prejudiced, often contributing to increased patrols in Black or brown neighborhoods or false accusations against people of color.
Predictive policing uses algorithms that encourage officers to patrol locations identified as high-crime based on victim reports. The software “replicates and supercharges bias in policing by sending police to places that they’ve policed before — that is often going to be Black and brown communities,” said Matt Cagle, a technology and civil liberties attorney with the American Civil Liberties Union of Northern California.
The Santa Cruz Police Department, which began predictive policing with a pilot project in 2011, had put a moratorium on the practice in 2017 when Andy Mills started as police chief. The new city ordinance bans the practice permanently.
Mills said predictive policing could have been effective if it had been used to work with community members to solve problems — that didn’t happen. Instead, the policy was used “to do purely enforcement,” leading to unavoidable conflicts, he said.
“You try different things and learn later as you look back retrospectively,” Mills said. “You say, ‘Jeez, that was a blind spot I didn’t see.’ I think one of the ways we can prevent that in the future is sitting down with community members and saying: ‘Here’s what we are interested in using. Give us your take on it. What are your concerns?’ ”
Nationwide, protesters and activists have been calling to “defund the police.” But what does it actually mean? And why are so many people calling for it to happen?
Like predictive policing, facial recognition systems have also come under fire, with critics arguing they can show bias toward faces with similar characteristics to the ones used to create the technology, particularly based on racial makeup.
A Santa Cruz City Council report said that “despite purported technological advances, a recent National Institute of Standards and Technology study found that some forms of face recognition technology were 100 times more likely to misidentify people of African and Asian descent.”
While Santa Cruz is the first city in the nation to ban predictive policing, it follows other cities that have banned facial recognition technology — notably San Francisco in May 2019 and Oakland in July 2019.
Roughly 20% of legislators were erroneously matched to a person arrested when the ACLU used face-scanning software to screen their pictures against a criminal database.
The City Council also voted Tuesday to evaluate additional police reforms following the Minneapolis killing of George Floyd.
“We’re really taking this situation seriously, and we are trying to be proactive in continuing this momentum toward actual systemic change,” Mayor Justin Cummings said during the meeting.
In the last month, he has convened three meetings with Black community members to understand their experience with Santa Cruz police. Cummings said he hopes to continue hearing “from communities that are the most impacted and understand how we can support those communities moving forward.”
Nearly 400 local community members had signed an ACLU petition calling for the predictive policing ban, and a host of organizations also backed the ordinance — the NAACP Santa Cruz chapter, American Friends Service Committee and Asian Americans Advancing Justice, to name a few.
Cagle said he didn’t hear a single speaker oppose the initiative before the long-anticipated ordinance passed.
Even Santa Cruz-based predictive policing company PredPol, whose software was used by the Santa Cruz Police Department until 2017, backed the new rules.
“Any government agency that applies technology to its operations should have a process to ensure that it does not result in racially inequitable outcomes,” PredPol CEO Brian MacDonald wrote in an email to The Times. The company is confident its software is not racially biased, MacDonald said, and therefore meets the conditions in the city’s ordinance.
With widespread support and research backing it, Cagle doubts the policy could ever be reversed.
“Santa Cruz’s ban is ironclad, and it requires new public legislation and finding that the technology cannot perpetuate bias” if such technology were to ever be used again, he said.
Early each morning, computers spit out maps of Los Angeles, marked with red squares where a complex algorithm has judged that property crimes are most likely to occur.
Southern California has been the home of predictive policing since the late 2000s. The concept is the brain child of former LAPD Chief Bill Bratton, who wanted to determine whether past data could predict crime locations.
As police departments and universities tweaked the model, they determined their map could forecast crime types, locations and times. In 2012, software developing business PredPol chose Santa Cruz as its home, citing the local success of predictive policing.
Since then, some studies have shown the policy reduces crime, but others have found it to have a negligible effect.
While civil liberties organizations flag predictive policing as racially biased, MacDonald said he stands by PredPol’s algorithm.
“If the command staff provides no explicit guidance, officers have to rely on their ‘judgment’ or ‘intuition’ as to where to patrol. Human judgment is of course prone to error and subject to bias, whether conscious or subconscious,” he wrote.
The LAPD announced in October that it would make changes to its PredPol program, including the creation of a data-driven policing unit to oversee all crime-fighting strategies and seeking input from community groups before implementing new data programs.
The department also said it would develop a system to provide periodic reports about data programs and outcomes with statistics on people and locations targeted for intervention.
The changes were made seven months after an inspector general couldn’t determine whether the LAPD’s predictive-policing program helped reduce crime.
Though MacDonald maintains that well-formulated algorithms can mitigate racial bias in policing, Cagle questions the policy.
“Private for-profit companies shouldn’t be dictating policing in any American community,” he said. “Banning facial recognition won’t dismantle the racism and bias that pervades policing in America, but it does take away a system that we know will further exacerbate that problem.”
The perils of parenting through a pandemic
What’s going on with school? What do kids need? Get 8 to 3, a newsletter dedicated to the questions that keep California families up at night.
You may occasionally receive promotional content from the Los Angeles Times.