The Los Angeles Police Department misclassified an estimated 14,000 serious assaults as minor offenses in a recent eight-year period, artificially lowering the city’s crime levels, a Times analysis found.
With the incidents counted correctly, violent crime in the city was 7% higher than the LAPD reported in the period from 2005 to fall 2012, and the number of serious assaults was 16% higher, the analysis found.
When presented with the findings, top LAPD officials acknowledged the department makes errors and said they were working to improve the accuracy of crime data reporting.
“We know this can have a corrosive effect on the public’s trust of our reporting,” said Asst. Chief Michel Moore, who oversees the LAPD’s system for tracking crime. “That’s why we are committed to ... eliminating as much of the error as possible.”
The misclassified cases often involved attacks that resulted in serious injuries, such as a 2009 incident in which April L. Taylor stabbed her boyfriend in the stomach with a 6-inch kitchen knife during a domestic dispute, police and court records show.
Police arrested Taylor, who later was found guilty of assault with a deadly weapon. In the LAPD’s crime database, however, the attack was recorded as a “simple assault.” Because of this, the case — like other misclassified incidents — was left out of the department’s tally of violence in the city.
The errors occurred during a time when the LAPD was reporting major drops in crime across the city. The Times analysis found the misclassified cases were not numerous enough to alter the overall downward trend.
Still, the findings are a mark against a department that has long been viewed as a national leader in using data to help deploy officers and set crime-fighting targets. When Mayor Eric Garcetti took office in 2013, he held up the LAPD’s data-tracking system as a model the rest of city government should emulate.
The findings follow a Times investigation last year that examined LAPD crime data from a 12-month period ending in the fall of 2013 and found widespread errors in the way serious assaults were classified.
The civilian Police Commission, which oversees the LAPD, also instructed its inspector general, Alex Bustamante, to conduct a wide-ranging audit of the department’s crime data. Officials said they expected Bustamante to release his findings later this year.
The reforms implemented last year center around a newly formed team of detectives responsible for improving the quality of the department’s crime reporting. The team, called the Data Integrity Unit, has retrained hundreds of officers who have a role in classifying crimes.
The unit also now conducts frequent spot checks on crime reports from across the department’s regional divisions in search of mistakes and signs of confusion among officers and their supervisors, said John Neuman, senior crime analyst.
In addition, detailed flowcharts — called “decision trees” — were created to provide step-by-step instructions for station supervisors, who are responsible for approving officers’ crime reports and deciding how incidents should be classified.
An internal police audit released this week found the crime categorization problems persisted last year — auditors found enough errors in 2014 data that aggravated assaults would have been 23% higher than previously reported. LAPD officials emphasized that the new audit reviewed data captured before the reforms took effect and said error rates are expected to improve in the future.
The LAPD continues to search for ways to remove as much human error as possible from the reporting process, Moore said.
Replacing the department’s badly outdated records management system with new, more automated technology would help. However, city officials denied a department request for money to conduct an assessment of what a new system would cost, he said.
The efforts to improve data accuracy within the LAPD have unfolded amid a sharp rise in the city’s violent crime over the last year. Moore and other police officials have said that some of the increase may be due to more accurate reporting, but that much of it reflects an actual increase in crime.
For example, homicides and shootings — categories less susceptible to classification errors — are both up by double digits this year.
To analyze the data, The Times used a computer algorithm. The program used crime data from the Times’ earlier investigation to learn key words that identified a crime as a serious or minor assault. The algorithm then analyzed nearly eight years of incident data, and the findings were sampled and manually checked by reporters for accuracy.
As in its earlier examination, The Times found examples of serious crimes such as the Taylor stabbing that were dismissed as minor offenses.
In 2012, for example, William Wisdom pulled a gun on a man, according to police and court records. Although Wisdom was arrested and found guilty of brandishing a firearm, police counted his crime as a minor one.
As part of its reforms, the department now counts all such brandishing cases as serious crimes.
Moore said he was confident misclassified crimes were still receiving appropriate attention from detectives.
Some current and retired LAPD officers have complained about what they saw as top-down pressure from division captains to meet crime reduction goals, which they said could lead to data manipulation.
Moore said holding captains accountable for addressing crime trends in their divisions is a tool the LAPD and many other police departments find useful.
He acknowledged, however, that some supervisors complained the intense grilling senior officials deliver at weekly meetings on crime trends were “too pressure-cooked” and “condescending.” Moore said he had worked to change the tone of the meetings.
“Is there pressure today? Absolutely,” Moore said. “We hold our people to high standards. Our issue is to do so respectfully and in a manner that provides people with their dignity.”