Advertisement

HEALTHCARE

Share

Some hospitals are better than others. But for many years all patients had to go on was reputation, doctors’ advice, word of mouth and advertising. Today, California follows some other states, the federal government and a few private groups in offering a window on hospital quality.

The study by state officials of hospital death rates shows that for eight common conditions and procedures -- including stroke, hip fracture and brain surgery -- the rates vary widely.

The study looked at mortality rates for 2007 and 2006. It found that, in 2007, 25 hospitals had death rates that were significantly better than the state average on at least one indicator, while 94 were significantly worse in at least one area.

Advertisement

In 2006, 33 hospitals had mortality rates that were significantly better on at least one indicator, while 98 hospitals rated significantly worse on at least one indicator.

Los Angeles County hospitals fared especially well in acute stroke care, based on mortality statistics in 2007. Of 97 hospitals in the county, 13 had significantly better than average mortality ratings for stroke, while only one was worse than average on the indicator.

The county hospitals with significantly better than average stroke death rates in 2007 are: Citrus Valley Medical Center; Foothill Presbyterian Hospital-Johnson Memorial; Garfield Medical Center; Glendale Memorial Hospital; Kaiser hospitals in Baldwin Park, Bellflower, Panorama City and on Sunset Boulevard in Los Angeles; Long Beach Memorial Medical Center; Olympia Medical Center; Pacific Alliance Medical Center; USC University Hospital; and Whittier Hospital Medical Center.

Officials plan to post the study today at www.oshpd- .ca.gov and said they hoped it would help improve care.

“It is our hope that the timely release of these new indicators will encourage California’s hospitals to examine their practices and improve their quality of care and help inform consumers and patients about their healthcare choices,” said David Carlisle, director of the Office of Statewide Health Planning and Development.

But the study was immediately criticized. Torrance Memorial Medical Center, which received a worse than average mortality rating for gastrointestinal hemorrhage, said the information was badly flawed.

Advertisement

The hospital’s own review of the 40 deaths in 883 gastrointestinal hemorrhage cases during the two-year study period “revealed a startling result: 15 of the 40 patients did not expire at Torrance Memorial,” the hospital said in a statement. “In fact, many of the patients listed by OSHPD as deceased are still known to us to be alive.”

The hospital said it discovered a programming error in the electronic data transfer from its medical record system to the state. A recalculation without the 15 cases inadvertently classified as deaths would result in a mortality rate well within the state average, the hospital said.

“We are very concerned about the validity of all mortality studies for the period 2006-2007 because the programming error extended to all patient types, not just deaths from GI hemorrhage,” the hospital said.

Pat Sullivan, a spokesman for the statewide health office, said that the state worked with hospitals to ensure the data they submitted were accurate and that Torrance signed off on the results in December.

“This is the first we’ve heard of this,” Sullivan said.

State officials said the data were adjusted for risk, taking into account patients’ preexisting health problems, to level the field and allow fair comparisons among hospitals.

Cedars-Sinai Medical Center was not significantly better or worse than the statewide average in any category in 2007. The study found its stroke mortality rate was better than average in 2006, and, in 2008, it was designated a Primary Stroke Center by the Joint Commission, the nation’s oldest and largest standards-setting body.

Advertisement

Neil Romanoff, vice president for medical affairs at Cedars, said the study offered a limited view of hospital care because it failed to take into account deaths that occurred shortly after hospitalization.

“If a hospital . . . transfers their patients out alive earlier and they die at the next level of care, what does that tell you?” Romanoff said. “These are complicated questions that are not clearly answered by one measure of quality.”

Joseph Parker, director of the statewide health office’s Health Outcomes Center, said a study that took into account deaths after hospitalization would be less timely. “There’s a trade-off here,” he said. “We wanted to get information here that is more recent and actionable.”

The state plans to update the study annually and to expand the categories. The federal Centers for Medicare and Medicaid and about 15 states publicly report various hospital quality indicators. Some report how well hospitals adhere to model practice standards, while others look at mortality and other outcomes.

--

lisa.girion@latimes.com

Advertisement