Editorial: California’s new school rating tool is better, but still flawed

Students leave school in Inglewood, Calif. on April 5.
(GIna Ferazzi / Los Angeles Times)

After years of work and some ludicrous missteps, California’s annual report card on schools is finally up and measuring educational performance. It’s improved from its early iterations, and there’s a fair amount to like about it. But the new system is still lacking in many areas; the state shouldn’t consider its work done here. The reports can be hard to parse, and they make schools look like they’re doing a lot better than they are.

That’s not helpful to parents or the public. There are few things parents are more interested in than the quality of the schools to which they entrust their children; providing that information is a key responsibility of the state Department of Education. The general public has a stake in this too, given the investment taxpayers are making in public education.

The new California School Dashboard replaces the old Academic Performance Index, which provided simple numerical scores for each school, based almost solely on results from the state’s annual proficiency tests. The API was abandoned a few years ago for a legitimate reason: Judging a school’s entire performance on two tests, each given once a year, was a blinkered way to measure educational efforts. Besides, it didn’t reveal much.


These scores are about compliance with filling out forms, not about meeting the needs of students and families.

In contrast, the dashboard provides information about many more aspects of education — including graduation rates, suspension rates, parent engagement and the like. And for parents willing and able to throw themselves at the reports, there’s a mountain of information contained therein. The question is how many parents will do the work. If they rely on the simple color charts instead of looking closely at the information, they might get a misleading idea of how their children’s schools are doing.

Early versions of the dashboard were a lesson in confusion. The state started with a grid of color-coded squares that were almost indecipherable. Those gave way to a lineup of pie charts with colored pieces that were just about as bad.

The new and official version improves on that with a series of colored graphics that look like fuel gauges. They make a lot more sense and give, at a glance, a sense of where that school stands on the various measures: red at the “empty” end where performance is low, up through the rainbow to excellent performance in blue or a full tank of gas. Parents can then link to more in-depth versions of each measurement. The reports are easy to find online at

Serious problems remain, though, especially in the metrics that make schools look better than they should. If that was the state’s goal here, it’s doing the public a real disservice. One example: Schools can have low test scores and still come out looking great on “college and career readiness.”

A big part of the problem is that there is no reasonable or objective measure of what good performance consists of. Schools are measured against each other, not against a standard of excellence, and they get extra credit for improvement.


So, for example, the Los Angeles High School of the Arts has low test scores in both English and math. Its graduation rate is also down in the below-average orange zone. Only 42% of its students graduate ready for college or a career. Yet it receives a nice, above-average green rating on the fuel gauge because the statewide average also is only 42%, and it got extra credit for improving significantly over last year. Most parents wouldn’t view 42% as a good graduation rate, especially considering that it doesn’t take into account all the dropouts who certainly aren’t college- or career-ready.

Graduation rates and suspension rates are measured the same way on the colored gauge — how they come out against other schools, not whether they’re objectively doing well. The numbers are there, for parents who dig below the surface, but they shouldn’t have to go that deep to get an accurate reading.

Enter the Fray: First takes on the news of the minute from L.A. Times Opinion »

In order to get a stellar score on “parent engagement,” it’s enough for schools to do an annual survey of their parents and report the results. The parents might have very negative things to say in the surveys. They might say that no one talks to them and that they feel unwelcome. But as long as the school reports the survey results, it’s in good stead.

The same kind of thinking applies to whether schools have enough qualified teachers. Schools need report only on how many teachers aren’t certified for the jobs they’re doing. It doesn’t matter how many are “mis-assigned.” In other words, these scores are about compliance with filling out forms, not about meeting the needs of students and families.

The dashboard itself earns a middling yellow rating on the fuel gauge, mainly for depth of information and continuous improvement. But the state should recognize that it has a long way to go for the dashboard to reach stellar blue.


Follow the Opinion section on Twitter @latimesopinionand Facebook