Advertisement

Editorial: California’s baffling new approach to evaluating public schools

Share via

It’s not easy to measure the performance of a school, because there are so many things that go into providing a good education. But neither should it be as hard as the State Board of Education is making it.

After three years of work, the board recently revealed a draft of its new annual accountability system for California schools. These are the report cards, in effect, that are to replace the old single-number Academic Performance Index by which schools have been judged for the last decade and a half. The API was based almost solely on the results of the annual standardized tests taken by students.

The board’s determination to measure schools by more than merely test scores is laudable and has led national thinking on the topic. But the new system is more than overly warm and fuzzy. Making sense of it is practically impossible.

Advertisement

There’s a series of colored boxes, with the colors designed to reflect both the school’s actual performance on a given measurement — such as how many students are suspended or what surveys say about the school’s atmosphere — and whether that performance is getting a little better or a lot better or … well, you get the picture.

Or more likely you don’t, because we’re sure having problems with it. There are nine different categories for measuring schools, with only one of those being how its students scored on the standards tests. Others include “basics” (such as having adequate textbooks and facilities) and “implementation of academic standards.” Each category is ranked by how high a priority it is for that particular school. And each category has two colored boxes. And there are six possible colors for each box. We’ve got this much: Green is good. Red is bad. Yellow is somewhere in between. Just like traffic lights. It’s hard to ascertain what the rest mean, but there’s a separate chart showing all the colors that is supposed to give you an idea of what they stand for. But it doesn’t, really.

Plus extra boxes for “equity reports,” which refer to how well subgroups such as Latino, black and low-income students are doing. But the numbers — also color-coded — confuse more than they inform.

Advertisement

If you’re a parent trying to figure out whether one school in your district is better than another, well, there’s no clear way to do it. If you’re a voter who wants to determine how much the local schools have improved over time, good luck.

The API was simplistic. It defined excellence too narrowly... But this new method swings way too far in the opposite direction.

The state board and its staff — educated and deeply immersed in the minutiae of school policy — have lost touch with those of us who aren’t experts in educational statistics. They would have seen the gap if they had taken these charts for a test drive, showing them to parents waiting outside schools to pick up their kids or to voters waiting in line at the supermarket. More than likely, these multicolored charts would have elicited a well-deserved “Huh?”

Advertisement

One reason the charts are so over-complicated is that they’re being larded with too many factors that don’t reveal how well students are learning. Among the elements being judged: parent involvement, suspension rates, graduation rates and the like. These may be important aspects of school life, but they are a means to an end — better-educated students. If a school doesn’t suspend any students, but also doesn’t improve academic outcomes, why should it get credit for hollow achievements?

Compare this with the API: one number, reflecting test performance, with a clear target number of 800 for academic proficiency. With a couple more numbers, parents could tell at a glance where that school stood in its efforts to improve.

The API was simplistic. It defined excellence too narrowly, which in turn led to “teaching to the test.” But this new method swings way too far in the opposite direction.

Meanwhile, the new federal Every Student Succeeds Act requires states to identify the lowest performing 5% of schools and intervene to improve achievement. How will California ever identify those schools? How will parents identify them?

The board is scheduled to adopt its final plan in September, just two months away. But these charts are unadoptable. The board must start over and create a simpler, clearer, more rigorous way to measure achievement.

Follow the Opinion section on Twitter @latimesopinion and Facebook

Advertisement
Advertisement