The analysis was based on test scores in Grade 2 through 5 at 450 of Los Angeles' approximately 500 elementary schools. It substantially changes the picture of which schools are succeeding and which are not.
Value-added has many critics who consider it unreliable and a narrow gauge of performance. It looks, in this instance, only at math and English scores, and it ignores many other factors that parents consider when choosing a school. Most of the controversy over value-added, however, has centered on whether it should be used to assess individual teachers, not schools.
Last Sunday, The Times published findings from a value-added analysis of more than 6,000 teachers in L.A. Unified, which noted that it matters much more which teacher a child gets than which school he or she attends. But parents don't usually pick a school for a single teacher; this analysis points to schools where teachers overall tend to be more successful at raising scores year after year.
Troubled by the exclusive focus on achievement under the federal No Child Left Behind law, the Obama administration has made analysis of student progress a priority for both teachers and schools. Several states, including California, are moving in that direction.
"I'm much less interested in absolute test scores and more interested in how kids are improving," U.S. Education Secretary Arne Duncan told The Times last week.
The results of such a shift are sure to be surprising.
"It's really shocking. I had no idea," said Nicole Miller, one of the Wilbur campers, upon hearing how the school fared in the Times analysis. "I would have definitely taken a really good look at other schools had I known those numbers."
The Academic Performance Index holds great sway in California education.
Principals tout high API scores and scramble to explain low ones. Real estate agents know to keep the numbers handy for house-hunting parents.
In elementary and middle school, the 1,000-point index is based entirely on how high students score on the state's annual tests, given in Grades 2 through 12. According to state data, 81% of the differences among schools reflect socioeconomic factors such as poverty and parents' education.
The benefit of the API is that it reflects state standards, helping to maintain clear and common goals for all schools. California third-graders, for instance, are expected to be able to add and subtract simple fractions.
But even those who designed the API more than a decade ago say it was never meant to be used alone. They recommended measuring student progress as soon as possible.
"The superiority of looking at student growth was recognized from the very beginning," said Ed Haertel, a Stanford professor and testing expert who helped develop the API for the state. "It's much more sensitive and accurate than the current system."
But California, like most states, isn't doing it. Developing the sophisticated tracking systems necessary for value-added analysis takes time and money. Budget constraints and political infighting, among other things, have stood in the way.
L.A. Unified is in a better position to act. It has had the data and computer systems in place to measure student progress for more than a decade, but it repeatedly has ignored the advice of its own experts to do so.
In 2006, for example, district researchers and outside consultants proposed including value-added scores on a new "report card" for each school.
The idea was rejected by administrators as too complicated for parents, said Julie Slayton, the district's former head of research and planning who is now an education professor at USC.