Next month, the district will take the more controversial step of providing thousands of teachers with confidential ratings of their performance using the same approach, known as value-added. The district is also negotiating with the teachers union to include such measures in teachers' formal performance reviews, an effort the union bitterly opposes.
The new measure of academic success has been a top priority for incoming Supt. John Deasy, who formally takes over Friday. It comes as districts throughout the country are wrestling with the reliability and the proper use of the value-added approach, which estimates school and teacher performance by analyzing students' improvement on standardized tests in math and English.
The district has had the data to conduct its own analysis for years but had never done so. Officials have said their adoption of the approach was hastened by a Times series and database released in August that rated elementary schools and about 6,000 elementary school teachers according to their value-added scores. The paper will release an updated database with the scores of 11,500 elementary teachers in the coming weeks, and later this year plans to expand it to include middle schools.
The value-added approach focuses on how much progress students make year to year rather than measuring solely their achievement level, like the API, which is heavily influenced by factors outside a school's control, including poverty and parental involvement. Value-added analysis compares a student with his or her own prior performance, largely controlling for outside-of-school influences.
Because value-added is based on standardized test scores, most experts agree it should be one of several measures to determine school or teacher performance.
Some critics say the value-added approach is too volatile to be used for teacher evaluations, but most experts say it is more accurate for campuses because it is based on the performance of hundreds, if not thousands, of pupils.
The district's ratings, dubbed "Academic Growth Over Time," can send parents a very different signal about a school's performance. Take, for example, 3rd Street Elementary School in Hancock Park, which has an API score of 938, putting it among the highest-scoring schools in the district. Under the new growth measure, 3rd Street is one of the lowest-performing elementary schools in the district.
"We've got to do a better job and reexamine," said 3rd Street Principal Suzie Oh, adding that she was shocked by the results.
Parents can view the new school ratings Wednesday for elementary and middle schools in math and English, and ninth grade for English only, on the district's website. L.A. Unified officials have said they will use the school and confidential teacher ratings to determine if students are making expected progress and if teachers need help. The school score will be included in future campus report cards.
Deasy presented the approach to the Board of Education on Tuesday. It was greeted with some excitement, some confusion and some skepticism.
"How do we not get obsessed with testing and test scores?" asked board member Steve Zimmer.
Board member Richard Vladovic later said, "I think this is going to be a great tool to help parents."
Members of the teachers and administrators unions raised no objections at the meeting to the school-level ratings.
But A.J. Duffy, outgoing president of United Teachers Los Angeles, said in an interview that he suspects that administrators will use the new information punitively.
Duffy and other union leaders have said they will not agree to a new teacher evaluation system that includes student test score data because they believe it is unreliable and will narrow the curriculum.
The district scores are based on an analysis conducted by a nonprofit research group affiliated with the University of Wisconsin, which has a three-year, $1.5-million contract with L.A. Unified. The group has also worked with public school districts in New York City, Chicago and Milwaukee.
The district and The Times used a largely similar statistical approach to develop ratings but made some different decisions about which variables to include. The rankings are also presented differently. A Times analysis of the elementary school ratings under the two methods found a roughly 90% correlation between the results, and district officials agreed there was a large degree of overlap.