Advertisement

Rankings for Many Schools Recalculated

Share
TIMES EDUCATION WRITER

The state on Wednesday revised a portion of the data in its new school accountability system and changed the rankings for about half of California campuses.

The recalculations affect the “similar school” rankings, which compare each school’s Stanford 9 test scores with a group of schools with similar demographics and resources.

The similar school rankings were first issued in January as part of the Academic Performance Index, but it quickly became evident there were errors and gaps in the information that formed the basis of the comparisons.

Advertisement

Each school also received a rank of 1 to 10 based solely on its students’ scores on the Stanford 9. Those rankings have not changed.

Since February, 4,100 schools in 600 districts gave the state revised data on the number of students participating in the federal lunch program, which is the key measure of poverty among schoolchildren. As a result, 18% of schools saw their rankings change more than two rungs on the 10-point scale; 38% of schools experienced one- or two-point shifts.

The demographic data include numerous components such as parents’ education level, percentage of students fluent in English and average class size. In most cases, schools had underreported the proportion of their students participating in the school lunch or breakfast programs. That meant their scores were compared with schools serving students who were, on average, more affluent.

Glendale Unified School District, for example, had been dismayed to see low rankings in the similar school index. After the revisions, 18 of its schools improved their rankings, one went down and the rest remained the same.

“We went back over the data . . . very carefully and found that some of the information submitted originally was inaccurate or had not been sent, so we worked very hard to get it clarified,” said Supt. James R. Brown.

In the Los Angeles Unified School District, only 67 schools saw their relative standing change by two or more points on the scale.

Advertisement

The new rankings will be posted on a state Internet site this morning. Also included for each school are the names of 100 other campuses with similar poverty levels. The address is https://www.cde.ca.gov/psaa/api.

The state devised the similar school ranking in an attempt to neutralize the powerful effect of income and school resources on academic performance. Policymakers felt it was useful and fair to compare schools to peers with similar advantages or disadvantages.

But the more crucial ranking is the one based strictly on the test scores. That’s the scale the state will use to judge whether schools are making adequate progress. Those that make a predetermined amount of progress will be eligible for financial rewards. Schools that fail to meet improvement targets could have remedies forced on them and, in extreme cases, face takeover by the state.

Many testing experts advise against using standardized exams for such high-stakes purposes because it is difficult to correlate changes in results with actual changes in what is being taught or how. They caution that improvements could be the result of cramming or subtle forms of cheating rather than substantive changes in how much students actually know. Or, score fluctuations could reflect the changes in the student body from year to year.

The inaccuracies in the similar school rankings were one of several foul-ups that plagued the state’s testing program. Last summer incorrect scores were reported for students not fluent in English and for some schools that operate year-round.

Despite the revisions in the similar schools data, state officials said they still cannot vouch for the accuracy of the new information.

Advertisement

Unlike many states, California lacks the capacity to maintain a computerized record for each student. Therefore, the state must rely on the accuracy of data compiled by teachers and principals, who in some cases collect it from students.

“We do recognize that this is going to be a continuing concern,” said Paul Warren, who is in charge of accountability programs for the state Department of Education.

In an effort to improve the quality of the information it receives, the state this year doubled the amount of money it gives to school districts to cover their costs in compiling the data. The state education department also is requesting more money so that it can deploy personnel to do its own spot checks of the data.

Still, said Warren, “we’re never going to eliminate these problems. This is just going to be part of the accountability system.”

The state’s accountability system is a centerpiece of Gov. Gray Davis’ education reform efforts. Currently the Stanford 9 is the only element of the index, although other measures are being developed.

School districts are in the process of giving this year’s tests; after the tests are scored by a contractor, Harcourt Educational Measurement, the state will then post the actual scores and then begin calculating next year’s rankings.

Advertisement

*

Times director of computer analysis Richard O’Reilly and data analyst Sandra Poindexter contributed to this story.

Advertisement