Advertisement

Flaws Found in Rankings of Schools

Share
TIMES EDUCATION WRITER

California Department of Education officials acknowledged Tuesday that a widely reported statewide ranking that compared the test scores of schools with similar student demographics was faulty, primarily because of poor data provided by school districts.

In a letter to superintendents dated Monday, the department noted that about 400 schools have contacted state officials to request that their rankings alongside similar schools be recomputed.

In most cases, these schools underreported the percentages of low-income students who qualify for free or reduced-price lunches under federal laws.

Advertisement

That meant that the schools were compared with schools in more affluent areas, where students tend to have higher achievement.

Several Orange County school districts had previously expressed concerns about the rankings of similar schools because the state did not specify the exact formula used to derive the rankings. Nor did local schools know which 100 schools they were being compared against.

“I commend the state on the fact that they have ‘fessed up to the problem,” said Jeff Bristow, testing director for the Capistrano Unified School District. “The problem isn’t created by the state, but by the poor data coming in. Where I fault the state is, if they knew about [the problems], but didn’t yank” the similar school rankings.

The similar-schools rankings were part of the state’s unprecedented Academic Performance Index, the cornerstone of the state’s $242-million campaign to make schools accountable for students’ learning.

Although only one of several components of the index is in place--and although the index confirms the already widely known fact that affluence boosts academic performance--the rankings have assumed great importance.

The index, reported Jan. 25, is aimed at measuring academic performance and establishing a base for gauging future progress. For now, it is based solely on results of the Stanford 9 basic skills test, which was given last spring to most public school students. Each school was given an API score from 200 to 1,000, calculated according to a seven-step formula.

Advertisement

Those scores were then ranked statewide in 10 groups of more or less equal size from 1 to 10. Schools were separated by type--elementary, middle and high schools--and ranked within those categories.

A second ranking of 1 to 10 compared each school’s 1999 API score with the scores of 100 schools with similar socioeconomic traits and other factors. Many schools made much of that similar-schools ranking because they fared better against peer schools than in the overall statewide ranking.

State officials Tuesday said the letter would flush out perhaps an additional 100 schools, out of a total of 7,000 that were included in the rankings, that will request a revision.

But they indicated that the changes would not affect the overall statewide rankings or growth targets that schools received.

Many schools showed a wide divergence in their rankings. It was possible, for example, for a school to have a statewide ranking of 2 but a 10 in its group of peer schools.

The idea of ranking similar schools against one another is a “wonderful thing,” said Al Sims, administrative assistant for research and evaluation in the 47,000-student Garden Grove Unified School District. “It’s the best way in the world to compare schools, but you’ve got to have hard data to do that.”

Advertisement

Some Orange County districts had problems providing some bits of data the state requested--including the level of education of a test-taker’s parents--so that data had to be provided by students themselves, Sims said.

In other words, students in grades two and up were expected to know--and fill in--whether their parents had finished high school or gotten a doctorate.

“We had to rely on what students, teachers or clerks bubbled in” for their parents’ level of education, Sims said. “. . . It’s a very difficult thing. Some parents feel it’s an invasion of privacy to be sending home surveys about parent education. It’s hard to get good, hard data for that particular area.”

Districts have until Feb. 25 to review each school’s data and to request that the similar-schools ranking be reconfigured. The new rankings for those schools will be released about mid-April, the department said. At that time, the department will also release the names of the similar schools against which each of the schools was ranked.

The problem resulted from erroneous data compiled during last spring’s Stanford 9 testing, the agency said.

“We’ve paid schools to work with Harcourt [Educational Measurement, the test publisher] to get more accurate data,” said Paul Warren, the state’s deputy superintendent for accountability. In the future, he added, “schools would be on the hook for cleaning it up.”

Advertisement

Times staff writer Kate Folmar contributed to this report.

Advertisement