New Flaw Is Found in State Test Results


Even as a school testing contractor scrambles to correct a scoring error on statewide achievement tests, an additional problem in Long Beach has resulted in a warning for districts throughout the state to check their results.

San Antonio-based Harcourt Educational Measurement acknowledged the problem but said it could not be corrected before next week, when statewide data are expected to be posted on the Internet.

Ed Slawski, a senior research scientist for the company, urged school officials across the state to double-check their results for similar errors, which involve the rankings for year-round schools compared with a national average.

“We can’t check the results for each of 1,100 districts,” Slawski said. “It’s invaluable for folks at the local level . . . to check to see if there’s anything that looks suspicious.”


It was unclear Tuesday if the problem was widespread or isolated.

Alerted about the problem by Harcourt, the state Department of Education on Tuesday began spot-checking other districts. “So far the situation has been related to Long Beach and Long Beach only,” said Doug Stone, the education department spokesman.

The state has about 1,300 year-round schools. A testing official in the Los Angeles Unified School District said it was unaffected by the latest headache. Schools in Fresno and San Diego seemed unaffected.

Statewide scores for the second year of the state’s Stanford 9 Achievement Test were to have been issued last week. But they were delayed when school officials in Anaheim and San Jose discovered that the scores for students not fluent in English had been misreported. Scores for most districts in the state were affected.


Officials in the Long Beach Unified School District, which has 21 year-round schools, were alerted to the newest discrepancy last Friday by a third-grade teacher at Grant Elementary School. The teacher noticed that her students answered more questions correctly this year than they did the year before, but that their rankings against a national average went down instead of up.

The explanation for that has to do with the intricacies of standardized testing. Essentially, testing companies create norms, or averages, by giving a test in the spring to a representative sample of students across the nation. Then, students and schools who take the test in subsequent years are compared to those averages and ranked on that basis.

The situation is trickier for year-round schools, though, because students at those campuses may have been in class far fewer days, and therefore received less instruction, than the comparison group. To make up for that, the norm is adjusted downward slightly for year-round schools.

Lynn Winters, the assistant superintendent for research, planning and evaluation in Long Beach, said the error occurred when the tests for about 9,000 students were scored. Because the incorrect rankings for those students affect district data, the company has agreed to rerun reports for all 65,000 students tested in the district.

Winters said that the reports had not yet been sent to parents when the scoring error was discovered and that corrected reports will be delayed.

The new reports will show students doing better, she said. A third-grader ranked at the 32nd percentile in math, for example, actually should have been at the 39th percentile, and a student at the 45th percentile in reading should have been listed as being at the 53rd percentile.