Advertisement

Low Test Scores Add to District’s Problems : Education: Most L.A. schools on the Westside declined slightly, but some administrators say the state program gives misleading data for judging school systems.

Share
TIMES STAFF WRITER

Release of the latest distressingly low California Assessment Program (CAP) scores for California eighth-graders this week couldn’t have come at a worse time for Westside schools in the Los Angeles school district.

Many parents in the area are already scurrying to get their children into private schools on the eve of a two-month winter break that many expect to be followed by a teachers strike in late February.

Fewer than half of the more than 300,000 youngsters tested statewide last spring performed “adequately” or above on CAP tests of reading, writing, math, history/social studies and science. An adequate score is an indication that they had absorbed the state curriculum.

Advertisement

In Los Angeles, only 32.3% of students in the Los Angeles Unified School District showed adequate performance, with an average score of 213, down 5 points from two years ago. The district was ranked 35 on a scale of 1 (worst) to 99 (best) in comparison with other school districts.

On the Westside, most LAUSD schools registered slight declines. However, Palms Junior High showed a gain of 16 points, with a dramatic increase in writing scores and 54.9% of its students performing adequately. Mount Vernon Junior High gained 11 points, although only 23.7% of its students turned in adequate performances.

In the Beverly Hills Unified School District, 70% tested at adequate-or-above levels, with an average raw score of 341, up 9 points. The district was ranked 86.

In the Culver City Unified School District, where all eighth-graders attend the same school, 56.2% were deemed adequate, with a raw score of 280, up 12 points, ranked at 66.

And in the Santa Monica-Malibu Unified School District, 58% tested at adequate levels, with an average raw score of 289, up 2, and a rank of 64.

Educators say, however, that low scores are misleading because they can mean either that a given school is doing a less-than-average job or that its students bring a wide range of backgrounds and skills to junior high.

Each school is also assigned a relative rank on the 1-to-99 scale, comparing it with the 20% of schools determined to be “most similar” to it in terms of student background factors. This means that each is ranked within a group of 320 schools, rather than against all 1,600 schools throughout the state.

Advertisement

Like the raw scores, the rankings of similar schools are also potentially misleading, particularly in the case of Los Angeles’ Westside schools.

Though located mostly in affluent or middle-income, predominantly Anglo neighborhoods, the Los Angeles Westside schools draw about 70% of their students from predominantly Latino or African-American inner-city neighborhoods, from which the children are bused because the nearby schools have no room for them. Under the CAP ranking system, however, factors used to compare schools are based in part on the demographics of the area where the schools are located, rather than where their students come from.

A case in point is Paul Revere Middle School in Brentwood. Revere draws only 30% of its students from the wealthy neighborhoods that surround it. The remaining 70% are bused in, and 28% of the students at the school have limited proficiency in English.

Revere’s overall raw score was 252 out of a possible 450, indicating that 45.6% of its 304 test-takers met minimal standards. The score earned Revere a rank of 26--significantly lower than some other schools with identical scores that were located in less affluent neighborhoods.

“I don’t know what it means,” Revere principal J. D. Gaydowski said of the results. “It certainly doesn’t give you a picture of the school . . . The score may mean you have a totally average eighth grade or that you are teaching a full range of students.”

At Revere, that range runs from students with only minimal basic skills and understanding of English to an unusually large proportion of gifted youngsters (23%) and high achievers who require honors classes and shine on national tests, he said.

Advertisement

“We don’t pine over these CAP test scores,” Gaydowski said. “They are a political football, unfortunately used for school shopping.”

State officials, however, contend that they are useful in comparing schools’ performances and progress.

The existing CAP tests give a school-wide score, but do not yield scores for individual students.

Next spring, a new “performance-based” test that replaces multiple-choice CAP questions with inquiries that measure reasoning ability, writing skill and problem solving will be given to fourth-, eighth- and 10th-graders. It will generate individual and composite scores, and is expected to replace traditional CAP testing as well as the Comprehensive Test of Basic Skills, a test that gives individual scores and allows comparisons with students nationally.

State education officials acknowledge that there are significant flaws with the CAP tests.

But because there had been no CAP testing for two years because of state budget cuts, and because dozens of groups that fund education projects require some kind of school assessment, Sacramento education consultant Jim Miller said the state Department of Education decided the test should be given one last time, this time to eighth-graders only, at a cost of $1.35 million.

Besides the overall scores published here, individual schools receive breakdowns that show scores adjusted for such variables as parents’ level of education, language proficiency and student mobility.

Advertisement

To determine the similarity groupings that are used as a basis for the individual school rankings, four factors are considered, according to Pat McCabe, an administrator in the state education office’s program evaluation and research division in Sacramento. They are: parent education, the percentage of students with limited English proficiency, student mobility, and percentage of students on welfare (Aid to Families with Dependent Children).

“Of the factors used to group the schools and predict scores, level of parent education turns out to be the most important,” McCabe said, “followed by language proficiency. AFDC and mobility do not play a major role.”

However, some schools have complained to the state that the data used to arrive at rankings is inaccurate. Consultant Miller told of a San Bernardino school that suspected its students had overestimated the level of their parents’ formal education. The school asked the same students who supplied the information and took the test to write essays about what their parents did and where they went to school, then had teachers check the information in the essays against what was written on the CAP forms. The two didn’t match. “When placed in a lower (by education) comparison group,” said Miller, “their performance and rank went up.”

State officials and local educators suspect that something similar may be dragging down Westside schools’ rankings, although the schools are not told exactly which schools they were compared with.

“It is inconceivable to me,” said Revere’s Gaydowski, “that more than half our students said their parents were college-educated, when we know that a fourth are not proficient in English and that more than two-thirds are bused in from poor, inner-city neighborhoods.”

The information on welfare families does not reflect students who are bused to Westside schools but rather only those students who actually live within a school’s attendance-area geographic boundaries, state officials said. This means that Westside schools face comparison with other affluent white areas, even though their student bodies may differ markedly.

Advertisement

Los Angeles school board member Mark Slavkin, whose district includes much of the Westside, said he was concerned that the latest CAP test results could propel parents who are already frustrated and anxious about the public schools into flight.

“Every time these scores are published, without context, the data tends to be misunderstood and used to make decisions that are often not well-considered,” he said.

School officials note that there can also be wide but inexplicable swings from year to year at a school, depending on the eighth-grade class. At Horace Mann Elementary School in Beverly Hills, for example, the average score dropped by 41 points this year, although it was still above 300 and 64% of the students were judged to be performing adequately or better.

“From year to year, the scores may not be the same, but we are still high,” said Mann principal Arthur Fields. “We just had an extremely bright class two years ago.” He said the staff reviews the results, assesses weaknesses and tries to come up with new ideas, but does not agonize over changes.

Educational experts warn against attaching too much importance to the scores. They need to be broken down by race, ethnicity, gender, income level and the like to provide a sense of how different groups are faring academically and where resources should be targeted, said Linda Wong of the Achievement Council in Los Angeles, a nonprofit group that works with low-achieving schools in Southern and Central California.

“It is evident that we need other kinds of benchmarks to measure school programs,” Wong said. “For test scores to go up, you need to have certain conditions in place. Piecemeal school reform will not be effective, and gains you do see won’t be sustained over the long haul.”

Advertisement

How the Schools Ranked

This table shows the percentage of students at each school whose CAP performance was at an “adequate” level, as determined by state standards. This level is intended to indicate a basic mastery of the state curriculum.

The second number is the school’s ranking on a scale of 1 (worst) to 99 (best) among other supposedly similar schools in the state. Each school is compared with the 20% of schools determined to be “most similar” to it in terms of student background.

Educators caution that the adequacy percentage and the rankings are potentially misleading for a variety of reasons, particularly in the case of Los Angeles’ Westside schools.

Students Percent Relative Tested Adequate Rank Statewide 301,190 46.5% N.A. L.A. County 79,192 39.3% N.A.

Beverly Hills Unified

Students Percent Relative Tested Adequate Rank Districtwide 308 70.0% 86 Beverly Vista 86 76.3% 94 El Rodeo 73 72.4% 90 Hawthorne 75 66.8% 76 Horace Mann 74 64.0% 68

Culver City Unified

Students Percent Relative Tested Adequate Rank Districtwide 330 56.2% 66 Culver City Middle 330 56.2% 66

Advertisement

Los Angeles Unified

Students Percent Relative Tested Adequate Rank Districtwide 31,972 32.3% 35 Bancroft 308 31.0% 20 Burroughs 496 53.2% 64 Emerson 392 43.2% 35 LeConte 267 27.2% 37 Marina Del Rey 174 30.8% 45 Mark Twain 220 30.2% 51 Mt. Vernon 495 23.7% 22 Palms 441 54.9% 74 Revere 304 45.6% 26 Webster 241 26.9% 21 West Hollywood Opportunity 15 19.0% 7 Westside Alternative 49 37.9% 19 Wright 356 45.3% 14

Santa Monica Unified

Students Percent Relative Tested Adequate Rank Districtwide 665 58.0% 64 Alternative 11 69.6% 92 John Adams 228 41.7% 50 Lincoln 321 64.1% 63 Malibu Park 105 69.9% 82

Advertisement