Reading the CAP test scores was a devastating experience. Pacific Palisades Elementary, which my children have attended since 1983, is ranked below the 20th percentile in both third and sixth grades with greater raw score value than several schools ranked in the 50-70 range percentiles. One school with a 251 score ranks 67.9% while our score of 251 garners a 7.9% ranking. This is true for many of the schools in the published results. Something is wrong. Raw scores are simple data. Rank is a construct. The criteria utilized here must be challenged.
The key word in ranking is comparison to "similar schools." "Comparable" local schools in our immediate area had 20% to 1% bused population relative to our 50% bused rate. Because busing accommodates enrollment overflow in crowded schools, these populations often include the most transient students with the least educational continuity and the most needs. Many of these children remain at a specific school for only one or two years before moving to another. In addition, our population includes about 30% ESL (English as a second language) students (my estimate), unlike our "similar" schools. Test-taking skills are not the academic priority with these students. They have more pressing needs. Good teachers meet those needs.
Pacific Palisades Elementary is a fine school with high standards. It is academically sound and culturally diverse with a staff of dedicated, responsible, top quality educators. Our CAP test "rank" does not reflect the quality of education that exists here. One week prior to the publication of the scores, our school received a glowing report from the Los Angeles Unified School District school site review team after a weeklong intensive evaluation of teachers, programs, administration, resources and organization. We choose to send our children here because we value an inclusive, ethnically enriched environment in which children can thrive socially and educationally. This has been our fortunate experience for several years.
This ranking system is inaccurately and unfairly constructed. It damages a school's reputation by presenting a false and degrading quantitative analysis that does not measure what is really taking place in the classroom. Prospective parents review these published "facts" as a standard by which to judge a school. How many of the other schools listed have suffered defamatory injury? How many more wonderful teachers have been dealt undeserved insult to their professional reputations? How many children are chided by friends attending other local schools about their school's inexplicable and indefensible low rank percentile?
Maybe now we can better understand some of the rampant frustration with administrative bureaucracies among parents and teachers alike. Beating good schools down with publicized misinformation is a bizarre mission for public education administrators. Playing loose and free with the facts to the detriment of our public schools and our children engenders negative emotion towards the system and does nothing to enhance or encourage the academic process. Our kids deserve fair representation. How about it?