Advertisement

Pressure for High Scores Blamed in Test Cheating

Share
Times Staff Writers

It was Christmas Eve, 1986, a time when most school officials were easing into the holiday vacation. But hidden away in a tiny cubicle at state Department of Education headquarters in Sacramento, Linda Pursell, a 19-year veteran of the Los Angeles Unified School District staff, was methodically going through pile after pile of peach- and green-colored California Assessment Program test answer booklets.

Only weeks before, the district had been alerted by state officials that computers grading the tests had kicked out the names of 18 of its elementary schools with unusually high numbers of erasures on the multiple choice answers, an indicator that cheating may have taken place.

Pursell’s job was to see if the erasures had been made to change wrong answers to right, and whether the changes were in handwriting other than the student’s.

Advertisement

After hours of painstaking study, a pattern emerged and she suddenly “felt almost physically ill.” In a phone conversation with her bosses in Los Angeles, the head of the district’s testing unit related her findings:

“They really did it,” she told them.

And so began a highly unusual investigation into who “they” were.

Who changed hundreds of answers on the tests? Teachers anxious to please principals or parents? Principals or administrators climbing the professional ladder and under pressure from the district? School aides and clerical workers who helped teachers “clean up” messy test booklets? Proctors--sometimes parents or other teachers--who provided security during the exams?

Under fire at last week’s school board meeting, Supt. Leonard Britton admitted that after investigating what is believed to be the district’s most widespread cheating scandal, only one thing seemed clear. “It was someone other than the students,” Britton said.

In all, at least 50 elementary schools statewide, including 24 in Los Angeles, have been accused of cheating on third- and sixth-grade California Assessment Program (CAP) tests over the last three years, with scores at some highly questionable schools invalidated. But the Los Angeles district’s investigation of tests taken during the 1985-86 academic year was abandoned without confronting any suspects.

Situation Too Murky

The district cited legal and ethical reasons for throwing in the towel on the 1985-86 incidents--concluding that the situation was too murky to hold specific individuals responsible for such widespread tampering. The district said it will continue to investigate, with help from a school police officer, the six schools thought to have cheated on the 1986-87 tests and one suspected case in 1987-1988.

But in the wake of the cheating probe, district officials have found themselves--not the cheaters--at the center of controversy. Critics charge that high-level administrators not only set the scene for cheating to occur by placing heavy pressure on schools to get high scores, but then failed to provide fail-safe testing security, and finally, curtailed the 1986-87 test investigation without even directly questioning those most likely to have done the cheating.

Advertisement

And in another ironic twist, the very system the state and district developed to link erasure marks on tests to cheating is now under fire. San Francisco Unified School District officials, saying that they did not want to become a “lynch mob,” have fired off a 20-page letter to the state Department of Education, insisting that their questionable test scores be validated because the detection process was faulty.

Meanwhile, the fact that the San Bernardino City Unified School District successfully investigated cheating turned up by the state has led some critics to question why Los Angeles could not have done the same.

Based on interviews with dozens of teachers, principals and administrators who handled the tests, a picture begins to emerge of how and why the cheating may have occurred at Los Angeles district schools in 1985-86--and how it might have been avoided.

Intense Pressure

At the top of the list of concerns is pressure from the office of state Supt. of Public Instruction Bill Honig as well as from the district office to do well on the CAP tests--mathematics, reading and writing exams that are given annually to third-, sixth-, eighth- and 12th-graders in public schools statewide. There are no individual student scores, but school scores are publicized and used widely by parents to judge the quality of their children’s education. Even real estate agents cite high test scores in sales pitches to prospective home buyers.

Two kinds of cheating appeared to have taken place--wholesale changes where answers for nearly an entire class were altered to dramatically boost test scores, and scattered instances in which someone changed only a few answers.

One retired Los Angeles district teacher said she saw teachers change answers in a CAP test in 1985.

Advertisement

“We were all together in the testing coordinator’s classroom cleaning up tests. We were told to fill in (answer) circles if they were only half filled in. But some teachers were saying, “Oh, my God, William knows that answer. So they changed it.”

‘Make Stupid Mistakes’

Said another teacher: “A lot of the kids just want to get the damn thing over with . . . so (they) make stupid mistakes. Teachers know their kids. If I’ve got a kid I know made real dumb mistakes, I’d say, look at it again and fix your mistake.

“Is that cheating or not? Some people might say you can’t do that. I don’t see it as wrong.”

Many of those interviewed talked about an atmosphere of paranoia caused by pressure to raise test scores. One veteran teacher in a part of the district with a cluster of schools that tampered with tests, recalled the pressure his principal was under to raise CAP scores. The principal would come back from meetings with his regional assistant superintendent “so badly shaken that the staff would have to bolster him up.”

District officials acknowledged that persistently low test scores can be a factor when principals are demoted or transferred. And, teachers get annual memos before test time showing how their students have performed compared to earlier years and to other classes.

‘Test Scores, Test Scores’

“You can be as mature as you want about it, but when you get bombarded by people saying, ‘Aha, I saw your (school’s) test scores’ and then you have pressure, politicians like Honig, people saying ‘test scores, test scores,’ you start thinking it’s important,” said one veteran Los Angeles district teacher.

Advertisement

“No one ever told you to cheat,” said another teacher who, like nearly all of his colleagues, requested anonymity. “But they hand you your past test scores, and say, ‘Gee, your scores are low; we need to get a certain percentage on reading,’ or whatever. It makes it tempting to do something.”

Alfred Moore, a Los Angeles assistant superintendent, said that when he was an administrator of an area that includes parts of mid-Wilshire and West Los Angeles, six schools received state and national distinguished-school awards. That region had a cluster of suspected tampering cases too.

He attributed the honors in part to a special program for principals and teachers that, among other things, emphasized doing well on tests.

“Some might have been pressured by such programs. To be successful, the program required extra work and striving to do better professionally,” Moore said. However, he added, he does not know why cheating occurred in his region and said he never condoned it.

Honor Taken Away

In another part of the district, Franklin Avenue School was stripped of its national distinguished-school status by the district because of cheating allegations. District investigators found that in two sixth-grade classes, 49 of the 52 tests had erasures on them, and that 85% of the erased answers in one class and 92% in the other were changed from wrong to right.

Some educators note that administrators and principals--who are not tenured or protected by unions--might feel the most pressure to have their schools do well on standardized tests. District officials said poor test scores by themselves would not be sufficient to demote or transfer a principal. But they acknowledge that test performance could be a factor in evaluating principals.

Advertisement

“Administrators might even be more susceptible to pressure,” said Paul M. Possemato, an associate superintendent who was in charge of the district’s CAP probe. “Teachers really have very little to benefit from a school score.”

How changing answers can affect a school’s ranking was dramatically illustrated at Bandini Elementary School in San Pedro where third-grade reading scores jumped 66% on the 1985-86 CAP test. According to statewide ranking, that score placed the school higher in reading than 94% of all third-grade classes in the state--a reversal from the previous year when Bandini scored worse than 95% of all third-graders statewide.

Educators note that there were other programs, such as the state “Cash for CAP,” which put even more emphasis on test scores. The program rewarded high schools with thousands of dollars for raising their CAP scores. The program was canceled because of lack of financing by the Legislature. There were no reports of cheating in that program, however, officials said.

Said one teacher: “There was a tremendous push when there was cash motivation. They would create practice material during the summer and everyday in homeroom they’d give the kids test times and have them practice.”

System Was Haphazard

Over the years, the state had used dramatic increases in test scores to find occasional instances of cheating. But that system was haphazard and officials say they were never sure that they were catching all the tampering cases. So in 1986, the state contracted with Questar Data Systems, a data-processing firm in Minnesota, to not only grade tests but to provide a way to electronically detect cheating.

A computer scanner was able to detect erasures on tests, and that was the key to the new sleuthing system. Using statistical data, the state determined that, on average, children taking the CAP test erased answers approximately 3% of the time. State officials notified districts about schools that greatly exceeded the rate.

Advertisement

The first district to respond was Los Angeles, which sent Pursell to Sacramento to examine the suspicious tests. She was amazed to find that an overwhelming number of erasures on the suspect tests were to change wrong answers to right ones.

She then looked further at individual tests and noted that answers appeared to be filled in by different people. For example, some individuals use a spiral motion to darken the multiple choice circles; others use horizontal or vertical strokes. It was obvious, Pursell said, that some answers were changed by someone other than students. The district later went though all the tests at schools where questions were raised.

District officials believed that they had proof that cheating occured but concluded that they lacked evidence about who actually made the changes and therefore decided not to directly confront anyone--teachers, principals and others--about cheating.

Direct Confrontation

San Bernardino City Unified, on the other hand, confronted its suspects, and one teacher resigned.

“Even if the evidence was circumstantial, it was overwhelming,” said John Woodford, the district’s head of employee relations, who questioned the teacher who later resigned.

Commenting on Los Angeles’ reluctance to question suspects, Woodford said: “Not only do you have the right but the obligation. . . . Whether it’s lying, cheating, child molesting, at some point you have to confront the suspect.”

Advertisement

But Los Angeles officials said that after a meeting with district legal advisers, CAP investigators and other administrators, they concluded that they could not go ahead with their investigation. Britton, Los Angeles’ superintendent, told the school board that because of a “lack of solid and clear evidence, disciplinary action against any individual staff member would be difficult, if not impossible, to substantiate.”

Los Angeles district manuals for test-taking in 1985-86 were geared to preventing student cheating, not tampering by others. The state’s examination manual that year said it was advisable to have proctors for large groups.

Last-Minute Warning

It was not until the state had reason to believe that widespread cheating had occurred that Los Angeles in March, 1987, added a last-minute warning to instruction sheets for the May tests. The notice said simply: “It is important for you to know that the state Department of Education will be monitoring schools to determine whether tests are properly administered and directions in the manuals are followed.”

In the fall of 1987, the district finally issued detailed security guidelines, which spelled out that the tests were to be kept in locked cupboards, not in teachers’ rooms, that teachers were not to take tests home, and that teachers were to provide only “appropriate” assistance to students. The eight-page bulletin emphasized that the “integrity of the test and district” was to be maintained.

None of the new safeguards instituted in Los Angeles have addressed “cleanup,” a practice in which test-givers are directed to erase extraneous pencil marks and doodles on tests so that the computer would not get confused and read them as intended answers. Those who were involved in the testing told The Times that this was where cheating most likely occured.

In each class, different forms of the test were used so that in general no two students answered exactly the same questions. This meant that there was no quick way to alter the tests. But tests often were left in classrooms or offices for several days before being sent on to a central district pickup point for grading later by the computer company in Minnesota. Officials speculate that, in many instances, individuals who cheated took the tests home so they could read each test and make changes. (Officials estimate that it would take about 10 minutes per test to go over the answers.)

Advertisement

Cleanup Unnecessary

Los Angeles officials are still telling test-givers to clean up answer booklets. But Dennis Dillon, president of Questar, which grades the tests, said new equipment used for the first time on this year’s tests, has made cleanup unnecessary.

Meanwhile, the state last year directed districts for the first time to provide independent proctors for all CAP tests. Officials at some Los Angeles district schools told The Times, however, that they did not have enough proctors at this year’s exams. And, in an added measure to beef up security, the state this year also began providing special bags and seals for use in storing the tests.

Times staff writers Elaine Woo, Carlos V. Lozano, John L. Mitchell, Patricia Ward Biederman, Bob Williams and Mary Barber contributed to this story.

HANDLING AND GRADING OF THE TESTS Route followed when administering and grading a typical California Assessment Program test in 1985-86. The tests were given between April 23 and May 9. While some schools turned in their tests before the May 9 deadline, others kept the tests the full 17 days. STEP 1--Teachers distribute test booklets to third and sixth graders. Students write their names on the booklet cover. Each gets a different version of the test, which takes an average of 35 minutes to complete. STEP 2--Teachers collect tests. Depending on the school: A. The teacher transfers the tests immediately to a school test coordinator or the principal for safekeeping. B. The teacher keeps the tests, checking through the pages and legitimately “cleaning up” extraneous pencil marks that might be misread by the scoring computer. C. The teacher improperly alters test answers from wrong ones to right ones. STEP 3--Students who were absent take makeup tests over a number of days until as many students as possible have completed the exams. Teachers give some of the tests; the testing coordinator gives others to small groups of students gathered in a single room, such as the library. STEP 4--After teachers pass on the tests to a coordinator or principal, a number of things can happen: A. The tests are locked in a special file cabinet or other safe place. B. They are left on the principal’s desk for a few days. C. The principal and testing coordinator legitimately “clean up” extraneous pencil marks that could be misread by the scanning computer. D. The tests are improperly tampered with while under the control of the principal or testing coordinator. STEP 5--By the May 9 deadline, all tests go to a regional collection point within the district, generally under the control of an assistant superintendent. Some principals have suggested that improper changes were made by higher-ranking officials at the regional centers. The tests remain at the collection points for about three days and then are shipped to a district warehouse, where they stay for about a week. STEP 6--The tests are sent to Minnesota, where they are graded by Questar Data Systems, a data-processing firm that handles school tests and corporate surveys.

Advertisement