Advertisement

Nationwide Writing Skills Test Deemed Unreliable

Share
TIMES EDUCATION WRITER

Results from a long-running nationwide assessment designed to track students’ writing skills over time are unreliable and have been scrapped.

The board that sets policy for the test, called the National Assessment of Educational Progress, voted at its quarterly meeting early this month to exclude the results of the 1999 long-term writing assessment from a trends report to be released this summer.

In addition, the data for the long-term writing test given in 1994 and 1996 will be removed from the statistics section of the U.S. Department of Education’s Web site (https://www.nces.ed.gov).

Advertisement

The long-term writing assessment, which has been administered seven times since 1984, was the only aspect of the National Assessment of Educational Progress found to be faulty. The problem does not affect any of the program’s many other tests, including a new writing exam first given in 1998.

*

National Assessment of Educational Progress results are widely followed and tend to grab headlines nationwide. California has turned in dismal showings in recent assessments. The state ranked second to last among 39 states in the 1998 federal assessment of fourth-grade reading skills, which showed that only 20% of the state’s students were considered proficient readers.

In rejecting the long-term writing results, the board acted on the recommendation of Gary W. Phillips, acting commissioner of the National Center for Education Statistics, which administers the assessment. He said he had “lost confidence” that the long-term writing data from those three years were reliable after a testing contractor revealed problems with the method used to compare results from one testing year to the next.

The contractor is Educational Testing Service of Princeton, N.J.

The problem, officials said, resulted from the small number of questions, or writing prompts, used in the test. Just six prompts were used in each of three grades: fourth, eighth and 11th. After comparing several years worth of data, test company officials realized that the results bounced around too much to be reliable.

“We’re talking six prompts here . . . just humongously small,” said Peggy G. Carr, associate commissioner for assessment at the center. “There was not enough information to establish how well students were really performing.”

National Assessment of Educational Progress results are based on representative samples of students and are provided only for groups, not individuals.

Advertisement

The writing trend test, which has used the same questions from assessment to assessment, has caused confusion. Among other puzzling results, the 1992 test showed a sharp jump in eighth-graders’ scores over the 1990 results. That was followed by a downward drift in the 1994 and 1996 assessments.

*

The trend tests, also given in science, math and reading, are administered only to national samples and to far fewer students than are the program’s main exams that are used to make state-to-state comparisons.

The new writing exam that was administered for the first time in 1998 has 20 prompts per grade, a number that officials said was more than enough to ensure dependable results.

Carr said states that have only recently begun to administer writing assessments could face similar problems in their efforts to report on trends.

The usual ways of calculating trends don’t work, she said. “We know that now.”

As a result, she said, what the national assessment effort learns about what works and what doesn’t will benefit states.

Sharif Shakrani, deputy director of the National Assessment Governing Board, said officials will attempt to find a way to salvage the long-term writing results. That might involve adding more writing topics in the future or devising new ways to compare year-over-year results.

Advertisement

“We’re going into research and development mode to come up with another way,” Carr said.

Advertisement