ACT Weighs Ways to Create Essay Portion of College Test

Associated Press Writer

Hunkered down in a windowless conference room, five scholars analyzed a high school student’s essay with a scrutiny normally reserved for the likes of Hemingway or Dickens.

“We need to be more forgiving about the endings,” said Roseanne Cook, a program manager for ACT, examining the handwritten pages. “We need to recognize what happens when they run out of time, or what happens when the proctor says there are only three minutes remaining.”

Cook and ACT know something about deadlines.

The clock has been ticking down for the testing service since last year, when ACT announced that it would add an optional essay to the second-most-popular college entrance exam in the nation starting in spring 2005.


Since then, test-makers on ACT’s leafy campus have been working on the painstaking task of crafting precise, thoughtful questions that will inspire great writing from America’s teenagers. The researchers also are figuring out how to grade an exam for hundreds of thousands of students that’s not multiple choice.

“What we’re interested in is measuring the writing skills,” ACT President Richard Ferguson said. “Do the sentences flow logically? Do they hang together?”

A similar process is under way in Princeton, N.J., where the Educational Testing Service, which produces the SAT entrance exam for the College Board, is also preparing an essay to be introduced in 2005.

Pressured to expand the scope of its exam by the biggest public higher education system in the nation, the University of California, the SAT announced in June 2002 that it was adding an essay. ACT’s announcement came two months later.


While ACT decided to provide students with an option to forgo the essay -- because many colleges already require a writing sample with their application -- the SAT essay will be mandatory.

ACT is unsure how many of the 1.1 million high school students who take its tests each year will decide to write an essay. A survey found that, so far, about half of the colleges that accept the ACT have said they will not require students to take the writing assessment.

To make sure that it’s ready for students who do exercise the option, hundreds of ACT employees and outside consultants are immersed in a labor-intensive process that includes reviews of subject matter and drawn-out debate on the phrasing of essay questions -- known as prompts.

“We’ve never had the perfect item,” said Sherri Miller, ACT’s director of elementary and secondary school measurement and research. “But we get pretty close.”

David Duer, a senior language arts specialist, noted the change in going from a multiple-choice exam to an essay: “It’s a whole different experience to sit down with the writing. You get to understand who they are and what they are thinking.”

To put all the students at the same starting point, ACT is identifying general-interest essay topics relevant to teenagers of every religion, race, ethnicity and state in the union.

“The bottom line is, whatever we wind up with needs to be a prompt that is clear to all students, fair to all students and gives each student a chance to show what they’ve learned in school,” said Cyndie Schmeiser, ACT’s vice president of development.

Company officials stressed that students will not be asked to elaborate on religion, politics, cataclysmic events or personal issues. So, for example, the essay will not be a forum for students to share their views on the Sept. 11 terrorist attacks.


“You certainly don’t want to create a situation where a student will be distracted by a strong emotional reaction to a prompt,” Duer said.

Once the subject for a question is chosen, the fine-tuning begins.

Each of ACT’s multiple-choice questions is subjected to a 2 1/2-year review process that assesses content, phrasing, fairness and bias.

Similar benchmarks will be applied to the writing sample.

“Nobody in the country knows that we sometimes spend up to 12 hours working on a single word,” Miller said. “But if my son was taking the test, I wouldn’t want that one word to be vague.”

Working from a pile of essays written by students in a high school pilot program earlier this year, Cook, Duer and a trio of academics that included college professors and an ACT testing expert brought the next phase of the process to the conference room.

Members of ACT’s “range-finding team,” the scholars are responsible for establishing grading guidelines that will be used for the exam. Eventually, upward of 400 professional “graders” will score ACT essays on a scale of one to six.

Although the same principles of grammar, spelling and punctuation used in the English section of the multiple-choice exam will be applied to the essay, the writing will not be scored solely on those components.


“They won’t get punched out if they put an apostrophe in ‘it’s’ when they shouldn’t,” Duer said.

Indeed, recognizing that a 30-minute timed essay allows for a rough draft at best, ACT officials emphasize that it is what the students say -- and not how they say it -- that will count the most.

“We are looking for some evidence of critical thinking with an emphasis on writing to convey that thinking to someone else,” Cook said.

On a recent morning at ACT headquarters, the team’s goal was to reach agreement on the grading scale: In other words, what makes one essay worth a “two” and another worth a “six.”

To do that, the team turned its attention to a pilot essay written by a student who responded to a test question that asked whether schools should intervene to stop nonviolent negative behavior, such as teasing.

To an outside observer, the essay was grammatically correct, but seemed to lack focus. There was no strong thesis.

The team split the first time they read the effort, unable to decide if the essay rated a four or a five. Then, page-by-page, line-by-line and, sometimes, even word-by-word, they reviewed the content again.

“I don’t know; I think this paper showed very haphazard movement,” said Barbara Kroll, an English professor at Cal State Northridge.

Duer also noted a lack of “coherent logic.”

“It started out fine, but became repetitious and stalled,” Cook said. “I didn’t have a sense of what she was trying to say.”

Kroll said that, despite its faults, the essay demonstrated creativity.

“When you’re faced with a topic that you don’t know what to do with, or how to address, the only choice is to waffle. And she waffled pretty good,” Kroll said.

When it came to the vote after 10 minutes of discussion, the team didn’t waffle at all; they dropped the essay’s score from five to four.

Then Cook plucked another paper from a stack of test essays and announced the name of the author. The team members began the process again.