Stacey Schwartz got straight A's last semester.
A graduate student in psychology at California State University, Long Beach, she has written grant proposals and term papers. Last December she even received a letter from two of her professors commending her for, among other things, the "high quality" of her writing.
Yet Schwartz, 23, has come close to dropping out of school. Twice she has failed the essay portion of the Writing Proficiency Exam, a test she must pass before receiving her degree.
"What's the point of continuing?" she said recently in a moment of despair. "I'm not going to be a writer, I (want) to be a psychologist. This is garbage."
The test is required of all Cal State Long Beach students wishing to graduate or receive a master's degree. Yet last year only 64% of those who took it passed. For the rest, there are two alternatives. They can keep taking the writing exam, at $25 a crack, until they succeed. Or they can transfer to another campus to complete their educations.
That unrelenting fact has created a schism on the campus. On one side are administrators and faculty members who view the writing exam as a model of how universities should screen their graduates for English writing proficiency. On the other side are critics, mostly students, who say that at least half of the test is an irrelevant measure of a specific kind of writing skill, that it's graded too subjectively and in a potentially discriminatory way, and that it has far too much bearing on an individual's future.
Testing Begun in 1979
The university began administering the writing test six times a year in 1979, after the state board of trustees decreed the need to certify the writing proficiency of California State University graduates. Exactly how that was to be done was left up to each of the 19 individual campuses in the system. While most now offer students the option of taking a course or passing an exam, according to Linda Bunnell Jones, state dean for academic programs and policy studies, three campuses in addition to Long Beach--in Hayward, Pomona and Northridge--require passage of writing proficiency exams.
At Cal State Northridge, an average of 80% of the students who take the test pass it the first time, a spokesman said, while Hayward has a pass rate of 73%. Pomona's rate was 67.8% this year.
Cal State Long Beach's exam was developed over more than 1 1/2 years by a team of faculty members.
The test, which students generally take in their junior year, is in two parts. In the first, students are asked a series of objective, multiple-choice questions designed to measure general knowledge of grammar and syntax.
In the second part, they are asked to write two essays--one in 20 minutes, the other in 40--on subjects of general or personal interest chosen specifically for each exam session. Recent topics have included favorite ways of relaxing and the phenomenon of women entering professions previously reserved for men.
Students must pass both the objective and the essay portions in order to pass the Writing Proficiency Exam. While few have difficulty with the multiple-choice questions, the essays have often proven to be major stumbling blocks.
Method of Grading Criticized
Part of what makes some of the students mad, they say, is the way in which the essays are graded.
Following each exam--which may be attended by as many as 2,600 students--faculty members gather in a large room to engage in what a mimeographed informational handout calls a "holistic" grading session.
The idea, according to Eileen Lothamer, a professor of English who helped develop the exam, is to form quick "total impressions" of the essays before deciding whether to give it a pass or fail grade. Spending an average of one to four minutes on each short essay and three to five minutes on each long one, she said, readers look for organization, detail, word choice, development, completion and flow. They grade each piece on a scale of zero to six; a score of four or above is passing, Lothamer said.
Faculty readers, who serve on a voluntary basis, come from virtually all disciplines on campus. Their only preparation, according to Alice Brekke, another English professor who administers the program, is a four-hour workshop under her supervision. But the system works, she said, because it has a series of built-in checks and balances. Each essay, she said, is read independently by two readers whose combined score of 0-12 determines its ultimate disposition. Separate readers are used on each of a student's two essays. And whenever a discrepancy of more than one point occurs between the two scorers of a single essay, she said, a third reader intervenes to make the final judgment.
"We're not expecting a whole lot," said Lloyd Hile, a professor of chemical engineering who has been grading the essays for five years. "Anyone who does an adequate job in an English composition course should be able to pass this exam."
Yet students like Schwartz point to several facts as evidence that the system is flawed. For one, they say, their names are on the front of the test sheets, within view of the scorers, thus opening the door to possible bias. For another, the critics complain, the fact that tests are returned unmarked and unsigned make it virtually impossible to determine why a particular essay was graded the way it was. Brekke said students who fail the exam are encouraged to visit the campus Learning Assistance Center where a battery of tutors is on hand to go over their essays and suggest areas of improvement.
Tutors Are Students
But the tutors are themselves students and not even all English majors, according to Allison Smith, a graduate student in linguistics, who oversees them. Their preparation, she said, consists of 20 to 30 hours of instruction under her supervision. And though the tutors are free to make suggestions, they, like the people they serve, have no way of knowing specifically why a given essay received a particular score.
"All they can do is guess," Schwartz said.
After failing the exam for the first time, Schwartz said, she worked with three different tutors over several weeks to improve her score. "Each one had a different opinion on why I had failed," she said. The next time she took the test, she didn't do much better.
Problem for EOP Students
Especially hard hit are students in the Educational Opportunity Program, only 40% of whom pass the test the first time they take it.
"It's probably the No. 1 obstacle to graduation for many of our students," said Jaun Mestas, who, as associate director of student development programs, oversees the state-mandated program in which selected low-income students, mostly black and Latino, enter the university through special admissions procedures.
Alarmed by those students' generally low performance on the Writing Proficiency Exam, Mestas and other faculty members last year formed a special task force to look into the problem. Its recommendation: establish a special semester-long writing course culminating in the actual administration of the Writing Proficiency Exam. Mestas said the proposal was in the process of being written up and would eventually be submitted for consideration by the vice president for academic services.
Course Taken Years Ago
"We should be teaching the skills that will be tested," Mestas said.
Though most students are already required to take at least one course in which writing skills are emphasized, he said, it is often completed early in their college careers. "The evidence is that there are many who take the course and then do not pass the proficiency exam," he said. "They take the exam years after taking the course, so nothing is fresh in their minds."
Other groups facing special challenges in demonstrating writing proficiency are students for whom English is a second language, and students in majors such as engineering and chemistry, in which writing skills are not emphasized.
Until last summer, Lothamer said, the exams of students for whom English is a second language were routinely separated from all others and graded by readers specially "sensitized" to the language problems of foreign-language speakers. The practice was discontinued, she said, due to concerns regarding its legality.
Even today, according to Brekke, foreign-language students who can demonstrate special need are placed in a separate room and given extra time on the essay portion of the test. In rare instances, said John Haller, acting vice president for academic affairs, the proficiency requirement is waived altogether for students who have failed repeatedly due to untreatable emotional or physical learning disorders that have been documented by a psychologist or physician.
For most, however, the pressure to get by this last hurdle to graduation is pervasive. "I can write to communicate, but in 20 minutes I can't think," said Jim Anderson, 21, a business major who recently took the test for the second time. "I'm sure I could do great in business courses and never know how to write."
Another young man, who would not give his name, said he had already left school and accepted an engineering internship at Hughes Aircraft Co., where he was succeeding admirably in a position that included some writing responsibilities. But his career was blocked, he said, because he had failed the writing test five times and nearly a year after completing his course work had not yet received a degree.
"My managers have been really understanding," he said, but without the parchment there can be no advancement to a regular staff position with its engineer's title and salary. "I think it's unfair. Other schools don't require this. What they should do instead is require more English classes."
Brekke attributes the low pass rate, which administrators say has improved by about 5% in the seven years since the test was first introduced, to what she calls the historical lack of writing instruction and the extraordinarily large class sizes in secondary schools throughout the state. "It's alarming," she said.
But she added that the situation is improving. "Most students do eventually pass the test," she said. "They know what the requirements are and they take appropriate action to (satisfy) them."
And the exam itself? "It's a very practical kind of assessment for people completing a degree," Brekke said. "No matter what field they are in, they must know how to write."