Advertisement

Confessions of an ex-pollster

Share
NICHOLAS GOLDBERG is editor of the Op-Ed page and Current section of The Times.

THE WEEK I became a pollster, the following paragraph appeared in a New Yorker article about President Clinton, written by the magazine’s then-political columnist Joe Klein:

“The president [is] an absolute slave to data: nothing is left to chance (or sadly, to moral imperative). Each new idea is market tested before it is presented to the public.... Other presidents have been poll-obsessed, but none quite so microscopically as this one.... Such micromarketing may be remembered as this president’s most lasting, and most dubious, contribution to the art of governance.”

Some people might have been disheartened, but I was delighted. At last, I was entering a field just as it was coming into its own! From its earliest days, political polling had its critics -- those who thought it was too intrusive or too unscientific, or who worried (like Klein) that political candidates would become excessively dependent on public opinion at the expense of their personal principles -- but I was inclined to give it the benefit of the doubt.

Advertisement

I knew, of course, that candidates might take polls too much to heart -- and might even cynically change their positions based on them. But I tried to keep in mind what Abraham Lincoln supposedly said: “What I want to get done is what the people desire to have done, and the question for me is how to find that out exactly.”

In retrospect, my understanding of the business when I began the job in early 1999 was extremely unsophisticated (though perhaps not as unsophisticated as that of my mother, who suggested at one point -- perhaps in jest? -- that she thought I was becoming an “upholsterer”). I believed, as most Americans probably do, that polls are simply a tool for finding out where people stand on issues and who they are going to vote for.

What I failed to grasp was that the primary purpose of our business was not to learn what voters think -- but to determine how they could best be persuaded.

The surveys I created in the years that followed would have little in common with the public opinion polls I had read in newspapers all my life (the kind that tell you, as Arianna Huffington once put it, that “59% of all Americans think ‘Ed’ is an ‘OK’ name, while 64% put on their pants left leg first”).

Yes, our clients wanted to know whether the voters would support them. And they certainly needed to know what were the most important issues to voters -- schools, taxes or crime.

But at their core, our polls were not about taking the proverbial pulse of the voters. As we used to explain in our pitch letters, we had no interest in providing clients with a useless “data dump.” We were seeking “actionable” information to prepare a detailed, quantitatively tested “blueprint” that in turn would help us craft the arguments that would resonate most forcefully with voters.

What did that mean exactly? It meant pinpointing how the public felt about our clients -- and then figuring out how to transform those perceptions among the voters we needed most. It was about driving up our candidate’s positive attributes while inoculating him or her against potential attacks. And the same with our opponents: We’d probe for their vulnerabilities and determine how they could be exploited.

Advertisement

Say, for example, our client was a 20-year veteran of the House of Representatives who wanted to run for the Senate. But after two decades in office, he wasn’t sure whether he was perceived as an energetic fighter for his constituents or as a lazy, aging political hack.

Enter the pollster! Could we buff up our client’s image by leveraging his votes in favor of healthcare coverage and his vehement opposition to raising taxes (which we would hammer home in uplifting televisions ads set against a background of comforting guitar music and photos of him with his kids)? Or would his opponents be able to outmaneuver us by harping on the 25 House votes he missed last year while vacationing in Bermuda?

Our goal was to run the campaign in theory before it started. To call 600, 800 or 1,000 test voters on the phone and begin to play out the arguments that would later be heard in union halls, direct-mail attacks, candidate forums, public debates and -- most important by far -- in millions of dollars of paid television ads to be aired in the closing weeks of the campaign.

A typical poll would open by screening out representatives of the media and screening in likely voters. We’d ask which issues were most important. We’d ask the critical “horse race” question: “On Nov. 3, there will be an election for U.S. Senate. Who are you going to vote for, Democrat John Smith or Republican Tom Jones?” We’d ask about our candidate’s “image”: Would you say he “cares about people like me” and “is effective,” or is he “in the pocket of the special interests”?

I learned the rules for writing questionnaires. Avoid loaded wording that encourages a particular answer (“Would you prefer to go to the movies or just stay home and watch television?”). Don’t “double-barrel” the questions so that they include more than one idea, confusing the respondent. (“Do you favor welfare reform if it requires an increase in taxes?”) Keep questionnaires short; respondents will only stay on the phone 15, 20, maybe 25 minutes -- and every hang-up is money down the drain.

But at the heart of our polls were the “messages.” These might include, say, a paragraph touting the candidate by naming all the environmental groups that endorsed him. A separate paragraph explaining that he “shares our values” because he voted against raising taxes. Another telling the story of his hardscrabble upbringing. Still another laying out his universal healthcare proposal.

Advertisement

After each message, we’d ask whether it was a convincing or unconvincing reason to vote for our candidate -- and we’d retest the horse race after each one. Maybe we’d also try out some negative arguments against our guy (“lazy, aging hack!”), and maybe some positives and negatives about our opponent.

Then we’d look at the results: Which messages were the most persuasive to voters -- especially the elusive swing and undecided ones? Which messages moved the vote most? If message A (“won’t raise taxes”) moves the race six points in our favor but message B (“backs universal healthcare”) moves it 12 points, then it’s a no-brainer, right?

By asking a series of demographic questions at the end of the poll -- including, but not limited to, age, income, race, religion and party registration -- we ensured that all the messages could be broken down by category: Do older African Americans respond to positive message A, but independent women respond to attack message B? Then maybe we need to put a 15-second ad on black drive-time radio making one point, while sending a four-color, glossy piece of mail into the homes of independent women saying something else? Tailor the message!

During the years I was a pollster, I worked for gubernatorial candidates, U.S. Senate candidates and mayoral candidates across the country. I worked on political races in Serbia, South Korea and South Africa, among other countries (on the theory that our fundamental methodology would be effective anywhere democratic elections were held -- and occasionally even where the election was anything but democratic, as in Serbia).

I wasn’t always pleased with what I was doing or with the candidates I was helping. But I didn’t feel that my work was terribly sinister either. Politics is like football: At the end of the day, you win or you lose. It’s a high-stakes game and often a nasty game, in which a lot of money, a lot of jobs and a lot of important policies are on the line. Few candidates will turn down the help they need to hone their messages effectively.

Besides, we didn’t (usually) encourage candidates to betray their principles. Mostly, we helped them plot a strategy, marshal their best arguments and target the audience they needed to reach to win the race. Did it work? Not always, and not with scientific precision. But we won a lot more races than we lost.

Advertisement

I worked for some slimebag politicians, but I worked for some smart, dedicated and effective ones as well. Most of my clients, frankly, fell somewhere in between.

I lasted four years, and then I left the business. I don’t regret at all having tried it and, despite the turmoil in my current line of work, I rarely regret the decision to quit.

Advertisement