Advertisement

Polls apart

Share
Phil Trounstine and Jerry Roberts report regularly on California politics at calbuzz.com.

Daily Kos, an influential liberal website, recently released a poll showing that San Francisco Mayor Gavin Newsom was just nine points behind Atty. Gen. Jerry Brown in the Democratic primary race for governor.

Within minutes, the San Francisco Chronicle posted a blog item saying the poll showed the race was “narrowing,” comparing it to a June survey, conducted by a different company, that gave Brown a 20-point lead over Newsom. The item was quickly picked up and posted by Rough & Tumble, California’s premier political news aggregator. Then it was reported and re-blasted by “The Fix” at the Washington Post, one of the top political sites in the country. Within 12 hours, this characterization of California’s race for governor became received wisdom.

There was only one problem with this wisdom: It was wrong.

The incident illustrates how political misinformation and misinterpretation can be more viral than the truth in the Internet News Age, as reporting on polls pulses through the electronic highway, launched by news organizations with little time to evaluate and sift the quality of research. In recent weeks, a series of California political surveys has produced a cacophony of often conflicting analysis, opinion and reporting that served to confuse readers and distort political perceptions.

Advertisement

For example, comparing and measuring the Daily Kos poll, conducted by Research 2000, against the previous poll -- done with a completely different methodology by Moore Methods Research of Sacramento -- created a false equivalency. In fact, a recent follow-up poll by poll director James Moore, who has long experience in California, found that, far from tightening, Brown’s lead over Newsom has grown to 29 percentage points.

A poll’s methodology -- including the sample size, method of selection and phrasing of questions -- is crucial. The Daily Kos survey, for example, used random digit dialing to reach California adults. To identify them as “likely voters,” pollsters asked respondents several questions, including whether they considered themselves Democrats or Republicans. But identifying 600 likely voters didn’t provide the number of Democrats and Republicans statistically necessary to measure the primaries, so pollsters called more people until they had 400 self-identified Republicans and 400 self-identified Democrats. Then, as they put it, “quotas were assigned to reflect the voter registration of distribution by county.”

After this statistical slicing and dicing, the survey produced a final sample of alleged likely voters that included 18% under age 30 and 19% age 60 and older. But according to a real-world screen of likely voters -- based on actual voting histories -- the June 2010 primary electorate is expected to include about 6% people under 30 and 38% people over 60.

These issues alone would be enough to distort the state of the Brown-Newsom contest. But will any of them surface when the next reporter Googles the California governor’s race, looking for standings? Not a chance. Why does it matter? Because misreporting of polls allows campaign spinners not only to boost or suppress candidate fundraising but to manipulate news coverage, frame campaign narratives and shape public perceptions.

The Daily Kos poll is far from an isolated incident, as misreading and misinterpretation of survey research have become endemic on the Web. Consider the following:

* A recent poll by the widely respected Public Policy Institute of California, for example, reported that 53% of registered voters now favor more oil drilling off the California coast, a finding trumpeted by supporters of the policy. But respondents were asked their view on drilling as one of several approaches “to address the country’s energy needs and reduce dependence on foreign oil sources,” a question likely to elicit a much different response than one about the environmental impact of drilling.

Advertisement

* A recent NBC/Wall Street Journal poll reported that only 43% of those surveyed supported a “public option” for healthcare reform -- an apparently dramatic swing from its previous poll, which found 76% support for the policy. On closer examination, though, it turned out that pollsters in the first survey asked people if they wanted the “choice” of a public option. In the later poll, they omitted the key word “choice,” asking simply whether respondents favored a public option. When Survey USA a short time later used the original language, 77% of respondents said they favored the public option, confirming the finding in the first NBC/WSJ survey.

* Some political analysts, citing an increase in the number and proportion of “independent” voters who decline to affiliate with a major party, have argued that California is becoming a post-partisan “purple state.” But the recent release of 30 years of surveys by the Field Poll showed how wrong this analysis is. On a host of divisive issues, such as abortion rights and same-sex marriage, independents have much the same attitudes as Democrats, keeping California a very blue state.

As established news organizations increasingly cut costs, first-rate, independent, nonpartisan polling is becoming scarcer. So polling stories should be viewed by readers -- and voters -- with great skepticism, and news outlets should use greater care in analyzing and disseminating survey data. Reducing political views to a number does not necessarily make them scientific. Caveat emptor.

Advertisement