Findings of wine contest study hard for critics to swallow

Share via

Wineries covet gold medals and spend millions of dollars a year entering wine in competitions and paying fees in hopes that they will be able to brag about awards on their bottles and boast about them in pitches to tasting room customers and wine club members.

But a study of U.S. wine contests published this week suggests consumers should not always assume that gold medal winners are outstanding wines.

Writing in the Journal of Wine Economics, retired Cal State Humboldt professor Robert Hodgson said he looked at the results for more than 4,000 wines entered in 13 U.S. competitions in 2003 and found little consistency in what wines won gold medals.


The findings were dismissed as “hogwash” by the organizer of the Los Angeles County Fair’s giant wine contest.

The study said that of almost 2,500 wines that were entered in more than three competitions, 47% won a gold medal in at least one contest.

However, of those gold medal winners, 98% were regarded as just above average or below in at least one of the other competitions. Hodgson said that demonstrated how little consistency there was.

“Of the wines that entered five competitions and got at least one gold, about 75% also received no award in at least one of the remaining competitions,” he said.

“How can you explain this huge discrepancy?” the professor asked. “Either the wineries are sending non-uniform samples to competitions or the judges are simply unreliable instruments for assessing quality. What is the consumer to think?” Hodgson, who taught oceanography and statistics, owns the small Fieldbrook Winery north of Eureka in Humboldt County. He decided to study wine competitions after seeing his wines win in some events and garner no awards in others.

Hodgson drew the ire of many wine contest organizers earlier this year when he published a four-year study of the California State Fair Wine Competition that found judges often rated the same wine differently when they tasted it twice in a blind group of wines.


His latest study used a compendium of wine contest entries and awards collected by California Grapevine, a wine enthusiast newsletter.

He’s also taking heat for this report.

“The conclusion that everything is just chance is hogwash,” said Robert Small, chairman of the Los Angeles County Fair’s wine contest.

“Our mission is to provide factual good information to consumers,” and the contest selects winemakers and others who are skilled at judging the quality of a wine, he said.

The competition also provides many wineries with an “affordable way” to get their wine noticed in an industry dominated by giant corporations with large marketing budgets, Small said.

Wineries entered about 3,600 wines in this year’s competition, paying a $75 entry fee. All but the makers of limited-production wines provided six bottles for tasting.

Joe Roberts, who writes the 1WineDude blog and is a certified wine educator, said Hodgson’s study failed to address wide differences in the way the contests were managed and in the tasting skills of judges.


A gold medal from a well-designed contest would be “meaningful” but looks “random” when lumped together with data from all the other competitions, he said.

Still, Roberts, who does not judge wine contests, said it’s wise for consumers to be cautious: “There is no place where a consumer can go to understand whether an award at one competition is any better than an award at another.”

The randomness of gold medals is no surprise to Hildegarde Heymann, a professor and sensory scientist at the UC Davis department of viticulture and enology.

The problem, she said, is that the wine contest judges don’t have enough training in sensory perception to be able to taste wines and offer consistent and repeatable evaluations.

“The facts are that the results are not repeatable across competitions,” Heymann said. “Basically, this is a beauty contest rating Miss Universe as she walks down the runway.”

The study can be found online at