Advertisement

Everybody’s a critic

Share
Kyle Pope, a former writer and editor for the Wall Street Journal, writes about business and the media.

LAST WEEKEND’S opening of “Pirates of the Caribbean: Dead Man’s Chest” was the latest installment in what has become an almost surefire summer script: Big-studio movie opens to much marketing hoopla, critics slam it to the ground -- and audiences trample the turnstiles to get in.

In the case of “Pirates,” audiences loved the movie to the tune of $132 million, setting a weekend box office record. This despite reviews that were, at best, lukewarm (“It would take more than just one bottle of rum” to while away this movie, snipped Slate).

But the gap between the lackluster “Pirates” reviews and its remarkable box office is no different from what we’ve seen with Hollywood’s other popcorn hits this summer: “The Da Vinci Code,” “Mission: Impossible III” and “X-Men: The Last Stand” were all generally dismissed by reviewers, yet they’ve done well enough in theaters to challenge last year’s conventional wisdom -- that Hollywood was entering a prolonged slump.

Advertisement

You could chalk all this up to a predictable, long-established summertime standoff between high-brow critics and low-expectation moviegoers who want little more than an easy escape that gets them out of the heat. But there’s something more going on here. What we are watching is nothing less than the demise of the mainstream entertainment critic -- a reordering that will last well beyond the return of more serious movies this fall.

In movies, books and television, critics who once dominated the national discussion are being marginalized, replaced by a confetti of micro-tastemakers on the Web and elsewhere.

Although this empowers consumers -- who now can go out in the world and seek their own critics -- it soon may affect the quality of what we see, hear and read -- and not necessarily in a good way.

There will be no critical consensus of what constitutes quality. Rather, art and entertainment will be judged largely on the basis of popularity, an “American Idol” sensibility that could spread the summer-blockbuster phenomenon into every corner of pop culture.

Already, the profusion of critical opinions -- on Wikipedia.comand Amazon.com and Huffpost.com, among other places -- has become more influential collectively than any movie or TV critic in the New York Times. It’s no anomaly that John Updike’s recent novel, “Terrorist,” is both his bestselling and worst-reviewed book in years, or that crowds lined up around the block in New York to see “Three Days of Rain,” a drama starring Julia Roberts that was blasted by critics.

Sensing the prevailing wind, movie studios increasingly are bypassing critics, doing away with review screenings before opening day. Nearly a dozen movies have premiered so far this year without critics getting a sneak peek.

Advertisement

Producers of “The Da Vinci Code” didn’t go quite that far, but they did sharply limit early press screenings. Movie reviewers wailed at the slight. Moviegoers ignored their bad reviews -- “Da Vinci” already has raked in more than $350 million globally.

“They voted with their feet,” said Amy Pascal, movie chief of Sony Pictures, which released “Da Vinci.”

Meanwhile, in Pasadena this week, some of the nation’s most prominent television critics will be AWOL from the TV industry’s twice-annual schmooze-fest, victims of cost-cutting at their newspapers. As someone who has attended these TV “press tours,” as they’re called, I can attest that the number-crunchers didn’t make an entirely bad choice. The two-week event is notorious for mind-numbing dullness, bad PR spin and gratuitous boozing by reporters.

It’s also possible that the editors at home realize what the audience already knows -- that their critics aren’t the arbiters of taste they used to be and probably won’t ever be again.

Complaints by critics that they are being ignored or marginalized are as old as newsprint.

In the 1970s and 1980s, New Yorker writer Pauline Kael earned her critical stripes by writing about the sorry state of the movie business -- a decline caused at least in part, she contended, by filmmakers who ignored the advice of Kael and her acolytes.

A decade ago, Susan Sontag picked up the mantle, publishing her much-reproduced essay “The Decay of Cinema.” That piece, and a slew of others that followed, laid down the themes that help define the current disconnect between movie reviewers and the people who go to the cineplex. “The commercial cinema has settled for a policy of bloated, derivative filmmaking,” Sontag wrote, calling the 1996 batch of commercial flicks “astonishingly witless.”

Advertisement

But it was the release of “Titanic” a year later -- and the critical undertow that followed -- that marked the beginning of the end of conventional media criticism. Reviewers found the movie, already famously over-budget and over-hyped, to be overwrought. Writing in The Times, critic Kenneth Turan said that audiences had become so desperate for entertainment that they were “sadly eager to embrace a film that, putting the best face on it, is a witless counterfeit of Hollywood’s Golden Age, a compendium of cliches, that add up to a reasonable facsimile of a film.”

“Titanic” director James Cameron was having none of it. In a 1,200-word response, he went after Turan, setting forth many of the frustrations that have since led audiences to search out their own entertainment and analysis. His complaint -- that audiences are treated “like little children who do not have the sense or experience to know what is good for them” -- is the foundation of the long-standing disconnect between consumers and the mainstream media.

What’s changed in the last 10 years is that technology now allows the audience to do something about it.

Steve Jobs has fashioned a career as a corporate windmill-tilter, helping Apple Computer become a technology titan. Marketing for iPod and iTunes borrows heavily from Napster, which sought to foment a music-sharing revolution among its users. By giving people the ability to collect and assemble their own music, rather than buying it prepackaged from a store, consumers can get a sense of being in control.

Similarly, satellite radio and television companies give their customers such a wide array of choices that users feel like they’re in command of what they see and hear, as opposed to having to listen to the records or watch the shows that critics deem most worthy.

Today, it’s mass opinion that’s deciding what we see. “American Idol” and “America’s Next Top Model” do this in a raw and obvious way. But soon, videos on YouTube.com will morph into movies, all because of the number of clicks they receive on the website.

Advertisement

The question now is what this user-generated decision-making will mean for our entertainment.

In the 1960s and early 1970s, filmmakers actually cared what critics thought, and some of them adjusted their next film accordingly. You could argue that movies were better as a result.

But now, with raw popularity increasingly calling the entertainment shots, there is no uniform standard of quality. It is the prevailing theory of our blogging-Wikipedia age that enough opinions, cobbled together and collated, congeal into a broader truth.

Sometimes that’s legit. But sometimes all we get is trash that happens to be popular.

As much as we like to think of ourselves as independent thinkers, resistant to the taste-making imposed by others, sometimes having a thoughtful outsider tell us what to think can make us all just a little bit smarter.

Advertisement