By overwhelming critical consensus, 2013 was a banner year for movies. End-of-the-year lists, that dependable fruitcake of entertainment journalism, arrived with festive unanimity. It was a "tremendous" (the Atlantic's Christopher Orr), "amazing" (the New Yorker's Richard Brody), "flat-out, stone-cold, hands-down spectacular year in movies" (the
As a theater critic who loves spending his free nights plunged in cinematic darkness, I couldn't have been more excited to get these reports amplifying the raves that came fast and furious all fall. There's nothing I like better than bingeing on Oscar bait in late December when I play Scrooge and take my holiday from the theater and its myriad revivals of "A Christmas Carol."
But as I caught up with most of the likely trophy contenders, I found that I was arriving at a different estimate from my reviewing brethren: What others proclaimed great, I would have begrudgingly called good — leaving me to wonder whether this might have been a more uplifting year for critics, relieved that the studios were remembering that adults buy tickets too, than for regular moviegoers, who judge a film solely on its merits.
For all its groovy '70s style, "American Hustle" couldn't overcome its tedious plot and flatly flamboyant characters. "Gravity" was as narratively absurd as it was visually spectacular. "12 Years a Slave," the best of the over-touted bunch, provided a harrowing and necessary historical corrective, which is invaluable but not the defining quality of a cinematic masterpiece. Other celebrated efforts (
Can 2013 really be such a remarkable year when there's not one best picture Oscar nominee that I would be eager to see again?
Naturally, I have my own favorites. I found "Blue Is the Warmest Color" to be a mesmerizing study of love and social identity.
As someone who receives his share of ticked-off emails from readers with dissenting opinions, I don't expect my likes and dislikes to match another critic's. I learned this lesson from years on awards panels, in which deliberations often turned into trench warfare.
My suspicions can't help being raised by the way a rather debatable idea has been so widely adopted. "Is 2013 the Greatest Year for Movies Since the Gone with the Wind Era?" Ihndubitably! Why else would Vanity Fair bother to raise the question in an online headline?
It just so happens that this last year on Broadway was also considered a historic one. Critics kept rhapsodizing about a cornucopia that in previous decades might have seemed like the usual mixed bag.
Rather than single out 2013 as an exceptional year for drama on stage and screen, I posit a different reason for its noteworthiness: It marked the period in which grade inflation by critics became a commonly deployed strategy for dealing with the cultural and economic insecurity that shows no signs of abating in post-recessionary America.
George Bernard Shaw, as preeminent a critic as he was a playwright, cautioned against the idealization of the critics of yore. "Criticism is, has been, and eternally will be as bad as it possibly can be," he unceremoniously declared at a luncheon of the London Critics' Circle in 1929.
In voicing this characteristically mischievous opinion, Shaw was attacking a profession in which he argued authority is arbitrarily granted and subsequently protected no matter the degree of incompetence or irresponsibility. He did, however, note that there is "one check on the badness of dramatic criticism, and that is the talent of the critic."
Yet the talent of the critic, the quality that makes an individual voice worth attending to, is confronting new pressures undreamed of in Shaw's pre-Internet philosophy. The earthquake that has rocked print media and thinned the ranks of newspaper and magazine staff critics has undermined the job security that historically gave these writers the freedom to express themselves boldly.
As journalism has been reeling from the changes wrought by the digital age, so too has the entertainment industry, which has struggled to reinvent business models that can accommodate new modes of spectatorship and in-demand consumption.
Critics aren't oblivious that their livelihoods are tied to the health of the disciplines they cover. (How could they be with every box-office blip setting off a breaking news alert?) A strong film industry means not only more robust advertising but also continued cultural relevance.
Social media, moreover, have created a deafening echo chamber in which opinions are confirmed ad nauseam until a de facto truth has been established. Dissent from the status quo can be attention-grabbing for a time, but a limit is quickly reached at which point the dissenter becomes marginalized as a crank.
With so much editorial emphasis placed on readership numbers (hence all the award show overkill), there is the furtive temptation for critics to align themselves with marketing forces. A rave review will be widely circulated by the studio distributing the film or the theater producing the musical. The danger here isn't so much conscious as unconscious collusion.
Popular sites such as Rotten Tomatoes, while useful to consumers, are detrimental to critics for two reasons: By tallying up the consensus of reviewers, they throw into relief the loneliness and vulnerability of the outlier position and by reducing criticism to a negative or positive assessment they are the enemy of nuance.
The decline in weekly and alternative publications of influence has endangered the long view perspective as has the demand for all journalistic outlets to keep pace with the 24/7 media cycle. There was a time when critics such as Stanley Kauffmann and Pauline Kael offered correctives to the haste of daily reviewers. What one of my editors calls "slow criticism" has long been banished to the quarterly fringe.
One might think that with so many voices competing for attention in the Twitter-sphere that the renegade would be encouraged. But one of the many ironies of the oceanic Web is the way its cherished mode of snark has been used to enforce homogeneity. To take an unpopular stand is to subject oneself to a withering cyber attack, and conversely, to extol what everyone else adores is to be made a 15-minute hero.
A critic criticizing the state of criticism is implicitly engaging in an act of self-criticism. By airing these concerns I'm not trying to set myself apart so much as bring out into the open some of the insidious traps of modern-day reviewing.
The critics I know are an honorable, overworked lot, and there are cogent reasons for proclaiming 2013 to be a good year for the movies. But paradoxically the case seems rather less convincing when superlatives are being used to reinforce one another. (On a related note, if one or two TV critics would relent in calling this a Golden Age, I just might be persuaded to try out "House of Cards.")
It has always been easier to love something than to analyze the reasons for that love and point out where the love is weakest. To choose one example, you might be so seduced by the celestial choreography of "Gravity" to overlook its ludicrous screenplay, but it's hard for me to credit your affection if you refuse to acknowledge glaring shortcomings.
As I write this, book critics are arguing whether there's any point in publishing negative reviews given the wobbly state of book publishing, and "The Lego Movie" is riding high at the box office and on the Rotten Tomatoes' Tomatometer. The commercialization of our culture is nearly complete. A relaxation of critical standards is hardly going to slow this process. Resistance can only lie in the unbeholden voice. To love an art form means not being afraid to tell its corporate overlords what they probably don't want to hear.