Advertisement

Mixed reviews

Share

Disney’s newspaper advertisement for “Up” featured the kind of blurbs typically associated with a critical hit, with Roger Ebert, Leonard Maltin, and the Wall Street Journal’s Joe Morgenstern among the reviewers quoted. But a closer look at Saturday’s full-page ad revealed a more unusual endorsement: a 98% “fresh” rating from the website Rotten Tomatoes.

The studios are always searching for new ways to sell movie tickets, and they are now looking to review aggregators such as Rotten Tomatoes, Metacritic and newcomer Movie Review Intelligence to generate box-office buzz by amplifying the sound of the critical chorus. As the sites grow more prominent, however, they also are attracting questions about their methodologies, and who exactly qualifies as a film critic in the Internet age?

In a way, the review aggregators are to movies what TripAdvisor is to hotels and the Zagat guides are to restaurants -- one-stop sites for consensus opinion. Whereas TripAdvisor and Zagat base their marks on consumer ratings, the movie aggregators generally use professional critics, although Rotten Tomatoes includes a number of citizen-reviewers who write on obscure websites, like Georgia’s self-proclaimed “entertainment man” Jackie K. Cooper.

Advertisement

Movie marketers say they like the sites because they can boost movie admissions.

“Are they the driver? No. Can they help drive business? Yes,” says Mike Vollman, MGM’s marketing chief. “People want to know what the consensus is. I am a huge believer that in today’s culture, people don’t pay as much attention to individual voices as to the aggregate score.”

Adds Peter Adee, the marketing head at Overture Films: “I definitely think that if you’re fresh, it helps sell the movie. Reviews matter for an older audience, and certainly matter for an older female audience.”

Rotten Tomatoes (a reference to what moviegoers once hurled at screens showing bad movies) is by far the most popular of the three aggregator sites, attracting about 1.8 million unique visitors monthly, according to comScore. As soon as the site deems a movie “certified fresh” -- meaning that at least 60% of the roughly 400 critics it surveys give a movie a favorable notice -- studio executives call Rotten Tomatoes, asking for the small trophies the website dispenses to commemorate the accomplishment. Rotten Tomatoes also is considering adding its mark to DVD packages for movies scoring well.

But as rivals Metacritic and Movie Review Intelligence point out, Rotten Tomatoes can give its coveted “fresh” rating to films that any number (and hypothetically all) of its counted reviewers don’t really love. And though all three sites present numerical averages in their ratings, the calculations involve subjective scoring by the aggregators themselves, not just the critics.

Rotten Tomatoes’ scores are based on the ratio of favorable to unfavorable reviews. If a film lands 10 positive reviews and 10 negative reviews, in other words, it’s 50% fresh, and if the ratio is 15 good to five bad, it’s 75%. But if all 20 of those critics give that same film the equivalent of a B-minus letter grade, it’s 100% fresh, because all of the reviews were positive, even if only barely so.

“Our goal is the extension of thumbs-up and thumbs-down,” says Shannon Ludovissy, Rotten Tomatoes’ general manager. “It’s not a measure of the degree of quality.”

Advertisement

Metacritic and Movie Review Intelligence try to come up with an average reflecting how much critics actually like a movie, rather than a ratio of raves to pans. If a movie on those two sites gets a 50% score, it means the consensus of all of the reviews it read was 50% positive -- the average review, put another way, was two out of four stars.

Like Rotten Tomatoes, Metacritic and Movie Review Intelligence assign every review it reads a numerical score, a sometimes tricky endeavor because many leading critics (including those who write for the Los Angeles Times) don’t award letter grades or stars as part of their reviews.

And that’s where the subjectivity comes in.

David Gross, a former market research and 20th Century Fox studio executive who launched Movie Review Intelligence a month ago, says he and his staff read (or watch and listen to) reviews from about 65 top U.S. and Canadian outlets, including media companies as varied as Newsweek and National Public Radio, excluding the little-known Internet critics Rotten Tomatoes includes.

Gross says about three-quarters of the appraisals his site tracks carry letter reviews, and that two analysts from his company score the notices that don’t with letter grades. “It’s the clearest way in the human mind to differentiate” a review’s enthusiasm, Gross says. “And if the analysts differ in their grades, we have a discussion about it.”

Those assigned grades are then translated into numerical scores -- a B-plus is an 83, a C-minus rates a 42, and so on. Rather than simply average those scores, Gross applies a weighting system based on a reviewer’s circulation, with People magazine (circulation: 3.7 million) receiving the greatest weight. “All that counts is what moviegoers are reading and seeing,” Gross says. “I think the most important thing is to reflect what is going on in the market -- in the real world.”

Metacritic, which was launched in 2001, uses a similar methodology to assess the 43 reviewers it surveys, about half of whom don’t use stars or letter grades. But rather than translate a review into a letter grade, the site’s staff scores notices on a 0-100 scale in 10-point steps. “It’s still often hard to distinguish between what’s an 80 and what’s a 90,” says Marc Doyle, one of the founders of Metacritic. Some critics will contact the site to say its scoring of their reviews was wrong, and Metacritic will amend its mark.

Advertisement

Whereas Movie Review Intelligence weights reviews for audience size, Metacritic tips the scales for “prestige,” a calculation it calls its “secret sauce” that it won’t disclose. “Roger Ebert is weighted more than someone you’ve never heard of,” Doyle says.

Doyle says that when critics are consistently 75% favorable in their reviews of a movie, its Metacritic score is a 75. But that same movie could be 100% fresh on Rotten Tomatoes. “That’s a fundamental difference,” he says.

Gross says that because Rotten Tomatoes gives equal weighting to Time and tiny websites, it penalizes circulation. “What’s going on hurts critics, it hurts moviegoers and it hurts the industry,” Gross says. “What difference does it make if some fan boy says thumbs down to ‘Terminator Salvation’?”

Unlike many of the filmmakers who get poor scores on his website, Rotten Tomatoes’ Ludovissy says he doesn’t mind the criticism. Even if the site simply scores reviews yeah or nay, Rotten Tomatoes endeavors to get its marks right. When a critic writes a mixed review, the site will call the author to see if they actually liked the film. (When a film is scored 59%, publicists will call Rotten Tomatoes to lobby the site to reevaluate some mixed reviews to push their movie over the magical 60% mark.) And the site does have an application process to admit its critics.

“All of them are accredited by some organization,” Ludovissy says. “It’s not just that Joe Blow starts a blog and then he’s a critic for us.”

--

john.horn@latimes.com

Advertisement

--

(BEGIN TEXT OF INFOBOX)

--

Good, bad or who really knows?

Metacritic and Movie Review Intelligence assign a numerical score to a critic’s review (0 through 100) and calculate an average for all sampled reviews. Rotten Tomatoes’ percentage reflects the ratio of positive to negative scores -- if four out of five critics like a movie (even not very much), for example, it receives a score of 80%.

‘The Taking of Pelham 123’

Metacritic: 58%

Movie Review Intelligence: 68.2%

Rotten Tomatoes: 48%

--

‘Imagine That’

Metacritic: 55%

Movie Review Intelligence: 60.7%

Rotten Tomatoes: 43%

--

‘Terminator Salvation’

Metacritic: 52%

Movie Review Intelligence: 55.7%

Rotten Tomatoes: 33%

Advertisement