Depending on who you listen to, Rotten Tomatoes is out to destroy the world.
Fine, that might be a slight exaggeration. But the produce-themed movie review site has taken some flak in recent years — some felt the site unfairly sabotaged DC's "Batman v Superman" in 2016 — and if your financial well-being is directly tied to annual box office receipts, Rotten Tomatoes could be a sore spot.
The basic argument is that even though Rotten Tomatoes is meant to be a reflection of critical and audience reaction, the site is having a pre-emptive effect on ticket sales.
Of course, promoters are more than happy to use the site when it serves their purposes. All kinds of well-reviewed movies will incorporate certified "fresh" scores into their advertising. It's only when a movie stinks that the complaints come along. I think the bigger issue — if you can call it that — is that most people don't understand how Rotten Tomatoes works.
When we see a percentage score out of 100, we naturally associate the results with the grades we got in school. Scores in the 90s were A's, B's were in the 80s and so on. So when a movie such as "Black Panther" gets a certified "fresh" 97 percent on Rotten Tomatoes, we interpret it as a film that is near perfection, while a film such as last year's "The Emoji Movie" that lands around 9 percent, must be a steaming pile of garbage.
Now, most of the movies that score that low are steaming piles of garbage. But things get tricky around the 50-60 percent area (anything below 60 percent is considered "rotten"), where a percentage that might merit a D or an F in high school could mean something else entirely online.
This is because the Rotten Tomatoes score turns whatever metric the critic is using (stars, letter grades, thumbs) into a simple pass/fail. It doesn't represent how much critics liked it or how much they hated it. It merely separates all the likes from all the dislikes.
In that light, a film that scores 98 percent might not be all that far from one at 59 percent. It just means the 98 percent movie got mostly positive reviews — not necessarily glowing reviews — where the 59 percent of critics who liked the second film might think it's the best movie in the last decade.
Going back to the "Black Panther" example, I gave it 3 stars out of 4, which is my way of saying it was a good, solid movie, nothing wonderful. But my positive gets counted the same as every other positive, which in this case added up to 97 percent. I certainly wouldn't give it an "A" in that sense, but I would recommend it.
It's always interesting to see how Rotten Tomatoes interprets the reviews I feel a bit more divided on. If I wind up giving a movie 2½ stars, it's because there's something I clearly don't like about it but can't completely condemn it for. For example, "15:17 to Paris" had a great finish but stumbled quite a bit to get there, so I gave it 2½ out of 4.
To their credit, often the Rotten Tomatoes reps will email me to double-check before they make a decision on one of my more ambiguous reviews, and once you factor in the "audience score" and the "average rating," there's more than enough information to get a clear consensus.
None of this addresses the question of whether seeing that aggregate is going to pre-emptively steer audiences from "bad" movies, but isn't that kind of what the site is supposed to do? My guess is the people who don't like Rotten Tomatoes are probably the same people who don't think much of critics anyway, and that's fine. It's your 10 bucks — just don't say I didn't try to warn you.