To what degree do you check out movie review metrics these days? Personally, I had that Rotten Tomatoes phase and burned out on it pretty quickly, mostly when I realized that flawless scores were often bought, and “rotten” movies really means divisive. I abandoned MetaCritic three years back when they started offering metascores that rated a person’s cumulative work — giving a score to human life — which seemed barbaric, even to a numbers cruncher like myself. If you’re still a part of this world, take note, because things are about to get even more flawed.

The folks over at have been looking at the overall movie ratings and comparing them between sites, and their October 2015 report on their findings does not bode well for Fandango as a reputable source.

From the piece:

“…Fandango, a NBCUniversal subsidiary that uses a five-star rating system in which almost no movie gets fewer than three stars, according to a FiveThirtyEight analysis. What’s more, as I’m writing this, scores on are skewed even higher because of the weird way Fandango aggregates its users’ reviews. And while other sites that gather user reviews are often tangentially connected to the media industry, Fandango has an immediate interest in your desire to see a movie: The company sells tickets directly to consumers.”

“What started all this? A couple of months ago, a colleague noticed that a bad film had received a decent rating on Fandango and asked me to look into it. When I pulled the data for 510 films on that had tickets on sale this year, something looked off right away: Of the 437 films with at least one review, 98 percent had a 3-star rating or higher and 75 percent had a 4-star rating or higher. It seemed nearly impossible for a movie to fail by Fandango’s standards. When I focused on movies that had 308 or more user reviews,9 none of the 209 films had below a 3-star rating. Seventy-eight percent had a rating of 4 stars or higher.”

Sure. This is just Fandango and this skewing seems like it’s just an integer or partial integer away from the standard mean of viewer scores. But here’s the twist: in mid-February, Fandango acquired both RottenTomatoes and Flixster.

From the press release, Fandango president Paul Yanover had this to say: “Flixster and Rotten Tomatoes are invaluable resources for movie fans, and we look forward to growing these successful properties, driving more theatrical ticketing and super-serving consumers with all their movie needs.”

Yes. Super-serving.

What does this mean for Rotten Tomatoes? Well, to refer back to the original breakdown from, the big takeaway was that a combination of button embed locations, end-user consumer experiences, and a somewhat twisted algorithm in the coding eventually results in all films being ranked on a scale between three and five stars, rather than one to five stars. It’s very technical and upsetting, but the end result looks like this:

So with Fandango buying Rotten Tomatoes, which has historically existed somewhere in the mid-range of user review scores, expect to see some users get super-served by an altered algorithm and some style changes that may shift the experience that built the very community. It’s also worth noting that this is a marketplace, so obviously some reviews are skewed by what customers are using them for — so perhaps Amazon user scores should be factored in, or at least acknowledge that there’s a difference in perception of long-term reflection versus post-purchase experience.

This is all to say that if you start to feel like you’re suddenly more curmudgeonly than the average movie viewer, it isn’t you — computers are trying to make us look happier than we are. You aren’t getting older, it’s just SkyNet telling you to cheer up.