This BMT:CSI:SVU was written around October 1, 2015 during the beginning of preparations for the Razzies. It is always difficult to determine which movies are more important to watch in theaters or right as they come out on DVD, so this short study was just an initial look at how we might connect the BMeTric to real Razzie results.
The problem the Bad Movie Twins face every year during Razzie preparations is the difficult choice of which movies are bad enough and big enough to earn the almost-meaningless dishonor of being nominated for a Razzie. As voting member we take our duty far more seriously than we should. So how best to determine which movies, prior to nominations, deserve our attention? That is where this comes in.
Alright, to start, the most important point during Razzie Prep is the moment the prenominations arrive. That is when you actually know what smaller group of movies you are dealing with (as opposed to the ~600 movies released to theaters in a given year, it is whittled down to around 30). I’ll have to go to the wayback machine (thanks Internet Archive) to determine vote/rating counts on January 1 of a given year of study because that is roughly when prenominations are known.
The method: Get the BMeTric for “all” released movies based on approximated IMDB votes and rating from 1/1/2015 via the internet archive (and a simple linear extrapolation from the nearest two points archived from around that date). Separate out the movies prenominated, nominated, and the winners for the Razzies 2015 from that year and do a side by side ranking based on how well they did in the Razzies and our BMeTric.
In order to do this I also needed to define a Razzie Score. I decided that all moves in a given year should have a score that sums to 100. I decided to then split the score into three equal parts: 33.3 for all the winners, 33.3 for all the nominees, and 33.3 for all the prenominees. In 2015 there were 108 prenominees, 45 nominees, and 9 winners (I counted combinations, like Cameron Diaz nominated for both The Other Woman and Sex Tape, as 0.5 wins/nomination/prenominations for each of those movies). So a win was worth 3.7, a nomination 0.74, and a prenomination 0.3083. I’ll adjust this in the future if it doesn’t seem to work, but there is far too little data to really make a real model I think. Here are the results for 2015:
So I have two main takeaways which really is one big takeaway. First, note the over-performers (movies that scored high in the Razzie Score, and lower on the BMeTric): Saving Christmas, Transformer 4, TMNT, A Million Way to Die in the West, and Expendables 3 mainly. These all are what I call “easy targets”. Kirk Cameron, Michael Bay, Megan Fox, Seth MacFarlane, Sly Stallone. It boosts their score in the Razzie voters’ eyes. On the flip side look at the unnominated list. Those are the unnominated movies with a BMeTric over 25. The yellows highlight horror films and the greens are Christian films. First, we need to stay away from horror films, Jesus Cristo. But to get back on track really basically all those films are low budget, and low budget really means: No big targets!
So really there is one big thing that gets you that Razzie Score: Targets … BMTargets. I’ll leave it there. Where I’ll want to look to in the future is perhaps a Predicted Razzie Score. This involves two things. Mainly I’ll have to determine BMTargets, and how that contributes to the score. Also, I’ll need to actually work on the time-independent BMeTric to get a populuarity rating without knowing the vote/rating count ahead of time (obviously very important). Once I have those I think I’ll be able to determine with …. accuracy is a strong word. But I think I might be able to identify “likely” Razzie targets.