Photo Courtest of the Alamo Drafthouse
The Alamo Drafthouse is presenting a summer movie event they kicked off in early May called “The Summer of ‘82”. In their belief, the summer of 1982 was the best summer ever for movie releases. And in case you can’t immediately wrap your mind around that belief, take a look at the films that came out that year, in order:
- Conan the Barbarian
- The Road Warrior
- Rocky III
- Star Trek II The Wrath of Khan (Khan!!!!)
- The Thing
- Blade Runner
- The Secret of Nimh
- The Dark Crystal
- Fast Times at Ridgemont High
And that’s not a complete list. The Alamo will be showing these films (so far, no announcement for Blade Runner sadly) in the order and on the correct dates as they were released 30 years ago. When it was first announced, I think I may have guffawed. Not a loud guffaw, but that’s what it was. I’ve been organizing my movies by theatrical release date for a while now and couldn’t immediately appreciate what they were saying. I love the Alamo and have discovered some great films by their recommendations, but I was still a little skeptical. So I looked it up and couldn’t believe how many of those films are favorites of mine. So many films from the summer of ‘82 are iconic and relevant to this day.
But it got me thinking. How do the other summer movie seasons compare? Can this be quantified in any sort of way? So I started looking through IMDB and collecting data. Could there be something to the Alamo’s definitive statement for the best summer ever?
Before we get any further, I must make it abundantly clear that something like this is impossible to quantify. If we were just looking at the most profitable summer ever, that would be easy. The best summer for movies is going to be completely subjective and this effort is just to add to the fun of it all.
This wouldn’t be a quantified article without some data! So I turned to my trusted sources, IMDB and The-Numbers to help narrow down a data set I could use and start massaging. First, I wanted to make sure I could get the time frame correct. I selected movies that had a US theatrical release date between all of May and August. I further restricted the data to include only movies with known domestic gross figures. Since I consider Jaws to be the first true summer blockbuster, I started the data set in 1975, the year Jaws was released. And that left me with, well, a very large set of movies, with the data being very heavy in the more recent years due to more mature record keeping standards. When I looked at the list of movies, I felt there were a number of films in the list that just weren’t “summer” movies. Sure, the release date was accurate, but when we talk about summer movies, we’re talking about blockbusters. So I made a decision to cut any film that made less than $12,000,000. This doesn’t eliminate non-blockbuster type films, but it comes close. I then adjusted for inflation (1975 was left alone, then all years adjusted from that base) so that a film in 2011 needed to make a minimum of just over $40,000,000. That may not seem like a lot of money, but it did do a reasonably good job at keeping the number of films each year about the same.
Once I knew which movies were going to be included in the list, I hunted through IMDB to get average scores of all the films. When doing this, I noticed a very large range in the number of users who rated each title. New movies were rated much more often. Older movies may be seeing a benefit to their rating due to bias. If a movie hasn’t been seen in a long time, it’s possible that it is being remembered with rose tinted glasses, as the saying goes. So keep that in mind. However, newer movies may have a terrible time being correctly evaluated, since longevity is impossible to really predict. There is, of course, the occasional film that has been better reviewed with the passage of time (The Thing), so it’s possible there are more modern films that will stand out more in another 20 or 30 years, but sheesh, we can only accommodate so many things at once.
Please, please remember, the amount of money a film makes has nothing to do with the quality of the movie. My reason for including films based on the amount of domestic gross is only because the scope of this article is focusing on summer blockbusters.
After all the data was collected, I made a decision to cut out from the final analysis the years 1975 and 1976 due to incomplete data. The-numbers just didn’t have box office information for enough of the summer movies in those years and they became outliers in the data. I still used the individual movie information from those years, so data that isn’t year specific will still have these films accounted for.
What Makes a Great Summer at the Movies?
Summer movies are fun. They’re spectacular feasts for the eyes and ears, and when a summer movie hits all the right notes, everyone’s talking about it. This summer has already started looking pretty great with The Avengers seemingly pitch perfect (I haven’t seen it…I live freakishly far away from a movie theater). But what makes a fantastic summer season at the movies?
Clearly, the films released have to be good. It’s also important for the good movies to not all clump together, creating a great start but a lackluster ending. Good, even distribution is what we would be looking for. The films should also be memorable, which isn’t really the same thing as good. Memorable is getting at something that requires a certain amount of time to appreciate, and this is nearly impossible to correctly quantify when accounting for newer films.
In the analysis of the data, I’ve narrowed down a few distinct categories to quantify each summer. First, I looked at the average IMDB user ratings. This will give us a pretty easy way to compare summers and provide a baseline.
From there, I’ve included a percent of movies from within that summer to exceed the average IMDB for all summer movies. Similarly, I found the percent of the movies that exceed both an IMDB score of 7 and also 8.
The next area would be distribution. I calculated the number of days between movies with a score of 7 or higher and found the average number of days someone would have had to wait between films. I also found the longest stretch of time in each summer a person would have to wait. If two or more movies with a rating of 7 or greater occurred on the same day, they were counted as one. The rationale for this is mostly competition. Ideally, films would be spread out a bit. Of course, if all the films were great, then this would be a whole other problem that I think studios and movie goers would enjoy sorting out for themselves.
Lastly, I went through each summer and subjectively counted the number of films I thought would fit the description of both good, and memorable. This was the only way I could approximate an evaluation for memorable. Newer movies are much more difficult to predict, and older movies could suffer from either not having seen the movie at all, to not having seen the movie in a very long time.
Best Summer of Film Criteria
|Summer Average||The average IMDB scores of all films in a given summer.|
|Films Above Average||The number and percent of films in a given summer that exceeds the average IMDB score of all combined summer movies.|
|Films at 7 or Above||The number and percent of films in a given summer that reach a minimum of an IMDB score of 7|
|Films at 8 or Above||The number and percent of films in a given summer that reach a minimum of an IMDB score of 8|
|Average Distribution||The average number of days between release dates of films that reach a minimum IMDB score of 7|
|Maximum Distribution||The largest amount of days in a summer between the release dates of films that reach a minimum IMDB score of 7|
|Subjective Count||The number of films in a given summer that are subjectively chosen to be both good and memorable.|
|Rank Average||The average rank of all criteria for a given summer.|