How did [album] become's record of the year?


The staff put up their Best of 2015 list today, and Sufjan Stevens' Carrie & Lowell took the No. 1 overall spot. That may seem somewhat odd, and certainly some people were confused by it, but it's not all that surprising if you know how the staff's rankings work. 

As an aside, the idea of being a place for only pop-punk and/or emo bands is somewhat outdated. Plenty of the staff and user base still primarily focus on pop-punk and/or emo, but many of the newer-ish staff members added in the past couple years have more broad listening palettes than the staff of a few years ago. Even if this hasn't yet changed the wide public perception of the site (with the name being what it is, that probably won't ever happen anyway), it does wind up shining through more than one might imagine in the end-of-year list tallying.

The difference in tastes here between the lists I linked and what people expect to see on may not seem all that significant if you're a person who doesn't know the general perception of the site well. But a list like Jake Jenkins' becomes significant on a year-to-year basis, and even more so as more and more staff members follow suit with ranking albums that are considered outside the usual score very highly on their lists.

This year, 20ish people contributed votes in the staff's final ballot. The percentage of those voters who have more varied listening tastes is much larger than even as recently as 2012, when there were arguably zero surprise albums in the collective staff's top 5. A larger percentage of people with varied tastes will result in closer voting among the top albums. As more new staff with varied tastes are brought in, the potential for more evenly spread voting at the top exists.

I wrote for AbsolutePunk from early 2010 until mid 2015, and I tallied up the EOTY ballot votes in four of those six years. Here's how it works: Each staff members builds their own top 30 list (you can do less if you want), and points are assigned where No. 1 is worth the most (100 points) and Nos. 26-30 are worth the least (5 points, I think). Your No. 2 vote is worth 80, then No. 3 is 70, then eventually it starts being blocked (i.e. Nos. 11-15 are all worth the same number; Nos. 21-25 are all worth the same number, etc).

Those top few votes become important very quickly. The bottom of the overall staff list (Nos. 25-30) is usually occupied by albums that received under 200 points in total. Getting to 300 total points is usually good for around top 10ish or 15ish depending on the year and how many staff members are voting. So one 100-point vote, or one top 5 vote in general, can be a huge boost for an album and make it so an album with a few high placements (but no other votes at all) gets on the overall list. This isn't very common, but I think the potential for something like this to happen is greater when you have a staff with more diverse tastes.

In years where there is not a dominant overall No. 1 (for example, My Beautiful Dark Twisted Fantasy was a runaway No. 1 and nothing else stood a real chance of challenging it), the key to an album getting the No. 1 overall spot is being on a lot of individual staff lists in the general top 15-20 range, not having the most No. 1 votes.

Looking at the individual staff lists this year, Carrie & Lowell was voted in the top 5 by four staff members. Zac Djamoos had it No. 1, me and Jake Jenkins had it No. 2, and Drew Beringer had it No. 3. Those four heavy votes alone are worth enough for Carrie & Lowell to probably be on the fringe of the top 10-15 overall already, but it's buoyed all the way to No. 1 by a few top 10 votes and several top 15-20 votes.

To me, in years like this where the staff doesn't have a dominant No. 1 album, it's better to look at the top 5ish as a whole. This provides a better representation of what was popular on the website throughout the full year: No. 2 was Foxing's Dealer, No. 3 Turnover's Peripheral Vision, No. 4 Kendrick Lamar's To Pimp A Butterfly, and No. 5 TWIABP's Harmlessness. The vote was close enough where a few changes on a few lists could have made any of these albums have a shot at No. 1 overall.

However, this is the beauty of how the scoring system works: Everyone does their own lists and no one knows the total point count for any album until all the lists are submitted. With the help of that blind voting method, the scoring system does a very good job of determining the true best album in the eyes of the staff; it avoids any "editorial spin" or any outside influences creeping in on the overall staff list. In years where it may seem to some people like it'd "make more sense" for the site to rank a certain album over another, you just have to realize that line of thinking never has the opportunity to come into play.