At RepostExchange, we believe in fairness and transparency across the whole platform. This includes reposts, feedback, charts and, of course, competitions.
As our first remix competition for Neev’s Seawall draws to a close, we wanted to shed a bit of light as to how we judge these competitions and how the winners are chosen.
Tracks are primarily judged on three criteria - production quality, creative flair, and originality. Judges are instructed to be as objective as possible, and to try and avoid bias towards personal taste.
Competitions are decided by multiple judges, acting independently, in order to minimise subjectivity and personal preferences affecting the result.
RepostExchange competitions will always involve a minimum of three judges, but sometimes more. There will always be at least one independent judge who is not an employee of RepostExchange. We do not announce the identity of judges for privacy purposes.
The Neev - Seawall competition had four judges (inc. one independent judge), along with a fifth contributor who was involved after the final round - a representative from Trapped Animal (the record label).
RepostExchange competitions have thee rounds. Here is how they work, using the Neev competition as an example:
All entries tracks are divided equally between the judges.
Neev - Seawall had 192 entries, which meant that each judge was randomly assigned 48 entries to listen to in round 1.
Each judge must listen to ALL of the entries in their list, and the interface allows them to add notes and ratings to help them keep track of and evaluate the tracks.
Each judge can put through ten tracks to the next round.
In round 2, there will be 40 tracks (10 put through by each judge in round 1).
The tracks in Round 2 are listened to by ALL judges.
Each judge independently provides a rating between 1-10 for each track.
These ratings are averaged and the tracks with the ten highest average ratings go through to the final round.
There will now be ten tracks in the final shortlist.
Each judge now independently ranks these tracks, from 1st to 10th place.
A track will receive 10 points for a 1st place ranking, 9 points for a 2nd place rating down to 1 point for a 10th place ranking.
The scores for each track will be added together to produce the final rankings.
In the situation where the winning track will be getting an official release, the record label will often have a say in choosing the 1st place position.
We will provide a representative from the record label with the top 3 ranked tracks in no particular order.
They are then able to choose the winning track for release. The other two tracks will get 2nd and 3rd place, in the order they were originally ranked by the main judges.
Each competition may have up to ten special mentions. Each judge can choose a selection of tracks from any round which they particularly enjoyed or felt stood out.
Special mentions might receive a small prize such as 500 credits or a piece of merchandise. This will be decided on a competition by competition basis.
Special mentions are listed on the Winners page in no particular order.
In the case where a track is tied with another, and this would affect either progression to the next round, or a winning position, the tracks will enter arbitration.
Judges will discuss the tracks in question, examine ratings and notes, and decide together which track will progress.
In the unusual situation where the standard deviation of ratings for a track is very high (e.g. three judges rate a track 9, and one judge rates it 1), this might be highlighted to establish the reason for the disparity, and whether the track's ratings need to be revisited.
If you have any questions or feedback about the judging process, please drop us a line at email@example.com