A total of 46 teams made it to Judgment Week, confronted with the challenge of summing up their work in only 10 minutes. That is a daunting task not only for the teams, but also for the judges. We did a quick recap with Vice President of Engineering, Michael Kamprath in order to better understand the judging process and what it takes to be named a Quantathon champion.
Who were the judges for this year’s competition?
The judging panel was composed of leaders from all areas of Quantcast. They were:
Co-Founder & CEO, Konrad Feldman
SVP of Product Management, Jag Duggal
SVP of Business and Legal Affairs, Michael Blum
VP of Engineering, Michael Kamprath
Head of Corporate Sales, Rick Boyce
Director of Mobile Targeting, Crispin Flowerday
VP of Research and Development, Jim Kelly
Head of Marketing, EMEA, Amit Kotecha
Director of Business Operations, Derek Moulaison
Senior Finance Director, Imad Tareen
What is the judgment process?
Like so many things at Quantcast, data is involved at every level of the process. The 4 judging categories are Strategy Relevance, Innovation, Execution, and Impact. Teams are scored on a scale of 1-10 in each category. Then the judges’ scores are averaged and used to calculate a ranking for each team. As a company, Quantcast highly values Execution and Impact and so those categories are more heavily weighted in the ranking process. It’s important to note that the rankings do not guarantee the winners; the scores are used to guide the judging panel’s discussion of the teams. Balancing the subjective content with objective numbers is something that was intentionally designed into the judging process.
What was one of the greatest challenges in designing this part of Quantathon?
Parsing out the biases of each judge in order to create the most accurate scores possible. An example of a scoring bias is the judge that had the effective dynamic range in scores of 6,7,8,9. The challenge with that is determining how to compare those scores to a judge that used all values between 1 and 10. Is the first judge’s “6” equivalent to the second judge’s “2”? The solution resulted in the ability to determine who was the “hardest” or “easiest” judge on a given day and for each category. Calculating the normalized scores allowed us to account for any judge’s scoring bias. This meticulous process resulted in an extremely accurate assessment of team performance and led to a very worthy group of winners.
What does it take to be a winning team?
When it gets to that point of the discussion, the judges have to remember you. They remember a good presentation that presents a problem and demonstrates a clear solution with a path forward. Teams where the judges need to figure out what is trying to be done don’t do well. It’s a competition.
Who was victorious? We’ll reveal the winning teams this week and also share more team profiles from Quantathon MMXIV!
Posted by Capri Mali LaRocca, Marketing Communications Associate