Sizing up Christmas

Mustard’s Gareth (Director) can’t help but think about our responsibilities as researchers, even when planning a Christmas party.

Yes, I know… nobody wants to hear the word ‘Christmas’ before we’ve even had our summer… but they do in Mustard’s office!

Why? Because every June we have a vote on where we are going on our Christmas party.

As part of making Mustard a great place to work, every year we take all our team abroad for two days of fun, booze, culture, booze, food, booze, team bonding, booze, and booze. It’s a little perk at the end of the year to thank everybody for their hard work over the course of the year, and show them how much we appreciate them.*

It’s a democratic process. From a list of potential locations, the team individually award 3 points to their preferred option, 2 points to their second option and 1 point to their third option. We then tally up the points and the winning city gets the pleasure of our company in December!

Christmas markets

This year’s voting was closer than ever, and it really hammered home the importance of sample sizes and data accuracy (how about that for a segue?… luring you into a blog about quantitative fundamentals with tales of booze and holidays in one easy step).

22 members of the team were voting in total and after the first 14 votes (64% of the audience), the results were…

  • Madrid – 23 points
  • Milan – 18 points
  • Amsterdam – 17 points
  • Berlin – 12 points
  • Dublin – 8 points
  • Katowice – 6 points
  • Blackpool – 0 points

With 27% of the available points being awarded to Madrid, it looked like they had it in the bag. With just 14% of the available points so far, Berlin looked like an also-ran.

I’d personally awarded Madrid 3 points, but I wasn’t getting carried away and celebrating too early as I tallied up the incoming votes. That’s because I know (off the top of my head, of course) that Madrid’s 27% based on a sample size of 14, from a population of 22, was only accurate to +/- 14.35%. Berlin’s proportion could also have swung 11.22% either way

Ok, most of that last paragraph is a lie. I was already thinking about what colour castanets to buy and planning accommodation near Fábrica Maravillas. And to be honest I was surprised that the potential variance was so large given that almost two-thirds of results had already been provided from a population so low.

And low and behold, once the final 8 votes came in, Berlin was sat at the top of the tree with 28 points to Madrid’s 27. Berlin now had 21% of the available points and Madrid had 20%. Both within the potential variance after 14 votes, but this time results accurate to +/- 0%.

As an example, it helps demonstrate the importance of thinking about sample sizes and data accuracy. Sure, people think about it when they’re digging into cross-tabs looking for significant differences to build their story or look for YoY changes in tracking, but how often does a number get presented (and taken) as a fact when it might not be the case?

If a client has a satisfaction survey with 200 respondents and their satisfaction score is 50%. Does the bar-chart, dial, bubble (whatever is being used) ever say “somewhere between 43.07% and 56.93%” or does it just say “50%”? After seeing hundreds of reports from hundreds of agencies over the years, I think I know the answer to that one. People will reference the statistical accuracy of the sample size in the introduction, but in reality, nobody is going to do that for every figure they present in a report.


People understand and accept that there will be a level of statistical variance, but do people fully appreciate how much in certain situations? In an age where clients want less and less information in their outputs, this shouldn’t be something which slips. Especially when things are diluted further for internal stakeholder consumption. It can’t be good for decision making!

If I’d have thought “oh two-thirds are probably representative of our staff and Madrid’s well clear anyway” and cut off our voting at the first 14 members of staff, we’d all be heading for an incorrect destination in December.




* For reference, these trips typically cost less that a recruitment consultant fee to hire a mid-level member of staff. In related news, we haven’t lost a single member of the team directly to another agency since 2013. I’d highly recommend it!