If you work with consumer surveys, you’ve probably at some point been told to “keep it simple silly” (or maybe they used a meaner term), and you probably rolled your eyes. Yes, what a wonderful principle, so long as it’s not being leveled at us. Still, there are times in research when simplification is so obviously required that it can be surprising how much disagreement this will provoke.
A great example of this is when designing customer feedback surveys. Here’s a rhetorical question everyone who’s ever done this needs to ask in themselves: have you ever finished a 30 question survey and felt the urge to do more shopping with the company that sent it? To simplify: have you ever finished a 30 question survey? There ought to be an inviolable rule in research that “no survey shall ever harm the brand about which it’s inquiring”.
If that’s a reason marketers can appreciate, here’s a reason data scientists should get behind: longer surveys are more likely to produce poorer quality data. We know attention spans are a scarce resource. Quantifying this is not always easy, though. For that reason we’re going to spend next few posts doing some quick analyses to understand a few ways surveys can be designed to spend that resource more wisely.
We begin this series by looking into how people respond to surveys, and specifically whether demographic differences are predictive of the time it takes people get through a survey. For this purpose we used a research dataset that was 45(!) questions long and included 10,000 completes by a random sample of US adults. Here are a few notable findings:
Surprisingly, most demographic traits, including education, sex, and occupation, are NOT strong predictors of survey response time. Some of these are statistically significant, but their impact is so minute relative to other factors that it should have no bearing on survey design or interview length.
Age, however, IS a strong predictor of response time. The older a respondent is, the more likely it is they will take longer to complete each question. This might not be an earth-shattering insight on its own, but the magnitude of the effect was quite surprising. For every 20 years of difference in age we found a ~24% increase in question-level response time!
In the next post we’ll look at the effect that different types of questions have on respondents.