We’re counting down our Top 10 blog posts of 2013. Coming in at #8 is this webinar recap originally published August 8.
As part of the CASRO webinar series, Inna Burdein, Ph.D., director of panel analytics with the NPD Group, discussed her research into survey length and effort.
The session moderator – John Bremer, the chief research officer of Toluna – began by discussing the conventional understanding of respondent engagement. “We know that a respondent who is engaged in a survey that they are taking is fully reading question text, considering all the choices, and thoughtfully providing an answer. We are able to show that respondent engagement provides quality data. Because of this, we have a variety of techniques and rules of thumb to determine if a respondent is reasonably engaged in the survey.”
Some of these heuristics:
- Keep a survey under 20 minutes
- Design it to be visually appealing
- Measure the length of time a respondent takes and compare it to a baseline to determine if they are taking the survey too quickly
- Insert trap questions to make sure they are paying attention
- Look for patterns that indicate satisficing, straight-lining or randomly choosing answers
“We rarely challenge these rules of thumb in the industry,” said John. “There has been little recent research into these areas because they have become lore. George Terhanian and I produced research in 2000 that had the earliest reference to the 20-minute rule. I still see that paper cited, yet our industry has changed dramatically. I am not sure that I myself subscribe to that rule anymore: is it the right thing to be thinking about?”
Introducing Inna, John said, “We need to challenge what we know and keep sacrosanct.”
Different Opinions on Ideal Survey Length
“How long should my survey be?” Inna began by asking. She reviewed researchers’ sites and noted that the most common advice was “no more than 20 minutes (or else!)” but that the sites said everything from “keep it short” to specifying ideal lengths of 5 minutes, 7-8 minutes, 8-10 minutes or 15 to 30 questions.
The dilemma faced by researchers is how to refactor long surveys that capture everything the client wants to know but far exceed 20 minutes to do so.
2 Key Questions About Survey Length
“Is length really as important as conventional wisdom holds?” As Inna noted, “Length has many confounds – the number of tasks, the number of questions, the difficulty of the tasks. And perceived length is different than actual length: an unenjoyable task seems to take longer.”
Her second key question was “Do some panelists not mind a longer survey?” She pointed out, “Professional respondents seem to have a high capacity for various surveys.”
To research these questions, Inna fielded an experiment with 5 conditions, with different lengths and different degrees of difficulty:
- 12-question survey about personality and fraud propensity
- 24-question survey that added 12 easy questions about the economy to questionnaire #1
- 36-question survey that added 12 more easy questions about the economy to questionnaire #2
- 24-question survey that used the same first 12 questions but added more difficult variations of the 12 questions about the economy in questionnaire #2
- 36-question survey that used the same first 12 questions but added more difficult variations of the 24 questions about the economy in questionnaire #3
For surveys #4 and #5, the question topics were the same but simply added a harder, more cognitively challenging version. For instance, instead of asking people to think about what they had done in the past month, they had to recall the past three months. Inna’s table provides a convenient contrast of the cognitive effort required by different types of questions:
|One month recall||Three month recall|
|Unipolar scales||Bipolar scales|
|Shorter instructions||Longer instructions|
|Check all that apply||Yes/No grid|
|Short list||Long list|
|Agree or Disagree||Pick one statement|
|Asking about now||Asking about the future|
The survey was fielded to a demographically representative sample of 5,182 respondents from the NPD panel.
Some key results are summarized below.
|12Q||24Q Easy||36Q Easy||24Q Diff||36Q Diff|
|Median completion time||9||15||21||19||30|
|Completed and found worthwhile||85%||82%||77%||72%||63%|
|Response rate to next NPD survey||80%||81%||78%||81%||79%|
As Inna observed from this data:
- “Difficult questions took longer to complete”
- “Difficulty causes abandonment more than length does”
- “Extra questions and difficulty lowers survey satisfaction”
- “Current experience has little impact on future experience”
Perceived Survey Length
Survey researchers have long known that perceived questionnaire length often varies dramatically from actual length. In this chart, Inna crosstabulates how long a respondent perceives a survey to be by its actual length:
Respondents find the easier versions of the surveys to feel shorter by 2 or 3 perceived minutes, regardless of the actual time spent completing the survey. “Anything under 15 minutes was estimated to be close to 15 minutes, and anything longer was estimated to be close to 20 minutes.” As Inna concluded, “You don’t get much gain in people’s perceptions of how long the survey took by lowering it 20 minutes.” [A survey twice as long – 40 minutes – is perceived as just 4 to 7 minutes longer than the 20-minute experience.]
Survey Length by Type of Panelist
Inna also analyzed responses by 4 categories of panelist behavior:
- Experienced, highly diligent panelists
- Experienced, moderately diligent panelists
- Experienced, unfocused panelists
- Inexperienced, diligent panelists
This survey experience had little impact on future experience except for the inexperienced, diligent panelists, of whom only ~50% of those who took the 24- and 36-question surveys took the next survey they were invited to, compared to ~85% of the experienced panelists.
Regarding the inexperienced panelists, Inna said, “This group is not invested, just curious—give them a short and sweet survey, or don’t bother: you won’t get much payoff unless its really short.”
Experienced, diligent panelists can “handle longer, more difficult surveys within reason”. They score “high on introversion, have a high need for cognition, enjoy thinking, and prefer complex to simple problems, and they like surveys – they find surveys worthwhile.”
In fact, experienced panelists differ in key ways from market researchers, who are less introverted, less conventional, less likely to enjoy mental effort [!] and less likely to find surveys worthwhile – only 16% of market researchers find surveys worthwhile, which Pete Cape of SSI commented was the most depressing statistic of the year.
Inna’s conclusion? “Those of us trying to create a better survey experience are not like those experiencing the surveys. Some hypotheses to create greater engagement may not pan out.”
Recommended Survey Length
“So what do we do with the 45 minute survey?” Inna asked. Her recommendations:
- “Simplify, before cutting”
- “If grids have straightlining over 20%, over 10%, they need to go – it’s just too much.”
- “Consider a short (not shorter) version for new panelists – cutting 40 minutes to 30 isn’t going to appeal to them.”
- “When doing analyses on survey engagement, consider the audience.”
Is the ideal survey length 20 minutes? No, but that’s probably how long your respondent will think the survey is!
Jeffrey Henning, PRC, is president of Researchscape International, which provides “Do It For You” custom surveys at Do It Yourself prices. He is a Director at Large on the Marketing Research Association’s Board of Directors. You can follow him on Twitter @jhenning.