Quality Starts with Survey Design: Tips for Better Surveys

Marketing researchers are all facing two important challenges to data quality. First is the question of representativeness: with response rates plummeting, we need to make surveys shorter, more engaging, and easier for respondents to complete. Second is the issue of data accuracy: we must make sure that survey questions measure what we think they measure.

If you want to make surveys more accurate and representative, it all comes down to survey design. You need to think carefully about the survey design and how quality is expressed and impacted throughout all phases of the research project. When you get in the habit of thinking about quality at all phases of a study—from design to implementation to analysis of results and feedback—the payoff will be clear.

First Steps First

It sounds obvious, but the first quality check is to take your survey. Clients, researchers, analysts—everybody on board should complete the survey. Be sure to ask some people who are not familiar with the project to complete it as well. How does it feel to be on the other side? Talk through the questionnaire as a group. Look for areas that need more focus or clarification. Seek recommendations to improve the survey. Encourage the group to advocate for changes and explain why the changes are important. And be sure to use a variety of devices and operating systems to understand how the survey performs in different situations.

Conduct a pretest to get feedback from respondents. You don’t have to complete many pretest surveys, but you should have at least a few “real,” qualified respondents complete the survey. Additionally, they should be answering questions about the survey’s design properties, ease of use, and any other issues they had with the survey. By all means, use survey engagement tools when feasible, but don’t fall into the trap of letting the droids rule the analysis. You need the human touch from real respondents, as well. (Don’t forget to clear the pretest respondents before you fully launch the survey. Or you can clear the responses or filter them out of the final results.)

Use technology to test for data quality. A computer application is great at metrics, scoring and summarizing responses. It can measure survey engagement by tracking rates of abandonment and speeding and measure experience quality via respondent ratings. The average length of time to complete the survey is also a key metric. Use technology as a predictive tool before launching the survey to evaluate engagement levels and suggest improvements.

Frequent Challenges

As you get in the habit of performing quality checks, be on the lookout for these common issues that will lead you to improve your survey design:

Is the survey user-friendly?

  • Beware of “survey fatigue.” Split long surveys into many short pages.
  • Make survey language more consumer-friendly and conversational and less “research-y.”
  • Does the language used on buttons and error messages match the survey language?
  • Validate questions for the correct data type and tie validation to relevant error messaging that tells how to fix the response.
  • Use a progress indicator to show how far the respondent is from completion.

Does the survey flow?

  • Improve the logical flow of the questions and watch out for redundancies.
  • Make sure the question type matches what you are looking for. Close-ended questions are ideal for analysis and filtering purposes.
  • Test your logical paths. When designing page skips, you don’t want unexpected branching to happen.
  • Use required questions for “must get” answers, so the respondent can’t move on without completing them. Be careful about making too many questions required, however, as respondents can become frustrated and break-off before completing the survey.

Is your survey design mobile-capable? While 40% of all survey responses are completed on a mobile device, a recent study reported that half of surveys are not mobile-capable, much less mobile optimized. Design your survey to work on mobile devices:

 

  • Make sure that short page questions fit on the screen.
  • Minimize scrolling whenever possible.
  • Check for comment box sizing problems and row-width for matrix question labels.

 

Remember, quality control should never be an afterthought; you must have an established quality control process for surveys. This process must specify the quality review responsibilities of each survey reviewer. One or more team members should be responsible for evaluating respondent-level data. The quality control process should review the survey design end-to-end to focus on maximizing both technological efficiency and respondent experience for optimal data quality.

Smart Survey Design: 3 Forgotten Pain Points to Avoid

“Smart Survey Design” is a loose term (bordering on a catch-all) that you’ve probably heard pitched to you.  Maybe you have used it yourself when piecing together a study.

It’s not a hollow term, by any means. Smart design has advantages for both designers and respondents. Designing “smart” simply means maintaining data integrity, both in terms of capturing statistically relevant data as well as reducing the amount of bad data caused by poor survey takers (straight liners, short responders for OE’s, speeders, cheaters, etc.).

That’s the basic idea, but there is one factor that often gets forgotten or ignored in a “Smart” design:  the respondent’s experience. You want your respondents to have a positive user experience, surveys with a human touch. They should feel good about taking the survey.

I’m not just talking about survey length or incentive, though those are certainly key tools in addressing the problem.  What I am referring to is the very way we talk to the respondent, the questions asked and how many times we ask that question.

It is easy for us as researchers to become so lost in our need for quality data that we forget the source of it—human beings. People are rational and emotional creatures. How do they feel about their participation?  It’s an important consideration, all too often ignored.

Identifying and avoiding potential pain points may not only help to reduce the number of scrubs and drop-outs, but also deliver better, more reliable data.

Have you ever been on a conference call where the speaker repeats the same point 5 times?  Did you like it?  Did you continue to pay attention or did you look at your phone or check your email?  Now imagine that same conference call. The speaker drones on with 4 more points that are roughly one hair’s width different from the original ones. Frustrating!

Plenty of studies out there get too repetitive in hopes of garnering nominal, ordinal, interval, and ratio data just to present the client with 4 different charts.  But you should ask yourself, how reliable are the opinions offered by a respondent that you have just bored and or annoyed?

Some repetition may be unavoidable, especially when you want to determine which of a group of stimuli is most attractive to your target, but you should not bludgeon the people who are meant to be helping you.

Pain point #2: Being too clever

“If you could be a tree, what tree would you be and why?”

This may be a good opener for your therapist to crawl around the workings and motivations of your mind, but some respondents may find such questions to be intrusive or something worse: “hogwash.”  They have signed up to take part in survey research, but they’re not lab rats!

We come back to the reliability question: how reliable is the data you are gathering if your respondent has been made uncomfortable and just wants to finish the ordeal and get out?

The prospect of getting “deeper data” out of your survey may be very alluring, but consider how appropriate those questions are for your audience.  Does a panelist really need to imagine their favorite restaurant as a spirit animal in order to tell you what their favorite sandwich is?

Pain Point #3: Being too “research-y”

While gathering data or even when trying to cut time off the length of interview in consideration for the respondents, questions might be presented impersonally or curtly. These rapid-fire “cold” questions, though absolutely focused, clear and concise, run the risk of boring a respondent into unintentional mental lethargy.

Questions can eliminate responders who have lost interest in your data set, but wouldn’t it be more beneficial to prevent the need for creating them in the first place?  You don’t have to write a narrative or tell a knock-knock joke to keep them engaged with the process.  Panelists are people. You should just remember to “speak” to them conversationally, instead of clinically prompting and probing for responses.

By being more aware of the respondent’s pain points and making a few tweaks to your surveys, you can improve completion rates, quality of open-ended responses and data integrity.  Better yet, it does all this without incurring any additional costs.

Why Volunteer?

The research industry needs volunteers. Here’s why you should consider playing a part.

Many of us here at MSG serve as active volunteer members of market and survey research industry organizations. It’s part of our company culture to get involved and make a difference. Recently, I attended back to back chapter events, and I began to reflect on the benefits of volunteering. Was it really worthwhile to devote my time to a local chapter organization?

It’s true, the amount of time you need to devote to volunteering can feel like a second job, and it is crucial that you be able to balance your primary and secondary activities. It’s definitely a juggling act, and it isn’t always easy.

That being said, there are loads of good reasons to become a volunteer. Here’s what influenced me to get involved:

Networking. Serving as an industry volunteer will get you talking to people and is a wonderful means for creating and maintaining relationships. I want to meet people whom I can work with, but I also want to build a network of long-lasting professional relationships. In my roles as a volunteer for a local chapter organization and committee memberships, I have encountered industry pros whom I never would have met otherwise.

Learning best practices. Education doesn’t end with a degree, a certification, or on-the-job training. It should be seen as a lifelong habit of mind. By attending events and seminars outside the orbit of your day-to-day business, you will be exposed to new ideas and pick up on new trends within your industry and related industries.

Organic growth. A natural goal we all have is to grow our business. When you volunteer, the cultivation of business growth can tend to happen more organically, as a function of developing relationships within the membership environment. As you discover ways to collaborate and partner with others, those seeds will sprout.

I firmly believe that volunteers are the lifeblood of an association. They keep our communities engaged and informed. Despite the fact that it can take up a lot of spare time, when I reflect and ask myself, should I have volunteered? I always answer a resounding YES!

 

 

 

MSG in a Bottle

Welcome to our brand new blog for customers and industry observers. We’re calling it MSG in a Bottle, and I know you’re going to love following it.

One of the best things about my job as president of Marketing Systems Group is the opportunity to work with an inspiring, committed team of professionals. They constitute a brain trust of talent and experience. Their collective market research expertise and dedication to quality truly makes a difference across our entire product line — GENESYS® sampling system , PRO-T-S® dialer software and ARCS® all-in-one panel manager.

Since 1987, we’ve been delivering innovative solutions to the survey research community, and our staff continues to do amazing things, year over year.

I’m proud of these professionals. They are what make our products great. We learn from each other every day.

That’s all well and good for a company president, you might be saying, but what about me?

That’s why we’ve started this blog. We want you to benefit from the collective wisdom I see in action every day at MSG. I’ve asked this talented team of pros to share their insights and expertise with you.

At the MSG in a Bottle blog you will get our expert analysis of hot industry trends, stay informed with news on the survey research industry and the latest standards updates, learn how we’re positioning our products to meet customers’ needs, and get practical advice and tips on how best to use our products. Think of it as “news you can use.” Also, we hope to have some fun along the way, too. We want you to get to know us better. And we want to hear from you too. You’ll be able to join the discussion and share your feedback and suggestions via the comments section after each blog post.

I’m excited about this new channel for reaching our customers and the survey research community, and I’m confident that our media and marketing team will keep you up-to-date on the survey research industry topics that truly matter. We hope you’ll bookmark us and stop in frequently. New pieces will appear on a bi-monthly basis. Or better yet subscribe/follow us here, and never miss a new post when it arrives.

Thanks and happy reading,

Jerry Oberkofler