Quality Starts with Survey Design: Tips for Better Surveys

Marketing researchers are all facing two important challenges to data quality. First is the question of representativeness: with response rates plummeting, we need to make surveys shorter, more engaging, and easier for respondents to complete. Second is the issue of data accuracy: we must make sure that survey questions measure what we think they measure.

If you want to make surveys more accurate and representative, it all comes down to survey design. You need to think carefully about the survey design and how quality is expressed and impacted throughout all phases of the research project. When you get in the habit of thinking about quality at all phases of a study—from design to implementation to analysis of results and feedback—the payoff will be clear.

First Steps First

It sounds obvious, but the first quality check is to take your survey. Clients, researchers, analysts—everybody on board should complete the survey. Be sure to ask some people who are not familiar with the project to complete it as well. How does it feel to be on the other side? Talk through the questionnaire as a group. Look for areas that need more focus or clarification. Seek recommendations to improve the survey. Encourage the group to advocate for changes and explain why the changes are important. And be sure to use a variety of devices and operating systems to understand how the survey performs in different situations.

Conduct a pretest to get feedback from respondents. You don’t have to complete many pretest surveys, but you should have at least a few “real,” qualified respondents complete the survey. Additionally, they should be answering questions about the survey’s design properties, ease of use, and any other issues they had with the survey. By all means, use survey engagement tools when feasible, but don’t fall into the trap of letting the droids rule the analysis. You need the human touch from real respondents, as well. (Don’t forget to clear the pretest respondents before you fully launch the survey. Or you can clear the responses or filter them out of the final results.)

Use technology to test for data quality. A computer application is great at metrics, scoring and summarizing responses. It can measure survey engagement by tracking rates of abandonment and speeding and measure experience quality via respondent ratings. The average length of time to complete the survey is also a key metric. Use technology as a predictive tool before launching the survey to evaluate engagement levels and suggest improvements.

Frequent Challenges

As you get in the habit of performing quality checks, be on the lookout for these common issues that will lead you to improve your survey design:

Is the survey user-friendly?

  • Beware of “survey fatigue.” Split long surveys into many short pages.
  • Make survey language more consumer-friendly and conversational and less “research-y.”
  • Does the language used on buttons and error messages match the survey language?
  • Validate questions for the correct data type and tie validation to relevant error messaging that tells how to fix the response.
  • Use a progress indicator to show how far the respondent is from completion.

Does the survey flow?

  • Improve the logical flow of the questions and watch out for redundancies.
  • Make sure the question type matches what you are looking for. Close-ended questions are ideal for analysis and filtering purposes.
  • Test your logical paths. When designing page skips, you don’t want unexpected branching to happen.
  • Use required questions for “must get” answers, so the respondent can’t move on without completing them. Be careful about making too many questions required, however, as respondents can become frustrated and break-off before completing the survey.

Is your survey design mobile-capable? While 40% of all survey responses are completed on a mobile device, a recent study reported that half of surveys are not mobile-capable, much less mobile optimized. Design your survey to work on mobile devices:

 

  • Make sure that short page questions fit on the screen.
  • Minimize scrolling whenever possible.
  • Check for comment box sizing problems and row-width for matrix question labels.

 

Remember, quality control should never be an afterthought; you must have an established quality control process for surveys. This process must specify the quality review responsibilities of each survey reviewer. One or more team members should be responsible for evaluating respondent-level data. The quality control process should review the survey design end-to-end to focus on maximizing both technological efficiency and respondent experience for optimal data quality.

Smart Survey Design: 3 Forgotten Pain Points to Avoid

“Smart Survey Design” is a loose term (bordering on a catch-all) that you’ve probably heard pitched to you.  Maybe you have used it yourself when piecing together a study.

It’s not a hollow term, by any means. Smart design has advantages for both designers and respondents. Designing “smart” simply means maintaining data integrity, both in terms of capturing statistically relevant data as well as reducing the amount of bad data caused by poor survey takers (straight liners, short responders for OE’s, speeders, cheaters, etc.).

That’s the basic idea, but there is one factor that often gets forgotten or ignored in a “Smart” design:  the respondent’s experience. You want your respondents to have a positive user experience, surveys with a human touch. They should feel good about taking the survey.

I’m not just talking about survey length or incentive, though those are certainly key tools in addressing the problem.  What I am referring to is the very way we talk to the respondent, the questions asked and how many times we ask that question.

It is easy for us as researchers to become so lost in our need for quality data that we forget the source of it—human beings. People are rational and emotional creatures. How do they feel about their participation?  It’s an important consideration, all too often ignored.

Identifying and avoiding potential pain points may not only help to reduce the number of scrubs and drop-outs, but also deliver better, more reliable data.

Have you ever been on a conference call where the speaker repeats the same point 5 times?  Did you like it?  Did you continue to pay attention or did you look at your phone or check your email?  Now imagine that same conference call. The speaker drones on with 4 more points that are roughly one hair’s width different from the original ones. Frustrating!

Plenty of studies out there get too repetitive in hopes of garnering nominal, ordinal, interval, and ratio data just to present the client with 4 different charts.  But you should ask yourself, how reliable are the opinions offered by a respondent that you have just bored and or annoyed?

Some repetition may be unavoidable, especially when you want to determine which of a group of stimuli is most attractive to your target, but you should not bludgeon the people who are meant to be helping you.

Pain point #2: Being too clever

“If you could be a tree, what tree would you be and why?”

This may be a good opener for your therapist to crawl around the workings and motivations of your mind, but some respondents may find such questions to be intrusive or something worse: “hogwash.”  They have signed up to take part in survey research, but they’re not lab rats!

We come back to the reliability question: how reliable is the data you are gathering if your respondent has been made uncomfortable and just wants to finish the ordeal and get out?

The prospect of getting “deeper data” out of your survey may be very alluring, but consider how appropriate those questions are for your audience.  Does a panelist really need to imagine their favorite restaurant as a spirit animal in order to tell you what their favorite sandwich is?

Pain Point #3: Being too “research-y”

While gathering data or even when trying to cut time off the length of interview in consideration for the respondents, questions might be presented impersonally or curtly. These rapid-fire “cold” questions, though absolutely focused, clear and concise, run the risk of boring a respondent into unintentional mental lethargy.

Questions can eliminate responders who have lost interest in your data set, but wouldn’t it be more beneficial to prevent the need for creating them in the first place?  You don’t have to write a narrative or tell a knock-knock joke to keep them engaged with the process.  Panelists are people. You should just remember to “speak” to them conversationally, instead of clinically prompting and probing for responses.

By being more aware of the respondent’s pain points and making a few tweaks to your surveys, you can improve completion rates, quality of open-ended responses and data integrity.  Better yet, it does all this without incurring any additional costs.

4 Surefire Ways to Increase ABS Response Rates Without Breaking the Bank

So you found the perfect sampling source with nearly 100% coverage and the ability to reach cell phone only homes with address based sample.  One can expect to get the completes needed but realistically what type of response rate will you achieve?  How can you boost it?

responsesDepending on the steps taken the response rate can vary greatly.  You may only realize 10-15% without a big name company endorsement to go with your survey and/or a pre-notification postcard, but such an endorsement can kill the budget before the study even begins!

Here are 4 surefire tips to increase response rates…

Tip #1 Phone number append and name append to the address using commercial databases to personalize the pieces and allow for reminder calls.  Even where the name is appended also include “or current resident” to reduce the return rate.

Tip #2 Add a creatively designed piece with the web link to drive them to participate online.  This allows the respondent to take the survey anytime and on a device of their choice.  Offering a multi-mode approach can increase participation and representation.

Tip #3 Repeat the Message and contact potential respondents multiple times via mail, phone, media or social networking sites which will increase awareness and help entice them to participate.  Messages are more effective when repeated!

Tip #4 Offer an Incentive to motivate your respondent.  Be sure the value represents a balance between effort and time spent on the survey within budget of course.

With some large, heavily endorsed studies we have seen up to 50% along with long field times, reminder calls, multiple post cards, and refusal conversions.  Use the tips that your study and budget will allow and you can experience a higher response rate too!

Remembering Dale Kulp

On what would have been Dale Kulp’s 66th birthday, we wanted to take a moment to remember a man who not only is responsible for the founding and creation of Marketing Systems Group but made innumerable contributions to the statistical sampling and survey research fields.  Dale’s career was already flush with accomplishment before he founded Marketing Systems Group in 1987.  He previously had worked for industry stalwarts Chilton, Bruskin and ICR (Now SSRS).  With MSG, he envisioned the development of a PC based in-house RDD sample generation system (GENESYS) that would become the cornerstone product of the company.

Aside from being the driving force behind the industry’s first in house sampling system, Dale was integral in the development of list-assisted RDD sampling methodology at a commercial level, which revolutionized the process for reaching probability-based samples of households. Through his many technical notes and various publications he remained vigilant about addressing the operational issues challenging the viability of this methodology, particularly those resulting from the unfolding changes in the US telephony.

Dale also started several Omnibus telephone surveys that not only continue to thrive 20 years after their launch; they have in at least one situation created their company.  Centris Marketing Science was created by Dale along with Paul Rappaport after realizing the value of the census block level data that the Omnibus survey collected.

Realizing that MSG should not be a one-trick pony, Dale continued to pursue other product lines that would benefit the survey research industry.  He assembled a team that included current MSG President Jerry Oberkofler and Vice President Reggie Blackman to develop the first automated screening process: GENESYS-ID.  Utilizing the technology and philosophy of GENESYS ID and applying it the survey research industry, PRO-T-S was born.  PRO-T-S was the first predictive dialer built exclusively for the research industry.  In 2004, Dale brought the ARCS Panel Management software under the MSG umbrella. ARCS is now one of the leading software packages in the sensory and pharmaceutical industries as well as a recruitment tool for large civic organizations.

Since Dale’s passing in late 2009, MSG has grown substantially but has remained attached to the vision, products and protocols that Dale Kulp laid out back in 1987.  Not only do the MSG folks wish Dale a Happy Birthday but we thank him for his vision, contribution and foresight.

Reflections on the Presidency of the Greater NY MRA

Trade associations are essential to the marketing research industry.  They provide us with an opportunity to learn about best practices, new forms of record and industry trends.  They provide us with an opportunity to network.  One of the most overlooked roles of industry associations is their ability to be our voice in the government.  Their lobbying and advocacy efforts are vital to the future strengthening of the marketing research industry.

Over the years Marketing Systems Group has embraced this and encouraged us to take an active role in these associations.  Over the last 10 years I have been very active in the Marketing Research Association, both at the chapter level as well as the national level, culminating this past year as the president of the Greater New York Chapter.

I found it to be an incredible opportunity to put my fingerprints on the direction and message of the largest chapter in the association.  What I didn’t realize is how much goes into the chapter running  at the high level that everyone has come to expect.  You don’t realize how much work is involved to get one event planned, let alone 5 events.  Of course, I was not alone.  I was surrounded by nine of the most dedicated and hard working board members that I could ask for.  Their hard work made my job so much easier and I am forever grateful for everything that they did.

By far my favorite event was the Price is Right event which was a twist on the old game show by guessing on costs for industry related expenditures.  It was a fun and interactive way for attendees to volunteer and learn about parts of the research industry that they are not involved in day to day.

I have often been asked why I am always trying stay involved with the MRA.  The answer is simple – you can only get out of an association what you put into it.  My suggestion to anyone would be to get involved.  That can mean so many different things.  Start by attending events.  Maybe you learn something new.  Maybe you meet someone who will become a client, a vendor or a friend.  Once you are comfortable, volunteer.  Our industry associations LOVE volunteers and my years of volunteering have helped me in so many professional and personal ways.  I have developed client relationships and secured new vendors.  I have learned so much about the research industry outside of the world of sampling that I may have never been exposed to and this has only helped me understand my client needs better.  I have met many people who I now count among some of my closest friends.  I am very appreciative and thankful to those that encouraged me to get involved all of those years ago as well as everyone that I have met and worked with over those ten years.  And, of course, I am grateful to work for a company that understands the importance of our associations as well as our involvement in them.  I truly think that we are a better company for it.

Split-Frame Sampling

Oftentimes, researchers are faced with the challenging task of targeting rare domains in a population while maintaining the probability-based nature of the employed sample.  For instance, in a national RDD sample it might be necessary to oversample households with small children or those with even less prevalent attributes.  While an epsem sampling design, whereby all numbers have the same chance of selection, will provide the most efficient sample with respect to the precision of survey estimates, from a cost perspective such a design can be completely prohibitive due to the required level of screening for reaching eligible households.  This is where a cleverly designed stratified sampling alternative that employs disproportional allocation can prove highly valuable.

In practice, an optimal sample allocation scheme takes into account the unit cost per interview in each sampling stratum.  As such, a stratum with a high incidence of reaching members of the target population will receive a higher allocation as compared to other strata.  This disproportionate sample allocation should be exercised while providing a non-zero chance of selection for all telephone numbers to ensure a probability-based sample.

The objective of this stratification is to provide a means for over sampling the target populations by segregating higher incidence households into distinct sampling strata.  This is done by matching all numbers against commercial databases, which contain household and individual level demographic data, and identifying the numbers that meet the specified target.

With access to all the top commercial databases Marketing Systems Group can provide cost-effective solutions for sample surveys that aim to target rare domains.  By placing such telephone numbers in the “top” or high incidence stratum and the remaining telephone numbers covering the geography of interest in another, you can create a complete sampling frame.  Subsequently, using an optimization procedure a higher sampling fraction will be determined for the top stratum cognizant of the design effect that will result from a disproportional sample allocation and will need to be adjusted for when weighting.

 

 

MSG in a Bottle

Welcome to our brand new blog for customers and industry observers. We’re calling it MSG in a Bottle, and I know you’re going to love following it.

One of the best things about my job as president of Marketing Systems Group is the opportunity to work with an inspiring, committed team of professionals. They constitute a brain trust of talent and experience. Their collective market research expertise and dedication to quality truly makes a difference across our entire product line — GENESYS® sampling system , PRO-T-S® dialer software and ARCS® all-in-one panel manager.

Since 1987, we’ve been delivering innovative solutions to the survey research community, and our staff continues to do amazing things, year over year.

I’m proud of these professionals. They are what make our products great. We learn from each other every day.

That’s all well and good for a company president, you might be saying, but what about me?

That’s why we’ve started this blog. We want you to benefit from the collective wisdom I see in action every day at MSG. I’ve asked this talented team of pros to share their insights and expertise with you.

At the MSG in a Bottle blog you will get our expert analysis of hot industry trends, stay informed with news on the survey research industry and the latest standards updates, learn how we’re positioning our products to meet customers’ needs, and get practical advice and tips on how best to use our products. Think of it as “news you can use.” Also, we hope to have some fun along the way, too. We want you to get to know us better. And we want to hear from you too. You’ll be able to join the discussion and share your feedback and suggestions via the comments section after each blog post.

I’m excited about this new channel for reaching our customers and the survey research community, and I’m confident that our media and marketing team will keep you up-to-date on the survey research industry topics that truly matter. We hope you’ll bookmark us and stop in frequently. New pieces will appear on a bi-monthly basis. Or better yet subscribe/follow us here, and never miss a new post when it arrives.

Thanks and happy reading,

Jerry Oberkofler