Reap the Rewards: Finding the Right Incentive Mix for Your Panelists

Pretty much everyone in the survey business understands the value of a satisfied panel. We want our surveys to be well-received and satisfying. We want our panelists to be engaged, and when we invite them again, we want them to participate eagerly.

To achieve these goals, you must work to build loyalty among your panelists. What does loyalty mean in this context? A panelist should think of your panel as their panel. They belong there, and it’s a place they will want to revisit.

One tried and true method for building loyalty is the offering of incentives, also known as rewards. An incentive reinforces positive behaviors and reminds panelists who your brand is and why it’s something worthy of their loyalty.

A panelist who is kept happy will in large measure be a loyal one. Here again, incentives can play a major role in building good will. When you reward respondents, you not only offer them something of value, you are letting them know that you value them.

In the abstract, an incentive program should contribute to the growth of

  • Acquisition
  • Participation frequency
  • Retention of participants

When we reward panelists for good behavior, the happy (thus loyal) panelists are much more likely to share their positive experience with their friends. In this way retention (satisfied panelists) can feedback into acquisition (new participants).

Let’s briefly examine how incentives can be structured to address these aims.

A reward that isn’t worthwhile to the participant isn’t worth much.

The value of the reward should be paired with two factors: time invested by the participant and the level of complexity of the tasks you ask them to complete.

Beware of offering too lavish a reward.

This can trigger fraudulent actions, as in “I’ll say or do anything to get the prize.” An incentive program should NEVER compromise the integrity of the research.

Watch out for the redundancy problem.

Offering the same reward again and again can have a negative impact: participant boredom leading to lack of engagement.

Weigh the benefits of adding diverse incentives.

Are their ways to cater or customize the panel experience? Is your panel management system able to accommodate changes to the incentive package over time, as needs change?

You might, for instance, design a tiered system for qualifying and non-qualifying participants. Why should non-qualifiers be rewarded with a token gift too? Because today’s non-qualifier could be tomorrow’s qualifying participant. Retention is the name of the game. With typical conversion rates tending towards the low range of 10% to 15%, when you reward the non-qualifier, you help to avoid gaming of the system and incentivize honest repeat participation in the next survey.

Be flexible.

A good incentive program will have some flexibility built-in, such as tiered rewards that trigger at different levels depending on specified factors. The levels could consist of gift cards, merchandise, PayPal payments, charitable donations, games, and other exclusive benefits. The key is to match the reward with the panelist. One size does not fit all.

Consider delivering digital rewards by email.

Digital rewards have a couple of advantages: (1) the recipient gets immediate satisfaction (they can redeem it right away) and (2) you reduce overhead for inventory and fulfillment management.

Weigh the costs and benefits.

Tiered rewards can add cost but they really help to cement the bond to your most loyal panelists. Points-based rewards are a popular approach that can be cheaper than cash rewards.

Give them a choice.

Using the idea of “Reverse Preference”, you offer the panelist a choice of reward type other than the default option, and you might use this as a motivational factor for a targeting a particular demographic.

Can your technology handle what you need to do? 

You want the system to accommodate multiple projects and programs across different demographics at the same time, each with its own custom incentive approach. An integrated application programming interface (API) can automatically deliver rewards. Fast incentive fulfillment not only increases efficiencies, it keeps panelists happier. Make sure your panel management system is robust enough to handle the granularity of analytics you need, and is adaptable enough when needs mutate.

Measuring the Results.

The key to improving an incentive program is to test and adjust.

You should always be tracking and measuring respondent satisfaction, which can be gauged via satisfaction surveys, social media feedback, and helpdesk availability.

Doing this will show panelists that you are there for them, are interested in their feedback, and are willing to act to improve their experience with each iteration.

Measurement is necessary for another reason. To gain approval for an incentive program, you will need to demonstrate to management that you have the metrics to show a clear return on investment. Plan to show them the positive feedback loops between completion rates and satisfaction metrics.

With these considerations in mind, you can expect an improved rewards system that boosts acquisition rates, leads to greater participation, and secures higher retention rates.

Smart Survey Design: 3 Forgotten Pain Points to Avoid

“Smart Survey Design” is a loose term (bordering on a catch-all) that you’ve probably heard pitched to you.  Maybe you have used it yourself when piecing together a study.

It’s not a hollow term, by any means. Smart design has advantages for both designers and respondents. Designing “smart” simply means maintaining data integrity, both in terms of capturing statistically relevant data as well as reducing the amount of bad data caused by poor survey takers (straight liners, short responders for OE’s, speeders, cheaters, etc.).

That’s the basic idea, but there is one factor that often gets forgotten or ignored in a “Smart” design:  the respondent’s experience. You want your respondents to have a positive user experience, surveys with a human touch. They should feel good about taking the survey.

I’m not just talking about survey length or incentive, though those are certainly key tools in addressing the problem.  What I am referring to is the very way we talk to the respondent, the questions asked and how many times we ask that question.

It is easy for us as researchers to become so lost in our need for quality data that we forget the source of it—human beings. People are rational and emotional creatures. How do they feel about their participation?  It’s an important consideration, all too often ignored.

Identifying and avoiding potential pain points may not only help to reduce the number of scrubs and drop-outs, but also deliver better, more reliable data.

Have you ever been on a conference call where the speaker repeats the same point 5 times?  Did you like it?  Did you continue to pay attention or did you look at your phone or check your email?  Now imagine that same conference call. The speaker drones on with 4 more points that are roughly one hair’s width different from the original ones. Frustrating!

Plenty of studies out there get too repetitive in hopes of garnering nominal, ordinal, interval, and ratio data just to present the client with 4 different charts.  But you should ask yourself, how reliable are the opinions offered by a respondent that you have just bored and or annoyed?

Some repetition may be unavoidable, especially when you want to determine which of a group of stimuli is most attractive to your target, but you should not bludgeon the people who are meant to be helping you.

Pain point #2: Being too clever

“If you could be a tree, what tree would you be and why?”

This may be a good opener for your therapist to crawl around the workings and motivations of your mind, but some respondents may find such questions to be intrusive or something worse: “hogwash.”  They have signed up to take part in survey research, but they’re not lab rats!

We come back to the reliability question: how reliable is the data you are gathering if your respondent has been made uncomfortable and just wants to finish the ordeal and get out?

The prospect of getting “deeper data” out of your survey may be very alluring, but consider how appropriate those questions are for your audience.  Does a panelist really need to imagine their favorite restaurant as a spirit animal in order to tell you what their favorite sandwich is?

Pain Point #3: Being too “research-y”

While gathering data or even when trying to cut time off the length of interview in consideration for the respondents, questions might be presented impersonally or curtly. These rapid-fire “cold” questions, though absolutely focused, clear and concise, run the risk of boring a respondent into unintentional mental lethargy.

Questions can eliminate responders who have lost interest in your data set, but wouldn’t it be more beneficial to prevent the need for creating them in the first place?  You don’t have to write a narrative or tell a knock-knock joke to keep them engaged with the process.  Panelists are people. You should just remember to “speak” to them conversationally, instead of clinically prompting and probing for responses.

By being more aware of the respondent’s pain points and making a few tweaks to your surveys, you can improve completion rates, quality of open-ended responses and data integrity.  Better yet, it does all this without incurring any additional costs.

How I Learned to Love AAPOR’s ResearchHack 3.0

It was my first year attending the American Association for Public Opinion Research (AAPOR) Annual Conference, and I was feeling a little nervous. AAPOR is one of the most influential conferences in the survey industry. My goal was to actively participate in events and networking opportunities on the conference list. ResearchHack 3.0 was one of them.

ResearchHack is AAPOR’s version of a “hackathon”, where teams of participants (aka. “hackers”) were asked to devise a plan for a mobile app that would inform various uses of the Census Planning Database.

I looked at the blank ResearchHack 3.0 registration form and hesitated. To be honest, I’m a statistician whose focus has been on survey research methodology. Except for the statistical programming language R, which I’ve used for my projects, I know very little about coding or making an app. Me, a hacker? A coder? I don’t think so! I didn’t know whether I could make any meaningful contribution. I was a little scared, but I knew that it would be a great chance to learn, to work with great people, to get out of my comfort zone, and to truly challenge myself. I signed in. “ResearchHack 3.0…bring it on!”

I was paired with three professionals: a health researcher, a healthy policy program research director, and a director of an institute for survey research. Our team decided to work on a Census Planning Database-based mobile app to help any survey firm/researchers who were trying to design a sampling and operational plan for a hard-to-survey population.

Surveying a hard-to-survey population usually results in a very low response rate. The “main idea” of our app proposal was to utilize the Low Response Score in the Census Planning Database to help identify areas with possible low response rate for the targeted population. Then we would “customize” sampling and operational plans based on areas with different degrees of predicted response rate, with the assistance of big data analysis results or shared experiences from other researchers.

Actually, we had no problem creating hot maps to identify areas with possible low response rate, but when we had to create an app prototype to demonstrate how the app can help survey researchers “customize” their research plans, we ran into a problem. None of us knew if our proposed ideas were even applicable in an app! We didn’t know what adjustments we should make to implement those ideas at the app level. None of us had the related experience needed to make those calls. It’s like that feeling you get when you have an awesome idea for decorating a cake, but you don’t know the needed ingredients. I have to admit, it was a frustrating realization, and I believe my team members had a similar feeling.

The clock was ticking. We had to present our ideas to the public only 24 hours after our first meeting. The pressure was huge, but no one gave up. We sacrificed sleep to work on our slides and outputs. We wanted to be sure that our “main proposal idea” would be clearly explained.

Next, we adapted a role-playing strategy in our presentation to show the audience what kind of difficulties any researcher might face when trying to survey a hard-to-survey population, and what “customized” research plans could help if the needed technical assistance for the app was provided.

Although our ideas didn’t wow the judges (totally understandable due to our app-level technical shortcomings), we did win the “audience pick” award. We were grateful to them for appreciating the effort we put in to help relieve the pressure on all the hardworking survey researchers who have to collect responses from hard-to-survey populations.

ResearchHack 3.0 was certainly tough, but very rewarding, too. You couldn’t ask for more from this crazy and unforgettable experience!

After the conference when I got back to the office, I shared my ResearchHack experience with the programmers in the Geo-Dem group. We had some great discussions. They gave me creative ideas that I had never thought of before. This is one of the great benefits of going to conferences like AAPOR. You share new knowledge and insights with your colleagues, which sparks more creative innovation. One day we will continue in the spirit of ResearchHack 3.0 and make great products for survey researchers, together. When that day comes, our blog readers will know the news. Stay tuned!

Kelly Lin | Survey Sample Statistician | Marketing Systems Group

Taking Aim with Consumer Cellular Sample

How Consumer Cellular Sample Can Give You a More Accurate Geographic Fit of your Target Population and Improve Coverage

Geo-targeting. We all know what it means, but for the sake of this article, let’s get at the essence of the concept. Geo-targeting is a way to pinpoint an audience based on location. Accuracy is everything. Geography is the fundamental basis for every sample frame – be it individual streets, Census geography or Postal geography.

Certain sample frames such as Cellular RDD tend to be difficult to target geographically due to inward and outward migration of individuals who retain their cell phone numbers.  It’s important to be aware of these limitations when using a Cellular RDD sample, especially when targeting small geographies.

Here’s how you can miss the target: a cellular RDD sample will include people who have moved outside your target geography (a.k.a. outward migration). Additionally, respondents might live in the area of interest, but do not have an opportunity to be included in the RDD frame (inward migration). They have a cell number that corresponds to an entirely different geography. These people wouldn’t  be included in a traditional cellular RDD frame. The result? Under-coverage due to inward migration and increased data collection costs due to outward migration

So how can we account for the under-coverage and take better aim? One option is to supplement from a relatively new convenience frame called Consumer Cell. This frame is based on a multi-source model comprised of consumer databases linked to publically available cellular telephone numbers. It is updated monthly.

The Consumer Cell  database is built from literally hundreds of sources including

  • Public records
  • S. Census data
  • Consumer surveys
  • Telephone directories
  • Real estate information (deed & tax assessor)
  • Voter registration
  • Magazine subscription
  • Surveys responses
  • E-commerce
  • Proprietary sources

Geographic targeting using consumer cell can be highly accurate, zooming in on small geographies such as census blocks or even the household level.  Further stratification can be done for numerous person and household level demographic variables.

One limitation to the database is that it is a convenience frame (non-probability compilation of households). It does not offer the same coverage as an RDD frame. It is probably best utilized as a supplement to sample respondents who live in a targeted geography. One of the benefits is that you now include respondents who otherwise would not have been sampled.

If your area of interest is at the state or local level, you should consider where we can address under-coverage issues with RDD cell sample.

DIY Web Scraping: Fetching and Extracting Data

Given the sheer multitude of accessible data available on the world wide web, the “web scraping” phenomenon has caught on like wildfire. Web scraping is a method for extracting data from websites. The scraping can be done manually, but is preferably done in a programmatic way.

Many free programs are out there to assist you with your forays into web scraping.  For a recent project we used IMacros to automate the fetching/extraction of needed data from the Residential Construction Branch of the U.S. Census.  This website provides data on the number of new housing units authorized by building permits.  Data are available monthly, year-to-date, and annually at the national, state, and most county and county subdivisions.  Prior to January 9, 2017 all building permit data at the county level or below was only available as individual text files.  This meant we had to manually download over 3,142 individual text files in order to obtain the data for all the counties in the U.S. It was a tedious task, to say the least.

Such a manual process would have been too labor intensive to take on without any automation via web scraping. Automating the entire process using IMacro was pretty straightforward and simple. Here’s an outline of the steps:

  • Install the IMacro extension to the Firefox web browser.
  • Test the IMacro recording function by going through the process of selecting and downloading the first file.
  • View the recorded code and create a loop +1 so that the code repeats itself and downloads each text file.
  • Save the files in the same file/folder location to make the process of merging data files into a single file much easier.
  • Extract data easily for every county, with the ability to roll up by state, region and nationally.

Like many data sites, the Building Permits Website now provides access to the FTP directory where you can navigate and download all 3,142 text files without having to enter specific parameters for each file.  However, if you come across websites that do not, we recommend that you get familiar with the site to determine what format the data is in: i.e. tables, individual pages etc.  If you need to scrape from numerous websites, take the time to get familiar with each one, because any change in formatting from site to site can cause havoc if you are not aware of the potential problem of downloading misaligned or incorrect data. Never forget the rule: garbage in, garbage out. Test before you scrape!

Why Volunteer?

The research industry needs volunteers. Here’s why you should consider playing a part.

Many of us here at MSG serve as active volunteer members of market and survey research industry organizations. It’s part of our company culture to get involved and make a difference. Recently, I attended back to back chapter events, and I began to reflect on the benefits of volunteering. Was it really worthwhile to devote my time to a local chapter organization?

It’s true, the amount of time you need to devote to volunteering can feel like a second job, and it is crucial that you be able to balance your primary and secondary activities. It’s definitely a juggling act, and it isn’t always easy.

That being said, there are loads of good reasons to become a volunteer. Here’s what influenced me to get involved:

Networking. Serving as an industry volunteer will get you talking to people and is a wonderful means for creating and maintaining relationships. I want to meet people whom I can work with, but I also want to build a network of long-lasting professional relationships. In my roles as a volunteer for a local chapter organization and committee memberships, I have encountered industry pros whom I never would have met otherwise.

Learning best practices. Education doesn’t end with a degree, a certification, or on-the-job training. It should be seen as a lifelong habit of mind. By attending events and seminars outside the orbit of your day-to-day business, you will be exposed to new ideas and pick up on new trends within your industry and related industries.

Organic growth. A natural goal we all have is to grow our business. When you volunteer, the cultivation of business growth can tend to happen more organically, as a function of developing relationships within the membership environment. As you discover ways to collaborate and partner with others, those seeds will sprout.

I firmly believe that volunteers are the lifeblood of an association. They keep our communities engaged and informed. Despite the fact that it can take up a lot of spare time, when I reflect and ask myself, should I have volunteered? I always answer a resounding YES!

 

 

 

Who Really Owns the Cell Numbers on Your List?

Say you have a list of cell numbers for consumers and you want to message them. Then you use an automatic dialing system to send text messages out to those numbers. This simple and apparently innocuous action could have drastic consequences that could actually cost millions of dollars.

This could happen to you

Take the case of Philadelphia-based frozen treats company Rita’s Water Ice, which settled a class action lawsuit for three million dollars in May 2016.

The reason? The plaintiff claimed that Rita’s had violated the Telephone Consumer Protection Act (TCPA).

The TCPA requires you to have prior express written consent before using an automatic telephone dialing system for messaging cellular numbers on a list.

In the Rita’s Water Ice court case, the company strongly denied the accusations, but they agreed to a settlement so as to avoid a prolonged lawsuit.

The plaintiff argued the case from two perspectives:

  1. Those who had given original consent but changed their mind and asked to be removed from the distribution list, which never happened.
  2. Those who claimed that they had never given consent to receive text messages.

What’s most interesting for those of us in the research industry is the second group. Upon analysis, it was discovered that certain plaintiffs owned cell phone numbers that had been assigned previously to consumers who HAD in fact agreed to receive text messages from Rita’s.  In effect, written consent had been given originally, then the cell number was reassigned to a new consumer who had no clue about any of that.

Navigating the murky waters of compliance  

All of this points to an issue of great concern to researchers: the vagueness of the TCPA. And it begs a major question: how much due diligence should a company or researcher have to perform, to ensure that the cell phone numbers on their list are in fact registered to the names on the list? It’s murky. A grey area. Undoubtedly, more litigation will have to occur before the question is answered definitively.

In the meantime, if TCPA compliance is at the forefront of your data collection, you should contact an MSG account manager. We have the ability to mitigate TCPA risk. We can identify wireless numbers for you, and we can offer identity verification that verifies called-party consent.

Until the TCPA is amended, clarified, or scrapped, the second golden rule always applies: “better safe than sorry.”

“Hope for the best, prepare for the worst”: salvaging the client list

You’ve probably heard the story before. It begins, “The study started with a client list….”

I can’t tell you how many times I had a client call and tell me that. The stories follow a pattern. The client says it’s a great list and you should be able to easily complete the study with it. Sounds great, right?

Here comes the plot twist. They forgot to tell you the list is 4 years old and hasn’t been touched since. Oh, and by the way, only 30% of the records have a phone or email address. Suddenly, easy street is filled with potholes.

This isn’t the end of the story, and it can have a happy ending. A sub-standard client list can be rescued with these investigative approaches and performance enhancements:

• Flag any cell phone numbers so they can be separated out and dialed manually, which also ensures TCPA compliance.

• Ask yourself: what is most important on their list? What is the key sampling element? Is it the individual (contact name)? If so, the file can be run against the National Change of Address (NCOA) database to see if the person has moved. If the person has moved, a search can be run for the new address. The next step is to identify the landline and (or) cellular telephone numbers associated with that individual at the new address.

• If location/address is the key element, check for the most up-to-date telephone numbers (either landline or cellular) and name associated with that address.

• Send the call list to a sample provider for verification. Does the information in your list match the sample provider’s database?

• If information doesn’t match, can you append on a new phone number or email address?

• Do you still have open quotas? See if you can append demographics to target for open quotas.

• When you’ve exhausted all options on the client list and the study still isn’t completed, order an additional custom sample that meets the ultimate client’s specifications (or at least comes close). Then you should dedupe the client list from any custom sample orders.

With the help of a good sample provider, even a subpar client list can be salvaged and the study brought to completion on time.

Using this software? Your data is NOT secure!

Almost every week I hear in the news about another data compromise, whether it be a large corporation having their customers credit cards and personal information stolen or even computer hacks rumored to be effecting the election. Personally, it is very scary thinking about myself and my family’s data getting into the wrong hands.

If I was managing a large database of panelists or other sensitive data which my business relies on, I am not sure I would sleep well at night knowing that there may be many different potential access points for this information to be exposed or accessed by someone else.

I was speaking with one of the world’s largest pharmaceutical companies recently and was told that the use of “brand x” survey platform has been 100% forbidden to use moving forward with the potential of losing a job if you are caught using this off the shelf tool.

I think we all need to take a look in the mirror and evaluate the different software, applications and tools we use in our everyday business and personal lives and ask if there is a better, more secure, way to manage these processes and our data.

In the business world it is very important to involve IT and your SECURITY team when choosing a new critical software platform(s) to make sure that the tools you will use every day meet the security requirements of your business as well as the productivity requirements for your research needs. I know this adds time and most likely costs to the bottom line but not as much as if your database wound up in the wrong hands .

Which side of the firewall does your database, survey data and other critical research tools reside?

Sports Adventure in the UK

It’s been my dream for quite a while to visit the UK.  I’ve been following tennis since I was a kid, as well as Premiere League soccer the past six years.  I finally decided to bite the bullet and take a solo trip to London last week.

I took the red-eye out of philly and landed at Heathrow airport 7:30am.  The first day was spent sight-seeing and catching up on some sleep, since the passenger next to me decided to use me as a shoulder rest the entire flight.

The following day was the highlight!  Tottenham vs Everton at White Hart Lane, a stadium almost 120 years old.  I didn’t see much of the hooliganism you hear about, but more a no-nonsense respect for the game.  If there were 34,000 fans in the stadium that day, 32,000 were dressed in navy blue (Tottenham’s colors), and the visitor’s section in bright blue (Everton’s colors).  I purchased a rain jacket back in the states and didn’t realize until boarding the train to the match, that it matched Everton’s bright blue jersey.  The looks I was given were equivalent to someone dressed in Cowboys colors at an Eagles home game.  I quickly removed my jacket and kept it rolled up until I could analyze the vibe at the stadium.  I noticed a few other Tottenham supporters wearing alternate colors so I put my jacket back on, and glad to say no hassles whatsoever.  I wondered why fans were literally shoveling food down their throats outside the grounds.  No one and I mean no one was eating, drinking or turning their eyes from the action until halftime.

The following day was spent touring Wimbledon and Shakespeare’s Globe Theatre.  The most interesting tidbit of the entire trip came on the Wimbledon tour.  There was a giant, what looked like a dough roller when you entered the grounds. 

When The All England Club first opened, they wanted the grass to look immaculate and needed to raise money for a grass flattener, aka the giant dough roller.  To raise money, they decided to hold a tennis tournament and since it was so profitable, continued from that point on.  Talk about the cart before the horse!