County Level Cell Phone Only Estimates

Probability based telephone surveys must utilize a dual frame approach in order to capture the ever increasing cell phone only population.   Until the day comes where it’ll be a single frame approach of only cellular numbers, researchers need to ensure they get the appropriate blend of cell only vs. dual phone users in their sampling allocations.

Marketing Systems Group (MSG) produces quarterly estimates of Cell-Phone Only (CPO) rates for every county in the country[1]. These rates are based on telephone households (as opposed to occupied housing units) and lag one quarter. A Cell-Phone only household is defined as one in which there is no operational landline telephone and cell phones are used exclusively to make and receive calls.  The CPO estimates are derived by using a combination of survey-based information along with large commercial and administrative data sources.  This triangulation approach provides us the ability to create CPO estimates at the state and county levels.  Moreover, the primary sources used to create these estimates are updated on a continuous basis.  This enables us to create updated CPO estimates each quarter.

MSG is currently the only source of county level CPO estimates available.  Having this granular level of information can benefit survey planning and weight computation/calibration at the sub-state level.  We analyzed our CPO estimates at the county level and observed that many states have big differences in the CPO rate from county to county (see Figure 1). This could have a big impact on studies that infer a given county level CPO rate using the state level estimate.

Figure 1:  CPO variations among counties (minimum vs. maximum) within state compared to the state level estimates.

One limitation of utilizing administrative data in our triangulation methodology is that it does not take into account behavior patterns of dual phone households.  Dual phone households are those that have both land line and cellular numbers but only make and take calls on their cell phones.  With that being said, these estimates are the closest approximations at the county level available.

Whether it be for statistical or productivity reasons, rely on MSG’s CPO estimates as a criterion to obtain the appropriate mix of land line vs. cellular numbers for sub-state dual frame RDD surveys and for weighting calculations.

[1] A Total of 32 counties were combined with other counties or removed due to lack of information.

Access & Control – Just Like the Family Fridge

Complying with the new GDPR rules means giving panelists more access to their data

When I was growing up, just about everything my family or someone watching us needed to know would be affixed to our refrigerator with tape or magnets. This included a calendar of events, important phone numbers, report cards, receipts, images, to-do lists and more. The fridge was the central repository for upcoming events for our family.

If you wanted to see what was going on in our lives, first you needed to be invited into our home (or have a key to gain access). Only trusted friends, relatives or service providers could get in and see the refrigerator to learn what we were up to.

Just as access to the family fridge was limited, the European Union General Data Protection Regulation (GDPR) has been designed to enhance an individual’s control over their data and restrict outside access.  Now, allow us to read your rights!  You have the right to be informed when your data is being processed, the right to access your data and confirm its lawful processing. You have the right to be forgotten, the right to data portability, rectification, objection to direct marketing, restriction of processing personal data, and safeguards against AI related decision making. One of the primary aims of GDPR is to give an individual total control of their data, and organizations with access must comply with the demands.

In ARCS we have something a lot like that family refrigerator. We call it the Panelist Portal. This is the individualized home page within ARCS for each member of your participant panel. The Portal gives users control over their core data (along with the ability to update this stored data). Users can also opt out, and all can be done within a single system, complying perfectly with GDPR.

Once someone is invited to join your participant database, they are given a unique “key” (ex: user name and configurable password which you will have the ability to control). This is the place where a panelist can make changes to their name and personalized password.

When my parents would go away, they would leave their itinerary and special instructions on the refrigerator. In the same way, you can post privacy policies, NDA agreements and other information that panel members might need to see.

Let’s say you have someone in your database who must “accept” your terms before being allowed to participate in your research studies.  You can provide the documentation, instructions, and mechanisms for them to read and acknowledge. This could be for the original acceptance or a change in terms that requires database members to acknowledge and confirm agreement with the new language.

Within the Panelist Portal, your database members have access to many important pieces of information about themselves, their history, and their upcoming research study schedule.  This information, referred to as participant data, is organized into two areas:

  • Core data. This includes items such as name, age, birthdate, address, email address, phone number, preferred contact method, household makeup, and more.
  • Attributes or custom data points. ARCS allows you to create, ask and track unlimited questions about particular panel members. You can then query on those custom attributes and data points. Some examples could be product usage, demographic information such as education, salary, marital status, and more.

The ability to view and update PII and sensitive data is critical to GDPR compliance. Using the Panelist Portal, your database members can access selected data fields and update these attributes themselves, as their product, brand and usage change over time. This will ensure that you have accurate and up to date information, which will help you invite and qualify the right panel members for your studies. This is also where your panel members can complete any necessary required paperwork (such as NDA forms). All of this information is date and time stamped as well as trackable.

All of the above capabilities are presented in one place, and just like the family fridge the Panelist Portal provides centralized visibility, auditing and tracking.

By giving database members more control and visibility into their data, you will be compliant with the applicable GDPR requirements, protecting yourself and protecting your most important asset, the participants. With greater access and control, they are likely to feel more comfortable with your organization. This can then lead to referrals of additional family members and friends.

Breaking up shouldn’t be hard to do

Lastly, GDPR compliance asserts the participant’s right to be forgotten. They may ask that their data be wiped, either completely or partially. Your participant engagement process needs to: (a) permit such a request, (b) quickly respond to the request and (c) identify the user and types of data to be eliminated.

Key Questions

What types of controls and tools does your participant engagement process have to handle these items? Do panel members constantly need to call your staff to update their information?  Would you like to have the visibility and controls to meet the ever-changing data protection needs your participants deserve and meet new regulations like GDPR?

Call our ARCS specialists today to discuss your unique research participant needs.  

Verifying Important Data Points

I trust my kids. No really, I do. My oldest is at the age where she spends a lot of time with her friends—getting picked up to go shopping, going out to eat, hanging out at parties, sleepovers—in all cases, she is not at home and not with me. Like I said, I do trust her, but for the sake of my own sanity (and of course her safety), I have taken the necessary steps to verify that she is doing what she says she is doing.

Yes, I take a quick look at the cell phone GPS while she is out, I text to make sure her location matches, and I call her friends’ parents to verify that she is spending the night. Not every time she goes out, but enough. Just so you don’t think I’m snooping on her unawares, my daughter understands perfectly well that I am verifying all these details. I want her to know, and it is good that she knows. Not that she is looking to intentionally deceive, but honestly, weren’t we all kids once? We know the script. Just knowing that someone is checking in—well, that keeps everyone honest. I hope, it also brings her a sense of security, too.

Data is no different, really. In particular, I mean data collected on panelists, participants or respondents.  No, of course you won’t be calling your panelists’ parents or tracking them on GPS, but certain information can and should be verified. Data verification is a key step to GDPR compliance, too.

Data verification sounds well and good, and GDPR expects it, but you might have lingering questions: Why does this data need to be verified? What information should I verify? What kind of verification should be performed with the collected data?

Why does data need to be verified?

First off, it’s a quality issue. You want quality data, reliable data, data you can use. Verification ensures that the information collected and/or stored is accurate. This is especially beneficial because the data will no doubt be used for projections, direction or insights.  Inaccurate data can have severely negative effects and possibly cost implications.

Secondly, it’s a legal issue. Data needs to be verified because of laws, rules or regulations.  For example, if data on minors is being stored, systems need to be in place to verify individuals’ ages or verifying that parental / guardian consent has been obtained. Regulations such as GDPR compliance have specific rules designed to ensure accurate verified data is being stored in the database.

What information should I verify?

This is completely up to any internal quality control procedures in place for your operations.  Many organizations verify and update demographic data every time they come into contact with panelists in their database.  Others have a schedule of verifying data at regular intervals. Either method is acceptable, so long as the procedure is followed carefully. Some major demographic data points to verify are:

  • Gender
  • Age
  • Income
  • Education
  • Ethnicity

Actually, any data critical to the company or fielded projects should be verified.  Some verify geographic data, psychographic data and behavioral habits in addition to demographic attributes—all with the purpose of having the most accurate data available at their fingertips.

A word of caution related to psychographic and behavioral attributes; purchase habits change quickly, loyalties adjust and specific usage can fluctuate greatly. Keep these points in mind when collecting or storing psychographic and behavioral data as the shelf-life of the information is very short and would need to be verified at a greater frequency.

What kind of verification should be performed on my data?

Just as there are multiple ways to check on the whereabouts of your children, there are multiple ways to verify panelists’ information. The software package that you use to store or collect data should have the tools readily available to verify any pertinent data. At MSG, we’ve built the ARCS panel management system to allow for custom-defined rules to perform data verification that matches your organizational procedures. Here are a few methods that a software package should feature:

  • Verify key data points before the panelist is entered into the database.
  • Define specific procedures as to exactly what data needs to be verified and how often.
  • Streamline the verification process by updating data each time a panelist is contacted or send specialized surveys to update needed information at regular intervals.

I know you trust your panelists just like I trust my kids, but it doesn’t hurt that they know you are checking in on them. Verifying important data points is not only a wise move for your company, but it is increasingly necessary when it comes to government regulations like GDPR. They are expecting you to take care of this business anyway. It’s an all-around good idea.

Understanding Authentication & Authorization

Learn the Magic Words: Authentication & Authorization

“Open Sesame!” “Abracadabra!” In fairy tales, magic words like these are invoked before the treasure chest unlocks and doors open. In the software realm, we don’t call it “magic”, we call it “authentication” and “authorization”. Authentication is needed to protect access to precious data. Authorization tells you what you can do once you are inside.

My mom taught me the magic of access protection in my early childhood. As a kid, I wanted to grow up and shop on my own. So one fine day she let me go to the grocery store and pick up the items on her shopping list. I learned how the process worked:

  • She gave me the customer identification card.
  • She gave me the list of items she needed.
  • She also sent the list to the store and paid for it in advance.
  • When I handed over the customer identification card to the billing clerk, he confirmed that I was authenticated to purchase for my mom.
  • Her list (sent before I got there) authorized what I could purchase.

Today, in the research world, we come in contact with large groups of people who are providing us various data points about themselves, their families, likes, dislikes, medical information, etc. Every time I look at a data extract, I remind myself that it is very important to take all necessary precautions to protect this data. To ensure it is protected, you need to authenticate identity and authorize users who have access to the data.

The two terms are frequently used interchangeably in conversation, and no doubt they are both tightly associated. Together they are key pieces in protecting data. But the two are really different concepts, often completely divorced from each other. Authentication is the process where by an individual’s identity is confirmed. Authorization is the association of that identity with rights and permissions.

What is Authentication?

A user needs access to data. The software offers a challenge. Identify yourself, say the magic words, and you get in. The software not only needs to know who the user is, it needs to track the changes the user makes to the data.

Once you have an authentication mechanism built, you also want to start thinking of strengthening the mechanism. Some of the things to consider are:

  • Password policy – Which administrators are allowed to set strong policies for the password, and what policies (characters, numbers, case sensitivity) should be enforced?
  • Expiration Policy – How frequently should the password be changed?
  • SSO (Single Sign on) – Should you integrate your systems with a single sign on system, so users do not have to manage passwords for multiple applications?
  • Two-factor Authentication – Should you allow users to have another mechanism (SMS, Email) to receive a soft token needed, in addition to their username and password, to authenticate into the system?

Selecting the appropriate authentication policy and mechanism is a vital first step towards securing the privacy of your participants.

What is Authorization?

Once “Abracadabra” opens the door (authentication grants a user access), Authorization lets you control what is available to the user within the framework (like my mother’s shopping list). Authorization is the function that enables security system administrators to specify user (operator) access rights and privileges. Specifically, administrators restrict the scope of activity by:

  • Giving access rights to groups or individuals for resources, data, or applications.
  • Defining what users can do with these resources.

Authorization is usually implemented through the following elements:

Privileges. Privileges grant access to specific operations. For instance, administrators have the privilege to create or disable other user accounts, while normal users will only be granted the privilege to change their own password and profile information.

Access Roles. A role-based authorization framework should let the administrators govern what level of access an authenticated user may have on the data. This is usually established by feature / data specific rights (read only, can edit, invisible) that are set to users of the system.

Under normal circumstances, an authenticated user is allowed to perform all operations they’re authorized to do. In my case, after showing the store card to the clerk, I could purchase items on the list (and only those items). However, when a user wishes to access a specifically sensitive resource or operation, additional steps must be taken to authorize the request. For instance, when users want to perform a payment, they will be asked to reenter their credentials, or basically repeat the authentication process.

ARCS uses industry standard protocols to create a secure environment to store and protect your participants’ data. With simplified user management, you can focus on your business and let ARCS better manage and protect your participants.

Facing the regulatory challenge of Centralized Data

In the new world of the European Union’s General Data Protection Regulation (GDPR), organizations that process or store information on EU Data Subjects must comply with new uniform data privacy requirements. Did you know that GDPR requires all information you hold to be centralized? In this article we focus on centralized data, the looming challenges we face, and the solutions to them.

Many research departments and companies struggle to keep all of their data in one central location. Beyond this challenge, there is the fact that sometimes the only way to accomplish project tasks is to use different software packages – each requiring specific and unique skills to manage them.

For example, companies might have information like panelist data residing in one software package, while the ability to collect data from those panelists is performed in a completely different package.

When you add the challenges of securely storing Personal Individual Information (PII), tracking participation, managing incentives and sharing data, an organization could be using three, four or five separate software platforms.

As if that weren’t problem enough, there is one more challenge – increasing industry and government regulations like GDPR are now requiring any information that you hold to be centralized. With all of the disparate software required to complete your projects, the challenges can seem daunting.

What steps can you take to mitigate some of these challenges?

Find a single software platform that covers most, if not all, requirements to meet not only industry and government regulations, but also to streamline internal processes. By unifying on one platform, you will save time, costs and resources related to a number of the challenges mentioned previously.

Arm yourself with the proper questions to ask any software provider. Here are some must ask questions:

  • Is all of the collected data stored in a secure and centralized database?
  • Does the data have the ability to be searched and shared across different tasks? For instance, can data collected via a questionnaire automatically update in the specific panelist record?
  • Are all panelist data, PII, participation data, incentives and questionnaire responses stored in a central location?
  • Can the software track how data was collected or changed within the database?
  • Can the software produce information for auditing purposes to assure regulation compliance?

These are just a few of the many questions to be asked when evaluating a software platform. By contacting a representative at Marketing Systems Group, we can partner to identify the specific areas that most need shoring up in your organization.

Our team can also assist in formulating the many questions to investigate while searching for the best software platform for your company. Let the experts at Marketing System Group help you navigate the difficult and ever changing regulation landscape.

Finding Your Niche: Specialization and the Future of Market Research Firms

Trend watchers no doubt have noticed that traditional players in the market research space are having an increasingly hard time keeping up and scaling their business practices with changing times. In a recent article, analysts have reported a flat growth rate of 2% in the market research industry. This is not good, especially when you factor in inflation.

Perhaps the highest impact change involves the digitally-savvy consumer who demands more data-driven analysis and quicker results. Traditional market research firms can be slow to adapt. Are there ways for the old dog to learn some new tricks that can help to evolve a firm and get it moving towards positive growth?

Sure. Some experts suggest that specialization might be the answer.

Think about the niche your firm’s business fits best. What is it that you do uniquely in terms of data delivery or technology? The firms bucking the flat growth trendline are the ones offering expertise within a domain along with strong data inventories that set them apart.

So what do we really mean by specialization? In a market research context, it might mean improvements with infrastructure, speed, size, creative solutions, or methodology.

Identify one or two core competencies and focus on them. This is your niche, the “special sauce,” the unique value your firm adds and the substance you explain to prospective clients: here is why they should hire you. The goal is to become known for something. This is the niche, your “word on the street” reputation.

Fully embrace new technologies. Data-hungry, digitally-savvy clients are looking for speed and quality results. They expect all decisions to be backstopped with data and research. Evidence-based criteria is king. Is your firm moving beyond traditional data collection and analysis? Are you collecting user “experiences” and offering support for impact measurement and taking action? Consider investments in services geared towards high-end analytics and customized tools that deliver data and analysis quicker than ever.

Alongside the new technology, realize the importance of expert advice. Guidance is where your domain expertise comes into play, and it shouldn’t be underestimated as a selling point. Odds are that clients have already researched the vendor space before they ever pick up the phone and probably have a sense of what your expertise is already (if not, take the opportunity to inform them). This means your main objective should be assuring them that yes, you have the expert know-how to serve the client’s needs.

For client needs outside your “competency zone”, considering turning to acquisitions and partnerships.

Acquisitions. If you can’t beat them, buy them. This can work for a large firm, but medium-sized firms are having trouble competing, because they may not have the capital to acquire niche firms. They rely on in-house research instead of partnerships.

Strategic partnerships. Competition isn’t just occurring at the level of operations, services, and pricing. Can you perform the turnaround faster? Sometimes the only way to meet those goals is to establish strategic partnerships in any number of areas like setup, automation, or infrastructure, to name a few.

To recap, the specialized market research firm becomes great at one or two core competencies, is willing to embrace new technology, makes acquisitions and forms strategic partnerships when needed. Flexibility and agility are key.

Keep in mind that being nimble is no guarantee of success, however. Being agile means you are primarily in reactive mode. Why not get proactive? The growth-based firms are the ones who are innovating. Prepare your firm for innovation by incubating your core competencies, which should lead to a creative problem-solving approach. Think more about the kinds of problems you can solve for your clients and less about the categorical label. Remember, you are in the solutions business.

What to Expect from the New European Data Protection Regulations

D-Day is coming to Europe next spring, and no, we’re not talking about World War II. For us in the here and now, the “D” in D-Day stands for Data. In May 2018, new data protection regulations will take effect in the EU, and the impact on businesses and consumers will be enormous.

The European Union General Data Protection Regulations (GDPR) have not been updated since 1995. Twenty years is a lifetime in the world of technology. So much has changed. Most of our lives are intertwined with technology now, and our digital alter-ego (data profiles of who we are and what we do) is living somewhere in the cloud, traveling the earth in milliseconds. Amid the looming chaos of privacy exploitation and hacking, consumers are justifiably concerned and doubtful. Does privacy even exist anymore? Is anything secure? People want their privacy protected. They want to trust that their data is secure, but at the same time they want the convenience of personalized consumption and instant access. It’s a tough balance to strike.

Fundamentally, the goals of GDPR are to reassert individual privacy rights, foster a more robust EU internal market, strengthen law enforcement, streamline international transfers of personal data, and unify global data protection standards. The new data protection regulations will consist of a two-part implementation. The first part is the General Data Protection Regulation itself, the new rules.  The second part involves the enforcement arm, a Data Protection Directive for police and criminal justice entities.

According to public information released by the European Commission, we are going to see some interesting outcomes from GDPR.

Privacy rights make a comeback. The privacy regulation aims to improve individual’s rights to virtually “be forgotten” When they don’t want their data held anymore, it must be deleted (with exceptions: data may be retained for contractual or legal compliance reasons until no longer needed).  Individuals’ access to their personal data will be easier to obtain. They will have a right to port their data between different providers and the right to be notified when their data has been breached. In addition, companies must inform the authorities which accounts were hacked in a timely fashion.

European Commission says “Data protection by design and default” will become the norm. Products and services must be safeguarded via built-in data protection. Privacy will become the primary focus and could lead to new business innovations.  This includes new techniques for data encryption, removing personal data identification from data sets, and replacing PII fields in data records with artificial identifiers.  All of these could restore trust between individuals and companies holding their data by limiting exposure.

Costs. Yes, it will require investment to upgrade apps and services, but the tangible and intangible payoffs of compliance with the new regulations are real.  According to estimates, Europeans’ personal data value could be worth upwards of €1 trillion by 2020. With stronger data protection regulations in place, opportunities will grow.

Streamlined regulations. There are currently 28 separate laws on data protection that are incoherent and unwieldly.  The plan is to have these 28 individual laws consolidated into one.  Estimated savings for companies and organizations could be as much as €2.3 billion per year. After the new data protection regulations take effect, companies will deal with one single supervisory authority only, making it easier to do business in the EU. This will level the playing field by applying the same rules for all companies – regardless of size or location. Companies outside of Europe must follow the same rules when doing business in the EU.

Negative reinforcement: Be prepared or pay up! The EU is expecting merchants to be more responsible for protecting customer data. Those who experience data breaches will face severe sanctions. Beginning May 25, 2018, the EU will impose heavy fines levied as a percentage of revenue on companies violating the GDPR rules.  Smaller companies doing business in the EU may be unaware how soon these regulations are coming online. Liability is an obvious concern, so active steps to achieve compliance must be taken. Products and infrastructure must be reviewed and updated. A sustainable cyber security program must be in place. The cost of compliance must be accounted for, and the ROI should initially be measured against the preparedness and protection from fines and liability. In the long term, as mentioned above, the new regulations could result in a more level playing field and increased business opportunities.

Organizations should act now:

  • Review and analyze the GDPR. Seek advice. Leave nothing to chance. Learn the precise meaning of “personal data”.
  • Update your documentation for personal information and security practices. Update policies and procedures for breaches, incident reports and risk assessments. Review all relevant contract and agreement language
  • Figure out how to best mitigate risks of noncompliance.

For more information about the European Union General Data Protection Regulations (GDPR) check out this European Commission website, with press releases, questions and answers, factsheets, legislative texts, the current legal framework, and public opinion surveys.

Quality Starts with Survey Design: Tips for Better Surveys

Marketing researchers are all facing two important challenges to data quality. First is the question of representativeness: with response rates plummeting, we need to make surveys shorter, more engaging, and easier for respondents to complete. Second is the issue of data accuracy: we must make sure that survey questions measure what we think they measure.

If you want to make surveys more accurate and representative, it all comes down to survey design. You need to think carefully about the survey design and how quality is expressed and impacted throughout all phases of the research project. When you get in the habit of thinking about quality at all phases of a study—from design to implementation to analysis of results and feedback—the payoff will be clear.

First Steps First

It sounds obvious, but the first quality check is to take your survey. Clients, researchers, analysts—everybody on board should complete the survey. Be sure to ask some people who are not familiar with the project to complete it as well. How does it feel to be on the other side? Talk through the questionnaire as a group. Look for areas that need more focus or clarification. Seek recommendations to improve the survey. Encourage the group to advocate for changes and explain why the changes are important. And be sure to use a variety of devices and operating systems to understand how the survey performs in different situations.

Conduct a pretest to get feedback from respondents. You don’t have to complete many pretest surveys, but you should have at least a few “real,” qualified respondents complete the survey. Additionally, they should be answering questions about the survey’s design properties, ease of use, and any other issues they had with the survey. By all means, use survey engagement tools when feasible, but don’t fall into the trap of letting the droids rule the analysis. You need the human touch from real respondents, as well. (Don’t forget to clear the pretest respondents before you fully launch the survey. Or you can clear the responses or filter them out of the final results.)

Use technology to test for data quality. A computer application is great at metrics, scoring and summarizing responses. It can measure survey engagement by tracking rates of abandonment and speeding and measure experience quality via respondent ratings. The average length of time to complete the survey is also a key metric. Use technology as a predictive tool before launching the survey to evaluate engagement levels and suggest improvements.

Frequent Challenges

As you get in the habit of performing quality checks, be on the lookout for these common issues that will lead you to improve your survey design:

Is the survey user-friendly?

  • Beware of “survey fatigue.” Split long surveys into many short pages.
  • Make survey language more consumer-friendly and conversational and less “research-y.”
  • Does the language used on buttons and error messages match the survey language?
  • Validate questions for the correct data type and tie validation to relevant error messaging that tells how to fix the response.
  • Use a progress indicator to show how far the respondent is from completion.

Does the survey flow?

  • Improve the logical flow of the questions and watch out for redundancies.
  • Make sure the question type matches what you are looking for. Close-ended questions are ideal for analysis and filtering purposes.
  • Test your logical paths. When designing page skips, you don’t want unexpected branching to happen.
  • Use required questions for “must get” answers, so the respondent can’t move on without completing them. Be careful about making too many questions required, however, as respondents can become frustrated and break-off before completing the survey.

Is your survey design mobile-capable? While 40% of all survey responses are completed on a mobile device, a recent study reported that half of surveys are not mobile-capable, much less mobile optimized. Design your survey to work on mobile devices:


  • Make sure that short page questions fit on the screen.
  • Minimize scrolling whenever possible.
  • Check for comment box sizing problems and row-width for matrix question labels.


Remember, quality control should never be an afterthought; you must have an established quality control process for surveys. This process must specify the quality review responsibilities of each survey reviewer. One or more team members should be responsible for evaluating respondent-level data. The quality control process should review the survey design end-to-end to focus on maximizing both technological efficiency and respondent experience for optimal data quality.

Reap the Rewards: Finding the Right Incentive Mix for Your Panelists

Pretty much everyone in the survey business understands the value of a satisfied panel. We want our surveys to be well-received and satisfying. We want our panelists to be engaged, and when we invite them again, we want them to participate eagerly.

To achieve these goals, you must work to build loyalty among your panelists. What does loyalty mean in this context? A panelist should think of your panel as their panel. They belong there, and it’s a place they will want to revisit.

One tried and true method for building loyalty is the offering of incentives, also known as rewards. An incentive reinforces positive behaviors and reminds panelists who your brand is and why it’s something worthy of their loyalty.

A panelist who is kept happy will in large measure be a loyal one. Here again, incentives can play a major role in building good will. When you reward respondents, you not only offer them something of value, you are letting them know that you value them.

In the abstract, an incentive program should contribute to the growth of

  • Acquisition
  • Participation frequency
  • Retention of participants

When we reward panelists for good behavior, the happy (thus loyal) panelists are much more likely to share their positive experience with their friends. In this way retention (satisfied panelists) can feedback into acquisition (new participants).

Let’s briefly examine how incentives can be structured to address these aims.

A reward that isn’t worthwhile to the participant isn’t worth much.

The value of the reward should be paired with two factors: time invested by the participant and the level of complexity of the tasks you ask them to complete.

Beware of offering too lavish a reward.

This can trigger fraudulent actions, as in “I’ll say or do anything to get the prize.” An incentive program should NEVER compromise the integrity of the research.

Watch out for the redundancy problem.

Offering the same reward again and again can have a negative impact: participant boredom leading to lack of engagement.

Weigh the benefits of adding diverse incentives.

Are their ways to cater or customize the panel experience? Is your panel management system able to accommodate changes to the incentive package over time, as needs change?

You might, for instance, design a tiered system for qualifying and non-qualifying participants. Why should non-qualifiers be rewarded with a token gift too? Because today’s non-qualifier could be tomorrow’s qualifying participant. Retention is the name of the game. With typical conversion rates tending towards the low range of 10% to 15%, when you reward the non-qualifier, you help to avoid gaming of the system and incentivize honest repeat participation in the next survey.

Be flexible.

A good incentive program will have some flexibility built-in, such as tiered rewards that trigger at different levels depending on specified factors. The levels could consist of gift cards, merchandise, PayPal payments, charitable donations, games, and other exclusive benefits. The key is to match the reward with the panelist. One size does not fit all.

Consider delivering digital rewards by email.

Digital rewards have a couple of advantages: (1) the recipient gets immediate satisfaction (they can redeem it right away) and (2) you reduce overhead for inventory and fulfillment management.

Weigh the costs and benefits.

Tiered rewards can add cost but they really help to cement the bond to your most loyal panelists. Points-based rewards are a popular approach that can be cheaper than cash rewards.

Give them a choice.

Using the idea of “Reverse Preference”, you offer the panelist a choice of reward type other than the default option, and you might use this as a motivational factor for a targeting a particular demographic.

Can your technology handle what you need to do? 

You want the system to accommodate multiple projects and programs across different demographics at the same time, each with its own custom incentive approach. An integrated application programming interface (API) can automatically deliver rewards. Fast incentive fulfillment not only increases efficiencies, it keeps panelists happier. Make sure your panel management system is robust enough to handle the granularity of analytics you need, and is adaptable enough when needs mutate.

Measuring the Results.

The key to improving an incentive program is to test and adjust.

You should always be tracking and measuring respondent satisfaction, which can be gauged via satisfaction surveys, social media feedback, and helpdesk availability.

Doing this will show panelists that you are there for them, are interested in their feedback, and are willing to act to improve their experience with each iteration.

Measurement is necessary for another reason. To gain approval for an incentive program, you will need to demonstrate to management that you have the metrics to show a clear return on investment. Plan to show them the positive feedback loops between completion rates and satisfaction metrics.

With these considerations in mind, you can expect an improved rewards system that boosts acquisition rates, leads to greater participation, and secures higher retention rates.

Smart Survey Design: 3 Forgotten Pain Points to Avoid

“Smart Survey Design” is a loose term (bordering on a catch-all) that you’ve probably heard pitched to you.  Maybe you have used it yourself when piecing together a study.

It’s not a hollow term, by any means. Smart design has advantages for both designers and respondents. Designing “smart” simply means maintaining data integrity, both in terms of capturing statistically relevant data as well as reducing the amount of bad data caused by poor survey takers (straight liners, short responders for OE’s, speeders, cheaters, etc.).

That’s the basic idea, but there is one factor that often gets forgotten or ignored in a “Smart” design:  the respondent’s experience. You want your respondents to have a positive user experience, surveys with a human touch. They should feel good about taking the survey.

I’m not just talking about survey length or incentive, though those are certainly key tools in addressing the problem.  What I am referring to is the very way we talk to the respondent, the questions asked and how many times we ask that question.

It is easy for us as researchers to become so lost in our need for quality data that we forget the source of it—human beings. People are rational and emotional creatures. How do they feel about their participation?  It’s an important consideration, all too often ignored.

Identifying and avoiding potential pain points may not only help to reduce the number of scrubs and drop-outs, but also deliver better, more reliable data.

Have you ever been on a conference call where the speaker repeats the same point 5 times?  Did you like it?  Did you continue to pay attention or did you look at your phone or check your email?  Now imagine that same conference call. The speaker drones on with 4 more points that are roughly one hair’s width different from the original ones. Frustrating!

Plenty of studies out there get too repetitive in hopes of garnering nominal, ordinal, interval, and ratio data just to present the client with 4 different charts.  But you should ask yourself, how reliable are the opinions offered by a respondent that you have just bored and or annoyed?

Some repetition may be unavoidable, especially when you want to determine which of a group of stimuli is most attractive to your target, but you should not bludgeon the people who are meant to be helping you.

Pain point #2: Being too clever

“If you could be a tree, what tree would you be and why?”

This may be a good opener for your therapist to crawl around the workings and motivations of your mind, but some respondents may find such questions to be intrusive or something worse: “hogwash.”  They have signed up to take part in survey research, but they’re not lab rats!

We come back to the reliability question: how reliable is the data you are gathering if your respondent has been made uncomfortable and just wants to finish the ordeal and get out?

The prospect of getting “deeper data” out of your survey may be very alluring, but consider how appropriate those questions are for your audience.  Does a panelist really need to imagine their favorite restaurant as a spirit animal in order to tell you what their favorite sandwich is?

Pain Point #3: Being too “research-y”

While gathering data or even when trying to cut time off the length of interview in consideration for the respondents, questions might be presented impersonally or curtly. These rapid-fire “cold” questions, though absolutely focused, clear and concise, run the risk of boring a respondent into unintentional mental lethargy.

Questions can eliminate responders who have lost interest in your data set, but wouldn’t it be more beneficial to prevent the need for creating them in the first place?  You don’t have to write a narrative or tell a knock-knock joke to keep them engaged with the process.  Panelists are people. You should just remember to “speak” to them conversationally, instead of clinically prompting and probing for responses.

By being more aware of the respondent’s pain points and making a few tweaks to your surveys, you can improve completion rates, quality of open-ended responses and data integrity.  Better yet, it does all this without incurring any additional costs.