Featured

Defining Small Area Geographies with Radius Sampling

In this blog we look at different ways to define small area geographies. How can we define a trade area or location using radius sampling (defining a geography around a particular point at different size ranges)? It sounds simple enough. You pick a point and draw a circle around the location. Done, right? Not exactly. The issue is how you identify geography within the circle. How precisely can we get at those addresses?

You can look at census blocks or block groups, or you can look at zip codes. It all depends on how large the radius is. Typically, we use census blocks for 5, 8, and 10 mile diameters, and because the blocks tend to fit well around the edges, we only need to do some minor carving.

With smaller radii, it becomes more of a challenge, though. Census blocks or block groups tend to fall way out of that range, or we get under coverage because not enough of the blocks or block groups fit inside the circle.

In episode 7 of our Coffee Quip video series, subject matter experts David Malarek (Senior Vice President, Sampling & Database Services) and Dennis Dalbey (Manager, Geodemographic Services) demonstrate the two basic ways to define radius geography using census blocks within a 10-mile radius—area inclusion using polygons and block centroids.

For example, we can start with a 10-mile radius around a point. Then we overlay census blocks (or block groups). On the overlay all block groups that intersect or have some relationship to the 10-mile radius can be seen. Where the color overlay falls outside the radius, we can decide which block groups to keep in or out of the sample frame.

The Polygonal Approach

By using polygon geometry, we can apply area inclusion. We can determine or make a cut on the percentage of the polygon area within the radii. For instance, any block group where the geographic area is at least 50% or more within the radius can be selected. When this is mapped, you will find holes along the edges of the circle where the geography once existed but was cut out because the inclusion was under 50%. Note that no matter how you carve the fringe, there will always be some overstatement of the sample frame or some under coverage depending on how well the block group geography fits the radius.

The Block Centroid Approach

Another way to do this is to use the center point of a polygon to assign geography to a radius. It’s a different type of geometry being put to use. Instead of using polygons where we’ll make an inclusion, we are actually using what’s considered the center of the polygon—the census block centroid. This method will allow us to select blocks that are theoretically 50 percent or more within the radii without having to apply an inclusion like we did with the polygon earlier. It is a more efficient way of doing it, but keep in mind that we are making a 50 percent cut across the board. You will encounter tradeoffs such as introducing under coverage or over coverage here, too.

One of the benefits of using the block centroid is that we can vary the distance. Let’s say we are not really sure whether the 10-mile radius is going to meet your quota. We may want to overstate the geography at 20 miles. With block centroid we can apply the distance and go from 10 to 15 to 20 miles until we meet the population or household quota. Note that this can be applied to polygons as well, but it is much easier to do with the block centroid.

The Address Level Approach

Yet another way to do this is to plot all the known addresses that fall within the blocks or block groups touching the circle and exclude any address that falls outside the circle. This is the most accurate way of getting accurate household counts and eliminates under and over coverage. It’s a two-step process and a more involved methodology but it is also the most accurate. One downside to this approach is demographic data is only available at some larger aggregation of geography such as block group and not by individual addresses.

Dave and Dennis compare the polygon and block centroid methods, and why they sometimes yield different results. Sometimes we actually need to plot ABS address locations in the blocks within the radius and remove the ones outside the radius, so we get a better fit for the overall target population. This is really the best methodology in terms of making sure that households are completely within a radius without having to worry so much about the regular geography, where part of it is in and part out.

Find more details and visualizations of these methods, in our Coffee Quips #7 video. It is the first in a series of geodemographic coffee quips. For additional information on our geodemographic services click here.

Hybrid Sampling: Why a Blended Sampling Approach Is a Sensible Option

In an ideal survey research world, it is preferable to work with a single probability-based sample as it provides the best representation of the target population. In the real world, however, cost and feasibility often prohibit the luxury of using purely probability-based samples. This is where different sampling methods come into play to reduce cost and improve feasibility, especially those that rely on online panels. All in all, online sampling isn’t ideal, since such samples are void of “organic” representation. If you can’t get generalizable results from your surveys, then what’s the point?

A blended (hybrid) sampling approach can offer an effective and practical alternative, through which multiple frames are used for sample selection—oftentimes a combination of probability-based and convenience samples from online (opt-in) panels. Further, we might start with a fully probability-based sample from a telephone or address frame, but then tap into online panels to supplement what we get from the main probability sample.

Taking a hybrid sampling approach sounds well and good, but just because you’ve gone hybrid doesn’t necessarily equate to unbiased survey results. Sampling from online panels is always a little tricky because if you don’t know what you’re doing, you can end up taking a seemingly inexpensive sample component, mix it with your precious probability-based sample and end up with a poor combination.

Sure, theoretically it’s preferable to have all or most of the samples be probability-based, but they are expensive. At the same time, you don’t want samples from opt-in panels dwarfing your precious probability-based sample. As a general rule of thumb, something on the order of no more than 50% of your sample should be coming from opt-in panels. Keep in mind that budget and other factors may dictate a higher or lower contribution.

The selection of samples from opt-in panels needs to be carried out sensibly. Equally important is the way you blend the probability and nonprobability-based sample components to produce a single database capable of producing reliable conclusions. It’s a little bit like chemistry when different materials are tossed into the mix to produce an alloy with higher-level properties; you have to be measured about it and get the ratios down just right using correct weighting and calibration adjustments.

As response rates continue to decline into single digit territory, even with fully probability-based samples, geodemographic weighting of survey data becomes essential. This is proven true since nonresponses are always different in nature. However, this issue will magnify with hybrid sampling when part of the sample may come from opt-in panels. Hence, in addition to basic weighting, additional calibration adjustments become necessary as well. This means going beyond geodemographics and applying corrections based on attitudinal and behavioral characteristics to ensure respondent representation for their population.

If you are looking to enhance your phone or address-based surveys and supplement them with samples from online panels, survey research scientists at MSG have decades of knowhow and hands-on experience to support your hybrid sampling methods. Our experts can assist you with sample selection, survey administration and questionnaire design, as well as state-of-the-art weighting and calibration procedures. Additionally, we can support you with reporting and analysis of data from complex surveys.

To learn more about our hybrid sampling products and services, click here, or contact one of our specialists.

For a deeper dive, watch Episode 06 of our Coffee Quip YouTube series, wherein the panelists discuss the intricacies and benefits of Hybrid Sampling!

Smart Survey Design: 3 Forgotten Pain Points to Avoid

“Smart Survey Design” is a loose term (bordering on a catch-all) that you’ve probably heard pitched to you.  Maybe you have used it yourself when piecing together a study.

It’s not a hollow term, by any means. Smart design has advantages for both designers and respondents. Designing “smart” simply means maintaining data integrity, both in terms of capturing statistically relevant data as well as reducing the amount of bad data caused by poor survey takers (straight liners, short responders for OE’s, speeders, cheaters, etc.). Continue reading “Smart Survey Design: 3 Forgotten Pain Points to Avoid”

Remembering Dale Kulp

On what would have been Dale Kulp’s 66th birthday, we wanted to take a moment to remember a man who not only is responsible for the founding and creation of Marketing Systems Group but made innumerable contributions to the statistical sampling and survey research fields.  Dale’s career was already flush with accomplishment before he founded Marketing Systems Group in 1987.  He previously had worked for industry stalwarts Chilton, Bruskin and ICR (Now SSRS).  With MSG, he envisioned the development of a PC based in-house RDD sample generation system (GENESYS) that would become the cornerstone product of the company.

Aside from being the driving force behind the industry’s first in house sampling system, Dale was integral in the development of list-assisted RDD sampling methodology at a commercial level, which revolutionized the process for reaching probability-based samples of households. Through his many technical notes and various publications he remained vigilant about addressing the operational issues challenging the viability of this methodology, particularly those resulting from the unfolding changes in the US telephony.

Dale also started several Omnibus telephone surveys that not only continue to thrive 20 years after their launch; they have in at least one situation created their company.  Centris Marketing Science was created by Dale along with Paul Rappaport after realizing the value of the census block level data that the Omnibus survey collected.

Realizing that MSG should not be a one-trick pony, Dale continued to pursue other product lines that would benefit the survey research industry.  He assembled a team that included current MSG President Jerry Oberkofler and Vice President Reggie Blackman to develop the first automated screening process: GENESYS-ID.  Utilizing the technology and philosophy of GENESYS ID and applying it the survey research industry, PRO-T-S was born.  PRO-T-S was the first predictive dialer built exclusively for the research industry.  In 2004, Dale brought the ARCS Panel Management software under the MSG umbrella. ARCS is now one of the leading software packages in the sensory and pharmaceutical industries as well as a recruitment tool for large civic organizations.

Since Dale’s passing in late 2009, MSG has grown substantially but has remained attached to the vision, products and protocols that Dale Kulp laid out back in 1987.  Not only do the MSG folks wish Dale a Happy Birthday but we thank him for his vision, contribution and foresight.

TCPA Compliance

My name is Tim Antoniewicz and I am not a lawyer.  I am not legal counsel.  I once played Clarence Darrow in a junior High School production of Inherit the Wind but what I am about to say should in no way be considered legal advice.  However it may be considered helpful in sorting out the quandary that many researchers face when conducting research with a cellular sample frame.

In light of the new FCC regulations (6/18/15) that expand the TCPA, CASRO has provided the following guidelines:

The Federal Communications Commission approved new regulations that expand the Telephone Consumer Protection Act (TCPA). The new FCC rules broaden restrictions on autodialed calls to cell phones without differentiation for caller intent. 

ALL calls to cell phones made using an auto-dialer are PROHIBITED.

DO:

ALL calls to cell phones should be manually dialed.

Regularly update number databases to identify numbers ported to cell phones.

Don’t:

Use a predictive dialer OR an auto-dialer to call cell phone numbers.

Assume that human proximity to, or intervention in, the placement of an autodialed call to a cell phone provides exemption from the TCPA.

Staying educated and taking proactive measures are the key to compliance with the law. Here are some high-level guidelines to follow:

  • Mitigate the risk by verifying the types of phones numbers on your list
  • Get the consent of the current wireless subscribers to a number
  • Be aware of ethical considerations including respondent safety and privacy.

Marketing Systems Group can assist you in flagging likely wireless numbers, identifying ported landline-to-wireless numbers and real-time screening of active cell phone numbers.

MSG is committed to helping you navigate the ever changing regulation landscape from the TCPA and remain in compliance while accomplishing your research goals.

For more information about TCPA compliance, see http://www.tcpacompliance.us/

MSG in a Bottle

Welcome to our brand new blog for customers and industry observers. We’re calling it MSG in a Bottle, and I know you’re going to love following it.

One of the best things about my job as president of Marketing Systems Group is the opportunity to work with an inspiring, committed team of professionals. They constitute a brain trust of talent and experience. Their collective market research expertise and dedication to quality truly makes a difference across our entire product line — GENESYS® sampling system , PRO-T-S® dialer software and ARCS® all-in-one panel manager.

Since 1987, we’ve been delivering innovative solutions to the survey research community, and our staff continues to do amazing things, year over year.

I’m proud of these professionals. They are what make our products great. We learn from each other every day.

That’s all well and good for a company president, you might be saying, but what about me?

That’s why we’ve started this blog. We want you to benefit from the collective wisdom I see in action every day at MSG. I’ve asked this talented team of pros to share their insights and expertise with you.

At the MSG in a Bottle blog you will get our expert analysis of hot industry trends, stay informed with news on the survey research industry and the latest standards updates, learn how we’re positioning our products to meet customers’ needs, and get practical advice and tips on how best to use our products. Think of it as “news you can use.” Also, we hope to have some fun along the way, too. We want you to get to know us better. And we want to hear from you too. You’ll be able to join the discussion and share your feedback and suggestions via the comments section after each blog post.

I’m excited about this new channel for reaching our customers and the survey research community, and I’m confident that our media and marketing team will keep you up-to-date on the survey research industry topics that truly matter. We hope you’ll bookmark us and stop in frequently. New pieces will appear on a bi-monthly basis. Or better yet subscribe/follow us here, and never miss a new post when it arrives.

Thanks and happy reading,

Jerry Oberkofler