New Census Data Available for Computer Ownership and Internet Subscription.

The United States Census has long been a treasure trove of data for market researchers, and the riches have just gotten more rewarding. It now offers data regarding computer usage and internet access.

On December 6, 2018, the United States Census Bureau released its Summary File for the 2013-2017 American Community Survey (ACS) Five Year Estimates. For the first time this data product contains tables for computer ownership and internet subscription. The ACS assists government, community leaders, and companies in understanding how their communities are undergoing change. It contains a wealth of information on U.S. population and housing. The new Five Year Estimates for computer use are further broken down by estimated characteristics such as household income, age, educational attainment, and labor force status.


To fully appreciate the significance and importance of this release, you have to go back 10 years. In 2008 Congress enacted the Broadband Data Improvement Act, with a goal to identify geographical areas of the country that did not have broadband services. Legislators were hoping to promote deployment of services within underserved areas and in addition, bring affordable services to all areas of the country.

In 2013 the Census started asking questions concerning computer and Internet use in its ongoing American Community Survey (ACS).  Each year the ACS randomly samples approximately 3.5 million addresses, and the information from that survey is released each year in two distinct datasets:

  • One Year Summary File (SF)
  • Five Year Summary File (SF)

The key difference between the datasets is that the Five Year SF is backed by five years of respondents and thus includes estimates down to a very detailed Census Block Group geography. Block Groups are statistical divisions of census tracts that are defined to contain a minimum of 600 people or 240 housing units and a maximum of 3000 people or 1200 housing units. The One Year SF, created from one year of respondents, only includes estimates for large geographies with a population greater than 65,000. Examples of large geographies are census regions and census divisions, individual states, and metropolitan areas (which are groups of cities and surrounding counties).  Incidentally, there are 501 metro areas with populations greater than 65,000.

One Year Summary File Drilldown

Below is an example of the level of detail that can be produced from the One Year SF, looking at Presence of Internet Subscriptions in US Households.

2017 American Community Survey One Year Estimates – Presence of Internet Subscriptions in Households
  United States
Estimate Percentage
Total US Households: 120,062,818
With an Internet subscription 100,662,676 83.84%
Internet access without a subscription 3,395,581 2.83%
No Internet access 16,004,561 13.33%

We see that there are 19,400,142 households (16.16%) in the U.S. that have no Internet subscription or no Internet access, and as the map shows, the highest percentage of no subscription households tend to be concentrated in Appalachia and deep south.

The Five Year Survey

With the 2013-2017 Five Year current release, the ACS has surveyed more than 17.5 million addresses, which is enough to accurately provide estimates down to a detailed level of geography.

However, there is a catch. Unfortunately, just prior to this release the Census announced the removal of the Block Group estimates and it is unclear at this point whether the Block Group estimates will be available before the next release, which is scheduled for December 2019.  This means that currently the lowest level of geography available is Census Tract. A census tract is an area roughly equal to a neighborhood.  Census tracts are smaller than a city, but larger than a block group and generally have a population size between 1,200 and 8,000 people. There are 73,056 Tracts in the US 50 State (+DC) geography.

Below is an example showing how census tract level geography helps to pinpoint particular target households. This analysis can be valuable because it allows users to target specific detailed areas.  In addition, clusters or groups of neighboring areas that have similar characteristics can then be used to define a study area.

New Data Available at Finer Geographic Levels

For the first time, the Five Year summary file offers new tables and categories which are available at detailed geographic levels. Click on the link to find out more information.

One key website containing census and demographic data is the American Fact Finder.  This site contains an abundance of information but can be tricky to navigate, manipulate, and comprehend.  With over 35 years of combined experience, MSG’s Geo-Demographic team are experts at working with this data and are here for your project needs.

Visit the Resource Center on the MSG website for National estimates of the new data categories.

Explore our geo-demographic capabilities at

Resources and links used in this article:

The ACS 5-Year Estimates.
Broadband Data Improvement Act.
American Community Survey (ACS).
Computer and Internet Use in the United States: 2013.
ACS Webinar.
American FactFinder.

Stay in compliance with ever-changing governmental regulations.

Alert! Alert! Alert!

In this new age of “always on” technology and communication, it seems like something is always vying for our attention. We get so may alerts. Weather alerts, traffic alerts, health alerts, vehicle recall alerts, food safety alerts. I could go on and on.  Our first tendency might be to get a little irritated about all of these alerts, as they interrupt our daily flow and can produce anxiety. But then again, think of why we are receiving the alerts in the first place. They are there for our benefit—the safety and security of ourselves, our family and our neighbors. By getting that important information out to us quickly, we can act immediately and take the necessary steps to either be prepared or be protected.

When we use alerts in the context of handling personal data, the benefits are the same. We are vigilant, looking out for the safety and security of the data collected on our panelists.  The safety and security of data is a major concern for many countries around the world. That is why regulations such as the Federal Information Security Management Act (FISMA), Health Insurance Portability and Accountability Act (HIPAA) and General Data Protection Regulation (GDPR) have been put in place. They help to ensure that personal data is being handled properly.

Stay alert to panelist data changes

Having the ability to receive alerts on the system used to store and/or collect data can be an essential tool.  An alert system can perform two kinds of notifications:

  • Inform proper personnel of changes that have been made in the data
  • Alert personnel of the need to make changes to the data.

Alerts can help you to make necessary changes in an expedited time frame, thus complying with on-going regulations.

What are some examples of alerts that can prove beneficial?

Let’s take a look at a few instances where alerts would provide helpful information.

Example: A panelist wants to be removed from the database.

Let’s say a request comes in from a panelist who no longer wishes to receive invitations to participate in studies, screeners or research. The request can come from the panelist portal, from the invitation or from within the questionnaire.  Whatever the source, an alert can be sent to the proper personnel to make sure that the panelist is removed. By receiving an alert, the removal can be handled quickly, therefore limiting the chances of the panelist becoming frustrated by receiving additional notifications.

Example: A panelist is not able to participate in research for a period of time (extended travel, expectant mothers, temporary health conditions, etc.).

Sometimes a panelist needs a little time off. They can inform you of the start date for the period of non-participation.  During that interval, the panelist record can be set to inactive. They will no longer receive notifications of available research studies. An alert can be created for the proper internal personnel at the resume date so the panelist record can now be set to active status. At this time, they would again get notificatons to participate in any available research.

Example: While filling out a questionnaire or screener, the panelist indicates that they would like to be contacted.

Companies always want to know if their consumers are happy. To ensure that any perceived issues are handled quickly, a question can be added to a questionnaire or screener asking the panelist if they would like to be contacted.  Additional information can also be collected as to the nature of why they would like to be contacted. Once this information is collected, an alert can be generated that informs the proper internal personnel of not only the request to be contacted but also the reason. The alert allows companies to respond to any issues or requests for contact and handle them without any hesitation.

These are just a few possible suggestions showing how alerts could be a great benefit. Having the ability to create custom alerts based on the particular operations of your organization can provide innumerable benefits to the company, department and panel members. The ARCS system, for example, allows you to define custom alerts that match your organizational needs. Contact one of our ARCS specialists today to discuss how to setup specific data alerts.

Keep in mind that alerts should be used judiciously, only when something really calls for immediate attention and action. When used for truly critical notifications, alerts will help your organization to stay in compliance with ever-changing governmental regulations.

County Level Cell Phone Only Estimates

Probability based telephone surveys must utilize a dual frame approach in order to capture the ever increasing cell phone only population.   Until the day comes where it’ll be a single frame approach of only cellular numbers, researchers need to ensure they get the appropriate blend of cell only vs. dual phone users in their sampling allocations.

Marketing Systems Group (MSG) produces quarterly estimates of Cell-Phone Only (CPO) rates for every county in the country[1]. These rates are based on telephone households (as opposed to occupied housing units) and lag one quarter. A Cell-Phone only household is defined as one in which there is no operational landline telephone and cell phones are used exclusively to make and receive calls.  The CPO estimates are derived by using a combination of survey-based information along with large commercial and administrative data sources.  This triangulation approach provides us the ability to create CPO estimates at the state and county levels.  Moreover, the primary sources used to create these estimates are updated on a continuous basis.  This enables us to create updated CPO estimates each quarter.

MSG is currently the only source of county level CPO estimates available.  Having this granular level of information can benefit survey planning and weight computation/calibration at the sub-state level.  We analyzed our CPO estimates at the county level and observed that many states have big differences in the CPO rate from county to county (see Figure 1). This could have a big impact on studies that infer a given county level CPO rate using the state level estimate.

Figure 1:  CPO variations among counties (minimum vs. maximum) within state compared to the state level estimates.

One limitation of utilizing administrative data in our triangulation methodology is that it does not take into account behavior patterns of dual phone households.  Dual phone households are those that have both land line and cellular numbers but only make and take calls on their cell phones.  With that being said, these estimates are the closest approximations at the county level available.

Whether it be for statistical or productivity reasons, rely on MSG’s CPO estimates as a criterion to obtain the appropriate mix of land line vs. cellular numbers for sub-state dual frame RDD surveys and for weighting calculations.

[1] A Total of 32 counties were combined with other counties or removed due to lack of information.

Access & Control – Just Like the Family Fridge

Complying with the new GDPR rules means giving panelists more access to their data

When I was growing up, just about everything my family or someone watching us needed to know would be affixed to our refrigerator with tape or magnets. This included a calendar of events, important phone numbers, report cards, receipts, images, to-do lists and more. The fridge was the central repository for upcoming events for our family.

If you wanted to see what was going on in our lives, first you needed to be invited into our home (or have a key to gain access). Only trusted friends, relatives or service providers could get in and see the refrigerator to learn what we were up to.

Just as access to the family fridge was limited, the European Union General Data Protection Regulation (GDPR) has been designed to enhance an individual’s control over their data and restrict outside access.  Now, allow us to read your rights!  You have the right to be informed when your data is being processed, the right to access your data and confirm its lawful processing. You have the right to be forgotten, the right to data portability, rectification, objection to direct marketing, restriction of processing personal data, and safeguards against AI related decision making. One of the primary aims of GDPR is to give an individual total control of their data, and organizations with access must comply with the demands.

In ARCS we have something a lot like that family refrigerator. We call it the Panelist Portal. This is the individualized home page within ARCS for each member of your participant panel. The Portal gives users control over their core data (along with the ability to update this stored data). Users can also opt out, and all can be done within a single system, complying perfectly with GDPR.

Once someone is invited to join your participant database, they are given a unique “key” (ex: user name and configurable password which you will have the ability to control). This is the place where a panelist can make changes to their name and personalized password.

When my parents would go away, they would leave their itinerary and special instructions on the refrigerator. In the same way, you can post privacy policies, NDA agreements and other information that panel members might need to see.

Let’s say you have someone in your database who must “accept” your terms before being allowed to participate in your research studies.  You can provide the documentation, instructions, and mechanisms for them to read and acknowledge. This could be for the original acceptance or a change in terms that requires database members to acknowledge and confirm agreement with the new language.

Within the Panelist Portal, your database members have access to many important pieces of information about themselves, their history, and their upcoming research study schedule.  This information, referred to as participant data, is organized into two areas:

  • Core data. This includes items such as name, age, birthdate, address, email address, phone number, preferred contact method, household makeup, and more.
  • Attributes or custom data points. ARCS allows you to create, ask and track unlimited questions about particular panel members. You can then query on those custom attributes and data points. Some examples could be product usage, demographic information such as education, salary, marital status, and more.

The ability to view and update PII and sensitive data is critical to GDPR compliance. Using the Panelist Portal, your database members can access selected data fields and update these attributes themselves, as their product, brand and usage change over time. This will ensure that you have accurate and up to date information, which will help you invite and qualify the right panel members for your studies. This is also where your panel members can complete any necessary required paperwork (such as NDA forms). All of this information is date and time stamped as well as trackable.

All of the above capabilities are presented in one place, and just like the family fridge the Panelist Portal provides centralized visibility, auditing and tracking.

By giving database members more control and visibility into their data, you will be compliant with the applicable GDPR requirements, protecting yourself and protecting your most important asset, the participants. With greater access and control, they are likely to feel more comfortable with your organization. This can then lead to referrals of additional family members and friends.

Breaking up shouldn’t be hard to do

Lastly, GDPR compliance asserts the participant’s right to be forgotten. They may ask that their data be wiped, either completely or partially. Your participant engagement process needs to: (a) permit such a request, (b) quickly respond to the request and (c) identify the user and types of data to be eliminated.

Key Questions

What types of controls and tools does your participant engagement process have to handle these items? Do panel members constantly need to call your staff to update their information?  Would you like to have the visibility and controls to meet the ever-changing data protection needs your participants deserve and meet new regulations like GDPR?

Call our ARCS specialists today to discuss your unique research participant needs.  

Verifying Important Data Points

I trust my kids. No really, I do. My oldest is at the age where she spends a lot of time with her friends—getting picked up to go shopping, going out to eat, hanging out at parties, sleepovers—in all cases, she is not at home and not with me. Like I said, I do trust her, but for the sake of my own sanity (and of course her safety), I have taken the necessary steps to verify that she is doing what she says she is doing.

Yes, I take a quick look at the cell phone GPS while she is out, I text to make sure her location matches, and I call her friends’ parents to verify that she is spending the night. Not every time she goes out, but enough. Just so you don’t think I’m snooping on her unawares, my daughter understands perfectly well that I am verifying all these details. I want her to know, and it is good that she knows. Not that she is looking to intentionally deceive, but honestly, weren’t we all kids once? We know the script. Just knowing that someone is checking in—well, that keeps everyone honest. I hope, it also brings her a sense of security, too.

Data is no different, really. In particular, I mean data collected on panelists, participants or respondents.  No, of course you won’t be calling your panelists’ parents or tracking them on GPS, but certain information can and should be verified. Data verification is a key step to GDPR compliance, too.

Data verification sounds well and good, and GDPR expects it, but you might have lingering questions: Why does this data need to be verified? What information should I verify? What kind of verification should be performed with the collected data?

Why does data need to be verified?

First off, it’s a quality issue. You want quality data, reliable data, data you can use. Verification ensures that the information collected and/or stored is accurate. This is especially beneficial because the data will no doubt be used for projections, direction or insights.  Inaccurate data can have severely negative effects and possibly cost implications.

Secondly, it’s a legal issue. Data needs to be verified because of laws, rules or regulations.  For example, if data on minors is being stored, systems need to be in place to verify individuals’ ages or verifying that parental / guardian consent has been obtained. Regulations such as GDPR compliance have specific rules designed to ensure accurate verified data is being stored in the database.

What information should I verify?

This is completely up to any internal quality control procedures in place for your operations.  Many organizations verify and update demographic data every time they come into contact with panelists in their database.  Others have a schedule of verifying data at regular intervals. Either method is acceptable, so long as the procedure is followed carefully. Some major demographic data points to verify are:

  • Gender
  • Age
  • Income
  • Education
  • Ethnicity

Actually, any data critical to the company or fielded projects should be verified.  Some verify geographic data, psychographic data and behavioral habits in addition to demographic attributes—all with the purpose of having the most accurate data available at their fingertips.

A word of caution related to psychographic and behavioral attributes; purchase habits change quickly, loyalties adjust and specific usage can fluctuate greatly. Keep these points in mind when collecting or storing psychographic and behavioral data as the shelf-life of the information is very short and would need to be verified at a greater frequency.

What kind of verification should be performed on my data?

Just as there are multiple ways to check on the whereabouts of your children, there are multiple ways to verify panelists’ information. The software package that you use to store or collect data should have the tools readily available to verify any pertinent data. At MSG, we’ve built the ARCS panel management system to allow for custom-defined rules to perform data verification that matches your organizational procedures. Here are a few methods that a software package should feature:

  • Verify key data points before the panelist is entered into the database.
  • Define specific procedures as to exactly what data needs to be verified and how often.
  • Streamline the verification process by updating data each time a panelist is contacted or send specialized surveys to update needed information at regular intervals.

I know you trust your panelists just like I trust my kids, but it doesn’t hurt that they know you are checking in on them. Verifying important data points is not only a wise move for your company, but it is increasingly necessary when it comes to government regulations like GDPR. They are expecting you to take care of this business anyway. It’s an all-around good idea.

Understanding Authentication & Authorization

Learn the Magic Words: Authentication & Authorization

“Open Sesame!” “Abracadabra!” In fairy tales, magic words like these are invoked before the treasure chest unlocks and doors open. In the software realm, we don’t call it “magic”, we call it “authentication” and “authorization”. Authentication is needed to protect access to precious data. Authorization tells you what you can do once you are inside.

My mom taught me the magic of access protection in my early childhood. As a kid, I wanted to grow up and shop on my own. So one fine day she let me go to the grocery store and pick up the items on her shopping list. I learned how the process worked:

  • She gave me the customer identification card.
  • She gave me the list of items she needed.
  • She also sent the list to the store and paid for it in advance.
  • When I handed over the customer identification card to the billing clerk, he confirmed that I was authenticated to purchase for my mom.
  • Her list (sent before I got there) authorized what I could purchase.

Today, in the research world, we come in contact with large groups of people who are providing us various data points about themselves, their families, likes, dislikes, medical information, etc. Every time I look at a data extract, I remind myself that it is very important to take all necessary precautions to protect this data. To ensure it is protected, you need to authenticate identity and authorize users who have access to the data.

The two terms are frequently used interchangeably in conversation, and no doubt they are both tightly associated. Together they are key pieces in protecting data. But the two are really different concepts, often completely divorced from each other. Authentication is the process where by an individual’s identity is confirmed. Authorization is the association of that identity with rights and permissions.

What is Authentication?

A user needs access to data. The software offers a challenge. Identify yourself, say the magic words, and you get in. The software not only needs to know who the user is, it needs to track the changes the user makes to the data.

Once you have an authentication mechanism built, you also want to start thinking of strengthening the mechanism. Some of the things to consider are:

  • Password policy – Which administrators are allowed to set strong policies for the password, and what policies (characters, numbers, case sensitivity) should be enforced?
  • Expiration Policy – How frequently should the password be changed?
  • SSO (Single Sign on) – Should you integrate your systems with a single sign on system, so users do not have to manage passwords for multiple applications?
  • Two-factor Authentication – Should you allow users to have another mechanism (SMS, Email) to receive a soft token needed, in addition to their username and password, to authenticate into the system?

Selecting the appropriate authentication policy and mechanism is a vital first step towards securing the privacy of your participants.

What is Authorization?

Once “Abracadabra” opens the door (authentication grants a user access), Authorization lets you control what is available to the user within the framework (like my mother’s shopping list). Authorization is the function that enables security system administrators to specify user (operator) access rights and privileges. Specifically, administrators restrict the scope of activity by:

  • Giving access rights to groups or individuals for resources, data, or applications.
  • Defining what users can do with these resources.

Authorization is usually implemented through the following elements:

Privileges. Privileges grant access to specific operations. For instance, administrators have the privilege to create or disable other user accounts, while normal users will only be granted the privilege to change their own password and profile information.

Access Roles. A role-based authorization framework should let the administrators govern what level of access an authenticated user may have on the data. This is usually established by feature / data specific rights (read only, can edit, invisible) that are set to users of the system.

Under normal circumstances, an authenticated user is allowed to perform all operations they’re authorized to do. In my case, after showing the store card to the clerk, I could purchase items on the list (and only those items). However, when a user wishes to access a specifically sensitive resource or operation, additional steps must be taken to authorize the request. For instance, when users want to perform a payment, they will be asked to reenter their credentials, or basically repeat the authentication process.

ARCS uses industry standard protocols to create a secure environment to store and protect your participants’ data. With simplified user management, you can focus on your business and let ARCS better manage and protect your participants.

Facing the regulatory challenge of Centralized Data

In the new world of the European Union’s General Data Protection Regulation (GDPR), organizations that process or store information on EU Data Subjects must comply with new uniform data privacy requirements. Did you know that GDPR requires all information you hold to be centralized? In this article we focus on centralized data, the looming challenges we face, and the solutions to them.

Many research departments and companies struggle to keep all of their data in one central location. Beyond this challenge, there is the fact that sometimes the only way to accomplish project tasks is to use different software packages – each requiring specific and unique skills to manage them.

For example, companies might have information like panelist data residing in one software package, while the ability to collect data from those panelists is performed in a completely different package.

When you add the challenges of securely storing Personal Individual Information (PII), tracking participation, managing incentives and sharing data, an organization could be using three, four or five separate software platforms.

As if that weren’t problem enough, there is one more challenge – increasing industry and government regulations like GDPR are now requiring any information that you hold to be centralized. With all of the disparate software required to complete your projects, the challenges can seem daunting.

What steps can you take to mitigate some of these challenges?

Find a single software platform that covers most, if not all, requirements to meet not only industry and government regulations, but also to streamline internal processes. By unifying on one platform, you will save time, costs and resources related to a number of the challenges mentioned previously.

Arm yourself with the proper questions to ask any software provider. Here are some must ask questions:

  • Is all of the collected data stored in a secure and centralized database?
  • Does the data have the ability to be searched and shared across different tasks? For instance, can data collected via a questionnaire automatically update in the specific panelist record?
  • Are all panelist data, PII, participation data, incentives and questionnaire responses stored in a central location?
  • Can the software track how data was collected or changed within the database?
  • Can the software produce information for auditing purposes to assure regulation compliance?

These are just a few of the many questions to be asked when evaluating a software platform. By contacting a representative at Marketing Systems Group, we can partner to identify the specific areas that most need shoring up in your organization.

Our team can also assist in formulating the many questions to investigate while searching for the best software platform for your company. Let the experts at Marketing System Group help you navigate the difficult and ever changing regulation landscape.

Finding Your Niche: Specialization and the Future of Market Research Firms

Trend watchers no doubt have noticed that traditional players in the market research space are having an increasingly hard time keeping up and scaling their business practices with changing times. In a recent article, analysts have reported a flat growth rate of 2% in the market research industry. This is not good, especially when you factor in inflation.

Perhaps the highest impact change involves the digitally-savvy consumer who demands more data-driven analysis and quicker results. Traditional market research firms can be slow to adapt. Are there ways for the old dog to learn some new tricks that can help to evolve a firm and get it moving towards positive growth?

Sure. Some experts suggest that specialization might be the answer.

Think about the niche your firm’s business fits best. What is it that you do uniquely in terms of data delivery or technology? The firms bucking the flat growth trendline are the ones offering expertise within a domain along with strong data inventories that set them apart.

So what do we really mean by specialization? In a market research context, it might mean improvements with infrastructure, speed, size, creative solutions, or methodology.

Identify one or two core competencies and focus on them. This is your niche, the “special sauce,” the unique value your firm adds and the substance you explain to prospective clients: here is why they should hire you. The goal is to become known for something. This is the niche, your “word on the street” reputation.

Fully embrace new technologies. Data-hungry, digitally-savvy clients are looking for speed and quality results. They expect all decisions to be backstopped with data and research. Evidence-based criteria is king. Is your firm moving beyond traditional data collection and analysis? Are you collecting user “experiences” and offering support for impact measurement and taking action? Consider investments in services geared towards high-end analytics and customized tools that deliver data and analysis quicker than ever.

Alongside the new technology, realize the importance of expert advice. Guidance is where your domain expertise comes into play, and it shouldn’t be underestimated as a selling point. Odds are that clients have already researched the vendor space before they ever pick up the phone and probably have a sense of what your expertise is already (if not, take the opportunity to inform them). This means your main objective should be assuring them that yes, you have the expert know-how to serve the client’s needs.

For client needs outside your “competency zone”, considering turning to acquisitions and partnerships.

Acquisitions. If you can’t beat them, buy them. This can work for a large firm, but medium-sized firms are having trouble competing, because they may not have the capital to acquire niche firms. They rely on in-house research instead of partnerships.

Strategic partnerships. Competition isn’t just occurring at the level of operations, services, and pricing. Can you perform the turnaround faster? Sometimes the only way to meet those goals is to establish strategic partnerships in any number of areas like setup, automation, or infrastructure, to name a few.

To recap, the specialized market research firm becomes great at one or two core competencies, is willing to embrace new technology, makes acquisitions and forms strategic partnerships when needed. Flexibility and agility are key.

Keep in mind that being nimble is no guarantee of success, however. Being agile means you are primarily in reactive mode. Why not get proactive? The growth-based firms are the ones who are innovating. Prepare your firm for innovation by incubating your core competencies, which should lead to a creative problem-solving approach. Think more about the kinds of problems you can solve for your clients and less about the categorical label. Remember, you are in the solutions business.

What to Expect from the New European Data Protection Regulations

D-Day is coming to Europe next spring, and no, we’re not talking about World War II. For us in the here and now, the “D” in D-Day stands for Data. In May 2018, new data protection regulations will take effect in the EU, and the impact on businesses and consumers will be enormous.

The European Union General Data Protection Regulations (GDPR) have not been updated since 1995. Twenty years is a lifetime in the world of technology. So much has changed. Most of our lives are intertwined with technology now, and our digital alter-ego (data profiles of who we are and what we do) is living somewhere in the cloud, traveling the earth in milliseconds. Amid the looming chaos of privacy exploitation and hacking, consumers are justifiably concerned and doubtful. Does privacy even exist anymore? Is anything secure? People want their privacy protected. They want to trust that their data is secure, but at the same time they want the convenience of personalized consumption and instant access. It’s a tough balance to strike.

Fundamentally, the goals of GDPR are to reassert individual privacy rights, foster a more robust EU internal market, strengthen law enforcement, streamline international transfers of personal data, and unify global data protection standards. The new data protection regulations will consist of a two-part implementation. The first part is the General Data Protection Regulation itself, the new rules.  The second part involves the enforcement arm, a Data Protection Directive for police and criminal justice entities.

According to public information released by the European Commission, we are going to see some interesting outcomes from GDPR.

Privacy rights make a comeback. The privacy regulation aims to improve individual’s rights to virtually “be forgotten” When they don’t want their data held anymore, it must be deleted (with exceptions: data may be retained for contractual or legal compliance reasons until no longer needed).  Individuals’ access to their personal data will be easier to obtain. They will have a right to port their data between different providers and the right to be notified when their data has been breached. In addition, companies must inform the authorities which accounts were hacked in a timely fashion.

European Commission says “Data protection by design and default” will become the norm. Products and services must be safeguarded via built-in data protection. Privacy will become the primary focus and could lead to new business innovations.  This includes new techniques for data encryption, removing personal data identification from data sets, and replacing PII fields in data records with artificial identifiers.  All of these could restore trust between individuals and companies holding their data by limiting exposure.

Costs. Yes, it will require investment to upgrade apps and services, but the tangible and intangible payoffs of compliance with the new regulations are real.  According to estimates, Europeans’ personal data value could be worth upwards of €1 trillion by 2020. With stronger data protection regulations in place, opportunities will grow.

Streamlined regulations. There are currently 28 separate laws on data protection that are incoherent and unwieldly.  The plan is to have these 28 individual laws consolidated into one.  Estimated savings for companies and organizations could be as much as €2.3 billion per year. After the new data protection regulations take effect, companies will deal with one single supervisory authority only, making it easier to do business in the EU. This will level the playing field by applying the same rules for all companies – regardless of size or location. Companies outside of Europe must follow the same rules when doing business in the EU.

Negative reinforcement: Be prepared or pay up! The EU is expecting merchants to be more responsible for protecting customer data. Those who experience data breaches will face severe sanctions. Beginning May 25, 2018, the EU will impose heavy fines levied as a percentage of revenue on companies violating the GDPR rules.  Smaller companies doing business in the EU may be unaware how soon these regulations are coming online. Liability is an obvious concern, so active steps to achieve compliance must be taken. Products and infrastructure must be reviewed and updated. A sustainable cyber security program must be in place. The cost of compliance must be accounted for, and the ROI should initially be measured against the preparedness and protection from fines and liability. In the long term, as mentioned above, the new regulations could result in a more level playing field and increased business opportunities.

Organizations should act now:

  • Review and analyze the GDPR. Seek advice. Leave nothing to chance. Learn the precise meaning of “personal data”.
  • Update your documentation for personal information and security practices. Update policies and procedures for breaches, incident reports and risk assessments. Review all relevant contract and agreement language
  • Figure out how to best mitigate risks of noncompliance.

For more information about the European Union General Data Protection Regulations (GDPR) check out this European Commission website, with press releases, questions and answers, factsheets, legislative texts, the current legal framework, and public opinion surveys.

Quality Starts with Survey Design: Tips for Better Surveys

Marketing researchers are all facing two important challenges to data quality. First is the question of representativeness: with response rates plummeting, we need to make surveys shorter, more engaging, and easier for respondents to complete. Second is the issue of data accuracy: we must make sure that survey questions measure what we think they measure.

If you want to make surveys more accurate and representative, it all comes down to survey design. You need to think carefully about the survey design and how quality is expressed and impacted throughout all phases of the research project. When you get in the habit of thinking about quality at all phases of a study—from design to implementation to analysis of results and feedback—the payoff will be clear.

First Steps First

It sounds obvious, but the first quality check is to take your survey. Clients, researchers, analysts—everybody on board should complete the survey. Be sure to ask some people who are not familiar with the project to complete it as well. How does it feel to be on the other side? Talk through the questionnaire as a group. Look for areas that need more focus or clarification. Seek recommendations to improve the survey. Encourage the group to advocate for changes and explain why the changes are important. And be sure to use a variety of devices and operating systems to understand how the survey performs in different situations.

Conduct a pretest to get feedback from respondents. You don’t have to complete many pretest surveys, but you should have at least a few “real,” qualified respondents complete the survey. Additionally, they should be answering questions about the survey’s design properties, ease of use, and any other issues they had with the survey. By all means, use survey engagement tools when feasible, but don’t fall into the trap of letting the droids rule the analysis. You need the human touch from real respondents, as well. (Don’t forget to clear the pretest respondents before you fully launch the survey. Or you can clear the responses or filter them out of the final results.)

Use technology to test for data quality. A computer application is great at metrics, scoring and summarizing responses. It can measure survey engagement by tracking rates of abandonment and speeding and measure experience quality via respondent ratings. The average length of time to complete the survey is also a key metric. Use technology as a predictive tool before launching the survey to evaluate engagement levels and suggest improvements.

Frequent Challenges

As you get in the habit of performing quality checks, be on the lookout for these common issues that will lead you to improve your survey design:

Is the survey user-friendly?

  • Beware of “survey fatigue.” Split long surveys into many short pages.
  • Make survey language more consumer-friendly and conversational and less “research-y.”
  • Does the language used on buttons and error messages match the survey language?
  • Validate questions for the correct data type and tie validation to relevant error messaging that tells how to fix the response.
  • Use a progress indicator to show how far the respondent is from completion.

Does the survey flow?

  • Improve the logical flow of the questions and watch out for redundancies.
  • Make sure the question type matches what you are looking for. Close-ended questions are ideal for analysis and filtering purposes.
  • Test your logical paths. When designing page skips, you don’t want unexpected branching to happen.
  • Use required questions for “must get” answers, so the respondent can’t move on without completing them. Be careful about making too many questions required, however, as respondents can become frustrated and break-off before completing the survey.

Is your survey design mobile-capable? While 40% of all survey responses are completed on a mobile device, a recent study reported that half of surveys are not mobile-capable, much less mobile optimized. Design your survey to work on mobile devices:


  • Make sure that short page questions fit on the screen.
  • Minimize scrolling whenever possible.
  • Check for comment box sizing problems and row-width for matrix question labels.


Remember, quality control should never be an afterthought; you must have an established quality control process for surveys. This process must specify the quality review responsibilities of each survey reviewer. One or more team members should be responsible for evaluating respondent-level data. The quality control process should review the survey design end-to-end to focus on maximizing both technological efficiency and respondent experience for optimal data quality.