online surveys

Scrubbing Unintended Bias from Research Surveys

Unintended bias can undermine survey research, rendering findings as unreliable. Best practices, starting with clear and objective questions, is the best way to ensure survey findings are useful and actionable.

Unintended bias can undermine survey research, rendering findings as unreliable. Best practices, starting with clear and objective questions, is the best way to ensure survey findings are useful and actionable.

“Bias is the mortal enemy of all surveys,” says SurveyMonkey. It turns out bias is a sly enemy that can sabotage meaningful research findings.

SurveyMonkey offers tips on how to “promote honest answers” from surveys, which all depend on the good intentions and sound practices of the survey creator.

“One of the leading causes of misleading survey data is researcher bias that comes directly from the survey writer,” according to SurveyMonkey. “This bias is sneaky. It’s caused by survey creators who innocently influence the results to reach an outcome they hope or expect to reach. It’s sneaky because survey creators are typically unaware it’s happening.”

Bias is reflected in the wording of questions. Just as attorneys are taught not to lead witnesses, researchers should avoid leading questions in surveys. This applies to all types of research from online surveys, telephone polls and stakeholder interviews.

SurveyMonkey says unintended bias can be sneaky and sabotage research findings.

SurveyMonkey says unintended bias can be sneaky and sabotage research findings.

Unintended bias also can occur by asking the wrong or incomplete questions, SurveyMonkey says. If you ask respondents to name their favorite kind of pizza, then list a few options, you may skew the results by appearing to limit the range of choice. An open-ended question would be better that allows respondents to list their choice, whether it was pepperoni or pineapple.

Another survey flaw is interviewing the wrong people. An unrepresentative sample can generate findings that don’t reflect the views of the audience you are targeting.

Related to that is excluding a significant cohort from your sample. This can happen when surveys are conducted at times or places where some people can’t participate. For example, a telephone poll that relies only on randomly selected landline phone numbers is bound to underrepresent young people, minorities and lower-income households. A focus group only works for a random sample of people in the immediate area of where the focus group is held.

Bias can rear its head by misreading survey data. “Bias can come into play when a survey creator gets excited about a finding that meets their hypothesis, but overlooks that the survey result is only based on a handful of respondents,” SurveyMonkey says. A common mistake is trying to quantify findings from qualitative research such as focus groups or stakeholder interviews.

The key takeaway is that bias can creep into research at just about every phase of survey work. Tools such as SurveyMonkey make online surveys broadly available to anyone who wants answers. However, researching best practices are essential to ensure you get useful, actionable answers.

Best practices start with clear, objective questions and include a representative sample and a faithful reading of results.

“By remaining true to your survey’s purpose and having a firm understanding of the topics of your research,” SurveyMonkey says, “ you’ll be well on your way to eliminating researcher bias from your survey.”

 

Health Care Embraces Panel Research

Hospitals and health systems are embracing new ways to improve patient engagement and communication by using panel-based research techniques.

The way people communicate is changing rapidly. Almost all households have access to the Internet. Smartphone and tablet use is widespread. Patients want to communicate with service organizations they trust and they want to do it at times that are convenient for them. Panel research allows this to happen.

Panel research uses web-based research tools. Customers are invited by email to participate in online surveys. Participants are asked if they want to continue to participate in future research. Typically, 60 to 70 percent say yes. This group forms the panel for future research.

Talking to Your Own

If you had a choice between interviewing 600 strangers on the telephone versus an online dialogue with 5,000 of your own customers, which would you choose?

Stated like that, it doesn't even seem like a fair choice. Yet, organizations continue to put their trust in telephone surveys and bypass the research-rich engagement with their own customers.

Tapping the power of databases makes sense to find out what consumers or stakeholders are thinking and affording them an opportunity to offer opinions or suggestions to inform real-time decision-making.

Telephone surveys have and will continue to be an important tool in the marketing or public opinion arsenal. However, online research expands the universe of what's possible, moving you from simple information-gathering to intentional engagement.

The biggest obstacle to harnessing the energy of a database is ignorance. Many CEOs and even marketing directors aren't aware of how much information their organizations already have on file — and how relatively easy it is to accumulate more.

There is consumer and stakeholder eagerness to share their views — if given the chance in a legitimate, credible forum, such as online research. Even more telling, data indicates that online respondents tend to give more honest feedback than they do in telephone surveys. That could be because telephone surveys invariably interrupt dinner or a favorite TV show. Online surveys can be completed on the respondent's timetable.

Smartphone Rage Prompts New Research Tactics

The accelerating switch from landline to mobile phones is undermining the ability of traditional telephone surveys to capture accurate reflections of many target groups, from young shoppers to likely voters.

As anyone watching the just completed 2012 London Olympics could readily tell, mobile phones are omnipresent. Recent data indicates more than half of all Americans with mobile phones have smartphones, which opens an expanding world of smartphone apps. More than a quarter of all smartphone users say they would rather surrender their computer than their smartphone.

That doesn't bode well for landline phones, which don't have cameras and games, and has caused pollsters to scramble to adjust.

Because the demographics of smartphone users differ from the mix of landline phone users, pollsters are having to juggle their samples. In an article last week, The New York Times reported that veteran GOP and Democratic pollsters who conduct surveys for NBC News/Wall Street Journal decided to increase their percentage of exclusive mobile phone users to 30 percent of their overall sample.

The Times reported other polling firms supplement traditional telephone surveys with online surveys in an attempt to capture the same variable demographic of mobile phone users.

Mark Mellman, who polls for Democratic candidates, told the Times, "That group is not only younger, but also attitudinally different from other people of all ages." Mellman said they are "disproportionately urban, African-American, on either the high or low side of the economic ladder and Democratic."

While it is legal to make randomly selected polling calls to landline phones, regulations prevent such calling on cell phones.

"I have yet to see a standard that I believe is anything more than a guesstimate," Republican pollster Whit Ayres admitted to the Times. Ayres predicted the research industry would eventually shift more toward Web-based surveys.

CFM is already well-versed in this conversion. Yes, we still conduct telephone surveys. However, we counsel that web-based surveys using representative and large databases can produce equally reliable results, with the added benefit of enabling ongoing engagement with a panel of respondents.

Old Poll Echoes Today

Cleaning out your garage can be clarifying on multiple levels. You create space for new stuff and you discover old, nearly forgotten stuff — like polling data from 1982.

"I am very saddened by the priorities that were cut. I am not convinced that a balanced budget is an absolute necessity..."

"Basically big companies are not paying taxes, and we're paying more taxes."

"I think the worst in the economy is still ahead…"

"We're going to appreciate the important things more and do away with some of the frivolous things."

"I believe in greater good than in greater harm. It's going to hurt. It's going to hurt fort a long time, but we must begin."

"Defense should be cut and welfare, social programs, education and the arts should have more added back."

Sound familiar? These could be quotes from a focus group last week instead of one held April 1982 in Richmond, Virginia. Back then the debate centered on Reaganomics. Today it touches on mortgage securities, derivatives and hedging. And, of course, ObamaCare.

Maybe as the world changes, it stays the same — or at least the problems stay the same.

However, as I reflected on these long-stored nuggets of polling wisdom, I was struck by a sense of voter bewilderment about what course to take to preserve jobs in America, liberty at home and peace abroad. That bewilderment continues today, spiced by more partisan and rancorous rhetoric.

Use of Margin of Error Can Be Misleading

As CFM increases the use of online research, more people are asking about the statistical reliability of Web-based surveys. Well, for a variety of reasons, the rules of statistical testing don’t apply to online surveys.

That said, the term “margin of error” is misleading and can build a false sense of confidence in telephone surveys.

There are multiple sources of errors that can occur in traditional survey research, including:

Caption:  Is the term "margin of error" still relevant or accurate when it comes to research?