representative samples

As Telephone Surveys Fade, Online Research Remains an Option

Fewer Americans are willing to answer their phones to participate in telephone public opinion surveys, which poses a big problem for political operatives who use results to fashion campaign strategies. As pollsters scramble for alternatives, online research stands out as a viable and valuable option.

Fewer Americans are willing to answer their phones to participate in telephone public opinion surveys, which poses a big problem for political operatives who use results to fashion campaign strategies. As pollsters scramble for alternatives, online research stands out as a viable and valuable option.

Telephone surveys have been the gold standard for public opinion polling for decades. That’s about to change.

“Fewer Americans than ever are willing to pick up the phone and talk to pollsters, sending costs skyrocketing to roughly double what they were four years ago,” writes Steven Shepard on Politico.

Pollster Scott Keeter told fellow pollsters recently that telephone surveys are in “wheezing condition” and efforts to find a suitable replacement are like “a great party on the deck of the Titanic.”

image006.png

These sober assessments about the ill health of public opinion polling come on the eve of the 2020 presidential election and have many political operatives scrambling to find sources of reliable information on which to base campaign strategies. 

The slow fade of telephone surveys isn’t really news. CFM’s resident researcher, Tom Eiland, explains, “Challenges with phone surveys started with the use of caller ID and voice mail, then Do Not Call lists and really accelerated with the use of cell phones and smartphones.”  

“Telephone surveys have been a great tool that produced high-confidence findings when representative samples were achieved,” Eiland says. “However, telephone use has gone digital and polling has to adjust to that reality.”

Eiland noted CFM’s research sample designs adapted as respondent behavior changed. 

For general population and voter surveys, Eiland recommends using multi-modal sample designs. “This entails using a combination of telephone interviews and online web-based surveys,” he explained. Telephone numbers and email addresses are acquired from trusted third-party vendors to make the combined sample random.

“The trick,” Eiland said, “is to use sample quotas for demographic characteristics, such as age, gender and area, to ensure survey participants are representative of the community.”

 

New Book Says Polls Provide Indications, Not Predictions

Anthony Salvanto with CBS News has written a new book that explains what polling can and can’t do. It’s a good place to begin to become a savvy research consumer.

Anthony Salvanto with CBS News has written a new book that explains what polling can and can’t do. It’s a good place to begin to become a savvy research consumer.

Political polls give indications of voter attitudes, not predictions of election outcomes, says Anthony Salvanto in his new book, Where Did You Get This Number.

Salvanto, the director of elections and surveys for CBS News, says he wrote his book to explain how polling works after skepticism arose following the 2016 presidential election that polls suggested was a lock for Hillary Clinton. She did win the popular vote, but lost in states critical to a victory in the Electoral College. The polls were right and wrong at the same time.

In an interview on Face the Nation, Salvanto said he is often asked how national poll numbers are generated based on as few as 1,000 ten-minute telephone interviews. He explains representative samples can produce reliable results. Pollsters may not interview you, but they interview people like you.

A representative sample is just part of the best practices followed by professional pollsters. Clear, objective questions must be asked. Individual questions should test a single variable. Conclusions should be tempered by statistical validity. For example, a national poll with a 1,000-respondent sample may provide a valid national picture, but not a statistically valid picture of voters in Colorado.

Even the most scrupulous professional pollsters don’t always get the numbers exactly right. There often is a slight, but significant skew as a result of the specific methodology a pollster uses. For example,  failure to include a representative number of random sample calls to cell phone users could under-represent younger people, low-income families and minorities.

Nate Silver of FiveThirtyEight.com argues it is more reliable to look at groups of polls through the lens of a probability model.  He claims analyzing a pool of polls and weighting each one by their history of accuracy can burp out a more accurate polling results. Even then, Salvanto would say, it is not a prediction, just a reflection in time.

Then there are the polls that aren’t really polls. Push-polls ask questions, less to get an answer and more to deliver a message, often a negative one, about a political opponent. Cheap robopolls get lower than average response rates, which can skew results. Because they are prohibited by law from calling cell phone users randomly, they have a built-in bias.

The bottom line: Purchasers need to be smart consumers of research. Before looking at results, look at the sample so you know whose views are represented in the results. Understand the methodology being used and the statistical confidence it will yield. Know the benefits and limitations of different types of research, and certainly between qualitative and quantitative research. Collaborate with a pollster on the questions that need to be asked and let him or her advise you how to ask them fairly so you get usable responses, not just what you want to hear.

Salvanto’s book may be the place to start on your journey to understanding polling’s potential and limitations.

 

Why Representative Samples Really Matter

If you want market research that matters, make sure the sample of people in your survey matches the audience you want to reach with your product or message.

If you want market research that matters, make sure the sample of people in your survey matches the audience you want to reach with your product or message.

A favorite story involves meeting with a client interested in promoting first-time homeownership. I mentioned the need for market research. No problem, the client said, we have that covered. I was handed the research summary and, as a matter of habit, jumped to the page about the telephone survey sample. It was very revealing. 

More than 50 percent of the respondents were 65 years or older. They were the majority of people who answered the phone and were willing to spend 15 or 20 minutes talking to a stranger about owning a home. Unfortunately, they weren’t the people the client had in mind as first-time homebuyers. 

Survey data is worthless unless the sample of who you interview reflects the audience you seek to reach. The sample in my client’s survey would have been terrific if the subject was reverse mortgages. It stunk as a reflection of who to address potential first-time homebuyers. 

Conversations between clients and research professionals must start with who to interview. If you have the wrong sample, the answers you get from the questions you pore over won’t matter a lick. 

Too often, the question of who to interview is glossed over. Sometimes the most obvious sample goes overlooked. When I was a lobbyist, a client hired me to “fix” his message that wasn’t gaining any traction with legislators. I started by interviewing about a third of the legislature, including virtually all of the lawmakers on the committees that were most engaged on my client’s issue. 

The interviews produced a wealth of insight. My client’s issue had latent support, but needed to be explained and demonstrated in a far different way. Lawmakers basically wrote the script my client and I used to lobby them. And it worked. 

Representative samples are harder to achieve for a mix of reasons. For example, increasing numbers of people don’t have landline phones and, if they do, they shield themselves from unsolicited calls with Caller ID. It takes a lot more calls, at greater expense, to collect a representative sample. Market research must cope with growing segmentation, which adds extra layers of complexity in selecting the right group of people to survey. 

The value of representative samples goes beyond quantitative research. Focus groups must be representative, too. And why would you do a customer satisfaction intercept survey for Nordstrom by interviewing people coming out of a rival department store? Representative samples matter in public opinion polling. A poll of New York voters wouldn’t be all that useful in projecting election results in Indiana. 

Despite the difficulty, solid research is grounded on good samples. Who you talk to matters if you want findings that mean something for your marketing.  

Gary Conkling is president and co-founder of CFM Strategic Communications, and he leads the firm's PR practice, specializing in crisis communications. He is a former journalist, who later worked on Capitol Hill and represented a major Oregon company. But most importantly, he’s a die-hard Ducks fan. You can reach Gary at garyc@cfmpdx.com and you can follow him on Twitter at @GaryConkling.

Death of Telephone Surveys Exaggerated

Death of Telephone Surveys Exaggerated.jpg

Telephone surveys face challenges, but aren't dead. When done right using new techniques and technologies, they still can produce reliable research results.Telephone survey research isn't dead, but it is undergoing some serious surgery. This reliable research tool faces challenges that have forced work-arounds, new techniques and partnerships.

Businesses, nonprofits and political candidates continue to rely on results produced by telephone surveys to introduce products, make decisions and craft marketing messages. However, the savviest users of research recognize the problems facing telephone surveys and are pushing pollsters to solve them.

The most obvious challenge is the exploding use of cell phones, which has led many people to abandon their landline telephones. This trend is especially prevalent among younger people and African-Americans. Failing to include cell phone users for a telephone survey can lead to a skewed sample that under-represents those cohorts.

One fix is to combine random digit dialing with a random sample of listed phone numbers. This increases the potential to reach people with unlisted or newly listed numbers, as well as cell phone users. Another strategy is to team telephone interviews with web-based interviews aimed specifically at hard-to-reach target audiences.

The next challenge is to get people to answer their phone. Caller ID allows people to filter their calls and call back only the people they want to talk with. As more people adopt the technique of not answering their phone unless they know who is calling, telephone surveys now require many more calls to achieve a representative sample. More calls translate into more time and more expense.

When pollsters get someone to pick up the phone and they agree to answer survey questions, they have learned that the window of opportunity is getting shorter. Once upon a time, respondents may have been willing to submit to an interview lasting a half hour. Nowadays, their patience peters out after 10 minutes. That means surveys must contain fewer questions and fewer complex or open-ended questions. Researchers need to exercise care to use the window of opportunity wisely, with well-crafted, clear and revealing questions.

We live in an era where people have become reliant on visual media – whether it's in the form of images or text messages. It isn't as natural as it once was to answer a series of questions on the phone. Complicating the problem, some callers employed by pollsters have accents or English is their second language. Clarity can suffer.

Despite the challenges, telephone survey can still do the job – if done right. That means insisting on a representative sample, employing new technology and techniques to reach often under-represented populations and integrating with web-based surveys. For the harried person working two jobs and trying to raise a family, responding to a telephone survey call that may come at dinnertime just isn't in the cards. A web-based survey allows those and other people to answer questions on their schedule.

The argument that counts isn't whether telephone surveys are better than their online counterparts. What matters is the skill in executing a survey to achieve high-confidence results, regardless what channel is used. Hire the research professional who is committed to getting accurate findings, not wedded to a technique or tool.