telephone surveys

As Telephone Surveys Fade, Online Research Remains an Option

Fewer Americans are willing to answer their phones to participate in telephone public opinion surveys, which poses a big problem for political operatives who use results to fashion campaign strategies. As pollsters scramble for alternatives, online research stands out as a viable and valuable option.

Fewer Americans are willing to answer their phones to participate in telephone public opinion surveys, which poses a big problem for political operatives who use results to fashion campaign strategies. As pollsters scramble for alternatives, online research stands out as a viable and valuable option.

Telephone surveys have been the gold standard for public opinion polling for decades. That’s about to change.

“Fewer Americans than ever are willing to pick up the phone and talk to pollsters, sending costs skyrocketing to roughly double what they were four years ago,” writes Steven Shepard on Politico.

Pollster Scott Keeter told fellow pollsters recently that telephone surveys are in “wheezing condition” and efforts to find a suitable replacement are like “a great party on the deck of the Titanic.”


These sober assessments about the ill health of public opinion polling come on the eve of the 2020 presidential election and have many political operatives scrambling to find sources of reliable information on which to base campaign strategies. 

The slow fade of telephone surveys isn’t really news. CFM’s resident researcher, Tom Eiland, explains, “Challenges with phone surveys started with the use of caller ID and voice mail, then Do Not Call lists and really accelerated with the use of cell phones and smartphones.”  

“Telephone surveys have been a great tool that produced high-confidence findings when representative samples were achieved,” Eiland says. “However, telephone use has gone digital and polling has to adjust to that reality.”

Eiland noted CFM’s research sample designs adapted as respondent behavior changed. 

For general population and voter surveys, Eiland recommends using multi-modal sample designs. “This entails using a combination of telephone interviews and online web-based surveys,” he explained. Telephone numbers and email addresses are acquired from trusted third-party vendors to make the combined sample random.

“The trick,” Eiland said, “is to use sample quotas for demographic characteristics, such as age, gender and area, to ensure survey participants are representative of the community.”


Plunging Participation Rates Plague Telephone Surveys

Robocalls, caller ID and impatience with dinner-time calls have shrunk the number of people willing to be respondents for telephone public opinion surveys. Pew Research and others have shifted to online panel research, an alternative CFM has recommended for years.

Robocalls, caller ID and impatience with dinner-time calls have shrunk the number of people willing to be respondents for telephone public opinion surveys. Pew Research and others have shifted to online panel research, an alternative CFM has recommended for years.

Response rates to telephone public opinion surveys continue to decline, making them more expensive and less attractive than online panel research. We’ve been pointing to this trend for years. Now Pew Research confirms it.

The response rate on landline phones to survey calls in 1997 was 36 percent. In 2018, it fell to 6 percent. Potential telephone survey respondents have declined recently because of the surge in robocalls. Phones with caller ID also discourage answering unfamiliar rings and sometimes flag survey calls as spam.

This isn’t the end of public opinion research. Online panel research has represented an attractive and versatile alternative for some time. Participation rates tend to be higher, there is an ability to follow up with some or all of respondents and it appears participants are more candid online than on the phone. Participation is higher because respondents can answer survey questions when it is convenient for them, as opposed to when someone calls on the phone.


According to Pew Research, low response rates on telephone surveys, especially ones that include cell phones, don’t equate to lower accuracy of findings. The real impact is higher costs. “This reality,” Pew says, “often forces survey organizations to make trade-offs in their studies, such as reducing sample size, extending field periods or reducing quality in other ways in order to shift resources to the interviewing effort.” Those trade-offs can lessen confidence in results.

Lower participation rates on telephone surveys aren’t new.  They have steadily declined since at least 1997. Rates stabilized around 9 percent in 2013, then started plunging again in 2016. Lower participation rates have persuaded Pew to conduct most of its US polling online using its American Trends Panel.

CFM has recommended online panel research to skeptical clients. To ease skepticism, we have benchmarked online results with results from telephone surveys, showing that results are comparable. 

As telephone survey participation rates have declined and sample sizes have been trimmed, panel research offers an affordable opportunity for larger sample sizes, often larger than even healthy telephone survey samples.

Larger sample sizes can increase the confidence rate for panel research by ensuring the samples are representative of the audience being polled. Larger sample sizes have another practical value – they allow for greater segmentation of respondent results, which can be valuable in reading poll results. For example, in political polling, it is useful to have reliable results by congressional districts as well as statewide.

Maybe the greatest value of online research over telephone surveys is the ability to follow up with respondents. This can take the form of sharing findings, asking follow-up questions or seeking views on subsequent, related information.

CFM Panel Research Infographic.png

Segmentation of panels allows segmented research. Follow-up questions can be directed at respondents based on their answers to questions. Online focus groups can be organized with respondents voicing a particular view. When we assisted Oregon officials in building a transportation funding proposal, we conducted online focus groups with respondents who expressed opposition to a gas tax increase, which produced useful information and an insightful dialogue among opponents that guided how the funding proposal was presented.

Telephone surveys have been a reliable research tool and still have utility. The ubiquity of cell phones, the surge of robocalls and the reluctance of people to interrupt dinner to answer survey questions are challenges that make telephone surveys a less effective option than before. The challenges are significant enough that panel research skeptics should put aside their doubts and talk to the firms that have spent time honing the use of online panel research.

One-on-One Interviews: The Rodney Dangerfield of Research

One-on-one interviews can produce uniquely informative and insightful findings.

One-on-one interviews can produce uniquely informative and insightful findings.

One-on-one interviews are the Rodney Dangerfield of research. They don’t get the respect they deserve, even though they can produce uniquely informative and insightful findings.

As a form of qualitative research, one-on-one interviews can penetrate issues more effectively than focus groups. One-on-one interviews are more conversational and flexible; potential participants are selected precisely, and the one-on-one environment yields candid insights.

The advantage of one-on-one interviews lies in who is interviewed. One-on-one interviews typically are scheduled from lists of customers, key stakeholders, managers or elected officials. In most cases, potential participants are recognized as “influentials” who impact opinions of others.

Participation rates are high even though you are targeting a very specific group of busy people. Why? The interviews are scheduled to meet their schedule, not at a specific day, time and place to accommodate client schedules. Also, key stakeholders like sharing their opinions, especially when assured comments are not for attribution.

Focus groups and one-on-one interviews both rely on discussion guides to propel discussion. Focus groups allow researchers – and clients – to observe a group reaction to a discussion guide consisting of questions, value propositions, logos or advertising messages. One-on-one interviews are more like confessionals when subjects feel comfortable to share their personal beliefs and attitudes. You can get unfiltered viewpoints directly from people that you interview.

With these virtues, why do clients purse their lips when asked about one-on-one interviews? Maybe they doubt how 20 well-conceived one-on-one interviews with a representative sample can outdo 500 randomly selected telephone surveys or a series of well-facilitated focus groups. They should erase their doubts and have faith. One-on-one interviews can deliver the goods.

Here are some excellent uses of one-on-one interviews:

Confirming alignment on objectives: One-on-one interviews are a discrete way to see if your managers or board members are in sync with a new overarching policy or strategic plan and, if not, to learn why not. Using a skilled third-party interviewer who will treat the interviews confidentially can generate a wealth of candid observations. Employing one-on-one interviews before full implementation can save a lot of frustration and embarrassment. 

Floating trial balloons: If you have a radical idea, one-on-one interviews can give you an advance read on how a defined audience will regard your out-of-the-box concept. The interviews will expose the most salient arguments opposing your idea and reveal strongest arguments supporting it. Findings can provide clues as to whether your trial balloon will soar or crash. More important, findings offer bread crumbs of how to proceed to avoid a crash.

Evaluating New Branding: Creating a new name, logo and visual identity is at its core subjective. One-on-one interviews can triangulate some perspective from stakeholders, customers or competing brand managers. Findings won’t magically produce a name, logo or visual identity, but can point to a productive direction and identify some key concepts. Findings also can warn of dead ends or bad ideas, which can save a lot of wasted time, energy and money.

Auditing media attitudes: Media audits can be valuable ways to assess relationships with reporters and editors who cover your business, products and services. The most effective way to conduct media audits is through one-on-one interviews. A third party, preferably someone with his or her own rapport with reporters and editors, can fetch the most candid observations and useful suggestions for improving media relationships.

Tapping Influencer insights: People who influence the behavior, preferences and consumer choices of others can be a valuable source of insight. One-on-one interviews may be the only viable way to capture that insight. Coincidentally, the outreach can establish or enhance relationships with key influencers.

Sampling diverse perspectives: Diversity and inclusion are increasingly important in organizations, but that priority doesn’t always scale down to understanding diverse perspectives within a group or team. One-on-one interviews with a diverse range of employees can suss out subtle and not-so-subtle differences in perspective. That knowledge can lead to greater cohesion in a unit and broader understanding of the range of viewpoints and cultural lenses in an organization.


Why Representative Samples Really Matter

If you want market research that matters, make sure the sample of people in your survey matches the audience you want to reach with your product or message.

If you want market research that matters, make sure the sample of people in your survey matches the audience you want to reach with your product or message.

A favorite story involves meeting with a client interested in promoting first-time homeownership. I mentioned the need for market research. No problem, the client said, we have that covered. I was handed the research summary and, as a matter of habit, jumped to the page about the telephone survey sample. It was very revealing. 

More than 50 percent of the respondents were 65 years or older. They were the majority of people who answered the phone and were willing to spend 15 or 20 minutes talking to a stranger about owning a home. Unfortunately, they weren’t the people the client had in mind as first-time homebuyers. 

Survey data is worthless unless the sample of who you interview reflects the audience you seek to reach. The sample in my client’s survey would have been terrific if the subject was reverse mortgages. It stunk as a reflection of who to address potential first-time homebuyers. 

Conversations between clients and research professionals must start with who to interview. If you have the wrong sample, the answers you get from the questions you pore over won’t matter a lick. 

Too often, the question of who to interview is glossed over. Sometimes the most obvious sample goes overlooked. When I was a lobbyist, a client hired me to “fix” his message that wasn’t gaining any traction with legislators. I started by interviewing about a third of the legislature, including virtually all of the lawmakers on the committees that were most engaged on my client’s issue. 

The interviews produced a wealth of insight. My client’s issue had latent support, but needed to be explained and demonstrated in a far different way. Lawmakers basically wrote the script my client and I used to lobby them. And it worked. 

Representative samples are harder to achieve for a mix of reasons. For example, increasing numbers of people don’t have landline phones and, if they do, they shield themselves from unsolicited calls with Caller ID. It takes a lot more calls, at greater expense, to collect a representative sample. Market research must cope with growing segmentation, which adds extra layers of complexity in selecting the right group of people to survey. 

The value of representative samples goes beyond quantitative research. Focus groups must be representative, too. And why would you do a customer satisfaction intercept survey for Nordstrom by interviewing people coming out of a rival department store? Representative samples matter in public opinion polling. A poll of New York voters wouldn’t be all that useful in projecting election results in Indiana. 

Despite the difficulty, solid research is grounded on good samples. Who you talk to matters if you want findings that mean something for your marketing.  

Gary Conkling is president and co-founder of CFM Strategic Communications, and he leads the firm's PR practice, specializing in crisis communications. He is a former journalist, who later worked on Capitol Hill and represented a major Oregon company. But most importantly, he’s a die-hard Ducks fan. You can reach Gary at and you can follow him on Twitter at @GaryConkling.

Political Polling Validity Becomes Shaky

Political polling is getting less reliable in predicting actual election outcomes. Reasons include the growing use of cell phones, reluctance to participate in telephone surveys and the rising cost of representative research samples.

Political polling is getting less reliable in predicting actual election outcomes. Reasons include the growing use of cell phones, reluctance to participate in telephone surveys and the rising cost of representative research samples.

Political polling doesn't seem to be as spot on as it used to be. Greater use of cell phones, wariness to participate in surveys and unrepresentative samples are among the reasons that political polls and election results turn out differently.

Cliff Zukin, a Rutgers political science professor and past president of the American Association for Public Opinion Research, writes in the New York Times that "polls and pollsters are going to be less reliable," so voters and the news media should beware.

"We are less rue how to conduct good survey research now than we were four years ago, and much less than eight years ago," says Zukin. "Don't look for too much help in what the polling aggregation sites may be offering. They, too, have been falling further off the track of late. It's not their fault. They are only as good as the raw material they have to work with."

Polling failures have been exposed in the most undetected 2014 mid-term election sweep in which Republicans captures both houses of Congress, Prime Minister Benjamin Netanyahu's solid victory in Israel and British Prime Minister David Cameron's relatively easy re-election win.

Cell phones are everywhere and increasingly have replaced landline telephones. Pollsters can find cell phone numbers, but federal law prevents calling them with automatic dialers. According to Zukin, "To complete a 1,000-person survey, it's not unusual to have to dial more than 20,000 random numbers, most of which do not go to working telephone numbers." That adds budget-busting cost to telephone surveys, which in turn lead to "compromises in sampling and interviewing."

Response rates to surveys have declined precipitously. In the 1970s, Zukin says an 80 percent response rate was considered acceptable. Now response rates have dipped below 10 percent. It is hard to draw a representative sample when large chunks of the population refuse to participate. Some cohorts, such as lower income household members, are more unlikely to participate than others, which can skew results. And it takes more calls to achieve a representative sample, which encourages corner-skipping.

Internet polling has emerged as a strong alternative. It is cheaper than telephone surveys and, at least or the moment, people seem more willing to participate, in part because they have more choice in when and how to respond.

But Internet use has built-in biases, too, Zukin notes. While 97 percent of people between the ages of 18 and 29 use the Internet, 40 percent of adults older than age 65 don't. "Almost all online election polling is done with non probability samples," Zukin says, which makes it impossible to calculate a margin of error. 

The most vexing polling problem is not a new one – determining who will actually vote. Public opinion polling is one thing; trying to predict the outcome of an actual election is another. Pollsters recognize that respondents will overstate their likelihood of actually voting, but have limited ability to identify who will and who won't cast ballots.

Non voting can occur for a mix of reasons – bad weather, lack of interest or political protest. Some registered voters simply forget to vote, especially in non-presidential elections. Less motivated voters vote in top-line races and leave the rest of their ballots blank, making it hard to predict the "turnout" for so-called down-ballot candidates and ballot measures.

Scott Keeter, who directs survey research at Pew Research, says the combination of these factors is shifting political polling "from science to art."

Political polls will continue to be magnets for media coverage, but readers should be aware that the results may not have as much validity as polling in the past.

Death of Telephone Surveys Exaggerated

Death of Telephone Surveys Exaggerated.jpg

Telephone surveys face challenges, but aren't dead. When done right using new techniques and technologies, they still can produce reliable research results.Telephone survey research isn't dead, but it is undergoing some serious surgery. This reliable research tool faces challenges that have forced work-arounds, new techniques and partnerships.

Businesses, nonprofits and political candidates continue to rely on results produced by telephone surveys to introduce products, make decisions and craft marketing messages. However, the savviest users of research recognize the problems facing telephone surveys and are pushing pollsters to solve them.

The most obvious challenge is the exploding use of cell phones, which has led many people to abandon their landline telephones. This trend is especially prevalent among younger people and African-Americans. Failing to include cell phone users for a telephone survey can lead to a skewed sample that under-represents those cohorts.

One fix is to combine random digit dialing with a random sample of listed phone numbers. This increases the potential to reach people with unlisted or newly listed numbers, as well as cell phone users. Another strategy is to team telephone interviews with web-based interviews aimed specifically at hard-to-reach target audiences.

The next challenge is to get people to answer their phone. Caller ID allows people to filter their calls and call back only the people they want to talk with. As more people adopt the technique of not answering their phone unless they know who is calling, telephone surveys now require many more calls to achieve a representative sample. More calls translate into more time and more expense.

When pollsters get someone to pick up the phone and they agree to answer survey questions, they have learned that the window of opportunity is getting shorter. Once upon a time, respondents may have been willing to submit to an interview lasting a half hour. Nowadays, their patience peters out after 10 minutes. That means surveys must contain fewer questions and fewer complex or open-ended questions. Researchers need to exercise care to use the window of opportunity wisely, with well-crafted, clear and revealing questions.

We live in an era where people have become reliant on visual media – whether it's in the form of images or text messages. It isn't as natural as it once was to answer a series of questions on the phone. Complicating the problem, some callers employed by pollsters have accents or English is their second language. Clarity can suffer.

Despite the challenges, telephone survey can still do the job – if done right. That means insisting on a representative sample, employing new technology and techniques to reach often under-represented populations and integrating with web-based surveys. For the harried person working two jobs and trying to raise a family, responding to a telephone survey call that may come at dinnertime just isn't in the cards. A web-based survey allows those and other people to answer questions on their schedule.

The argument that counts isn't whether telephone surveys are better than their online counterparts. What matters is the skill in executing a survey to achieve high-confidence results, regardless what channel is used. Hire the research professional who is committed to getting accurate findings, not wedded to a technique or tool.