Public opinion polling

As Telephone Surveys Fade, Online Research Remains an Option

Fewer Americans are willing to answer their phones to participate in telephone public opinion surveys, which poses a big problem for political operatives who use results to fashion campaign strategies. As pollsters scramble for alternatives, online research stands out as a viable and valuable option.

Fewer Americans are willing to answer their phones to participate in telephone public opinion surveys, which poses a big problem for political operatives who use results to fashion campaign strategies. As pollsters scramble for alternatives, online research stands out as a viable and valuable option.

Telephone surveys have been the gold standard for public opinion polling for decades. That’s about to change.

“Fewer Americans than ever are willing to pick up the phone and talk to pollsters, sending costs skyrocketing to roughly double what they were four years ago,” writes Steven Shepard on Politico.

Pollster Scott Keeter told fellow pollsters recently that telephone surveys are in “wheezing condition” and efforts to find a suitable replacement are like “a great party on the deck of the Titanic.”

image006.png

These sober assessments about the ill health of public opinion polling come on the eve of the 2020 presidential election and have many political operatives scrambling to find sources of reliable information on which to base campaign strategies. 

The slow fade of telephone surveys isn’t really news. CFM’s resident researcher, Tom Eiland, explains, “Challenges with phone surveys started with the use of caller ID and voice mail, then Do Not Call lists and really accelerated with the use of cell phones and smartphones.”  

“Telephone surveys have been a great tool that produced high-confidence findings when representative samples were achieved,” Eiland says. “However, telephone use has gone digital and polling has to adjust to that reality.”

Eiland noted CFM’s research sample designs adapted as respondent behavior changed. 

For general population and voter surveys, Eiland recommends using multi-modal sample designs. “This entails using a combination of telephone interviews and online web-based surveys,” he explained. Telephone numbers and email addresses are acquired from trusted third-party vendors to make the combined sample random.

“The trick,” Eiland said, “is to use sample quotas for demographic characteristics, such as age, gender and area, to ensure survey participants are representative of the community.”

 

Plunging Participation Rates Plague Telephone Surveys

Robocalls, caller ID and impatience with dinner-time calls have shrunk the number of people willing to be respondents for telephone public opinion surveys. Pew Research and others have shifted to online panel research, an alternative CFM has recommended for years.

Robocalls, caller ID and impatience with dinner-time calls have shrunk the number of people willing to be respondents for telephone public opinion surveys. Pew Research and others have shifted to online panel research, an alternative CFM has recommended for years.

Response rates to telephone public opinion surveys continue to decline, making them more expensive and less attractive than online panel research. We’ve been pointing to this trend for years. Now Pew Research confirms it.

The response rate on landline phones to survey calls in 1997 was 36 percent. In 2018, it fell to 6 percent. Potential telephone survey respondents have declined recently because of the surge in robocalls. Phones with caller ID also discourage answering unfamiliar rings and sometimes flag survey calls as spam.

This isn’t the end of public opinion research. Online panel research has represented an attractive and versatile alternative for some time. Participation rates tend to be higher, there is an ability to follow up with some or all of respondents and it appears participants are more candid online than on the phone. Participation is higher because respondents can answer survey questions when it is convenient for them, as opposed to when someone calls on the phone.

image003.png

According to Pew Research, low response rates on telephone surveys, especially ones that include cell phones, don’t equate to lower accuracy of findings. The real impact is higher costs. “This reality,” Pew says, “often forces survey organizations to make trade-offs in their studies, such as reducing sample size, extending field periods or reducing quality in other ways in order to shift resources to the interviewing effort.” Those trade-offs can lessen confidence in results.

Lower participation rates on telephone surveys aren’t new.  They have steadily declined since at least 1997. Rates stabilized around 9 percent in 2013, then started plunging again in 2016. Lower participation rates have persuaded Pew to conduct most of its US polling online using its American Trends Panel.

CFM has recommended online panel research to skeptical clients. To ease skepticism, we have benchmarked online results with results from telephone surveys, showing that results are comparable. 

As telephone survey participation rates have declined and sample sizes have been trimmed, panel research offers an affordable opportunity for larger sample sizes, often larger than even healthy telephone survey samples.

Larger sample sizes can increase the confidence rate for panel research by ensuring the samples are representative of the audience being polled. Larger sample sizes have another practical value – they allow for greater segmentation of respondent results, which can be valuable in reading poll results. For example, in political polling, it is useful to have reliable results by congressional districts as well as statewide.

Maybe the greatest value of online research over telephone surveys is the ability to follow up with respondents. This can take the form of sharing findings, asking follow-up questions or seeking views on subsequent, related information.

CFM Panel Research Infographic.png

Segmentation of panels allows segmented research. Follow-up questions can be directed at respondents based on their answers to questions. Online focus groups can be organized with respondents voicing a particular view. When we assisted Oregon officials in building a transportation funding proposal, we conducted online focus groups with respondents who expressed opposition to a gas tax increase, which produced useful information and an insightful dialogue among opponents that guided how the funding proposal was presented.

Telephone surveys have been a reliable research tool and still have utility. The ubiquity of cell phones, the surge of robocalls and the reluctance of people to interrupt dinner to answer survey questions are challenges that make telephone surveys a less effective option than before. The challenges are significant enough that panel research skeptics should put aside their doubts and talk to the firms that have spent time honing the use of online panel research.

Public Opinion Polls Stay Predictable in 2017 Election

Public opinion polling earned a black eye in the 2016 election cycle when most polls failed to predict a Donald Trump presidential victory. Few changes in polling techniques have been implemented in a handful of 2017 statewide elections and poll accuracy seems reconfirmed, at least for now. The X-factor of Trump wasn’t on the ballot.

Public opinion polling earned a black eye in the 2016 election cycle when most polls failed to predict a Donald Trump presidential victory. Few changes in polling techniques have been implemented in a handful of 2017 statewide elections and poll accuracy seems reconfirmed, at least for now. The X-factor of Trump wasn’t on the ballot.

Public opinion pollsters got a shiner in the 2016 election with off-base predictions about presidential and congressional elections. That may have signaled the need for major changes in technique, but that hasn’t happened, according to a story in The New York Times.

However, one unsuspecting change might right the ship. Pollsters are literally giving more weight in surveys to the level of education of respondents. Weighting respondents by education is far from easy. Candidates don’t perfectly align along educational attainment. In 2016, because of the profile of the presidential candidates, educational levels mattered. That may not be so in future elections.

For pollsters who think big methodological changes are unnecessary, Virginia may prove them right. Hillary Clinton polled five or six points ahead of Donald Trump in the 2016 election. She eventually carried Virginia by 5.3 percent. Polling in the 2017 Virginia gubernatorial election held on Tuesday showed Democrat Ralph Northam leading his GOP counterpart Ed Gillespie by as few as 3 percentage points.  With more than 80 percent of votes tallied, Northam posted nearly a 7 percent lead.

Political polling is not a perfected science. Conscientious pollsters continuously look for factors that can skew results, such as the sea-shift from landline phones to cell phones, and adjust to account for that shift. If you didn’t include cell phones in a sample, you would under-represent young voters and minorities and people who work more than one job.

Trump’s largely unexpected victory in 2016 confounded many pollsters and led to serious questioning of polling techniques. Did pollsters conduct late surveys to capture voters who decided at the last minute? How did pollsters compensate for respondents who intended to vote for Trump, but didn’t want to say so publicly? Did surveys fully take into account more remote areas, which went strongly in Trump’s direction? And how do you accurately predict turnout, not just overall, but by key constituencies that can determine whether one candidate wins or loses?

Challenges to getting accurate polling results may be intensifying as the electorate becomes more polarized, which is a hard factor to measure. While educational levels may be an obvious factor to include, figuring out how – and whether – it is a reliable indicator of voting behavior isn’t so obvious.

Politicians and news media put more stock in public opinion polling than voters. They are the ones that pay for it and, in varying degrees, expect polling results to reflect reality. Voters have no such expectations or fealty to polling results. If anything, polling results can incite small groups of voters to go to the polls or stay home, to vote one way or the other.

When all is said and done, polls don’t matter. Elections matter. Hillary Clinton led in the polls, but lost the election. Donald Trump sleeps in the White House. Clinton sleeps in hotels on her book tour explaining how she lost an election she thought she would win.

History may show 2016 is an aberration in polling perfection. Pre-election polls proved out in the gubernatorial elections today in New Jersey and Virginia. No curve balls, even though Gillespie in Virginia did his best to imitate the political bombast of Trump.

While the gubernatorial election outcome may give pause to Republicans standing for re-election in 2018, the predictability of public opinion polls in this cycle may reassure the buyers of political polling to keep investing.