Political polling has taken another lick this election, but one poll predicted the outcome accurately, using techniques pollsters often diss.
The USC/Los Angeles Times tracking poll showed Donald Trump gaining momentum heading into the election, which contradicted the prevailing polling that showed the race tightening, but breaking in Hillary Clinton’s direction.
The tracking poll technology used in the 2016 election wasn’t experimental or new. It has been employed in 2012 on behalf of RAND Corp. and accurately projected president Obama’s re-election.
The two techniques that distinguished the USC/LA Times tracking poll were:
- Respondents were asked on a 0-to-100 scale how likely they were to vote for a particular candidate; and
- Respondents were part of 3,200-person panel, not randomly selected each time a survey was conducted.
“Lots of people don’t know for sure how they’re going to vote,” the LA Times reported about the polling technique. “Forcing them to choose before they are ready can distort what they’re thinking. Asking people to estimate the probability of voting for one of the other captures their ambivalence more accurately. Asking people to estimate their chance of voting allows us to factor in information everyone in the sample.”
Here is the punchline that may explain why traditional surveys missed the Trump surge that carried him to victory. “By contrast, polls that used a likely-voter screen can miss a lot of people who won’t meet the likely-voter test, but who in the end really do vote.”
Using a panel rather than a freshly selected random sample for each survey is a tried-and-true market research practice, but less common in public opinion polling. Its advantage is that survey is tracking the evolving views of the same people.
The USC/LA Times tracking poll culled 450 people daily from the larger panel and conducted a survey. Respondents were given up to seven days to respond. “Each day, we post results that are an average of the previous seven days of responses,” according to the LA Times. “Between those two factors – people taking up to seven days to respond and averaging seven days of result – the impact of an event might not be completed reflected in the poll for as long as two weeks."
“One of the problems polls face is that sometimes partisans on one side are more enthusiastic about responding to questions than those on the other side,” the LA Times explained. “Maybe their candidate has a particularly good week or the opposing candidate has had a bad one. When that happens, polls can suddenly shift simply because of who is willing to respond to a pollster’s call, which is called differential response.”
“Using a panel of the same people,” the newspaper added, “can ensure that when the poll results change, the shift reflects individuals changing their minds."