Accessibility links

Breaking News

Pollsters Struggle To Regain Credibility After Faulty 2016 Predictions


U.S. President-elect Donald Trump defied the predictions of most pollsters this year when he beat his Democratic rival Hillary Clinton to take the White House (file photo)
U.S. President-elect Donald Trump defied the predictions of most pollsters this year when he beat his Democratic rival Hillary Clinton to take the White House (file photo)

For the political polling industry, 2016 was a year of faulty predictions that undermined the credibility of public opinion research.

As a result, the professional associations for pollsters in both Britain and the United States have launched investigations into why so many wrongly predicted 2016 election outcomes.

The American Association for Public Opinion Research is analyzing why almost every major U.S. poll incorrectly predicted Democrat Hillary Clinton would win the presidency and underestimated support for U.S. President-elect Donald Trump.

The British Polling Council is investigating why many of its members failed to predict a majority of voters in the June 23 Brexit referendum would choose to leave the European Union.

Meanwhile, the Public Relations and Communications Association (PRCA) -- a trade association for Britain's public relations sector -- has announced its own inquiry because of what it says is a loss of public confidence in political predictions.

PRCA Director-General Francis Ingham says the industry must "proactively address a perceived problem among those who work in politics: so many called both results wrong."

So what went wrong? Although the results of the studies are not expected until May 2017, pollsters are already identifying the potential problems.

'Nonresponse Bias'

During the U.S. election campaign, Trump repeatedly criticized mainstream media as being "biased" against him and dismissed most public opinion research as "dirty polls" that just put out "phony numbers."

The nonpartisan Washington-based Pew Research Center says distrust among Trump supporters toward media and media-funded polls probably caused a "nonresponse bias" in telephone surveys and new online methodologies that are being tested.

Previous research shows that less-educated white voters are less likely than university graduates to answer the questions of pollsters over the telephone.

Crucially, less-educated white voters heavily supported Trump in 2016.

Research also suggests that those who've voted infrequently or not at all in the past are not as likely to respond to pollsters' questions.

Claudia Deane, Pew's vice president of research, told RFE/RL that Trump's campaign also appears to have attracted support from people who didn't vote in 2012. "The frustration and anti-institutional feelings that drove the Trump campaign may also have aligned with an unwillingness to respond to polls," Deane said.

"The result would be a strongly pro-Trump segment of the population that simply did not show up in the polls in proportion to their actual share of the population," she said.

The 'Shy Voter' Effect

A week before the U.S. election, when most polls predicted Clinton would win, Republican campaigners insisted a significant number of "secret Trump voters" did not want to admit to pollsters that they supported Trump.

Some critics suggest that backlash on social media may have bolstered the number of "shy voters" who were cautious about disclosing their views to a stranger over the telephone or in an online survey.

U.S. President Richard Nixon had popularized a term to describe a similar concept in 1969 when he used the phrase "the great silent majority" to describe American voters who supported him but did not express their opinions publicly.

Deane says "the hypothesis of shy Trump voters" is "the big cross-national question" that the American pollsters' association is trying to answer.

But she says she is skeptical because her own research and observations have not turned up any evidence supporting the notion "up to this point."

Like their American counterparts in the U.S. presidential election, U.K. pollsters got it completely wrong in the country's Brexit referendum, in which voters opted out of the European Union (file photo).
Like their American counterparts in the U.S. presidential election, U.K. pollsters got it completely wrong in the country's Brexit referendum, in which voters opted out of the European Union (file photo).

Similarly, in Britain, polling companies use the term "shy Tory factor" to describe similar "shy voter" circumstances observed by political scientists in the country's general elections of 1992 and 2015.

Some claim there was a "shy Tory factor" before the 2016 Brexit vote.

But John Curtice, the president of the British Polling Council, told RFE/RL there is no consistent evidence that "shy voter bias" significantly impacted opinion surveys in the United Kingdom ahead of the Brexit referendum.

Curtice also said there is "no substantial evidence" that conservative voters, globally, are more likely to refuse to admit their political sympathies to pollsters because they don't want to appear "socially undesirable."

Identifying Likely Voters

Exit polls conducted outside polling stations on Election Day usually are more accurate than opinion surveys conducted ahead of the vote. That's because exit polls survey people who have actually cast a ballot.

Pre-election polls have an added burden of trying to determine whether randomly sampled respondents will actually vote on Election Day.

Researchers appear to have wrongly estimated turnout within some U.S. demographics. For example, the willingness of Trump supporters to go to the polls appears to have been underestimated.

Conversely, researchers appear to have overestimated how many self-proclaimed Clinton supporters would cast a ballot in key states. Exit-poll data showed that in several battleground states, Clinton did worse than pollsters had predicted among Hispanic voters, women, and African Americans.

In the closely contested state of Florida, for example, turnout for Clinton by Hispanics and voters under the age of 30 was less than predicted. Meanwhile, Trump's support among Hispanic voters in Florida was higher than expected -- enough to tilt the crucial state in his favor.

Getting Things Right

The British Polling Council is adamant about the fact that, in the end, most U.S. pollsters were on the mark about the nationwide popular vote in 2016.

Final results show Clinton did win the popular vote with about 2.8 million more votes than Trump -- a margin of more than 2 percentage points.

That is about 1 percentage point from what a polling data aggregator at Real Clear Politics, a Chicago-based political news aggregator, estimated Clinton's popular vote lead over Trump would be -- well within the standard three-percentage-points margin of error.

Curtice notes that the error of aggregated U.S. polls in 2016 was actually lower than the average error for presidential election polls from 1968 to 2012.

But the U.S. presidency is determined by the Electoral College, not the nationwide popular vote. So what really matters in predicting the winner in each state.

In the Electoral College, Trump won by a margin of about 80,000 votes cast across three states -- and he won each of those decisive states by less than one percentage point.

Almost every major pollster got the results wrong in those critical, closely contested battleground states of Pennsylvania, Michigan, and Wisconsin.

Quality Of Polls 'Very Uneven'

Polls in other swing states usually were within the expected three-percentage-points margin of error.

But Deane says the fact that nearly all pollsters erred in the same direction in nine key swing states -- all wrongly predicting a Clinton victory -- suggests systemic problems. "They were all off in one direction, and when you're doing a random survey you should not see correlated errors like that," Deane said.

"There's not always a lot of state polling, the quality of those polls is very uneven, and there's no real market incentive for people to do those polls," Dean explained.

As a result, Deane said the American polling association's investigation is "literally going to take the country apart, break it into 50 parts, and look at how polling did in every state."

The USC/Los Angeles Times poll was the only major U.S. poll that correctly predicted Trump's victory.

It was led by Arie Kapteyn, director of the University of Southern California's (USC) Dornsife polling center. Kapteyn told the Los Angeles Times that an unusual methodology helped his team assess "likely voters" and identify a pattern change in voter turnout that more traditional surveys missed.

Rather than ask which candidate respondents would vote for, Kapteyn's pollsters asked people to rate, on a scale of 0 to 100, their probability of voting for each candidate.

Polling researchers appear to have incorrectly estimated turnout within some crucial U.S. demographics (file photo).
Polling researchers appear to have incorrectly estimated turnout within some crucial U.S. demographics (file photo).

Deane told RFE/RL that a review of Kapteyn's methodology will certainly be part of the American polling association's postelection study. But she said the investigation also will look at other factors such as how researchers drew their samples, whether surveys were conducted online or on the phone, and how other pollsters made "educated guesses" about who was likely to vote.

Curtice is critical of the USC/Los Angeles Times poll, noting that it wrongly predicted Trump would win the nationwide popular vote.

The real challenge for pollsters, according to Curtice, is that people are now "significantly less willing" to answer telephone pollsters. And Internet surveys, he said, rely on people who are willing to sign up. Both scenarios make it more difficult to obtain random samples that truly represent voting populations, he said.

At the Pew Research Center, Deane concludes there is "a great deal of speculation but no clear answers" about the cause of pollsters' erroneous 2016 predictions.

Deane says pollsters are aware that their profession "faces serious challenges" and that "restoring polling's credibility" is very important.

For that reason, pollsters in both the United States and Britain will be looking closely at the revelations from their professional associations in May 2017.

RFE/RL has been declared an "undesirable organization" by the Russian government.

If you are in Russia or the Russia-controlled parts of Ukraine and hold a Russian passport or are a stateless person residing permanently in Russia or the Russia-controlled parts of Ukraine, please note that you could face fines or imprisonment for sharing, liking, commenting on, or saving our content, or for contacting us.

To find out more, click here.

XS
SM
MD
LG