Well, "humble" may be going too far. Rationalizing their faulty results is more like it.
What did analyst Nate Silver do right? Among other things, he was painstakingly clear about the methods he was using. He was also openly and endearingly nervous about his results. No wonder: his calculations were necessarily based on the results of the... pollsters.
Pollsters like Frank Newport of Gallup? Well, Gallup's old battleship sank during the 2012 election and they're out there now trying to raise it off the sea floor and get it going again. Like most conventional pollsters, he hasn't made much of an effort to be clear about the methodology, tossing out mystifyingly idiotic hints in language that suggests they think they know what they're doing.
Newport doesn’t think Gallup’s problem was missing the fact that the nation has growing Hispanic and Asian populations, however.
Instead, he and other pollsters point to screening systems that determine the likely voters in a poll as possibly being too stringent.
“We used a very tight screen,” said Suffolk University pollster David Paleologos. “We screened out everyone unless they said they were very likely to vote, so a looser screen might have single handedly been the difference of screening out those Obama voters that turned out.”
Polling likely voters is supposed to be a more accurate measurement than polling registered voters, and most firms moved to polling likely voters in the two months before Election Day.
But in 2012, Gallup’s registered voter model was more accurate. It showed Obama with a 3 percentage point lead heading into Election Day, compared to a 1-point lead for Romney in the likely voter model.
Newport says Gallup might have underestimated the Obama campaign’s get-out-the-vote efforts – a criticism Team Obama voiced on a number of occasions throughout the cycle. Voters that sounded non-committal about heading to the polls, and were therefore taken out of Gallup’s likely voter survey, may have turned out on Election Day because of the Obama campaign’s sophisticated and micro-targeted grassroots efforts. ...The Hill
To be fair, they haven't yet started to take into account respondents who mislead them... sometimes deliberately.
“Our estimation of whose going to vote isn’t based on demographics, it’s based on what they tell us in the interview,” Newport said. ...The Hill
Ooops.
Well, those guys are looking kind of shambly these days. They shared a kind of faith-driven mentality that we used to seeing in the Republican party. That ugly-looking word "pragmatic" seems more heroic, though for the right it's still the "rat" in Democrat.
I believe people are seriously misstating what Silver achieved. It isn’t that he predicted the election right where others botched it. It’s that he popularized a way of thinking about polling, a way to navigate through conflicting numbers and speculation, that would still have remained invaluable even if he’d predicted the outcome wrong.I guess that leaves Nate Silver with two problems: the hubris/downfall problem that plagues all gods, and the inevitable knowledge that hundreds of pollsters and reporters out there now want him to fail -- just once, please!Many liberals relied exclusively on Silver. But his model was only one of a number of polling trackers that were all worth consulting throughout — including Real Clear Politics, TPM, and HuffPollster — that were doing roughly the same thing: tracking averages of state polls.
The election results have triggered soul-searching among pollsters, particularly those who got it wrong. But the failure of some polls to get it right doesn’t tell us anything we didn’t know before the election. Silver’s approach — and that of other modelers — has always been based on the idea that individual polls will inevitably be wrong.
Silver’s accomplishment was to popularize tools enabling you to navigate the unavoidable reality that some individual polls will necessarily be off, thanks to methodology or chance. People keep saying Silver got it right because the polls did. But that’s not really true. The polling averages got it right. ...Greg Sargent, WaPo
___
Nate Silver is a new kind of political superstar. One who actually knows what he's talking about. In America, punditry has traditionally been about having the right kind of hair or teeth or foaming rightwing views. Silver has none of these. He just has numbers. Lots of them. And, on the night of the US presidential election, they were proved to be right in quite spectacular fashion.
For weeks and months, the election had been "too close to call". Pundit after pundit declared that the election could "go either way". That it was "neck and neck". Only it wasn't. In the end, it turned out not to be neck and neck at all. Or precisely what Nate Silver had been saying for months. On election day, he predicted Obama had a 90.9% chance of winning a majority in the electoral votes and by crunching polling data he successfully predicted the correct result in 50 out of 50 states.
"You know who won the election tonight?" asked the MSNBC TV news anchor, Rachel Maddow. "Nate Silver." ...The Guardian