With Romney trailing by a clear margin, a frontal assault on the accuracy of polling has begun. Some of the skepticism is understandable, but other elements are brazenly self-serving. I'll be assessing the critiques of the polls over the next few days, but any debate about the accuracy of the polls must start by remembering a central point: The polls are usually pretty good.
Forget about LOESS trend-lines, demographic regressions, or House Effects and just consider the surprising accuracy of a polling average that can be calculated by any sixth grader with a calculator and access to the Internet. In 2004 and 2008, the RealClearPolitics average ended with Bush leading by 1.5 points; he ended up winning by 2.4 points. In 2008, Obama led by 7.6 points; on Election Day he won by 7.2 points. Pretty good, right?
The battleground state polls were also accurate. In 2004, the RCP average only missed Wisconsin, but Bush entered Election Day with just a .9 point lead and Kerry only won by .4 points, so it’s hard to characterize that as a real failure. In 2008, the RCP average only missed Indiana and North Carolina, the two closest states won by Obama. Again, the average got the basics of a close race right: McCain entered Election Day with a .4 point lead in North Carolina, he would lose by .3 points; Obama trailed by 1.4 points in Indiana, he would eventually win by just 1 point. There are examples of more substantial errors, like when the RCP average showed Kerry within one point of Bush in Florida, even though Bush would ultimately prevail by 5 points. But on average, the state averages were off by just 2.8 points in 2008 and 1.9 points in 2004—not perfect, but more than good enough for our purposes.
Polls are typically less accurate in judging races further down the ballot, but they still do pretty well. They might also be less accurate during off-year elections, or when the contested races are in deep red or blue states. Of the 24 closely contested gubernatorial, senate, or presidential contests where the polls exhibited a Republican bias of 3 points or more, just three were in states carried by John McCain in 2008, and two were West Virginia and Kentucky, states where Democrats hold a large advantage in party-ID. Conversely, of the 14 gubernatorial, senate, or presidential contests where the polls tilted Democratic by 3 points or more, just four were in states carried by John Kerry, and only Rhode Island was non-competitive at the presidential level over the last decade.
But the polls tend to do quite well in close contests in the battleground states, where swing voters seem to split evenly between the candidates between the final polls and Election Day. If there’s one exception, it might be the competitive western states, like Nevada, Colorado, and New Mexico. In 11 competitive contests in those three states, the polls underestimated the eventual Democratic performance in every instance, including the two upset Democratic victories. Some have suggested this is due to difficulties in polling Latino voters, but I’ll shy away from explaining the error and simply observe its existence.
Despite recent accusations, there isn't much evidence suggesting that the polls are systemically biased toward Democrats. In fact, the clearest instance of bias in any direction came in 2010, when the polls systemically underestimated the strength of Democratic senatorial and gubernatorial candidates. Was this because of the unique circumstances of 2010 or because most close races were fought on heavily-Democratic turf, where undecided voters in a tight race are disproportionately composed of Democratic-leaners? It's hard to say.
Are the polls getting less accurate? While the pro-GOP bias in 2010 and the possible cell phone issue might lead some to believe the polls are getting less accurate, that's not yet evident. The polls did tilt-GOP in 2010, but the error wasn't that much greater than prior elections. If you need an example from 2012, recall that it was just a few months ago that the public polls nailed the results of the Wisconsin recall. The RCP average found Walker leading by 6.7 points and he ultimately won by 6.8 points.
Poll-doubters would do well to remember the reaction of Democratic pundits to Walker's growing lead in Wisconsin. Democrats railed against the likely voter screen—it was said to be too tight in a Democratic-leaning state where voters had supposedly been outraged by Walker's policies. A wave of internal Democratic polls were leaked and they showed a closer race. Yet on Election Day, the average of public polls was right. Something similar happened in 2004, when Democrats complained that the polls didn't show a Democratic partisan-advantage, even though they held one 2000 and just about every previous election. On Election Day, the polls were right, Kerry lost, and there were an equal number of Democrats and Republicans in the final exit poll.
No, the polls aren’t perfect. Polls have been wrong before and they certainly will be again. There is nothing wrong with noting that the polls are imperfect and could be wrong in this election; they might be. It might even be understandable if someone argued that there was a greater chance that the polls are wrong than usual, given declining response rates and challenges with voters relying on cell phones. But asserting that the polls are wrong, simply because the result doesn't match expectations is a recipe for disappointment. Analysts arguing that the polls are fundamentally inaccurate have rarely been vindicated by the results. If you dismiss the polls when you disagree with their findings, you'll usually end up wrong.
*This article only covered the simple polling average. Other techniques, like looking at the median poll (which has the effect of excluding outliers on either side) do just as well or even better.