Skip to main contentSkip to navigationSkip to navigation
Voters wait in line at the Bobby Miller Activity Center in Tuscaloosa, Ala. Voters in record numbers flocked to the poles Tuesday to cast their votes.
Voters wait in line in Tuscaloosa, Alabama, on election day. Photograph: Dusty Compton/AP
Voters wait in line in Tuscaloosa, Alabama, on election day. Photograph: Dusty Compton/AP

The top five polling lessons from the 2012 election

This article is more than 11 years old
Cellphones, internet tactics and other vital things we should take with us as we close the books on last month's election

The 2012 election is in the books, and as we head into the new term filled with new elections, there are polling lessons we should take with us. Yet, as this list of the top five polling rules from 2012 will demonstrate, there will always be exceptions to the rule.

1. When likely and registered voter polls disagree in high turnout elections, you should usually go with the registered voter surveys

Likely voter (LV) models at their heart are trying to separate the LV electorate from the larger registered voter (RV) unit. That just creates unnecessary noise in a presidential year because, as I've noted, almost all RVs cast a ballot.

This year LV samples were more favorable to Mitt Romney throughout the entire campaign. Among pollsters who conducted a poll among both LVs and RVs in the final week, Obama was ahead in thefinal LV aggregate by only 1.3 points. The RV average had him by 3.9 points and was thus far more accurate given Obama is leading by 3.7 currently. This follows a trend from past presidential years when RV polls were more accurate than their LV counterparts.

Exceptions to the rule:

Pew Research's final LV margin of three points for Obama was more accurate than their RV margin of seven points. That's noteworthy because Pew Research has one of the tighter LV screens in the polling business. Pew's LV success marks three consecutive presidential elections where their LV model was more accurate their RV model.

The RAND poll LV model, which allowed voters to assign themselves a percentage likelihood of voting, had Obama leading by 3.3 points and was only 0.4 points off the mark. The RV RAND model had Obama ahead by 6.3 points – a 2.6-point error.

2. Cellphones are generally needed for an accurate telephone poll

If you're trying to poll the overall population, and part of the population no longer has a landline, then you need to call cellphones. This "fact" has become especially true as the cellphone-only population explodes, and those with cellphones have different political ideologies even when controlling for demographics.

During the course of the 2012, polls without cellphones consistently painted a false pretty picture for the Romney campaign. Polls without cellphones in the final 21 days were more than a point more inaccurate than those that included cellphones. This includes some of the worst pollsters of 2012 such as Gravis and the infamous Rasmussen.

Exception to the rule:

Public Policy Polling was arguably the most accurate large-scale pollster of 2012 and didn't compensate for not calling cells by utilizing an internet subsample. Their secret? They simply were able to correctly model the election day electorate, which is what polling is all about.

3. Internet polling is the wave of the future

Once upon a time, internet polling, and specifically non-probability internet polling, was shunned. Now with cellphones popping up everywhere and poll response rates plummeting, internet polling has gotten a second look because it's cheap and accurate. Almost all age groups now use the internet and weighting can compensate for people who don't.

Google, Ipsos, RAND, and YouGov performed at least as well as telephone polls, if not better. RAND was by far the most accurate national pollster of 2012. RAND and YouGov also tended to be stabler than telephone surveys in their results after the first debate.

Exception to the rule:

Zogby (JZ Analytics) is awful at internet polling. Their poor performance in 2004 delayed the acceptance of internet polling in the mainstream by at least five years, and they were again in 2012 one of the least accurate pollsters.

4. Internal polls published publicly generally should not be trusted

This one is fairly simple. Internal, or campaign, pollsters usually only release surveys to the public that are most favorable to their candidate. They may also only pass polls to the press that are based on turnout models more favorable than probable for their candidates.

In 2012, we saw that Mitt Romney's internal polls were hilariously bad. We also witnessed that polls conducted for Republican groups such as Citizens United in the swing states that were way off the mark. During the Wisconsin gubernatorial recount, Democratic internal surveys were on average six to seven points more favorable to Democrat Tom Barrett than public polls.

Exception to the rule:

Any poll conducted by Democratic pollster Mark Mellman. He was most accurate state pollster of 2012. Mellman is the only pollster I'd trust in the state of Nevada where Latinos are generally undercounted by pollsters. He also did well in North Dakota when he argued early on that Democratic Senate candidate Heidi Heitkamp was competitive, even though most of us thought she had little chance.

5. When state and national polls disagree, you should generally go with the state data

The state polling always had President Obama winning a second term and state polling averages were correct in 49 or 50 states. The national polling had Romney much closer and even pulling into the lead after the first debate. It was only in the final week that it became clear judging off the national polls that Obama was going to win the nationwide vote. Even then, 90% of the polls underestimated Obama's lead.

This was not the first time that national and state polls disagreed. In 1996 and 2000, national polling incorrectly had Bill Clinton winning by double digits and George W Bush winning respectively. This year also continued a streak wherein no presidential candidate won a state where they were not leading in at least one non-partisan statewide poll in the final weeks.

Exceptions to the rule:

The state polling was atrocious in a number of states. Twenty-eight of 29 polls in Colorado, 20 out of 21 polls in Michigan, 16 of the 18 polls in Nevada, and 33 of the 38 polls in Virginia during the final month were biased against Obama. Fortunately for pollsters, Obama was leading by a comfortable enough margin in these states that this bias did not effect the projected winner in the polling aggregates.

Most viewed

Most viewed