Sunday, November 9, 2008

Pollster Report Card Summary

In case you did not want to go through our entire report card methodology and the exhaustive lists of the state-by-state polls, here is our final ranking of the pollsters who waded into the 17 key swing states we identified. The score provided represents on average, how close that pollster was to the two candidates' final percentages each, combined.

(1) Opinion Research. Score: 2.4 (13 states polled)
(2) Public Policy Polling. Score: 2.74 (14 states polled)
(3) Zogby. Score: 3.09 (8 states polled)
(4) Rasmussen. Score: 3.44 (16 states polled)
(5) Marist. Score: 3.56 (5 states polled)
(6) Survey USA. Score: 3.94 (11 states polled)
(7) ARG. Score: 3.98 (13 states polled)
(8) Polimetrix. Score: 4.69 (16 states polled)
(9) Strategic Vision. Score: 5.38 (5 states polled)
(10) Insider Advantage. Score: 5.61 (7 states polled)
(11) Quinnipiac. Score: 5.77 (6 states polled)
(12) R2K. Score: 6.33 (10 states polled)
(13) Mason-Dixon. Score: 7.23 (13 states polled)
(14) Big 10 Battleground. Score: 7.48 (5 states polled)
(15) Zogby Internet. Score: 7.52 (10 states polled)
(16) Suffolk. Score: 7.56 (7 states polled)
(17) AP/GfK Score: 7.84 (8 states polled)
(18) LA Times/Bloomberg Score: 7.85 (2 states polled)
(19) National Journal Score: 8.26 (7 states polled)

Before I get to my analysis, a couple of points. First, I recognize fully that this is an absolutely imperfect and incomplete analysis of our nation's political pollsters. We are focusing on a narrow class of polling work, and only a piece of our nation's political universe. Not only do I know this, but this is precisely what I intended the T2L Report Card to be.

The methodology I used to tabulate these scores is based on only one poll per pollster per state. Some pollsters did many polls in a state, but I only took its final result in these states. I think that first, this gives us the freshest data right before the election, and second, in many ways it is fair to the pollsters themselves because we do not put together a ranking which is based on old or stale polling where a pollster may seem way off today because it was gauging the public back in June or whenever, when the state of the election was far, far different.

Second, I am not looking at national polls here (which I intend to do later) or polling for all 50 states. While looking at every state would be extremely beneficial, I do not have the time right now to go through them. It also might not be as valuable given that many states had very little polling done in them, and really, the results for these non-swing states were never in doubt. Also, as you can see with the polls in Michigan, Nevada, and New Hampshire -- states that turned out to be blow-outs -- there was very little polling which came close to the final results. This makes me think that polling state landslides is more difficult, and really, that the polling in places like Alabama, Tennessee, and other places would not look good either if I spent the time going over it.

What I am trying to achieve with this report card, or ranking, or whatever you would call it, is to give us a good idea of which pollsters put out the best work in the most important states at a crucial time with the clock about to run out in mid or late-October. I think these results give us a neat snapshot of the best and worst pollsters of this memorable election cycle, and frankly there are some surprises.

As a reminder, the 17 swing states we focused on were: Arizona, Colorado, Florida, Georgia, Indiana, Michigan, Minnesota, Missouri, Montana, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Pennsylvania, and Virginia. I choose these states because they were the key swing states for the entire election. While some of them were not that close in the end, they were just about all polled pretty heavily. A state like Iowa was not included in my analysis becaue I do not think it was ever that close even in the summer, regardless of some smattering of polling data showing it competitive; in my mind, Barack Obama was never losing Iowa.

Analyzing the best and worst pollsters of the cycle

Judging by the numbers, it is easy to see who was the best this year in the swing states: the polls done for CNN and Time Magazine by Opinion Research. By far, Opinion Research churned out the results which fell closest to the final numbers in many, if not most states.

This is even more noteworthy when we find that CNN/Time waded into 13 swing states in October. This was not a small sample being carried by a handful of good results: this was a collective body of polling work that was sterling. CNN/Time had one of the top two polls in nine of the 13 states it polled at or very close to the finish line: Arizona (1), Colorado (1-tied), Florida (2), Georgia (1-tied), Minnesota (1), Nevada (2), New Hampshire (1), Ohio (1), and Pennsylvania (1). And while I did not include national data here, allow me to note that the final Opinion Research national poll for CNN called the popular vote 53-46 for Obama -- exactly right.

If that is not impressive for a pollster, I don't know what is. CNN/Time/Opinion Research gets the gold medal for polling this year.

That's not to say other pollsters did not do a great job. Public Policy Polling, a firm out of North Carolina, finished second to CNN/Time, and statistically was not far off. PPP was dismissed by some bloggers and observers this year because it does its work primarily for Democrats, but the polls it churned out in October often hit the mark. For those who are fellow political junkies, I urge you to bookmark PPP's excellent blog, where PPP posts its result along with invaluable analysis and great insights and observations.

Some other assorted thoughts and observations:

*I think we all need to lay of Zogby for a while. I often thumb my nose at Zogby polling results, and I know a lot of other people do too, including Nate Silver on his blog and in some of his rankings. Well, in swing state polling this year, Zogby did a very solid job, and ended up ranking third on the T2L Pollster Report Card. Perhaps Zogby spit the bit in other state polls or national polls I have not scrutinized (I have no idea), but the firm excelled this year. Zogby's late October results in Florida, Missouri and Nevada (where he was the closest to the actual result in a state pollsters had trouble with), North Carolina, Ohio, and Virginia deserve recognition.

*Rasmussen continues to be among the best. Ras was tops (in my book, and I know in others') in 2006, and the firm continued its good work in 2008. There were a lot of times that I scratched my head at Rasmussen polling, but the proof is in the pudding, they know what they are doing over there. Also consider: Rasmussen's average is of 16 of our 17 swing states. Their pretty record is not based on a small sample.

*Decent job, Marist. Not a lot of polls, but you stood out over a lot of your colleagues this year, and at least we noticed. We will look out for your numbers in the future.

*ARG is another one that is often maligned, included by me. The firm was not in the top tier, but it did a good job in a lot of polls.

*Polimetrix: eh... so-so. Ditto Insider Advantage, but I still like your work and analysis.

*It was a very rough cycle, at least at the end, for well-respected stalwarts R2K and Mason-Dixon. Both of them finished 12th and 13th on the T2L Report Card, and they did not distinguish themselves in the swing states. Nate Silver often noted this year that Mason-Dixon had a GOP lean this cycle, and it may be fair to examine why both its and R2K's results were so mediocre. I still strongly believe that R2K would not weight its results to satisfy the readers at Daily Kos (who comissioned dozens of polls from R2K this year), but R2K had problems this cycle nonetheless.

*Next time, I will be sure to avoid putting much credence into Big 10 Battleground polls; though, their results looked pretty outlandish when they came out.

*Keep working at it, Zogby Interactive, you can only go up, and you are no longer at the bottom.

*National Journal has some of the best material in politics, but its polling this cycle was not so hot, and it ranks dead-last out of all the major pollsters.

Best in state

Before I finish, let me just highlight the pollster (or in some cases, pollsters) who came closest in each of the 17 swing states, and the runner-up as well.

Arizona. First place: CNN/Time (1.6); runner-up: Rasmussen (3.6% off both candidates' final percentages each, combined)
Colorado. First place: (tie) CNN/Time and Insider Advantage (0.6)
Florida. First place: PPP (1.3); runner-up: CNN/Time (1.5)
Georgia: First place: (tie) Rasmussen and CNN/Time (0.4)
Indiana. First place: PPP (1.9); runner-up: ARG (2.9)
Michigan. First place: (tie) Strategic Vision, Polimetrix, and PPP (3.5)
Minnesota. First place: CNN/Time (1.2); runner-up: Rasmussen (1.8)
Missouri. First place: (tie) Rasmussen, Zogby, and PPP (0.8)
Montana. First place: Rasmussen (1.5); runner-up: ARG (1.9)
Nevada. First place: Zogby (2.8); runner-up: CNN/Time (5.4)
New Hampshire. First place: CNN/Time (1.7); runner-up: UNH, Rasmussen, R2K, and S-USA (4.1)
New Mexico. First place: PPP (2.3); runner-up: (tie) Polimetrix and Rasmussen (4.7)
North Carolina. First place: PPP (0.6); runner-up (tie) Zogby and Rasmussen (1.4)
North Dakota. First place: (tie) R2K and Polimetrix (5)
Ohio. First place: CNN/Time (0.4); runner-up: (tie) PPP, Univ. of Cincinnati, and Columbus Dispatch (2)
Pennsylvania. First place: CNN/Time (1.6); runner-up: PPP (2.4)
Virginia. First place: CNU Virginia (0.9); runner-up: PPP (1.1)

Let the record show that the second-closest pollsters in Ohio and the closest one in Virginia came from firms/sources only dipping into their own states. The University of Cincinnati, the Columbus Dispatch poll, and CNU Virginia all came very close to predicting the outcomes in two of the most crucial states in the country.

7 comments:

Anonymous said...

Did't Quinnipiac's last poll in PA showed Obama winning by 11 points and he won by 10.4 eventually?

Also, in Forida Quinnipiac showed Obama up by 2 in their last poll and race was decided by 2.5?

Mark said...

Hi. Thanks for the comment. You are right that Quinnipiac's final polls in FL and PA were very close to the final margin. However, our rankings are not based on how was closest to the final margin. The rankings are based on which pollsters, on average, were closest to the final percentages for each of the candidates.

So, while Q's final FL poll was 47-45, the ultimate result was 51-47, so they did not do perfectly in predicting where Obama and McCain would end up in the state. On the whole, Quinnipiac had a decent year but they rank 11th on my rankings. Please take a look at our methodology to see how we tabulated everything.

Anonymous said...

Not bad! I will come back to read more soon

Mark said...

Thank you! Please bookmark the site and feel free to share your thoughts on any future posts.

Anonymous said...

So Mark, if you use individual vote share as a criterion for accuracy, as in case of Quinnipiac's numbers mentioned above, in your methodology how do you take into account the undecided voters in the poll - who have a significant effect on eventual individual vote shares and who nobody knows will break in the end?

Anonymous said...

And doesn't University of Cincinnati allocated their undecideds in their last poll and most others (all others as far as I know and I might be wrong) did not.

Mark said...

Hi. Great question. Exit polls show that voters deciding in the final days was in the low-single digits in some states. Sure, this chunk may be tough for some pollsters to gauge and then figure out where they will go, but part of the pollsters' perceived (key word there) job is to predict.

In terms of the undecided vote, some pollsters push leaners and undecided voters more. S-USA does this a lot, and this is why their polls have so few undecideds, even early in a race. And look at CNN and PPP: they got the gap AND the final numbers dead-on in many cases. They may do the same but I do not know.

But this is part of the game. It is the job of the pollster to give us the temperature of the electorate at the moment the poll is taken. So from that standpoint, we can't know which polls are dead-right or wrong at at a given moment because a poll is a snapshot in time. A pollster will tell he can't predict the future, and that's right, but people expect a poll to be accurate, especially at the end.

If we are going to judge the accuracy of pollsters, we have to make tough calls. First, we have to find a way to judge their work product. Using old polls is useless, and really unfair since a June poll could be 100% right on June 15, but dead-wrong November 4. This is why I picked the final polls for pollsters -- it is the best as well as most fair way to go.

Second, we need the right methodology. As I noted earlier, just predicting the ultimate gap might not mean anything if the actual numbers are off. So, I think my simplistic solution works in both ways: it rewards accuracy in the gap and with each candidates' number.

My closing thoughts: I am absolutely no expert and no Nate Silver, and I do not pretend to be. I merely tried to craft a way to look at the final accuracy of pollsters as best I could. I agree that this is not the perfect formula, but I think it does give a good picture of the pollsters who did the best.

If you are interested, what I could do is go through all of the 17 swing state exit polls, see what percent of voters decided in the finals day/days/week and try to factor into our analysis in such a way as to weigh it to give some grace to pollsters' whose final state poll came some time before November 4? I need to ponder the best way to do this.