Some may wonder how so many of the pollsters got it wrong last week. How did all the disparate bloggers, pundits, commentators and radio personalities get so many different answers regarding the outcome of last Tuesday’s election? Well, the answer to that question lies in something called Ockham’s Razor. This philosophy suggests that the simplest answer is most likely the right answer. And, in the case of the predictions between Obama and Romney, the simple answer is math. Nate Silver, statistician and blogger at FiveThirtyEight, predicted a win for Obama with almost razor sharp precision. And, he used an oft forgotten method to do so: simple linear statistics. So, this leads one to ask, well, what did everyone else use?
Let’s start by saying, Geeks: 1, Pundits: 0. Last week’s outcome proves that partisans have no equal with mathematical fortitude. So, what did Silver do exactly? He simply looked at trend data (prior election outcomes, poll returns, marginal effects, etc.) and simulated the election over and over and over again. When this type of analysis is performed, the continued simulations develop into a pattern, a line of sorts. And as these simulations run over and over again, the outcomes on the edges, or anomalies, slowly filter out until there is just one answer in the middle (well one logical answer at least). Now, there will always be error. And there will always be anomalies. But Mr. Silver was able to predict the outcomes of the election with such accuracy because he accounted for just about every variable impacting the election. In this case, math is the real success story not Silver himself.
What about the other guys (or gals)? How did they manage get it so wrong? Again, the simplest answer is most likely the right answer. They didn’t really use math. They may have used party leanings, desired outcomes, personal wishes, or, in the case of Karl Rove, plain old bully pulpitting. Why would they do that? Because, there is a lovely phenomenon in elections wherein voters, usually low-information ones, cast their vote for who they think has the best chance of winning. I know, it sounds silly, but, who wants to be the person saying they voted for the loser? To avoid the prospect of that embarrassment, voters will sometimes glean information from news outlets and other media sources to help inform their own vote. So, if party leaders and biased pollsters simply tell voters that the “polls” suggest that the election will go one way or the other, doesn’t that guarantee a slight benefit? It might get the voters they want to turn out and the voters they don’t want to stay home. This tactic is a win-win option.
Logically, if a polling group is conservative- or liberal-leaning, they might benefit from simply telling voters their side has already got it in the bag, right? What a relief it might be for the confused or overwhelmed voter to know that the rest of the country was voting for Romney or Obama. That would make their decision so much easier. And, that is exactly the goal. Political “analysts” are not new to this game. They’ve done mounds and mounds of research indicating what sways voters. And, little things like telling the American people “we’ve got this” or “polls say we’re in the lead” don’t take a lot of leg work to get out there in the public eye. And, most voters aren’t going to analyze the data themselves so pollsters capitalize big time from intentional sampling bias (skewing), question wording issues (like leading questions), and implicit polling slant.
When you see poll results, do you often research the sampling method to understand if there was bias? How about the polling method to see if the study used landlines or email surveys? And, do you check to see if it was a panel survey, meaning the same group of people were study at different instances, to get a better understanding of the study’s reliability? I am sure you don’t. Don’t be concerned. Most other people don’t do any of that stuff either. They simply trust the pollsters to provide a real and true sampling of real voters.
Therein lies the problem. How boring would this election season have been if the entire political universe just said from the very beginning that President Obama had it in the bag? Not very exciting right? In this age of “infotainment,” simple artifacts of enrichment and I don’t know, news, are hard to come by. They are typically slathered with slant, spin, bias, and propaganda. And polls are no different.
Oddly enough, the average Joe or Jill will use statistics at an innumerable rate each day. By guessing when to book a hotel room in Las Vegas in order to secure the best deal, or strategically planning when to leave for work to avoid traffic, one is using probability and statistics. Even all those sports lovers and Fantasy Football addicts benefit from statistical methods. So, what Nate Silver did really isn’t out of the realm of possibilities for anyone who truly wants the right answer.
For your daily dose of geek, see Silver explain his prediction below. And, next time you see a poll, you’re going to vet the outcome right? Maybe? Well, I bet you will wonder about how the data was collected, who the people were responding, and how the interviewers asked the questions. And, if nothing else, that is a triumph for geek-dom. So vet your data, it might surprise you how amazingly un-complex these predictions really are.