The weekend before the election, I was talking with a friend – a woman who has become a newly-minted conservative in the past two years. She’d sat out the 2008 election, and had voted for Kerry in ’04, but finally became alarmed about the state of this nation’s future – she’s got kids – and got involved with the Tea Party and started paying attention to politics. And she was going to vote conservative. Not Republican, mind you, but conservative.
And the Saturday before the election, she sounded discouraged. “Have you seen the polls?” she asked. “Emmer’s gonna get clobbered”.
I set her straight, of course – referred her to my blog posts debunking the election-eve Humphrey and Minnesota polls.and showing her the Emmer campaign internal poll that showed the race a statistical dead heat (which, obviously, was the most correct poll before election day).
She left the room feeling better. She voted for Emmer. And she voted for her Republican candidates in her State House and Senate districts, duly helping flip her formerly blue district to the good guys and helping gut Dayton’s agenda, should he (heaven forefend) win the recount.
But I walked away from that meeting asking myself – what about all the thousands of newly-minted conservatives who don’t have the savvy or inclination to check the cross-tabs? The thousands who saw those polls, and didn’t have access to a fire-breathing conservative talk show host with a keen BS detector who’s learned to read the fine print?
How many votes did Tom Emmer lose because of the Hubert H. Humphrey and Minnesota polls that showed him trailing by insurmountable margins?
How many votes to conservatives and Republicans lose in every election due to these polls’ misreporting?
Why do these two polls seem so terribly error-prone? And why do those errors always seem to favor the Democrats, with the end result of discouraging Republican voters?
Public opinion polling is the alchemy of the post-renaissance age. Especially “likely voter” polling; every organization that runs a poll has a different way of taking the hundreds or thousands of responses they get, and classifying the respondents as “likely” or not to vote, and tabulating those results into a snapshot of how people are thinking about an election at a given moment.
But the Star Tribune’s Minnesota Poll has, to the casual observer, a long history of coming out with polls that seem to short Republicans – especially conservative ones – every single election. And the relative newcomer to the regional polling game, the Hubert H. Humphrey Institute’s poll done in conjunction with Minnesota Public Radio, seems – again, anecdotally (so far) to take that same approach and supercharge it.
I’ve had this discussion in the past – David Brauer of the MinnPost and I had a bit of a back and forth on the subject, on-line and on the Northern Alliance one Saturday about a month ago.
And so it occurred to me – it’s easy to come up with anecdotes, one way or another. But how do the numbers really stack up? If you dig into the actual numbers for the Humphrey Institute and the Minnesota Poll, what do they say?
I’ll be working on that for the next couple of weeks. Here’s the plan: