I've now written more times about how to properly interpret political polling than I can easily remember, including more than a dozen pieces here at Down East.
I've also worked to set a good example in the public opinion research that I've had a hand in conducting and releasing. Last year's Down East/Maine People's Resource Center gubernatorial poll, for instance, provided a full description of methodology and even disclosed the raw data collected during the survey.
It appears, however, that my obsessive campaign will have to continue. Some of Maine's media is still aiding and abetting bad public opinion research.
On Wednesday, the Maine Heritage Policy Center released a poll showing a small majority in favor of eliminating Election Day voter registration. Instead of hiring a pollster, they used a do-it-yourself Interactive Voice Response polling system from Rasmussen called Pulse.
I've discussed this service before, when 2010 unenrolled gubernatorial candidate Kevin Scott attempted to use a Pulse poll to show he was gaining in the governor's race.
Scott’s poll was ignored by the media and this one should have been too. Unfortunately, both the MaineToday newspapers and WGME reported the poll without even a hint of critical evaluation of its methodology or the wording of its questions.
Setting aside Rasmussen’s potential conservative biases, the lack of transparency around this poll’s methodology should raise some red flags. For instance, we have no idea how Pulse selects “likely voters” to sample.
The parts of the methodology that we do have knowledge of aren’t any more encouraging. UMaine professor Amy Fried (who literally wrote the book on polling) has a good analysis of some of the problems.
Public opinion research methodology is a complicated subject and there are legitimate arguments for and against different methods. Reporters can be excused for not having a deep understanding of survey methods and statistical analysis (although I’m not sure why they can’t call someone who does).
Anyone with even basic political knowledge, however, should have recognized the biggest problem with this poll: the wording of the question itself.
Here’s what MHPC asked:
The Maine legislature recently discontinued the practice of same-day voter registration, in order to give municipal clerks time to verify that registrants meet residence and legal requirements to vote. Opponents claim that this change will make it more difficult to vote, while supporters say the trade-off is necessary to protect against election fraud. Do you favor or oppose the elimination of same-day registration?
This question commits one of the greatest sins in polling: including information within the question designed to influence the responses of those surveyed.
The question begins by claiming a justification for why the legislature passed the law, then contrasts a weakly-worded argument against it with the hotbutton issue of “election fraud” before asking the real question.
Even if the poll used a completely legitimate and transparent methodology, it wouldn’t make a difference with such a biased survey instrument. This question was designed not to elicit public opinion but to pollute it.
Sometimes it can be difficult to construct a question that will accurately measure public opinion on a political issue, but not in this case. We actually know the exact wording of the question that will be on the ballot in November and that, or something very similar, is what should have been used in this poll.
Not only did Rebekah Metzler’s MaineToday article and the WGME piece both fail to point out the problems with the wording of this question, they didn’t even report the actual question at all. WGME, in fact, used a graphic showing only the last sentence, implying that this was what had been asked rather than the full, biased script.
Uncritically promoting this kind of poll isn’t just damaging to the campaign to protect voting rights in Maine (a cause I support both personally and professionally), it’s damaging to our society.
Good research and statistics can have a profound effect on politics and government policy, from Ed Muskie’s postcard surveys to Florence Nightingale’s statistical analysis of soldier deaths during the Crimean War.
The questions we ask, the way we ask them, and the way we analyze and report the results are vitally important to a modern, functional democracy. By not recognizing and challenging the bad surveys, we demean and discredit the good ones.
Pollsters, even more than politicians, claim to speak with the voice of the people. That’s an extraordinary assertion and it should be paired with extraordinary skepticism.