It’s officially Summer, the legislative session has (finally) ended and there are at least a dozen serious candidates declared for president – and that’s just the Republican side of the ledger.
Do you know what that means?
POLLING SEASON HAS BEGUN!
I know it’s a little crazy but we are only (yes, “only”) about 200 days from the Iowa Caucuses. And, in case you haven’t been following political news, someone is releasing a new poll almost weekly.
Do you know what else that means?
IT’S TIME TO REFRESH THE SALTSHAKER TEST!
For those of you who aren’t familiar with the saltshaker test, we take a deep dive into the methodology of the poll in question and give it a quick thumbnail evaluation. We look at some basic things like sample size, party balance, and other facets of how it was conducted. This is by no means intended for peer-review publication, but we think it’s a good way for our readers to put these polls in context, as it offers what we consider to be a fair measure of their validity and reliability.
And here’s the scale:
- No salt needed: Solid pollster, solid methodology, and the sample appears to be nicely balanced.
- A grain of salt: The poll has one or two non-critical problems and should be taken with a grain of salt.
- A few grains: There are several concerns with how the poll was conducted, but not enough to throw it out entirely.
- A half shaker: There are enough problems with the methodology to warrant serious concerns, and the poll should not be taken seriously.
- A full shaker: The poll has so many problems it should not only be completely disregarded but pollsters receiving multiple “full shakers” will no longer have their polls covered by Florida Politics/SaintPetersBlog.
Oh, and one more thing…
While I’m a consumer of polls, have a relationship with a polling operation, and am probably more knowledgeable on the subject than your average Joe, I am not – as many of you so kindly noted last cycle – a professional pollster, nor am I academically trained as one.
But I know someone who is.
That is why, this cycle, we have enlisted the help of one of our 2014 “Top 5 Brightest Minds” winners and longtime Florida pollster, Steve Vancore of VancoreJones Communications and Clearview Research. Vancore’s polling experience in the state goes back to 1985; he has a master’s degree in marketing communications from FSU; and he is undoubtedly seen as a credible voice on the subject. Additionally, as many of you may recall, he nailed the governor’s race last year when many people were off by quite a few points. He also happens to be a stickler for the finer details of polling, which makes him perfect for the job.
With that, we asked Steve to take a stab – or should I say, “A shake?” – at the most recent poll from Quinnipiac…
Saltshaker test for Quinnipiac Poll released June 22, 2015.
Verdict: A Few Grains.
In evaluating the methodology of the Quinnipiac poll, one thing jumps off the page – these folks went to great pains to get it right. This poll has a large sample size, with over 1,100 participants; they used live callers and it appears that they took a large number of surveys via cell phone (a must-do in this era). They both used random-digit dialing and weighting the sample to make it look more like the target population. That technique can sometimes be problematic, but with a sample this large, that isn’t an issue.
But, there were a few things that raised our eyebrows.
For starters, during their dialing, they depended on the respondent to “self-identify” as a voter. A better way to conduct calls is beginning with the voter file and verifying that the person on the line is the actual voter. The way Q-pac did it here left too much room for error. Is the person a voter who happens to vote in another state? Does the respondent actually get out and cast a ballot? The voter file would have told them that.
Further, and on the subject of self-identification, Q-pac relied on the respondent to declare their party and not the voter file. This is fine, if you want to measure trends and sentiments related to party affiliation, but it is not a good way to balance your sample. Each sample, especially in Florida, absolutely must be balanced according to how voters actually are registered to vote and not how they wish they were registered. That is likely why the sample has 34 percent “Independent,” when the actual number in our state is closer to 27 percent registered, and we reasonably estimate they will be about 23 percent of those who cast ballots in November 2016. Additionally, this sample has a +4 Democrat advantage over Republicans and, while that is close to voter registration, history tells us the actual number will more likely be +1 Democrat. In short, the poll favors the Democratic candidates by a slight margin, and we suspect (given the large number of “Independents” and nearly 40 percent of calls taken on a cell phone) that the sample is also too young.
Finally, this poll was in the field for 11 days. That’s not that big of a deal, as opinions are relatively static this far out from an election, but that length of time should be more condensed as the election approaches. However, and this is important to note, even in a non-election year, public opinion can –- and often does -– change quickly. (Think, “Confederate flags” and how quickly and dramatically the public’s opinion recently changed on this long-lingering issue.)
In conclusion, this is a pretty good poll. Anytime a group goes to the lengths Q-pac did to get it right, we need to acknowledge that. But, not verifying that they were talking to actual Florida voters (or even those who have a history of voting), having party balance off, and using respondent “self-identification” for party registration cause us to give this poll a few grains of salt.