About that Tampa Bay Times poll released last night …

in Uncategorized by

Author’s note: I started looking into this poll based upon the strange methodology details that I saw in the reporting on the poll. I’m a registered NPA and have been since I moved to Florida 13 years ago. At StPetePolls, clients from all across the political spectrum use our polling services.

I know this analysis will discount the Tampa Bay Times and its new pollster to the benefit of the Democratic Party, but I would like to mention that six months ago, the Democratic side was discounting our polling of the CD-13 special election which showed David Jolly winning.

My primary goals in this analysis are to make polling better, more accurate and expose the flaws in the poll released last night by the Times.

After reading the analysis from the Tampa Bay Times, Bay News 9 and My News 13 (the organizations that sponsored the statewide poll released last night) the numbers shown just didn’t add up, and the methodology sounded a bit out of the ordinary.

Ignoring the results themselves, I decided to look deeper into the organization conducting the polling, as well as their methods for calculating the results.

First, the news I was happy to see was that the Tampa Bay Times had dropped Braun Research as its polling company. This group famously declared that Romney would win Florida by six percent and Alex Sink would win the special congressional election by seven percent. But what about its new pollster, the University of Florida’s Bob Graham Center? What was their track record on political polling in Florida?

More on that later.

Let’s start with the information that they have released in the articles published so far.

The sample size was 814 respondents, a relatively small sample size for a major polling release from the most influential newspaper in Florida. Most other large polling companies, like Quinnipiac, have sample sizes at least 50% larger than this when polling statewide in Florida.

Looking at the UF Graham Center’s website, it says that a typical political survey will have a sample of over 1000. When asked about this difference, staff at UF said that with the Labor Day holiday it was more difficult to get a larger sample.

The next thing I noticed didn’t surprise me at all, there were no cross-tabs letting you know the full demographics of the voters surveyed. This is how the Times typically operates and it makes the results harder to independently analyze. But the interesting thing that I learned when I spoke with staff at UF was that after two more sets of articles are written on this poll by the Times, then UF will be able to release the full cross-tabbed report.

I very much look forward to finally seeing a full Times’ sponsored polling report.

The “weighting” mechanisms were the last thing that caught my eye, mostly for how non-standard they were for a public poll.

First, the use of “media market” as a weighting mechanism is pretty rare in public polling. The reason given by UF staff for this was that its political consultants recommended this as a metric. That’s not surprising here in Florida, because candidates need to know what media markets they should spend their money in. But for public political polling this metric is hard to find on most polls conducted for media organizations like the Times.

Second, the absence of gender and race as weighting mechanisms was a bit confusing, especially given all of the press discussing the “Women’s vote” or the “Hispanic vote.” Failing to weight by these metrics considers these less important factors in this race, which is a hard contention to back up. When I asked about this, the staff at UF said that was a good question, and that they don’t usually do political polling, and that this was only their second political survey.

Third, the political party weighting was done in a strange way. The results are weighted by “registered party,” but the party information is determined by asking which party the respondent “identifies with.” This often inflates the number of Independent responses, and makes them count for less than those of the two major parties.

The Times article states that the polling sampled 35% Republicans, 30% Democrats and 26% Independents. But that only adds up to 91%, so what happened to the other 9%? I asked the UF staff about this and they stated those were “Other” and would have been weighted down proportionally to the 1% that their voter data source files state “Other” falls under.

They further explained that the data they received had political parties divided into 4 categories: Republican, Democrat, No Party and Independent, with about 1% left over as a 5th category of Other.

So, the 65% Republican and Democratic respondents are weighted much higher than the 9% of respondents that stated they belonged to an “Other” party. Separate from the “Other” respondents, Independent and No Party respondents were combined together in their results and accounted for a proportional 26% of both poll respondents and registered voters.

This led me to ask where they got their voter data from since the Florida Department of Elections doesn’t group voters in that way. I found out that UF obtained their lists from Labels and Lists (L2Political.com), a Bellevue, Washington based data company.

Fourth, they did weight by age, but most polling does not rely on age as the only physical trait to be weighted by.

As for the other elements of the poll, they were all on mark. The polling was conducted in both English and Spanish by human operators, the questions appeared to be neutral and balanced (although I have not seen a full script to confirm this), and given the parameters they are using, the data analysis appears to be accurate given the limited data available right now for me to look at.

Looking more into the UF Bob Graham Center and their previous work, they have a very long and positive history of mostly economic survey work. The people involved have decades of experience and are very well published in the arena of economic polling.

Talking on the phone with them, I found them to be very knowledgeable and open about the work they do. They also appear to have a well-tested survey call center where most of their polling is conducted.

The issues I have detailed above mostly seem to stem from their limited experience with political polling and their reliance on two outside political consultants. I’m hoping that as they continue in this area that they take input from some more outside sources in the future.

But, I would caution against putting too much faith in the accuracy of this specific poll.