Rasmussen And Corporate Toadies In Cahoots; Pollster.com Named As Co-conspirator
-by Doug Kahn
Pollster.com is hawking charts that obfuscate the actual results of most of the reputable pollsters that are supposedly contained in them. Is obfuscate the right word for it? Let’s see: obfuscate, blur, shade, cloud, confuse, conceal, mask, cover up. Maybe dilute is better. Water down, thin out, adulterate, doctor. That’s the word I want. Doctor.
They display the actual polls and their percentages in a list, and then doctor them on the way into the charts. The math they use allows an individual pollster to supply results that substantially change, and even reverse, the results reported by other pollsters. Pollster.com is giving Rasmussen control over the national conversation on politics, with the help of credulous news people and political bloggers who display these charts on their web sites. I’m not going to name names. You know who you are.
Journalists in print and online reproduce the charts as though they were news, as though they represented reality, the truth. They don’t. One pollster can use Pollster.com to change Obama favorables into unfavorables, or a Democratic advantage on the House generic ballot into a Republican advantage.
A political poll doesn’t ask every eligible voter how they’re going to vote, they ask a sampling of voters. If you were to take a poll of 500 randomly selected voters, and do that 100 times, you’d get different results every time, but the results that are closer to reality will cluster around one number. Imagine a curve that looks like a bell; the top of the bell is the most frequent result, the presumed truth of the matter. By chance, some of the polls will be way off; those results are the tails, the bottom left and right edges of the bell, the ‘outliers.’ It should seem obvious that when you poll more people you get more reliable results. A common standard: you poll enough voters to make sure that if you did 100 polls, only 5 of them would be further than 6 percentage points away from reality.
Let’s look at the 2010 National Congressional generic ballot. Rasmussen is a prolific pollster. Since January 1, 2010 Rasmussen has reported polling 21,000 people, versus 18,786 for every other pollster combined. The reason you’d want to aggregate all the polls is to be more accurate by examining the responses of as many voters as possible. Polling is expensive, so polling numbers are hard to come by, especially in local races. Even if you simply add the results and average them, Rasmussen ends up being the overwhelming influence on the final result.
But that’s not the way these charts are created; the math that Pollster.com uses gives Rasmussen an even bigger advantage. They use a statistical method called regression analysis, the purpose of which is to make the trend line more closely match the reality of the situation, reduce the influence of random variations.
The frequency of Rasmussen’s reported results gives their figures a heavy influence on the charts. Pollster.com gives Rasmussen, an extreme outlier, an extra advantage just by the way the statistics are done. A regression analysis gives the more frequent pollster more control over the curve. That makes sense, but only because to use regression analysis in an honest way, you have to presume that all the results are comparable (in polling, that all the pollsters ask the same question) and that they’re all honestly reported figures.
Here’s the current 2010 National Congressional generic ballot chart, with a Republican advantage of 43.6% to 43%.
You can reduce the advantage Rasmussen gets by using tools that Pollster.com gives you to customize the charts, for instance by increasing the ‘sensitivity’ of the trend line to more recent polls. There have been 25 polls in 2010, and six of them are from Rasmussen. Increasing the sensitivity gives the other 19 polls more influence. Here’s the same chart giving the non-Rasmussen pollsters more weight: a Democratic advantage, 44.7% to 42.8%. This is a swing of 2.5%.
Here’s the chart the way I always look at Pollster.com trends, with the sensitivity on high and Rasmussen removed. Democratic advantage, 46.6% to 42%.
You can set the sensitivity on high and use the tools to remove other pollsters from the mix, and I’ve done that with this chart and with the national charts measuring Obama job approval and Obama favorability rating. Removing Rasmussen produces a swing of up to 5%. You can’t produce anything more than a 1% swing removing any other pollster.
All through the year after Obama’s election, Pollster.com reported Rasmussen’s ‘Obama favorability’ as wildly divergent from all other pollsters. Mark Blumenthal, who runs the site, had inserted Rasmussen’s ‘Obama job approval’ numbers (which are always more negative than favorable ratings). After repeated complaints and objections that the Rasmussen results had to be wrong, he removed all of these numbers without telling anyone. His September 29, 2009 public explanation was buried on the site in a ‘P.S.’ to a complaint from someone else about an impossible Rasmussen result from Iowa. It made no logical sense, I don’t think:
“We discovered over the last few days that what we believed to be a weekly update of Barack Obama's favorable rating on RasmussenReports.com had been changed at some point to a weekly job approval rating, even though the labels still read favorable and unfavorable rather than approve and disapprove up until about a week ago. As such, we have removed the non-favorable ratings from our national Barack Obama favorable rating chart.”
That contradicts reality: for a year he reported numbers of respondents in the supposed Rasmussen favorability polls that were different from the numbers in the Rasmussen approval polls. How does that correspond to the explanation that it was a labeling error?
The Pollster.com charts, which understated Obama’s favorable rating advantage by almost 10%, were viewed hundreds of thousands of times on numerous political web sites. Arguably, it affected the fate of Democratic proposals in Congress.
Here’s something that needs an explanation. If you take the Obama favorability chart and restrict the date range to the 12 months between 11/20/08 and 11/20/09, you get a list that includes no Rasmussen results and a chart that corresponds:
But when you use the customization tool to ‘remove’ the already absent Rasmussen, the chart numbers move! The ghost of Rasmussen past apparently influences the other numbers.