Conservative bloggers don't seem to understand how polling works

The thing that bothers me most about the whole “liberal media bias” argument is that it invites laziness and incuriosity. Armed with the “liberal bias” truncheon, any conservative can simply dismiss out of hand news they don't like or disagree with based on the presupposition that the media outlet that reported it is biased towards liberals and against conservatives.

Take, for example, the latest poll out from the Washington Post and ABC News, which found that Americans are wildly anti-incumbent right now, but also that the public trusts Democrats more than Republicans on major issues facing the country. Obviously, conservatives would not take kindly to that second finding, and Hot Air's Ed Morrissey chalks it up to perfidious liberal bias: “Today's Washington Post/ABC poll offers Democrats some bad news, but they've managed to artificially temper it by actually adding a point to the partisan gap in their sample since the last survey.” Picking up where Morrissey left off, NewsBuster Noel Sheppard wrote that the pollsters “cooked the books,” adding: “The lengths liberal media outlets will go to assist the politicians they support is oftentimes sick-making.”

This is insane.

Here's a quick refresher on how public opinion polling works. The pollster, in most cases, will pick up the telephone and survey a random sample of Americans, asking their party ID only after they have them on the phone. This, according to their methodology, is how the Post/ABC pollsters operate: “This Washington Post-ABC News poll was conducted by telephone April 22-25, 2010, among a random national sample of 1,001 adults including users of both conventional and cellular phones.” [emphasis added] They have absolutely no control over the partisan make-up of the sample, and they give no indication that they weighed by party ID.

I should point out, though, that I'm not a polling expert. FiveThirtyEight.com's Nate Silver, however, is a polling expert, and here's what he had to say back in 2008 when a bunch of similarly misguided liberals complained about the partisan makeup of a Fox News poll:

It's true that FOX's sample included a materially higher percentage of Republicans this time around. FOX, however, does not choose its sample; its sample chooses itself. In this case, when they drew their ping-pong balls out of the jar, they came up with a slightly higher percentage of red ones. This kind of thing will happen all the time unless a pollster weights by party ID, which FOX News and many other pollsters do not. The Pew poll that came out the other day, for instance, had a big increase in the number of Democrats in its sample.

To get an idea of the lengths to which pollsters go to ensure as random a sample as possible, check out this piece from Pollster.com's Mark Blumenthal.

Put simply, the Post/ABC poll's (minor) change in party ID can be explained in one of two ways. First, the random sample of Americans the pollsters contacted this time around randomly contained slightly fewer people willing to identify themselves as Republicans, a phenomenon polling experts say happens all the time with pollsters who sample randomly. Second, the deviously liberal pollsters decided that they would act in a monstrously unethical fashion and risk professional credibility by purposefully undersampling Republicans in order to manufacture some modestly good news for the Democrats they love and adore.

Ed Morrissey and Noel Sheppard would have you believe option number two, based solely on the fact that the party ID changed by one point, and the assumption that the pollsters -- since they are part of the “liberal media” -- are disreputable Democratic Party flacks.

And the sad part is that this happens all the time in the conservative blogosphere. If a poll comes out that they don't like, their default response is to claim that the liberal pollster “cooked” the results.

It's lazy, it's stupid, and it's wrong.