Economist Dean Baker Points Out “Problems” With Fox-Hyped Paper On How Stimulus Supposedly Destroyed Jobs

As Media Matters has noted, Fox's supposedly “straight news” division is hyping a new paper suggesting that the 2009 economic stimulus resulted in the loss of approximately 550,000 jobs. Fox even presented the paper's conclusions as fact despite the fact that the non-partisan Congressional Budget Office and numerous economists say the stimulus helped curtail job losses and increase economic output.

Now, economist and Center for Economic and Policy Research co-director Dean Baker has weighed in, pointing out “a few problems with the paper.” Baker notes that the paper may suffer from “the problem of cherry picking.” From his May 17 post on the Center for Economic and Policy Research blog, headlined “The Stimulus Did Not Create Jobs: The 35,496th Try”:

With an exercise like this, you always have to worry about the problem of cherry picking. It is very easy to run 1000 regressions in an hour. Inevitably, you find 4 or 5 of these 1000 that show you almost anything. (Our standard of significance is a result that you would not get by random chance more than 10 times in a hundred. This means that if you ran 1000 regressions of things that had nothing to do with each other, you would expect 100 of them to have statistically significant results.)

[...]

Their results depend on pulling out four private sector industry groups (lumped together) and measuring the stimulus against trend job growth in these industries. Even for these four industry groups , most of the results are only marginally significant. It is clear from their tables that if they took all private sector jobs, their results would be insignificant. So, how did they decide on these lumping these four industry groups together? It certainly is not a standard break out. It certain does raise a suspicion that they ran many different regressions and then discovered that they got the results they wanted with these four industries lumped together.

Baker also identifies other “peculiar items” in the paper and concludes: “In short, there are many unusual aspects to this analysis and very little effort to determine whether these quirks are driving the results.”

His entire post is worth a read.