Stupid Stimulus Tricks

So there’s another the-stimulus-didn’t-work paper (pdf) making the rounds, and as usual being seized on by people who have no idea what the issues are with this kind of estimation.

Basically I’m with both Dean Baker and Noah Smith here, but I thought I might add some more general discussion.

What this study claims to do is estimate the effect of the stimulus by looking at cross-state comparisons. So the first thing we should understand is just how difficult it is to do that.

Remember, the stimulus was not big compared with the economic downturn. The original Romer-Bernstein estimate was that it would, at peak, reduce unemployment by about 2 percentage points relative to what it would otherwise have been. And most of that effect was supposed to come through measures that would have been common to all states: tax cuts, transfer payments, etc.. At most, differences between predicted effects among states should have come to no more than a fraction of a percentage point off the unemployment rate.

Meanwhile, there were large differences in actual unemployment changes by state. Here’s the change in the unemployment rate from 2007 to 2010:

BLSChange in unemployment from 2007 to 2010

Obviously there were factors other than the stimulus driving the great bulk of these differences. At the top are the “sand states” that had the biggest housing bubbles; at the bottom, cold places where nobody lives.

To tease any effect of the stimulus out of these interstate differences, if it’s possible at all, would require very careful and scrupulous statistical work — and we’d like to see some elaborate robustness checks before buying into any results thereby found.

The latest anti-stimulus paper shows no sign of that kind of care. It makes no effort to control for the differential effects of bubble and bust. It uses odd variables on both the left and the right side of its equations. The instruments — variables used to correct for possible two-way causation — are weak and dubious. Dean Baker suspects data-mining, with reason; the best interpretation is that the authors tried something that happened to give the results they wanted, then stopped looking.

Really, this isn’t the sort of thing worth wasting time over.