Beware of Bad Research on the Minimum Wage from a Great San Diego University

Get that degree in black-smithing

Do Blacksmith’s Earn Above Minimum Wage?

A new, highly flawed, study on the “job killing’ impacts of the minimum wage, by a professor of economics at the University of California, San Diego (UCSD), went viral amongst far right wing blogs. That was to be expected. But now it’s making  its way, uncritically,  into more professional forums, like CATO, the American Enterprise Institute, the National Review,  and Forbes.   Soon it will be on nervous “mainstream media,” like CNN, who want to be sure, at all costs, they are “balanced,” even if one of the sides is based on junk science (and the other is not).

Because this defective work comes from a credentialed faculty member at a very fine economics department in a distinguished university, of course adds to its credibility….and  to its harm.  Where was the peer review?  The work in question, examining the impact of the 2007 mandated increase in the federal minimum wage from $5.15 to $7.25 per hour in three increments, concludes that “binding minimum wage increases had significant, negative effects on the employment and income growth of targeted workers. Lost income reflects contributions from employment declines, increased probabilities of working without pay (i.e., an “internship” effect), and lost wage growth associated with reductions in experience accumulation.”

The Research Design of this study is terribly flawed, on several grounds. First, the main study period begins July 2007,  just before the U.S. financial collapse and Great Recession; and ends in July 2009, not quite before the recovery really begins; more like when the free fall stopped.

Trying to evaluate the impacts of a small, gradual increase in minimum wage (MW) affecting a fraction of the workforce, in the midst of an earthshaking set of economic events, is pure (unscientific) folly. Even if some effects in the study extend a year or so beyond this highly exceptional episode in U.S. economic history. (Actually, we are not yet beyond the episode). The author’s acknowledgement of that is (amazingly) perfunctory, and his effort to adjust for it, even weaker.

Under the best of study conditions — a period of economic stability with few or no seismic events — it is extremely difficult to reach “conclusions” about the impact of changing a single economic variable. It’s like trying to study the environmental effects of hydraulic fracking during a period that coincides with an 8.0 earthquake on the Richter Scale.

A second and related flaw in the study is that ANY brief time frame for examining the impact of a minimum wage change is painfully too small to reach valid conclusions (apart from the fact that it coincided with the Great Recession).  You are always going to find some (at least short term) negative employment impacts from an increase in the MW. But, raising the MW also puts more money in the pockets of (the vast majority) of workers who don’t lose jobs. They spend the extra dollars in the local economy, creating (some) jobs elsewhere in the region. That’s why many MW studies show no or negligible net job loss, and sometimes even small job gains.  You need a little time for the cascading effects to show up. (You can find a summary of that literature here).

Most professional economists agree that large, broad-based, precipitous MW changes can be highly destabilizing, lead to significant, long term job losses,  and have the opposite of intended effects on low income earners. But that’s not at all the nature of the MW changes mandated by the Minimum Wage Act of 2007 and subject to the UCSD study.

A third, serious flaw in the study concerns the data source – the Survey of Income and Program Participation (SIPP). Self reported income and wages from surveys are notoriously poor.  Read about that, and more, here and here.  Administrative data (like unemployment insurance [UI] and social security administration [SSA] reports are not perfect, but decades of research overwhelmingly conclude that SIPP (and other survey data) significantly understate wages, compared with UI and SSA data. The reported differences  are in the $1000 to $3000 range annually. That’s large.

The author may say, “well, yes, SIPP under reports wages (which it does), but I’m not looking at absolute levels; I’m looking at ‘trends.’” That would be a nice try. But, the SIPP wage data not only under estimate wages, they are also notoriously unstable and volatile from year to year. SIPP may provide good trend results over a 10 or 20 year span, but using it to pinpoint the impacts of a small change in the economy (the MW) over two or three years is amazingly irresponsible.

The SIPP wage data, BTW, also includes tips; i.e. the respondents are asked to include tip income in their recollections. (Now, there is something the typical respondent is going to report with tremendous accuracy!)  Many workers directly affected by the MW receive significant tip income. In the midst of the greatest recession since the great depression the size of tips likely fell; quite apart from any effects a (small) change in the MW may have. That is not addressed in the study, as best as I can tell.  I could have missed something buried in a footnote.

Even if this study was without serious flaws in design and methods, the best than can be said for it is that it confirms the belief that public policies which raise business costs during a severe recession or a halting recovery, may not be a good idea. That’s elementary.

Ironically, the minimum wage isn’t even the most effective way to address wage and income inequality. Expanding the Earned Income Tax Credit (EITC) is a much better approach. Still, that doesn’t mean junk science attacking the MW ought to be published under the banner of a great university.


2 thoughts on “Beware of Bad Research on the Minimum Wage from a Great San Diego University

    1. Irv Lefberg Post author

      Thanks for the comment, Brian. This paper shows up as a National Bureau of Economic Research (NBER) ‘”working paper,” so I guess its had some form of peer review. The author seems to have a relationship of sorts with NBER; this is not his first “NBER” working paper”. I thought some of the others were pretty good. I may have been a little too harsh, attacking it in blog style, but the certitude with which he stated the conclusion really bugged me. The Abstract was written as if designed to get max PR in the right wing blogosphere, It did. I don;t think he’s generally partisan as far as I can tell. As you may recall, NBER (I think still at Columbia) was, and maybe still is, the place that “officially” (whatever than means) designates national recessions, (They;re still revising the dates of the 1929 one) Its not over till the fat lady at NBER sings. (I sincerely hope there really isn;t a corpulent female economist at NBER, or I’d be in real trouble. I am anyway.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s