One of the achievements of Roosevelt’s New Deal administration during the Great Depression was the introduction in 1938 of the federal minimum wage, then set at twenty-five cents an hour. At no time in its history has the minimum wage kept workers out of poverty. But it has helped to stave off the full depths of poverty that would otherwise ensue. Since the minimum wage is not indexed to inflation its real value diminishes over the years, requiring that Congress periodically raise it. The last time this was done was 2009 when it was increased to the present $7.25 an hour. Obama’s proposal in his State of the Union Address to raise the federal minimum wage to $9 an hour—and to index it to the cost of living so that its real value would no longer diminish with inflation—has received widespread support from the population. At the same time it has been subjected to severe criticism from key sectors of capital, which have gone into overdrive in pushing their claim that such a raise in the hourly wage of the poorest segments of society would be a devastating “job killer,” increasing unemployment.
To understand the historical significance of this it is important to understand that the real minimum wage (adjusted for inflation) reached its highest point of $10.50 (in 2010 dollars) in 1968, while today it is lower than it was in 1956, during Eisenhower’s first administration. If Obama’s proposal were adopted the real minimum wage would still be about $1.50 short of where it was in 1968 at the end of the Johnson administration, forty-five years ago. Yet, this paltry attempt to lift the floor of wages for the poorest workers in the United States—at a time when the annual income of a single parent receiving the minimum wage is well below the federal poverty line for a family of three—is coming under virulent attack from the vested interests. Immediately following Obama’s proposal, the Wall Street Journal, Forbes, Fortune, and FOX all issued charges that the minimum wage was a “job killer.” (See Peter Dreier and Donald Cohen, “,” Huffington Post, February 23, 2013; John Light, “,” BillMoyers.com, February 22, 2013; “,” .)
The plutocratic interests controlling such media outlets would like to convince the population that they spend every waking moment worrying about how to promote jobs. The truth, however, is that these same interests consistently oppose all policies to promote full employment on the grounds that full employment is inflationary and thus a threat to business. Their argument that an increase in the minimum wage would be detrimental to employment as a whole is therefore disingenuous to an extreme. It is also entirely false. Low-wage workers who receive increased hourly wages turn around and spend 100 percent of those wages on consumer goods, thereby increasing effective demand in the economy as a whole. As a result the overall effect on employment is nil. Nevertheless, the floor under wages is raised, bringing workers at the bottom closer to a living wage.
But if enacting a federal minimum wage hike would have virtually no effect on overall employment and the functioning of the economy as a whole, why then does it face such strong opposition from business circles? Part of the answer clearly lies in the fact that some employers, such as those who use minimum-wage workers to produce luxury goods and services exclusively for the well-to-do, would see their costs go up but not their revenue. Conversely, those branches of industry producing wage goods would see their profits rise. Hence, while overall employment would not be affected, some sectors of industry would experience gains while others would experience losses. A larger part of the answer, however, is that such opposition reflects the fundamental outlook of the capitalist class, whereby increases in wages are always seen as a threat to profits to the same extent. Related to this is the prevailing, but mistaken, assumption that the beginning of the economic problems of the U.S. economy in the 1970s had to do with a wage squeeze on profits, leading to the stagflation era—which has been used to justify the neoliberal strategy of forever weakening labor in relation to capital. The right turn associated with Reaganomics in the 1980s made minimum-wage legislation a major target. A master of the big lie, Reagan went so far as to declare: “The minimum wage has caused more misery and unemployment than anything since the Great Depression” (Dreier and Cohen, “Raising the Minimum Wage”). The New York Times jumped on the bandwagon, editorializing in its January 14, 1987 issue: “.”
What big business is worried about today has less to do with an increase in the minimum wage itself, but rather the larger effect that this could have in eroding its overall neoliberal political-economic strategy of continually ratchetting up the rate of exploitation in U.S. society in order to generate a larger economic surplus to fill corporate coffers. A sizeable hike in the minimum wage—even if it only brings the real minimum wage under Obama’s proposal back to where it was fifty years ago—is seen by capital as a sign of resistance from below, however faint, and hence is to be squashed; or, if that should prove impossible, to be contained within narrow limits. (For an empirical treatment of the class war over the wage share in the United States see Fred Magdoff and John Bellamy Foster, “Class War and the Declining Labor Share” in the March issue of MR. For a wider historical and theoretical treatment see John Bellamy Foster, “Marx, Kalecki, and Socialist Strategy” in this issue.)
One of the most notable left intellectuals in the United States is Gar Alperovitz, professor of political economy at the University of Maryland. As an undergraduate Alperovitz studied at the University of Wisconsin under the great revisionist historian William Appleman Williams (a frequent contributor at the time to Monthly Review). He went on to pursue a PhD in economics at Cambridge University under Joan Robinson (also an MR contributor). Alperovitz’s doctoral dissertation, which was eventually published as the book Atomic Diplomacy, constituted a revisionist historical account of the decision to drop the atomic bomb on Hiroshima and Nagasaki. He demonstrated that Truman’s decision to use the bomb on civilian populations was opposed by U.S. military leaders—and was not so much the last act in the Second World War as the first act in the Cold War. (Truman’s objective in dropping the bomb was to force an immediate unconditional surrender—Japan had offered to surrender but the security of the emperor remained a sticking point—in order to preempt the further advance of Soviet troops into Asia and the expansion of the Soviet sphere of influence.) Alperovitz’s latest initiative is represented by his brand new book, What Then Must We Do?: Straight Talk About the Next American Revolution. He describes the systemic crisis in which the United States is trapped as one of “punctuated stagnation”: The nature of our current economic predicament, he writes, “is not [one of] collapse, but rather an odd form of painful stagnation—the hallmark of which is economic decay, with occasional significant downturns and (at best) modest and temporary upticks around a sickening, uncertain and debilitating norm” (126). (Here he footnotes John Bellamy Foster and Robert McChesney’s book, The Endless Crisis: How Monopoly-Finance Capital Produces Stagnation and Upheaval from the United States to China [Monthly Review Press, 2012].) We strongly agree with Alperovitz that a “punctuated stagnation,” in the sense he has described it, is the most accurate reading of contemporary economic reality and sets the stage for what could be “the next American revolution.” His book can be obtained from Chelsea Green Publishing at http://chelseagreen.com.
Correction: On page 44 of the March 2013 MR, in Al Ruben’s “The Man Who Was Over the Rainbow,” line 18 should read “four volumes” instead of “three.”