[p2p-research] Is the "lump of labor fallacy" itself a fallacy?

Paul D. Fernhout pdfernhout at kurtz-fernhout.com
Sat Nov 28 17:02:11 CET 2009

Michel Bauwens wrote:
>> Michel mentioned US$40 an hour as the inflation and productivity and equity
> adjusted figure for a minimum wage -- I'd love a reference? I'm starting to
> think raising the minimum wage back to US$40 an hour may be what it takes?
> :-) But I'd rather see a basic income and no minimum wage. It may be too
> late for a raise in minimum wage to fix things at all, given how fast
> companies can automate and redesign.
> Hi Paul,
> it was a recent study by a union in California, on the county level, I saw
> it about 6 weeks ago, trouble is, I spent half an hour in my tags and on
> google, and can't get a hold of it again ... I think it had "Sonoma county"
> in the heading ...

I found this, but it is not exactly the same:
"The Living Wage Coalition believes a living or self-sufficiency wage for 
Sonoma County in 2008 is $14.90 an hour (including benefits) and an annual 
family income of $62,940 based upon calculations for a two- parent, 
two-child family with both parents working full-time."

Another item:
"A federal minimum wage was first set in 1938. The graph shows nominal (blue 
diamonds) and real (red squares) minimum wage values. Nominal values range 
from $0.25/hr in 1938 to $6.55/hr as of July 2008. The graph adjusts these 
wages to 2007 dollars (red squares) to show the real value of the minimum 
wage. Calculated in real 2007 dollars, the 1968 minimum wage was the highest 
at $9.47. The real dollar minimum wage (red squares) falls during periods 
Congress does not raise the minimum wage to keep up with inflation. The 
period 1997-2007, is the longest period during which the minimum wage has 
not been adjusted. The minimum wage increases in three $0.70 increments--to 
$5.85 in 2007, $6.55 in mid 2008, and to $7.25 in mid 2009. The real values 
after 2007 are projected for future decline in purchasing power. "

When it was introduced in the USA in 1938, it was was about US$0.25. For 
someone working 40 hours a week 50 weeks a year, that would be about US$42 a 
month or US$500 a year.

I put US$0.25 in the inflation calculator here:
that is about US$3.64.

For reference:
How Much things cost in 1938
Average Cost of new house $3,900.00
Average wages per year $1,730.00
Cost of a gallon of Gas 10 cents
Average Cost for house rent $27.00 per month
A loaf of Bread 9 cents
A LB of Hamburger Meat 13 cents
Average Price for new car $763.00
Blanket $5.00
Liptons Noodle Soup 10 Cents

But, the difference is presumably the claim that industrial productivity has 
increased. I don't know that figure. Here it says 2.1% increanes in output 
per person hour from 1937 to 1952:
Taking that from 1938 to now (just assuming it as a constant for 70 years, 
compounded) that is about a factor of four. So, that would suggest that in 
today's dollars, a minimum wage should be 4 * US$3.64 or US$14.56.

But, that assumes 2.1% annual compounded productivity growth.

"U.S. productivity growth has accelerated in recent years, despite a series 
of negative economic shocks. An analysis of the sources of this growth over 
the 1995-2003 period suggests that the production and use of information 
technology account for a large share of the gains. The authors project that 
during the next decade, private sector productivity growth will continue at 
a rate of 2.6 percent per year, a significant increase from their 2002 
projection of 2.2 percent growth."

And more recently, for example:
"Productivity rises 6.4%, fastest rate in six years: Unit labor costs fall 
5.8% in second quarter, the most in nine years" 

So, I could imagine that if the rate was higher, one would get a higher 
result that approached US$40. I'd easily believe US$20 as and adjusted 
minimum wage from the above calculation if the last two decades were 2.6% 
annual growth.

By the way from that last link, to see how broken the current system is:
WASHINGTON (MarketWatch) -- U.S. companies slashed their workers' hours in 
the second quarter, boosting the productivity of the workplace to an 
annualized rate of 6.4%, the Labor Department reported Tuesday. ...  Hourly 
compensation rose just 0.2% in the second quarter. After inflation, real 
hourly compensation sank 1.1%. ...

So, productivity goes up the annual equivalent of 6.4%, but real wages go 
down 1.1%, instead of also up 6.4% (to match productivity growth). So, who 
gets all the increased profits? Obviously not the workers. The workers would 
probably spend them all to make new demand. But if they go to rich people, 
then they just might be pushed into various bubble investments that have 
essentially no connection to the real physical economy (things like driving 
up the price of commodities (gold), land, housing, stock prices, or 
otherwise increasing financial reserves or paying back previous debt or 
stuff like that). Alternatively, the same amount of stuff is produced but 
workers are laid off. So, efficiency goes up, but there are less workers 
needed to produce the same stuff, workers are laid off, and then there is 
lower demand, as a negative spiral. Keynes said this would not happen 
because of lags in the system (things being "sticky") or, alternatively, if 
it did, then the government would need to step in and spend money or lower 
interest rates. Of course, there is a huge disconnect between government set 
interest rates (about zero) and consumer interest rates (rising beyond 20% 
per year again from fears of default or just rent-seeking).

Of course, there is no accountability for economists, because they might be 
right eventually. There are endless things they can throw out to be 
explanations (like the old epicycle explanations of the movements of the 
planets). But they mostly come down to supporting the status quo.

--Paul Fernhout

More information about the p2presearch mailing list