The minimum wage was conceived as a way to help bolster wageworkers and decrease class stratification. It was first introduced in the United States with the Fair Labor Standards Act of 1938 (FLSA). Passed under President Roosevelt, this act called for the first national minimum wage of 25 cents an hour. This created a floor on wages in the labor market and overall helped to create fairer labor standards throughout the country.
Classical economic theory suggests that the minimum wage would have a stabilizing effect on the economy. As economist Richard Freeman explains in his 1996 book, Uneven Tides: Rising Inequality in America, there are three main ways that the minimum wage affects income disparity by distributing more earning power to people at the lower end of the economic spectrum. He refers to this as the “redistribution theory.”
The first outcome of increased minimum wages is that the cost of producing goods and services increases, which results in higher prices. These increased prices mean that everyone is paying more for goods, including the middle and upper classes, yet only the lower class is simultaneously increasing its income, thereby increasing the purchasing power of low-wage workers. In addition, higher wages decrease company profits, while simultaneously increasing the income of the poor. Increased wages also cost (some) jobs; often those are on the middle or higher end of the income spectrum. Overall, raising the minimum wage acts to decrease the wealth of the wealthier classes while increasing the wealth of lower paid workers.
The FSLA was also found to temporarily harm regional economies. Economist John F. Moloney, found in 1942 that after the law was implemented, southern plants experienced some adverse effects: “the value of output [was] 18 percent lower per plant and 21 percent lower per worker in the South than in the rest of the nation.”
Despite some negative regional effects, the FLSA empowered workers and decreased income inequality in America, as seen in the declining value of the Gini coefficient, a measure of inequality. Overall the first implementation of the minimum wage benefited American workers, especially when combined with increased labor demand that resulted from the onset of World War II.
Despite Roosevelt’s intentions, the minimum wage did not continue to be a means of guaranteeing an economic floor for wageworkers. At the time, the actual poverty line was determined by multiplying estimated food costs by three. At its high point, the federal minimum wage could support a family of three above the poverty line, but by the 1980’s it could not even support a family of two. From January of 1981 to April of 1990, the federal minimum wage was not increased at all. In fact, in these nine years, the real value of the minimum wage, adjusted to 2012 dollars, decreased from $8.29 to $6.66.
Surprisingly, poverty declined during this period. The number of people living under the poverty line during the 1980s peaked in 1983, when the minimum wage was worth approximately $7.59 in 2012 dollars. Why didn’t poverty increase? It was because poverty is more than a measure of hourly wages; it involves a variety of factors including federal assistance in the form of Food Stamps, welfare, inflation, and the unemployment rate.
"Minimum Wage - U.S. Department of Labor - Chart1 | United States Department of Labor." Minimum Wage - U.S. Department of Labor - Chart1 | United States Department of Labor.
“Persons Below Poverty Level in the U.S., 1975–2010” Poverty and Income. Infoplease. © 2000-2016 Sandbox Networks, Inc., publishing as Infoplease. 21 Jan 2016.
U.S. Census Bureau. Historical Income Tables: Income Inequality. Table H-4. Gini Ratios for Households, by Race and Hispanic Origin of Householder. [Excel Spreadsheet]. Web. 21 Jan 2016