Workers rights in the U.S.As the industrial revolution progressed during the 19th century, the share of the U.S. workforce employed in manufacturing industries rose accordingly. Prior to this, labor movements and unions were generally viewed as detrimental to business and treated with disdain by authorities and business leaders - as industrialization changed the working landscape in the U.S., workers began to demand better wages and working conditions.
Apart from some movement on local and state level, it was not until the Great Depression when the federal government began to expand workers rights on a national level; this included a minimum wage, federal pensions, unemployment securities, and the right to unionize, among others. While these New Deal improvements did expand job security in some form for most workers, they did not completely remove discrimination in areas such as age, disability, gender, or race - it would take decades for legal protections to be expanded for workers belonging to these other groups. However, the 1970s and 1980s also saw the the government begin to strip away workers' rights on both state and national levels - deregulation was seen as the key to promoting enterprise, and this often involved dismantling labor unions and removing workers' rights.