Only $2.99/month

Terms in this set (86)

The Harlem Renaissance was the name given to the cultural, social, and artistic explosion that took place in Harlem between the end of World War I and the middle of the 1930s. During this period Harlem was a cultural center, drawing black writers, artists, musicians, photographers, poets, and scholars. Many had come from the South, fleeing its oppressive caste system in order to find a place where they could freely express their talents. Among those artists whose works achieved recognition were Langston Hughes and Claude McKay, Countee Cullen and Arna Bontemps, Zora Neale Hurston and Jean Toomer, During this period Harlem was the Mecca to which black writers, artists, musicians, photographers, poets, and scholars traveled.Walter White and James Weldon Johnson. W.E.B. Du Bois encouraged talented artists to leave the South. Du Bois, then the editor of THE CRISIS magazine, the journal of the NAACP, was at the height of his fame and influence in the black community. THE CRISIS published the poems, stories, and visual works of many artists of the period. The Renaissance was more Sign: Father and Son Banquetthan a literary movement: It involved racial pride, fueled in part by the militancy of the "New Negro" demanding civil and political rights. The Renaissance incorporated jazz and the blues, attracting whites to Harlem speakeasies, where interracial couples danced. But the Renaissance had little impact on breaking down the rigid barriers of Jim Crow that separated the races. While it may have contributed to a certain relaxation of racial attitudes among young whites, perhaps its greatest impact was to reinforce race pride among blacks.
rohibition in the United States was a nationwide constitutional ban on the sale, production, importation, and transportation of alcoholic beverages that remained in place from 1920 to 1933. It was promoted by "dry" crusaders movement, led by rural Protestants and social Progressives in the Democratic and Republican parties, and was coordinated by the Anti-Saloon League, and the Woman's Christian Temperance Union. Prohibition was mandated under the Eighteenth Amendment to the U.S. Constitution. Enabling legislation, known as the Volstead Act, set down the rules for enforcing the ban and defined the types of alcoholic beverages that were prohibited. For example, religious uses of wine were allowed. Private ownership and consumption of alcohol was not made illegal under federal law; however, in many areas local laws were more strict, with some states banning possession outright. Nationwide Prohibition ended with the ratification of the Twenty-first Amendment, which repealed the Eighteenth Amendment, on December 5, 1933.

In the nineteenth and early twentieth centuries, the introduction of alcohol prohibition and its subsequent enforcement in law was a hotly debated issue. Prohibition supporters, called drys, presented it as a victory for public morals and health. Anti-prohibitionists, known as wets, criticized the alcohol ban as an intrusion of mainly rural Protestant ideals on a central aspect of urban, immigrant, and Catholic life. Though popular opinion believes that Prohibition failed, it succeeded in cutting overall alcohol consumption in half during the 1920s, and consumption remained below pre-Prohibition levels until the 1940s,[1] suggesting that Prohibition did socialize a significant proportion of the population in temperate habits, at least temporarily. Some researchers contend that its political failure is attributable more to a changing historical context than to characteristics of the law itself.[2] Criticism remains that Prohibition led to unintended consequences such as the growth of urban crime organizations. As an experiment it lost supporters every year, and lost tax revenue that governments needed when the Great Depression began in 1929.[3]
In Adkins v. Children's Hospital (1923), the Supreme Court ruled that a minimum wage law for women violated the Due Process Clause of the Fifth Amendment because it abridged a citizen's right to freely contract labor. In 1918, the District of Columbia passed a law setting a minimum wage for women and children laborers. It set up a board to investigate current wages, solicit input on ideal wage levels, and ultimately set minimum wages. The law was designed to protect women and children "from conditions detrimental to their health and morals, resulting from wages which are inadequate to maintain decent standards of living." The board eventually set minimum wages for various industries, e.g., a minimum $16.50 per week "in a place where food is served" and $15 per week "in a laundry." The Children's Hospital of the District of Columbia, which employed many women at wages below those established by the board, sued the board on the grounds that its regulations violated liberty of contract as defined in Lochner v. New York (1905). The case reached the Supreme Court on appeal in 1923.

In a 5-3 decision written by Justice George Sutherland, the Court struck down the minimum wage law as unconstitutional, arguing that it violated the Due Process Clause of the Constitution's Fifth Amendment. The Court cited Lochner v. New York (1905) in maintaining that the clause gives citizens equal rights "to obtain from each other the best terms they can as the result of private bargaining." According to Justice Sutherland, the D. C. minimum wage law, by contrast, was "an arbitrary interference with the liberty of contract which no government can legally justify in a free land." The law was especially "arbitrary," argued the Court, because it imposed uniform minimum wages on all women regardless of their individual needs or occupations.

Adkins v. Children's Hospital thus reaffirmed the basic holding of Lochner: minimum wage laws violate a citizen's right to freely contract work. Adkins was effectively overturned by West Coast Hotel v. Parrish (1937), which held that states could impose minimum wage regulations on private employers without violating the Due Process Clause. As long as they are rational and procedurally fair, minimum wage laws are a legitimate exercise of the state's police power.
Modernism is a philosophical movement that, along with cultural trends and changes, arose from wide-scale and far-reaching transformations in Western society in the late 19th and early 20th centuries. Among the factors that shaped Modernism were the development of modern industrial societies and the rapid growth of cities, followed then by the horror of World War I. Modernism also rejected the certainty of Enlightenment thinking, and many modernists rejected religious belief.[2][3]

Modernism, in general, includes the activities and creations of those who felt the traditional forms of art, architecture, literature, religious faith, philosophy, social organization, activities of daily life, and even the sciences, were becoming ill-fitted to their tasks and outdated in the new economic, social, and political environment of an emerging fully industrialized world. The poet Ezra Pound's 1934 injunction to "Make it new!" was the touchstone of the movement's approach towards what it saw as the now obsolete culture of the past. In this spirit, its innovations, like the stream-of-consciousness novel, atonal (or pantonal) and twelve-tone music, quantum physics, genetics, neuron networks, set theory, analytic philosophy, the moving-picture show, divisionist painting and abstract art, all had precursors in the 19th century.

A notable characteristic of Modernism is self-consciousness, which often led to experiments with form, along with the use of techniques that drew attention to the processes and materials used in creating a painting, poem, building, etc.[4] Modernism explicitly rejected the ideology of realism[5][6][7] and makes use of the works of the past by the employment of reprise, incorporation, rewriting, recapitulation, revision and parody.[8][9][10]

Some commentators define Modernism as a mode of thinking—one or more philosophically defined characteristics, like self-consciousness or self-reference, that run across all the novelties in the arts and the disciplines.[11] More common, especially in the West, are those who see it as a socially progressive trend of thought that affirms the power of human beings to create, improve and reshape their environment with the aid of practical experimentation, scientific knowledge, or technology.[12] From this perspective, Modernism encouraged the re-examination of every aspect of existence, from commerce to philosophy, with the goal of finding that which was 'holding back' progress, and replacing it with new ways of reaching the same end. Others focus on Modernism as an aesthetic introspection. This facilitates consideration of specific reactions to the use of technology in the First World War, and anti-technological and nihilistic aspects of the works of diverse thinkers and artists spanning the period from Friedrich Nietzsche (1844-1900) to Samuel Beckett (1906-1989).[13]
William Jennings Bryan (March 19, 1860 - July 26, 1925) was a dominant force in the populist wing of the Democratic Party, standing three times as the Party's candidate for President of the United States (1896, 1900 and 1908). He served two terms as a member of the United States House of Representatives from Nebraska and was United States Secretary of State under President Woodrow Wilson (1913-1915), resigning because of his pacifist position on World War I. Bryan was a devout Presbyterian, a strong advocate of popular democracy, and an enemy of the banks and their gold standard. He demanded "Free Silver" because it reduced power attributed to money and put more money in the hands of the people. He was a peace advocate, supported Prohibition, and an opponent of Darwinism on religious and humanitarian grounds. With his deep, commanding voice and wide travels, he was one of the best-known orators and lecturers of the era. Because of his faith in the wisdom of the common people, he was called "The Great Commoner."

In the intensely fought 1896 and 1900 elections, he was defeated by William McKinley but retained control of the Democratic Party. With over 500 speeches in 1896, Bryan invented the national stumping tour, in an era when other presidential candidates stayed home. In his three presidential bids, he promoted Free Silver in 1896, anti-imperialism in 1900, and trust-busting in 1908, calling on Democrats to fight the trusts (big corporations) and big banks, and embrace anti-elitist ideals of republicanism. President Wilson appointed him Secretary of State in 1913, but Wilson's strong demands on Germany after the Lusitania was torpedoed in 1915 caused Bryan to resign in protest. After 1920 he was a supporter of Prohibition and attacked Darwinism and evolution, most famously at the Scopes Trial in 1925. Five days after the end of the case, he died in his sleep.[2]
Nicola Sacco (April 22, 1891 - August 23, 1927) and Bartolomeo Vanzetti (June 11, 1888 - August 23, 1927) were Italian-born anarchists who were convicted of murdering a guard and a paymaster during the armed robbery of a shoe factory in Braintree, Massachusetts, United States in 1920.

Both adhered to a strain of anarchism that advocated relentless warfare against war, violence and oppressive governments. [1][2]

After a few hours' deliberation, the jury found Sacco and Vanzetti guilty of first-degree murder on July 14, 1921. A series of appeals followed, funded largely by a private Sacco and Vanzetti Defense Committee. The appeals were based on recanted testimony, conflicting ballistics evidence, a prejudicial pre-trial statement by the jury foreman, and a confession by an alleged participant in the robbery. All appeals were denied by the original trial judge and eventually by the Massachusetts State Supreme Court. By 1925, the case had drawn worldwide attention. As details of the trial and the men's suspected innocence became known, Sacco and Vanzetti became the center of one of the largest causes célèbres in modern history. In 1927, protests on their behalf were held in every major city in North America and Europe, as well as in Tokyo, Sydney, São Paulo, Rio de Janeiro, Buenos Aires, and Johannesburg.[3]

Celebrated writers, artists, and academics pleaded for their pardon or for a new trial. Harvard law professor and future Supreme Court justice Felix Frankfurter argued for their innocence in a widely read Atlantic Monthly article that was later published in book form. Sacco and Vanzetti were sentenced to death in April 1927, accelerating the outcry. Responding to a massive influx of telegrams urging their pardon, Massachusetts governor Alvan Fuller appointed a three-man commission to investigate the case. After weeks of secret deliberation, which included interviews with the judge, lawyers, and several witnesses, the commission upheld the verdict. Sacco and Vanzetti were executed via electric chair on August 23, 1927.[4] Subsequent riots destroyed property in Paris, London, and other cities.

Since their deaths, some critics have concluded that the two men were convicted largely because of anti-Italian prejudice and their anarchist political beliefs and were therefore unjustly executed.[5][6] Investigations of the case continued throughout the 1930s and 1940s. The publication of the men's letters, containing eloquent professions of innocence, intensified belief in their wrongful execution. Additional ballistics tests and incriminating statements by the men's acquaintances have clouded the case. In 1977, Massachusetts Governor Michael Dukakis issued a proclamation that Sacco and Vanzetti had been unfairly tried and convicted and that "any disgrace should be forever removed from their names", but did not proclaim them innocent.
John Calvin Coolidge Jr. (/ˈkuːlɪdʒ/; July 4, 1872 - January 5, 1933) was the 30th President of the United States (1923-1929). A Republican lawyer from Vermont, Coolidge worked his way up the ladder of Massachusetts state politics, eventually becoming governor of that state. His response to the Boston Police Strike of 1919 thrust him into the national spotlight and gave him a reputation as a man of decisive action. Soon after, he was elected as the 29th Vice President in 1920 and succeeded to the Presidency upon the sudden death of Warren G. Harding in 1923. Elected in his own right in 1924, he gained a reputation as a small-government conservative, and also as a man who said very little.

Coolidge restored public confidence in the White House after the scandals of his predecessor's administration, and left office with considerable popularity.[1] As a Coolidge biographer put it, "He embodied the spirit and hopes of the middle class, could interpret their longings and express their opinions. That he did represent the genius of the average is the most convincing proof of his strength."[2] Some later criticized Coolidge as part of a general disapproval of laissez-faire government.[3] His reputation underwent a renaissance during the Ronald Reagan administration, but the ultimate assessment of his presidency is still divided between those who approve of his reduction of the size of government programs and those who believe the federal government should be more involved in regulating and controlling the economy.[4]
Alfred Emanuel "Al" Smith (December 30, 1873 - October 4, 1944) was an American statesman who was elected Governor of New York four times and was the Democratic U.S. presidential candidate in 1928. He was the foremost urban leader of the efficiency-oriented Progressive Movement and was noted for achieving a wide range of reforms as governor in the 1920s. He was also linked to the notorious Tammany Hall machine that controlled New York City's politics; was a strong opponent of Prohibition and was the first Catholic nominee for President. His candidacy mobilized Catholic votes—especially women who previously had not voted. It also mobilized the anti-Catholic vote, which was strongest in the South.

As a committed "wet" (anti-Prohibition) candidate, Smith attracted not only drinkers but also voters angered by the corruption and lawlessness brought about by prohibition.[1] However, he was feared among Protestants, including German Lutherans and Southern Baptists, who believed that the Catholic Church and the Pope would dictate his policies. Most importantly, this was a time of national prosperity under a Republican Presidency, and Smith lost in a landslide to Republican Herbert Hoover. Four years later Smith sought the 1932 nomination but was defeated by his former ally and successor as New York Governor Franklin D. Roosevelt. Smith entered business in New York City and became an increasingly vocal opponent of Roosevelt's New Deal. Roosevelt blocked Smith from having any major role in World War II.
The Young Plan was a program for settling German reparations debts after World War I written in 1929 and formally adopted in 1930. It was presented by the committee headed (1929-30) by American industrialist Owen D. Young, creator and ex-first chairman of Radio Corporation of America (RCA), who, at the time, concurrently served at board of trustees of Rockefeller Foundation, and also had been one of representatives involved in previous war reparations restructuring arrangement - Dawes Plan of 1924. The Inter-Allied Reparations Commission established the German reparation sum at a theoretical total of 132 billion, but a practical total of 50 billion gold marks. After the Dawes Plan was put into operation in 1924, it became apparent that Germany would not willingly[citation needed] meet the annual payments over an indefinite period of time.[citation needed] The Young Plan reduced further payments by about 20 percent. Although the theoretical total was 112 billion Gold Marks, equivalent to US $8 billion in 1929 (US$ 110 billion in 2015) over a period of 59 years, which would end in 1988, few expected the plan to last for much more than a decade.[1] In addition, the Young Plan divided the annual payment, set at two billion Gold Marks, US $473 million, into two components: one unconditional part, equal to one third of the sum, and a postponable part, equal to the remaining two-thirds, which would incur interest and be financed by a consortium of American investment banks coordinated by J.P. Morgan & Co.
The Emergency Quota Act, also known as the Emergency Immigration Act of 1921, the Immigration Restriction Act of 1921, the Per Centum Law, and the Johnson Quota Act (ch. 8, 42 Stat. 5 of May 19, 1921) restricted immigration into the United States. Although intended as temporary legislation, the Act "proved in the long run the most important turning-point in American immigration policy"[3] because it added two new features to American immigration law: numerical limits on immigration from Europe and the use of a quota system for establishing those limits. These limits came to be known as the National Origins Formula.

The Emergency Quota Act restricted the number of immigrants admitted from any country annually to 3% of the number of residents from that same country living in the United States as of the U.S. Census of 1910.[4] This meant that people from northern European countries had a higher quota and were more likely to be admitted to the U.S. than people from eastern Europe, southern Europe, or other, non-European countries. Professionals were to be admitted without regard to their country of origin. The Act set no limits on immigration from Latin America. The act did not apply to countries with bilateral agreements with the US, or to Asian counties listed in the Immigration Act of 1917, known as the Asiatic Barred Zone Act.[1]

Based on that formula, the number of new immigrants admitted fell from 805,228 in 1920 to 309,556 in 1921-22.[5] The average annual inflow of immigrants prior to 1921 was 175,983 from Northern and Western Europe, and 685,531 from other countries, principally Southern and Eastern Europe. In 1921, there was a drastic reduction in immigration levels from other countries, principally Southern and Eastern Europe.

Following the end of World War I, both Europe and the United States were suffering economic and social upheaval. In Europe, the destruction of the war, the Russian Revolution, and the dissolutions of both the Austro-Hungarian Empire and Ottoman Empire led to greater immigration to the United States, while in the United States an economic downturn following post-war demobilization increased unemployment. The combination of increased immigration from Europe at the time of higher American unemployment strengthened the anti-immigrant movement.

The act, sponsored by Rep. Albert Johnson (R-Washington),[6] was passed without a recorded vote in the U.S. House of Representatives and by a vote of 90-2-4 in the U.S. Senate.[2]

The Act was soon revised by the Immigration Act of 1924.

The use of such a National Origins Formula continued until 1965 when the Immigration and Nationality Act of 1965 established America's current immigration quota system.
Depression in the early 20th Century are a matter of active debate among economists, and are part of the larger debate about economic crisis, although the popular belief is that the Great Depression was caused by the 1929 crash of the stock market. The specific economic events that took place during the Great Depression have been studied thoroughly: a deflation in asset and commodity prices, dramatic drops in demand and credit, and disruption of trade, ultimately resulting in widespread unemployment and hence poverty. However, historians lack consensus in determining the casual relationship between various events and the government economic policy in causing or ameliorating the Depression.

Current theories may be broadly classified into two main points of view and several heterodox points of view.

First, there are demand-driven theories, such as Keynesian economics and Institutional economists who argue that the recession was caused by underconsumption and over-investment (thereby causing an economic bubble). The consensus among demand-driven theories is that a large-scale loss of confidence led to a sudden reduction in consumption and investment spending.[1] Once panic and deflation set in, many people believed they could avoid further losses by keeping clear of the markets. Holding money therefore became profitable as prices dropped lower and a given amount of money bought ever more goods, exacerbating the drop in demand.

Second, there are the monetarists, who believe that the Great Depression started as an ordinary recession, but that significant policy mistakes by monetary authorities (especially the Federal Reserve), caused a shrinking of the money supply which greatly exacerbated the economic situation, causing a recession to descend into the Great Depression. Related to this explanation are those who point to debt deflation causing those who borrow to owe ever more in real terms.

There are also various heterodox theories that reject the explanations of the Keynesians and monetarists. Some new classical macroeconomists have argued that various labor market policies imposed at the start caused the length and severity of the Great Depression. The Austrian school of economics focuses on the macroeconomic effects of money supply and how central banking decisions can lead to malinvestment. Marxian economists view the Great Depression, with all other economic crises, as a symptom of the classism and instability inherent in the capitalist model.
The Muscle Shoals Bill was designed to build a dam in the Tennessee River and sell government-produced electricity. Congress passed bills to harness energy from the Tennessee River, but presidents Coolidge and Hoover insisted that private enterprise should do the job, and vetoed the bills.

The chief sponsor, Senator George Norris of Nebraska, had blocked a proposal from Henry Ford to develop the dam site.

Hoover's veto message stated:

I am firmly opposed to the Government entering into any business the major purpose of which is competition with our citizens. There are national emergencies which require that the Government should temporarily enter the field of business, but they must be emergency actions and in matters where the cost of the project is secondary to much higher considerations. There are many localities where the Federal Government is justified in the construction of great dams and reservoirs, where navigation, flood control, reclamation, or stream regulation are of dominant importance, and where they are beyond capacity or purpose of private or local government capital to construct. In these cases power is often a by-product and should be disposed of by contract or lease. But for the Federal Government deliberately to go out to build up and expand such an occasion to the major purpose of a power and manufacturing business is to break down the initiative and enterprise of the American people; it is the negation of the ideals upon which our civilization has been based.[1]

Norris demanded public power because he distrusted privately owned utilities.[2] Norris said of Hoover:

Using his power of veto, he destroyed the Muscle Shoals bill--a measure designated to utilize the great government property at Muscle Shoals for the cheapening of fertilizer for American agriculture and utilization of the surplus power for the benefit of people without transmission distance of the development. The power people want no yardstick which would expose their extortionate rates so Hoover killed the bill after it had been passed by both houses of congress.[3]

The idea for the Muscle Shoals Bill in 1933 became part of the New Deal's Tennessee Valley Authority (TVA).
The Reconstruction Finance Corporation (RFC) was a government corporation in the United States that operated between 1932 and 1957 which provided financial support to state and local governments and made loans to banks, railroads, mortgage associations and other businesses. Its aim was to boost the country's confidence and help banks return to performing daily functions after the start of the Great Depression. It continued to operate through the New Deal where it became more prominent and through World War II. It was disbanded in 1957 when the US government felt it no longer needed to stimulate lending.

It was an independent agency of the United States government, established and chartered by the US Congress in 1932, Act of January 22, 1932, c. 8, 47 Stat. 5, during the administration of President Herbert Hoover. When Eugene Meyer became Governor of the Federal Reserve Board, he had suggested creating the RFC. It was modeled after the War Finance Corporation of World War I. The agency gave $2 billion in aid to state and local governments and made a large number of loans which were nearly all repaid. The RFC was created to solve the problem that the Federal Reserve could not fix by itself since they had some limitations. The Federal Reserve System was created in 1913 to act as a lender of last resort during financial panics but was not able to lend to every bank or firm.

The RFC continued under the New Deal and played a major role in recapitalizing banks. The Reconstruction Finance Corporation was effective at reducing the probability of bank failure and stimulating bank lending.[1] The Reconstruction Finance Corporation played a major role in handling the Great Depression in the United States and setting up the relief programs that were taken over by the New Deal in 1933.
The Bonus Army was the popular name of an assemblage of some 43,000 marchers—17,000 World War I veterans, their families, and affiliated groups—who gathered in Washington, D.C., in the spring and summer of 1932 to demand cash-payment redemption of their service certificates. Its organizers called it the Bonus Expeditionary Force to echo the name of World War I's American Expeditionary Forces, while the media called it the Bonus March. It was led by Walter W. Waters, a former army sergeant.

Many of the war veterans had been out of work since the beginning of the Great Depression. The World War Adjusted Compensation Act of 1924 had awarded them bonuses in the form of certificates they could not redeem until 1945. Each service certificate, issued to a qualified veteran soldier, bore a face value equal to the soldier's promised payment plus compound interest. The principal demand of the Bonus Army was the immediate cash payment of their certificates.

Retired Marine Corps Major General Smedley Butler, one of the most popular military figures of the time, visited their camp to back the effort and encourage them.[1] On July 28, U.S. Attorney General William D. Mitchell ordered the veterans removed from all government property. Washington police met with resistance, shots were fired and two veterans were wounded and later died. Veterans were also shot dead at other locations during the demonstration. President Herbert Hoover then ordered the army to clear the veterans' campsite. Army Chief of Staff General Douglas MacArthur commanded the infantry and cavalry supported by six tanks. The Bonus Army marchers with their wives and children were driven out, and their shelters and belongings burned.

A second, smaller Bonus March in 1933 at the start of the Roosevelt Administration was defused in May with an offer of jobs for the Civilian Conservation Corps at Fort Hunt, Virginia, which most of the group accepted. Those who chose not to work for the CCC by the May 22 deadline were given transportation home.[2] In 1936, Congress overrode President Franklin D. Roosevelt's veto and paid the veterans their bonus nine years early.
The Dust Bowl, also known as the Dirty Thirties, was a period of severe dust storms that greatly damaged the ecology and agriculture of the US and Canadian prairies during the 1930s; severe drought and a failure to apply dryland farming methods to prevent wind erosion (the Aeolian processes) caused the phenomenon. The drought came in three waves, 1934, 1936, and 1939-40, but some regions of the High Plains experienced drought conditions for as many as eight years.[1] With insufficient understanding of the ecology of the Plains, farmers had conducted extensive deep plowing of the virgin topsoil of the Great Plains during the previous decade; this had displaced the native, deep-rooted grasses that normally trapped soil and moisture even during periods of drought and high winds. The rapid mechanization of farm equipment, especially small gasoline tractors, and widespread use of the combine harvester contributed to farmers' decisions to convert arid grassland (much of which received no more than 10 inches (250 mm) of precipitation per year) to cultivated cropland.[citation needed]

During the drought of the 1930s, the unanchored soil turned to dust, which the prevailing winds blew away in huge clouds that sometimes blackened the sky. These choking billows of dust - named "black blizzards" or "black rollers" - traveled cross country, reaching as far as such East Coast cities as New York City and Washington, D.C. On the Plains, they often reduced visibility to 1 metre (3.3 ft) or less. Associated Press reporter Robert E. Geiger happened to be in Boise City, Oklahoma to witness the "Black Sunday" black blizzards of April 14, 1935; Edward Stanley, Kansas City news editor of the Associated Press coined the term "Dust Bowl" while rewriting Geiger's news story.[2][3] While the term "the Dust Bowl" was originally a reference to the geographical area affected by the dust, today it is usually used to refer to the event, as in "It was during the Dust Bowl". The meaning of the term "bowl" - a hollow container - in this context is however still not quite clear.

The drought and erosion of the Dust Bowl affected 100,000,000 acres (400,000 km2) that centered on the panhandles of Texas and Oklahoma and touched adjacent sections of New Mexico, Colorado, and Kansas.[4]

The Dust Bowl forced tens of thousands of families to abandon their farms. Many of these families, who were often known as "Okies" because so many of them came from Oklahoma, migrated to California and other states to find that the Great Depression had rendered economic conditions there little better than those they had left. Author John Steinbeck wrote Of Mice and Men (1937) and The Grapes of Wrath (1939) about migrant workers and farm families displaced by the Dust Bowl.
new first lady to be a symbol of elegance, but rather "plain, ordinary Mrs. Roosevelt." Despite this disclaimer, she showed herself to be an extraordinary First Lady.

In 1933, Mrs. Roosevelt became the first, First Lady to hold her own press conference. In an attempt to afford equal time to women--who were traditionally barred from presidential press conferences--she allowed only female reporters to attend. In 1939, the Daughters of the American Revolution (DAR) refused to allow Marion Anderson, an African American singer, to perform in their auditorium. In protest, Mrs. Roosevelt resigned her membership in the DAR.

Throughout Franklin D. Roosevelt's presidency, Eleanor traveled extensively around the nation, visiting relief projects, surveying working and living conditions, and then reporting her observations to the President. She was called "the President's eyes, ears and legs" and provided objective information to her husband. When the Japanese attacked Pearl Harbor and the United States entered WWII, Mrs. Roosevelt made certain that the President did not abandon the goals he had put forth in the New Deal. She also exercised her own political and social influence;

she became an advocate of the rights and needs of the poor, of minorities, and of the disadvantaged. The public was drawn in by the First Lady's exploits and adventures which she recounted in her daily syndicated column, "My Day". She began writing the column in 1935 and continued until her death in 1962.

During the war, she served as Assistant Director of Civilian Defense from 1941 to 1942 and she visited England and the South Pacific to foster good will among the Allies and to boost the morale of U.S. servicemen overseas.
Once in office, FDR set to work immediately. His "New Deal," it turned out, involved regulation and reform of the banking system, massive government spending to "prime the pump" by restarting the economy and putting people back to work, and the creation of a social services network to support those who had fallen on hard times.

Between 8 March and 16 June, in what later became known as the "First Hundred Days," Congress followed Roosevelt's lead by passing an incredible fifteen separate bills which, together, formed the basis of the New Deal. Several of the programs created during those three and a half months are still around in the federal government today. Some of Roosevelt's most notable actions during the Hundred Days were:

A national bank holiday: The day after his inauguration, FDR declared a "bank holiday," closing all banks in the country to prevent a collapse of the banking system. With the banks closed, Roosevelt took measures to restore the public's confidence in the financial systems; when the banks reopened a week later, the panic was over.22
Ending the gold standard: To avoid deflation, FDR quickly suspended the gold standard.23 This meant that U.S. dollars no longer had to be backed up by gold reserves, which also meant that the government could print—and spend—more money to "prime the pump" of the economy.
Glass-Steagall Act: The Glass-Steagall Act imposed regulations on the banking industry that guided it for over fifty years, until it was repealed in 1999.24 The law separated commercial from investment banking, forced banks to get out of the business of financial investment, banned the use of bank deposits in speculation.25 It also created the FDIC[link to "FDIC" passage below]. The effect of the law was to give greater stability to the banking system.
FDIC: The Federal Deposit Insurance Commission backed all bank deposits up to $2500, meaning that most bank customers no longer had to worry that a bank failure would wipe out their life savings.26 The agency continues to insure American deposits today.
Federal Securities Act: This act regulated the stock markets and preceded the creation of the Securities and Exchange Commission in 1934, which continues to regulate U.S. stock markets to this day.
Agricultural Adjustment Act: The AAA provided relief to farmers by paying them to reduce production; this also helped to reduce crop surpluses and increase prices for crops.27
Civilian Conservation Corps: To reduce unemployment, put 250,000 young men to work in rural conservation projects, mostly in national parks and forests.28
Tennessee Valley Authority: The TVA provided electrification and other basic improvements the impoverished interior of the South.
National Industrial Recovery Act: One of FDR's more controversial measures, it created new agencies and regulations that tightened the relationship between government and business. It was declared unconstitutional by the Supreme Court in 1935.
Public Works Administration: Funded the construction of public works projects across the country, including schools, hospitals, airports, dams, and ports, as well as ships for the Navy and airports for the Army Air Corps.29
Federal Emergency Relief Act: Provided direct relief, training and work for unemployed Americans. It was abolished in 1935 and its programs folded into other agencies.30
The term Glass-Steagall Act usually refers to four provisions of the U.S. Banking Act of 1933 that limited commercial bank securities activities and affiliations within commercial banks and securities firms.[1] Congressional efforts to "repeal the Glass-Steagall Act" referred to those four provisions (and then usually to only the two provisions that restricted affiliations between commercial banks and securities firms).[2] Those efforts culminated in the 1999 Gramm-Leach-Bliley Act (GLBA), which repealed the two provisions restricting affiliations between banks and securities firms.[3]

The term Glass-Steagall Act is also often used to refer to the entire Banking Act of 1933, after its Congressional sponsors, Senator Carter Glass (D) of Virginia, and Representative Henry B. Steagall (D) of Alabama.[4] This article deals with only the four provisions separating commercial and investment banking. The article 1933 Banking Act describes the entire law, including the legislative history of the Glass-Steagall provisions separating commercial and investment banking. A separate 1932 law also known as the Glass-Steagall Act is described in the article Glass-Steagall Act of 1932.

Starting in the early 1960s federal banking regulators interpreted provisions of the Glass-Steagall Act to permit commercial banks and especially commercial bank affiliates to engage in an expanding list and volume of securities activities.[5] By the time the affiliation restrictions in the Glass-Steagall Act were repealed through the GLBA, many commentators argued Glass-Steagall was already "dead."[6] Most notably, Citibank's 1998 affiliation with Salomon Smith Barney, one of the largest US securities firms, was permitted under the Federal Reserve Board's then existing interpretation of the Glass-Steagall Act.[7] President Bill Clinton publicly declared "the Glass-Steagall law is no longer appropriate."[8]

Many commentators have stated that the GLBA's repeal of the affiliation restrictions of the Glass-Steagall Act was an important cause of the late-2000s financial crisis.[9][10][11] Some critics of that repeal argue it permitted Wall Street investment banking firms to gamble with their depositors' money that was held in affiliated commercial banks.[12] Others have argued that the activities linked to the financial crisis were not prohibited (or, in most cases, even regulated) by the Glass-Steagall Act.[13] Commentators, including former President Clinton in 2008 and the American Bankers Association in January 2010, have also argued that the ability of commercial banking firms to acquire securities firms (and of securities firms to convert into bank holding companies) helped mitigate the financial crisis.[14]
The government took action to protect bank depositors by creating the Banking Act of 1933, which also formed the FDIC. The FDIC's purpose was to provide stability to the economy and the failing banking system. Officially created in the Glass-Steagall Act of 1933 and modeled after the deposit insurance program initially enacted in Massachusetts, the FDIC guaranteed a specific amount of checking and savings deposits for its member banks.

Originally denounced by the American Bankers Association as too expensive and an artificial support of bad business activity, the FDIC was declared a success when only nine additional banks closed in 1934. Due to the conservative behavior of banking institutions and the zeal of bank regulators through World War II and the subsequent period, deposit insurance was regarded as less important. Some financial experts concluded that the system had become too guarded, and was therefore impeding the natural effects of a free market economy. Some notable items and milestones for the FDIC through 1983 are as follows:

1933: Congress creates the FDIC.
1934: Deposit insurance coverage is initially set at $2,500, and is then raised midyear to $5,000.
1950: Deposit insurance increased to $10,000; refunds are established for banks to receive a credit for excess assessments above operating and insurance losses.
1960: FDIC's insurance fund passes $2 billion.
1966: Deposit insurance is increased to $15,000.00.
1969: Deposit insurance is increased to $20,000.00.
1974: Deposit insurance is increased to $40,000.00.
1980: Deposit insurance is increased to $100,000.00; FDIC insurance fund is $11 billion.

The period from 1933-1983 was characterized by increased lending without a proportionate increase in loan losses, resulting in a significant increase in bank assets. In 1947 alone, lending increased from 16% to 25% of industry assets; the rate rose to 40% by the 1950s and to 50% by the early 1960s.

In the '60s, banking operations started to change. Banks began taking nontraditional risks and expanding the branch networks into new territory with the relaxation of branching laws. This expansion and risk taking favored the banking industry throughout the 1970s, as generally favorable economic development allowed even marginal borrowers to meet their financial obligations. However, this trend would finally catch up to the banking industry and result in the need for deposit insurance during the 1980s.
Executive Order 6102 is a United States presidential executive order signed on April 5, 1933, by President Franklin D. Roosevelt "forbidding the Hoarding of gold coin, gold bullion, and gold certificates within the continental United States". The order criminalized the possession of monetary gold by any individual, partnership, association or corporation.
The stated reason for the order was that hard times had caused "hoarding" of gold, stalling economic growth and making the depression worse.[1] The New York Times, on April 6, 1933, p. 16, wrote under the headline "Hoarding of Gold", "The Executive Order issued by the President yesterday amplifies and particularizes his earlier warnings against hoarding. On March 6, taking advantage of a wartime statute that had not been repealed, he issued Presidential Proclamation 2039 that forbade the hoarding 'of gold or silver coin or bullion or currency,' under penalty of $10,000 and/or up to five to ten years imprisonment."[2]

The main rationale behind the order was to remove the constraint on the Federal Reserve which prevented it from increasing the money supply during the depression; the Federal Reserve Act required 40% gold backing of Federal Reserve Notes issued. By the late 1920s, the Federal Reserve had almost hit the limit of allowable credit (in the form of Federal Reserve demand notes) that could be backed by the gold in its possession (see Great Depression). If gold can't be legally owned, then it can't be legally redeemed. If it can't be legally redeemed, then it can't constrain the central bank.
Executive Order 6102

Executive Order 6102 required all persons to deliver on or before May 1, 1933, all but a small amount of gold coin, gold bullion, and gold certificates owned by them to the Federal Reserve, in exchange for $20.67 (equivalent to $376.58 today[4]) per troy ounce. Under the Trading With the Enemy Act of 1917, as amended by the recently passed Emergency Banking Act of March 9, 1933, violation of the order was punishable by fine up to $10,000 (equivalent to $182,185 today[4]) or up to ten years in prison, or both.
The Civilian Conservation Corps (CCC) was a public work relief program that operated from 1933 to 1942 in the United States for unemployed, unmarried men from relief families as part of the New Deal. Originally for young men ages 18-23, it was eventually expanded to young men ages 17-28.[1] Robert Fechner was the head of the agency. It was a major part of President Franklin D. Roosevelt's New Deal that provided unskilled manual labor jobs related to the conservation and development of natural resources in rural lands owned by federal, state and local governments. The CCC was designed to provide jobs for young men, to relieve families who had difficulty finding jobs during the Great Depression in the United States while at the same time implementing a general natural resource conservation program in every state and territory. Maximum enrollment at any one time was 300,000; in nine years 3 million young men participated in the CCC, which provided them with shelter, clothing, and food, together with a small wage of $30 a month ($25 of which had to be sent home to their families).[2]

The American public made the CCC the most popular of all the New Deal programs.[3] Principal benefits of an individual's enrollment in the CCC included improved physical condition, heightened morale, and increased employability.[4] Implicitly, the CCC also led to a greater public awareness and appreciation of the outdoors and the nation's natural resources; and the continued need for a carefully planned, comprehensive national program for the protection and development of natural resources.[5]

During the time of the CCC, enrollees planted nearly 3 billion trees to help reforest America, constructed more than 800 parks nationwide and upgraded most state parks, updated forest fire fighting methods, and built a network of service buildings and public roadways in remote areas.

The CCC operated separate programs for veterans and Native Americans.

Despite its popular support, the CCC was never a permanent agency. It depended on emergency and temporary Congressional legislation for its existence. By 1942, with World War II and the draft in operation, need for work relief declined and Congress voted to close the program.[8]
Federal Emergency Relief Administration (FERA) was the new name given by the Roosevelt Administration to the Emergency Relief Administration (ERA) which President Herbert Hoover had created in 1932. FERA was established as a result of the Federal Emergency Relief Act and was replaced in 1935 by the Works Progress Administration (WPA).

FERA under Hoover gave loans to the states to operate relief programs. One of these, the New York state program TERA (Temporary Emergency Relief Administration), was set up in 1931 and headed by Harry Hopkins, a close adviser to Governor Franklin D. Roosevelt. Roosevelt asked Congress to set up FERA—which gave grants to the states for the same purpose—in May 1933, and appointed Hopkins to head it. Along with the Civilian Conservation Corps (CCC) it was the first relief operation under the New Deal. Basically, it gave grants and loans to states.

FERA's main goal was alleviating household unemployment by creating new unskilled jobs in local and state government. Jobs were more expensive than direct cash payments (called "the dole"), but were psychologically more beneficial to the unemployed, who wanted any sort of job, for self-esteem, to play the role of male breadwinner. From May 1933 until it closed in December, 1935, FERA gave states and localities $3.1 billion.[citation needed] FERA provided work for over 20 million people and developed facilities on public lands across the country.

Faced with continued high unemployment and concerns for public welfare during the coming winter of 1933-34, FERA instituted the Civil Works Administration (CWA) as a $400 million short-term measure to get people to work. The Federal Emergency Relief Administration was terminated in 1935 and its work taken over by two entirely new federal agencies, the Works Progress Administration and the Social Security Administration.
The National Industrial Recovery Act (NIRA) was a law passed by the United States Congress in 1933 to authorize the President to regulate industry in an attempt to raise prices after severe deflation and stimulate economic recovery.[1] It also established a national public works program known as the Public Works Administration (PWA, not to be confused with the WPA of 1935).[2] The National Recovery Administration (NRA) portion was widely hailed in 1933, but by 1934 business' opinion of the act had soured.[3] By March 1934 the "NRA was engaged chiefly in drawing up these industrial codes for all industries to adopt."[4] However, the NIRA was declared unconstitutional by the Supreme Court in 1935 and not replaced.[3][5][6]

The legislation was enacted in June 1933 during the Great Depression in the United States as part of President Franklin D. Roosevelt's New Deal legislative program. Section 7(a) of the bill, which protected collective bargaining rights for unions, proved contentious (especially in the Senate),[3][7] but both chambers eventually passed the legislation. President Roosevelt signed the bill into law on June 16, 1933.[3][8] The Act had two main sections (or "titles"). Title I was devoted to industrial recovery, authorizing the promulgation of industrial codes of fair competition, guaranteed trade union rights, permitted the regulation of working standards, and regulated the price of certain refined petroleum products and their transportation. Title II established the Public Works Administration, outlined the projects and funding opportunities it could engage in. Title II also provided funding for the Act.

The Act was implemented by the NRA and the Public Works Administration (PWA).[6][9] Very large numbers of regulations were generated under the authority granted to the NRA by the Act,[10][11] which led to a significant loss of political support for Roosevelt and the New Deal.[6] The NIRA was set to expire in June 1935, but in a major constitutional ruling the U.S. Supreme Court held Title I of the Act unconstitutional on May 27, 1935, in Schechter Poultry Corp. v. United States, 295 U.S. 495 (1935).[6] The National Industrial Recovery Act is widely considered a policy failure, both in the 1930s and by historians today.[3][12][13] Disputes over the reasons for this failure continue. Among the suggested causes are that the Act promoted economically harmful monopolies,[10] that the Act lacked critical support from the business community,[14] and that it was poorly administered.[14][15] The Act encouraged union organizing, which led to significant labor unrest.[16] The NIRA had no mechanisms for handling these problems, which led Congress to pass the National Labor Relations Act in 1935.[17] The Act was also a major force behind a major modification of the law criminalizing making false statements.[18]
Huey Pierce Long, Jr. (August 30, 1893 - September 10, 1935), nicknamed The Kingfish, was an American politician who served as the 40th Governor of Louisiana from 1928 to 1932 and as a member of the United States Senate from 1932 until his assassination in 1935. A Democrat, he was an outspoken populist who denounced the rich and the banks and called for "Share the Wealth." As the political boss of the state he commanded wide networks of supporters and was willing to take forceful action. He established the political prominence of the Long political family.

Long is best known for his Share Our Wealth program, created in 1934 under the motto "Every Man a King." It proposed new wealth redistribution measures in the form of a net asset tax on corporations and individuals to curb the poverty and homelessness endemic nationwide during the Great Depression. To stimulate the economy, Long advocated federal spending on public works, schools and colleges, and old age pensions. He was an ardent critic of the policies of the Federal Reserve System.

A supporter of Franklin D. Roosevelt in the 1932 presidential election, Long split with Roosevelt in June 1933 to plan his own presidential bid for 1936 in alliance with the influential Catholic priest and radio commentator Charles Coughlin. Long was assassinated in 1935 and his national movement soon faded, but his legacy continued in Louisiana through his wife, Senator Rose McConnell Long, and his son, Senator Russell B. Long.[1]

Under Long's leadership, hospitals and educational institutions were expanded, a system of charity hospitals was set up that provided health care for the poor, massive highway construction and free bridges brought an end to rural isolation, and free textbooks for schoolchildren. He remains a controversial figure in Louisiana history, with critics and supporters debating whether he was a dictator, demagogue or populist.[2]
The Securities and Exchange Commission was established in 1934 to regulate the commerce in stocks, bonds, and other securities. After the October 29, 1929, stock market crash, reflections on its cause prompted calls for reform. Controls on the issuing and trading of securities were virtually nonexistent, allowing for any number of frauds and other schemes. Further, the unreported concentration of controlling stock interests in a very few hands led to the abuses of power that the free exchange of stock supposedly eliminated.

To bring order out of chaos, Congress passed three major acts creating the Securities and Exchange Commission,SEC,and defining its responsibilities. The Securities Act of 1933 required public corporations to register their stock sales and distribution and make regular financial disclosures. The Securities Exchange Act of 1934 created theSEC to regulate exchanges, brokers, and over-the-counter markets, as well as to monitor the required financial disclosures. The 1935 Public Utility Holding Company Act did away with holding companies more than twice removed from the utilities whose stocks they held. This "death sentence" ended the practice of using holding companies to obscure the intertwined ownership of public utility companies. Further, the act authorized the SEC to break up any unnecessarily large utility combinations into smaller, geographically based companies and to set up federal commissions to regulate utility rates and financial practices.

The business community, wary of New Deal reforms, was mollified by the efficient chairmanships of Joseph P. Kennedy and William O. Douglas.
The Works Progress Administration (renamed in 1939 as the Work Projects Administration; WPA) was the largest and most ambitious American New Deal agency, employing millions of unemployed people (mostly unskilled men) to carry out public works projects,[1] including the construction of public buildings and roads. In a much smaller but more famous project, the Federal Project Number One, the WPA employed musicians, artists, writers, actors and directors in large arts, drama, media, and literacy projects.[1]

Almost every community in the United States had a new park, bridge or school constructed by the agency. The WPA's initial appropriation in 1935 was for $4.9 billion (about 6.7 percent of the 1935 GDP), and in total it spent $13.4 billion.[2]
At its peak in 1938, it provided paid jobs for three million unemployed men and women, as well as youth in a separate division, the National Youth Administration. Headed by Harry Hopkins, the WPA provided jobs and income to the unemployed during the Great Depression in the United States. Between 1935 and 1943, the WPA provided almost eight million jobs.[3] Full employment, which was reached in 1942 and emerged as a long-term national goal around 1944, was not the WPA goal. It tried to provide one paid job for all families in which the breadwinner suffered long-term unemployment.[4] Robert D. Leighninger asserts that "The stated goal of public building programs was to end the depression or, at least, alleviate its worst effects. Millions of people needed subsistence incomes. Work relief was preferred over public assistance (the dole) because it maintained self-respect, reinforced the work ethic, and kept skills sharp."[5]

The WPA was a national program that operated its own projects in cooperation with state and local governments, which provided 10-30% of the costs. Usually the local sponsor provided land and often trucks and supplies, with the WPA responsible for wages (and for the salaries of supervisors, who were not on relief). WPA sometimes took over state and local relief programs that had originated in the Reconstruction Finance Corporation (RFC) or Federal Emergency Relief Administration (FERA) programs.[6]

It was liquidated on June 30, 1943, as a result of low unemployment due to the worker shortage of World War II. The WPA had provided millions of Americans with jobs for 8 years.[7] Most people who needed a job were eligible for at least some of its positions.[8] Hourly wages were typically set to the prevailing wages in each area.
John Llewellyn Lewis (February 12, 1880 - June 11, 1969) was an American leader of organized labor who served as president of the United Mine Workers of America (UMW) from 1920 to 1960. A major player in the history of coal mining, he was the driving force behind the founding of the Congress of Industrial Organizations (CIO), which established the United Steel Workers of America and helped organize millions of other industrial workers in the 1930s. After resigning as head of the CIO in 1941, he took the Mine Workers out of the CIO in 1942 and in 1944 took the union into the American Federation of Labor (AFL).

A leading liberal, he played a major role in helping Franklin D. Roosevelt win a landslide in 1936, but as an isolationist broke with Roosevelt in 1940 on FDR's anti-Nazi foreign policy. Lewis was a brutally effective and aggressive fighter and strike leader who gained high wages for his membership while steamrolling over his opponents, including the United States government. Lewis was one of the most controversial and innovative leaders in the history of labor, gaining credit for building the industrial unions of the CIO into a political and economic powerhouse to rival the AFL, yet was widely hated by calling for nationwide coal strikes which critics believed damaging to the American economy and war effort. His massive leonine head, forest-like eyebrows, firmly set jaw, powerful voice and ever-present scowl thrilled his supporters, angered his enemies, and delighted cartoonists. Coal miners for 40 years hailed him as their leader, whom they credited with bringing high wages, pensions and medical benefits.[1]
The Judicial Procedures Reform Bill of 1937[1] (frequently called the "court-packing plan")[2] was a legislative initiative proposed by U.S. President Franklin D. Roosevelt to add more justices to the U.S. Supreme Court. Roosevelt's purpose was to obtain favorable rulings regarding New Deal legislation that the court had ruled unconstitutional.[3] The central provision of the bill would have granted the President power to appoint an additional Justice to the U.S. Supreme Court, up to a maximum of six, for every member of the court over the age of 70 years and 6 months.

In the Judiciary Act of 1869 Congress had established that the United States Supreme Court would consist of the Chief Justice and eight associate justices. During Roosevelt's first term the Supreme Court struck down several New Deal measures as being unconstitutional. Roosevelt sought to reverse this by changing the makeup of the court through the appointment of new additional justices who he hoped would rule his legislative initiatives did not exceed the constitutional authority of the government. Since the U.S. Constitution does not define the size of the Supreme Court, Roosevelt pointed out that it was within the power of the Congress to change it. The legislation was viewed by members of both parties as an attempt to stack the court, and was opposed by many Democrats, including Vice President John Nance Garner.[4][5] The bill came to be known as Roosevelt's "court-packing plan".[2]

The legislation was unveiled on February 5, 1937, and was the subject of Roosevelt's 9th Fireside chat of March 9, 1937.[6][7] Three weeks after the radio address the Supreme Court published an opinion upholding a Washington state minimum wage law in West Coast Hotel Co. v. Parrish.[8] The 5-4 ruling was the result of the sudden jurisprudential shift by Associate Justice Owen Roberts, who joined with the wing of the bench supportive to the New Deal legislation. Since Roberts had previously ruled against most New Deal legislation, his support here was seen as a result of the political pressure the president was exerting on the court. Some interpreted his reversal as an effort to maintain the Court's judicial independence by alleviating the political pressure to create a court more friendly to the New Deal. This reversal came to be known as "the switch in time that saved nine"; however, recent legal-historical scholarship has called that narrative into question[9] as Roberts's decision and vote in the Parrish case predated the actual introduction of the 1937 bill.[10]

Roosevelt's legislative initiative ultimately failed. The bill was held up in the Senate Judiciary Committee by Democrat committee chair Henry F. Ashurst, who delayed hearings in the Judiciary Committee, saying "No haste, no hurry, no waste, no worry—that is the motto of this committee."[11] As a result of his delaying efforts, the bill was held in committee for 165 days, and opponents of the bill credited Ashurst as instrumental in its defeat.[5] The bill was further undermined by the untimely death of its chief advocate in the U.S. Senate, Senate Majority Leader Joseph T. Robinson. Contemporary observers broadly viewed Roosevelt's initiative as political maneuvering. Its failure exposed the limits of Roosevelt's abilities to push forward legislation through direct public appeal. The public perception of his efforts here was in stark contrast to the reception of his legislative efforts during his first-term.[12][13] Roosevelt ultimately prevailed in establishing a majority on the court friendly to his New Deal legislation, though some scholars view Roosevelt's victory as pyrrhic.[13]
John Maynard Keynes, 1st Baron Keynes,[1] CB, FBA (/ˈkeɪnz/ KAYNZ; 5 June 1883 - 21 April 1946), was a British economist whose ideas have fundamentally affected the theory and practice of modern macroeconomics and informed the economic policies of governments. He built on and greatly refined earlier work on the causes of business cycles, and he is widely considered to be one of the founders of modern macroeconomics and the most influential economist of the 20th century.[2][3][4][5] His ideas are the basis for the school of thought known as Keynesian economics and its various offshoots.

In the 1930s, Keynes spearheaded a revolution in economic thinking, overturning the older ideas of neoclassical economics that held that free markets would, in the short to medium term, automatically provide full employment, as long as workers were flexible in their wage demands. Keynes instead argued that aggregate demand determined the overall level of economic activity and that inadequate aggregate demand could lead to prolonged periods of high unemployment. According to Keynesian economics, state intervention was necessary to moderate "boom and bust" cycles of economic activity.[6] He advocated the use of fiscal and monetary measures to mitigate the adverse effects of economic recessions and depressions. Following the outbreak of World War II, Keynes's ideas concerning economic policy were adopted by leading Western economies. In 1942, Keynes was awarded a hereditary peerage as Baron Keynes of Tilton in the County of Sussex.[7] Keynes died in 1946; but, during the 1950s and 1960s, the success of Keynesian economics resulted in almost all capitalist governments adopting its policy recommendations.

Keynes's influence waned in the 1970s, partly as a result of problems that began to afflict the Anglo-American economies from the start of the decade and partly because of critiques from Milton Friedman and other economists who were pessimistic about the ability of governments to regulate the business cycle with fiscal policy.[8] However, the advent of the global financial crisis of 2007-08 caused a resurgence in Keynesian thought. Keynesian economics provided the theoretical underpinning for economic policies undertaken in response to the crisis by President Barack Obama of the United States, Prime Minister Gordon Brown of the United Kingdom, and other heads of governments.[9]

In 1999, Time magazine included Keynes in their list of the 100 most important and influential people of the 20th century, commenting that: "His radical idea that governments should spend money they don't have may have saved capitalism."[10] He has been described by The Economist as "Britain's most famous 20th-century economist."[11] In addition to being an economist, Keynes was also a civil servant, a director of the Bank of England, a part of the Bloomsbury Group of intellectuals,[12] a patron of the arts and an art collector, a director of the British Eugenics Society, an advisor to several charitable trusts, a successful private investor, a writer, a philosopher, and a farmer.
The New Deal itself created millions of jobs and sponsored public works projects that reached most every county in the nation. Federal protection of bank deposits ended the dangerous trend of bank runs. Abuse of the stock market was more clearly defined and monitored to prevent collapses in the future. The Social Security system was modified and expanded to remain one of the most popular government programs for the remainder of the century. For the first time in peacetime history the federal government assumed responsibility for managing the economy. The legacy of social welfare programs for the destitute and underprivileged would ring through the remainder of the 1900s.

Laborers benefited from protections as witnessed by the emergence of a new powerful union, the Congress of Industrial Organizations. African Americans and women received limited advances by the legislative programs, but FDR was not fully committed to either civil or women's rights. All over Europe, fascist governments were on the rise, but Roosevelt steered America along a safe path when economic spirits were at an all-time low.

However comprehensive the New Deal seemed, it failed to achieve its main goal: ending the Depression. In 1939, the unemployment rate was still 19 percent, and not until 1943 did it reach its pre-Depression levels. The massive spending brought by the American entry to the Second World War ultimately cured the nation's economic woes.

Conservatives bemoaned a bloated bureaucracy that was nearly a million workers strong, up from just over 600,000 in 1932. They complained that Roosevelt more than doubled the national debt in two short terms, a good deal of which had been lost through waste. Liberals pointed out that the gap between rich and poor was barely dented by the end of the decade. Regardless of its shortcomings, Franklin Roosevelt and the New Deal helped America muddle through the dark times strong enough to tackle the even greater task that lay ahead.