88 terms

us history 2 final

us history 2 final
Turner thesis
The frontier thesis
reservation policy
George Armstrong Custer
was a United States Army officer and cavalry commander in the American Civil War and the Indian Wars. Raised in Michigan and Ohio, Custer was admitted to West Point in 1858, where he graduated last in his class. However, with the outbreak of the Civil War, all potential officers were needed, and Custer was called to serve with the Union Army.

Custer developed a strong reputation during the Civil War. He fought in the first major engagement, the First Battle of Bull Run. His association with several important officers helped his career, as did his success as a highly effective cavalry commander. Custer was eventually promoted to the temporary rank (brevet) of major general. (At war's end, he reverted to his permanent rank of captain.) At the conclusion of the Appomattox Campaign, in which he and his troops played a decisive role, Custer was on hand at General Robert E. Lee's surrender.

After the Civil War, Custer was dispatched to the west to fight in the Indian Wars. His disastrous final battle overshadowed his prior achievements. Custer and all the men with him were killed at the Battle of the Little Bighorn in 1876, fighting against a coalition of Native American tribes in a battle that has come to be popularly known in American history as "Custer's Last Stand".
Comstock Lode
was the first major U.S. discovery of silver ore, located under what is now Virginia City, Nevada, on the eastern slope of Mount Davidson, a peak in the Virginia Range. After the discovery was made public in 1859, prospectors rushed to the area and scrambled to stake their claims. Mining camps soon thrived in the vicinity, which became bustling centers of fabulous wealth.

It is notable not just for the immense fortunes it generated and the large role those fortunes had in the growth of Nevada and San Francisco, but also for the advances in mining technology that it spurred. The mines declined after 1874.
Homestead Act of 1862
was one of three United States federal laws that gave an applicant ownership at no cost of farmland called a "homestead" - typically 160 acres (65 hectares or one-fourth section) of undeveloped federal land west of the Mississippi River. It was an expression of the "Free Soil" policy of Northerners who wanted individual farmers to own and operate their own farms, as opposed to slaveowners who would use gangs of slaves.

The first act, the Homestead Act of 1862, had been blocked in Congress by Southern Democrats who wanted lands for slaveowners. Once they were gone, the Republican Congress passed the bill; it was signed into law by President Abraham Lincoln on May 20, 1862.[1] The law required 3 steps: file an application, improve the land, and file for deed of title. Anyone who had never taken up arms against the U.S. government, including freed slaves, could file an application to claim a federal land grant. The occupant had to be 21 or older or the head of a family, live on the land for five years and show evidence of having made improvements.

Because much of the prime low-lying alluvial land along rivers had been homesteaded by 1900, an update called the Enlarged Homestead Act was passed in 1909. It targeted land suitable for dryland farming, increasing the number of acres to 320.[2] In 1916, the Stock-Raising Homestead Act targeted settlers seeking 640 acres (260 ha) of public land for ranching purposes.[2]

Only about 40 percent of the applicants who started the process were able to complete it and obtain title to their homestead land.[3] Eventually 1.6 million homesteads were granted and 270,000,000 acres (420,000 sq mi) of federal land were privatized between 1862 and 1934, a total of 10% of all lands in the United States.[4] Homesteading was discontinued in 1976, except in Alaska, where it continued until 1986.
"rum, Romanism, rebellion"
Days before the election, James Blaine visited the crucial swing state of New York, attending a morning meeting in a New York City hotel. During a speech made by Presbyterian minister Samuel Burchard, the Democratic Party was assailed as the party of "Rum, Romanism, and Rebellion." In many ways, this phrase singled out Irish Catholics, many of whom lived in the large urban centers like New York and Boston. The phrase catered to the stereotype of the drunken Irishman and demeaned the Catholic faith. Most all Irish were Roman Catholic.

The term "Rebellion" was a typical post Civil War Republican ploy of "waving the bloody shirt," reminding voters that it had been the Democrats who were responsible for the great bloodshed and connecting them to the assassination of Abraham Lincoln. By 1884, Democrats saw themselves as fully rehabilitated politically. During his term in office, Cleveland would return Confederate battle flags to the South, and action that hurt his reelection in 1888.

Read more at Suite101: Rum Romanism and Rebellion: The Phrase that Cost a Presidential Election Victory | Suite101.com http://suite101.com/article/rum-romanism-and-rebellion-a93033#ixzz23fHFD7JO
Granger laws
were a series of laws passed in Southern states of the United States after the American Civil War to regulate grain elevator, railroad freight rates and to address long- and short-haul discrimination. They were passed through political agitation both by merchants' associations and by so-called Granger parties, which were third parties formed most often by members of the Patrons of Husbandry, an organization for farmers commonly called the Grange. The Granger Laws were an issue in two important court cases in the late 19th century, Munn v. Illinois and Wabash v. Illinois.
Interstate Commerce Commission
was a regulatory body in the United States created by the Interstate Commerce Act of 1887. The agency's original purpose was to regulate railroads (and later trucking) to ensure fair rates, to eliminate rate discrimination, and to regulate other aspects of common carriers, including interstate bus lines and telephone companies. The agency was abolished in 1995, and its remaining functions were transferred to the Surface Transportation Board.

The Commission's five members were appointed by the President with the consent of the United States Senate. This was the first independent agency (or so-called Fourth Branch).
has been viewed as a political ideology, political philosophy, or as a type of discourse. Generally, populists tend to claim that they side with "the people" against "the elites". While for much of the twentieth century, populism was considered to be a political phenomenon mostly affecting Latin America, since the 1980s populist movements and parties have enjoyed degrees of success in established democracies such as the USA, Canada, Italy, the Netherlands and Scandinavian countries.
William Bessemer
was the first inexpensive industrial process for the mass-production of steel from molten pig iron. The process is named after its inventor, Henry Bessemer, who took out a patent on the process in 1855. The process was independently discovered in 1851 by William Kelly.[1][2] The process had also been used outside of Europe for hundreds of years, but not on an industrial scale.[3] The key principle is removal of impurities from the iron by oxidation with air being blown through the molten iron. The oxidation also raises the temperature of the iron mass and keeps it molten.

Bessemer converter, schematic diagramThe process using a basic refractory lining is known as the basic Bessemer process or Gilchrist-Thomas process after the discoverer Sidney Gilchrist Thomas.
William Kelly
(August 21, 1811 - February 11, 1888), born in Pittsburgh, Pennsylvania, was an American inventor. Kelly studied metallurgy at the Western University of Pennsylvania. Instead of getting a job as a scientist, Kelly, his brother, and his brother-in-law started a dry goods and commission business, which they called McShane & Kelly. After a fire destroyed their warehouse, William and his brother John decided to move to Eddyville, Kentucky in 1847 to enter the iron industry.
The Gospel of Wealth
is an article written by Andrew Carnegie in 1889[4] that describes the responsibility of philanthropy by the new upper class of self-made rich. The central thesis of Carnegie's essay was the peril of allowing large sums of money to be passed into the hands of persons or organizations ill-equipped mentally or emotionally to cope with them.

As a result, the wealthy entrepreneur must assume the responsibility of distributing his fortune in a way that it will be put to good use, and not wasted on frivolous expenditure. In this he represented a captain of industry who had risen to power by his own hand and refused to worship.
United States Steel
Leading U.S. producer of steel and related products. It was founded in 1901 by Charles M. Schwab, Elbert H. Gary, and J.P. Morgan to consolidate Andrew Carnegie's Carnegie Steel Co., Gary's Federal Steel Co., and other steel and metal companies. As chairman of the board, Gary dominated the corporation in its early years, organizing price agreements among steel producers and opposing unions. An antitrust suit against U.S. Steel went as far as the Supreme Court, which ruled in 1920 that it was not a monopoly in restraint of trade. The corporation recognized the United Steelworkers of America in 1936. The largest U.S. steel producer, U.S. Steel diversified into oil and gas in the later 20th century as well as into chemicals, mining, construction, and transportation. In 1986 the holding company USX Corp. was established to oversee it and other operating units. U.S. Steel Group was spun off from USX in 2002 and again became an independent, publicly traded corporation.

Read more: http://www.answers.com/topic/united-states-steel-corp#ixzz23fLN0xJX
Sherman Antitrust Act
is a landmark federal statute on competition law passed by Congress in 1890. It prohibits certain business activities that reduce competition in the marketplace, and requires the United States federal government to investigate and pursue trusts, companies, and organizations suspected of being in violation. It was the first Federal statute to limit cartels and monopolies, and today still forms the basis for most antitrust litigation by the United States federal government. However, for the most part, politicians were unwilling to refer to the law until Theodore Roosevelt's presidency (1901-1909).

The Sherman Antitrust Act was named after its author, Senator John Sherman, an Ohio Republican, the chairman of the Senate Finance Committee, who was also Rockefeller's colleague.[2] After being ratified in the Senate on April 8, 1890 by a vote of 51-1, the Sherman Act passed unanimously (242-0) in the House of Representatives on June 20, 1890, and was then signed into law by President Benjamin Harrison on July 2, 1890.[2]

Despite its name the Act has fairly little to do with "trusts". Around the world, what U.S. lawmakers and attorneys call "antitrust" is more commonly known as "competition law." The purpose of the Act was, to quote Sherman:

"To protect the consumers by preventing arrangements designed, or which tend, to advance the cost of goods to the consumer".[3]

It has since, more broadly, been used to oppose the combination of entities that could potentially harm competition, such as monopolies or cartels.

According to its authors, it was not intended to impact market gains obtained by honest means, by benefiting the consumer more than the competitors. Senator George Hoar of Massachusetts, another author of the Sherman act, said a person:

"...who merely by superior skill and intelligence...got the whole business because nobody could do it as well as he could was not a monopolist..(but was if) it involved something like the use of means which made it impossible for other persons to engage in fair competition"[4]

Its reference to trusts today is anachronistic. At the time of its passage, the trust was synonymous with monopolistic practice, because the trust was a popular way for monopolists to hold their businesses, and a way for cartel participants to create enforceable agreements.[5]

In 1879, C. T. Dodd, an attorney for the Standard Oil Company of Ohio, devised a new type of trust agreement to overcome prohibitions in Ohio against corporations owning stock in other corporations. A trust is a centuries-old form of a contract whereby one party entrusts its property to a second party. The property is then used to benefit the first party.

The law attempts to prevent the artificial raising of prices by restriction of trade or supply.[6] In other words, innocent monopoly, or monopoly achieved solely by merit, is perfectly legal, but acts by a monopolist to artificially preserve his status, or nefarious dealings to create a monopoly, are not. Put another way, it has sometimes been said that the purpose of the Sherman Act is not to protect competitors, but rather to protect competition and the competitive landscape. As explained by the U.S. Supreme Court in Spectrum Sports, Inc. v. McQuillan 506 U.S. 447 (1993),

"The purpose of the [Sherman] Act is not to protect businesses from the working of the market; it is to protect the public from the failure of the market. The law directs itself not against conduct which is competitive, even severely so, but against conduct which unfairly tends to destroy competition itself.[7] This focus of U.S. competition law, on protection of competition rather than competitors, is not necessarily the only possible focus or purpose of competition law. For example, it has also been said that competition law in the European Union (EU) tends to protect the competitors in the marketplace, even at the expense of market efficiencies and consumers."
John D. Rockefeller
was an American industrialist and philanthropist. He was the founder of the Standard Oil Company, which dominated the oil industry and was the first great U.S. business trust. Rockefeller revolutionized the petroleum industry and defined the structure of modern philanthropy. In 1870, he founded the Standard Oil Company and aggressively ran it until he officially retired in 1897.[1] Standard Oil began as an Ohio partnership formed by John D. Rockefeller, his brother William Rockefeller, Henry Flagler, Jabez Bostwick, chemist Samuel Andrews, and a silent partner, Stephen V. Harkness. As kerosene and gasoline grew in importance, Rockefeller's wealth soared, and he became the world's richest man and first American worth more than a billion dollars.[2] Adjusting for inflation, he is often regarded as the richest person in history.[3][4][5][6]

Rockefeller spent the last 40 years of his life in retirement. His fortune was mainly used to create the modern systematic approach of targeted philanthropy. He was able to do this through the creation of foundations that had a major effect on medicine, education, and scientific research.[7]

His foundations pioneered the development of medical research, and were instrumental in the eradication of hookworm and yellow fever. He is also the founder of both the University of Chicago and Rockefeller University. He was a devoted Northern Baptist and supported many church-based institutions throughout his life. Rockefeller adhered to total abstinence from alcohol and tobacco throughout his life.[8]

He had four daughters and one son; John D. Rockefeller, Jr. "Junior" was largely entrusted with the supervision of the foundations.
Andrew Carnegie
was a Scottish-American industrialist who led the enormous expansion of the American steel industry in the late 19th century. He was also one of the most important philanthropists of his era.

Carnegie was born in Dunfermline, Scotland, and emigrated to the United States with his parents in 1848. His first job in the United States was as a factory worker in a bobbin factory. Later on he became a bill logger for the owner of the company. Soon after he became a messenger boy. Eventually he progressed up the ranks of a telegraph company. He built Pittsburgh's Carnegie Steel Company, which was later merged with Elbert H. Gary's Federal Steel Company and several smaller companies to create U.S. Steel. With the fortune he made from business among others he built Carnegie Hall, later he turned to philanthropy and interests in education, founding the Carnegie Corporation of New York, Carnegie Endowment for International Peace, Carnegie Institution of Washington, Carnegie Mellon University and the Carnegie Museums of Pittsburgh.

Carnegie gave most of his money to establish many libraries, schools, and universities in the United States, the United Kingdom, Canada and other countries, as well as a pension fund for former employees. He is often regarded as the second-richest man in history after John D. Rockefeller. Carnegie started as a telegrapher and by the 1860s had investments in railroads, railroad sleeping cars, bridges and oil derricks. He built further wealth as a bond salesman raising money for American enterprise in Europe. Carnegie once gave $25,000 to Speaker of the House David B. Henderson to erect a library on the campus of Upper Iowa University in his name.[2]

He earned most of his fortune in the steel industry. In the 1870s, he founded the Carnegie Steel Company, a step which cemented his name as one of the "Captains of Industry". By the 1890s, the company was the largest and most profitable industrial enterprise in the world. Carnegie sold it in 1901 for $480 million to J.P. Morgan, who created U.S. Steel. Carnegie devoted the remainder of his life to large-scale philanthropy, with special emphasis on local libraries, world peace, education and scientific research. His life has often been referred to as a true "rags to riches" story.
Social Darwinism
is an ideology of society that seeks to apply biological concepts of Darwinism or of evolutionary theory to sociology and politics, often with the assumption that conflict between groups in society leads to social progress as superior groups outcompete inferior ones.

The name social Darwinism is a modern name given to the various theories of society that emerged in England and the United States in the 1870s, which, it is alleged, sought to apply biological concepts to sociology and politics.[1][2] The term social Darwinism gained widespread currency when used in 1944 to oppose these earlier concepts. Today, because of the negative connotations of the theory of social Darwinism, especially after the atrocities of the Second World War (including the Holocaust), few people would describe themselves as Social Darwinists and the term is generally seen as pejorative.[3]

Social Darwinism is generally understood to use the concepts of struggle for existence and survival of the fittest to justify social policies which make no distinction between those able to support themselves and those unable to support themselves. Many such views stress competition between individuals in laissez-faire capitalism; but the ideology has also motivated ideas of eugenics, scientific racism, imperialism,[4] fascism, Nazism and struggle between national or racial groups.[5][6]

Opponents of evolution theory have often maintained that social Darwinism is a logical entailment of a belief in evolutionary theory, while biologists and historians maintain that it is rather a perversion of Charles Darwin's ideas.[7] While most scholars recognize historical links between Darwin's theory and forms of social Darwinism, they also maintain that social Darwinism is not a necessary consequence of the principles of biological evolution[8] and that using biological evolution as a justification for policies of inequality amounts to committing the naturalistic fallacy.
Lester Frank Ward
a critic and reformer worte Dynamic Social, 1883
Horatio Alger
was a prolific 19th-century American author, best known for his many formulaic juvenile novels about impoverished boys and their rise from humble backgrounds to lives of middle-class security and comfort through hard work, determination, courage, and honesty. His writings were characterized by the "rags-to-riches" narrative, which had a formative effect on America during the Gilded Age.

He secured his literary niche in 1868 with the publication of his fourth book Ragged Dick, the story of a poor bootblack's rise to middle-class respectability, which was a huge success. His many books that followed were essentially variations on Ragged Dick and featured a cast of stock characters-the valiant youth, the noble, mysterious stranger, the snobbish youth, and the evil squire.

In the 1870s, Alger took a trip to California to gather material for future books, but the trip had little influence on his writing. In the last decades of the 19th century, boys' tastes changed, and Alger's moral tone coarsened accordingly. The Puritan ethic had loosened its grip on America, and violence, murder, and other sensational themes entered Alger's works. Public librarians questioned whether his books should be made available to the young. By the time he died in 1899, he had published around a hundred volumes.
"single tax"
system is a system of taxation based primarily or exclusively on one tax, typically chosen for its special properties.[citation needed]

The original proposal for a single tax, and consequently the one most commonly known and referred to as a "single tax", is the Georgist proposal for a tax system based exclusively on land value taxes. More recently others have made proposals for a single tax based on other revenue models such as the Fair Tax proposal which is based on a consumption tax. Flat tax proposals have also been referred to as single taxes.[1]
Looking Backward
is a utopian science fiction novel by Edward Bellamy, a lawyer and writer from Chicopee Falls, Massachusetts; it was first published in 1887. According to Erich Fromm, Looking Backward is "one of the most remarkable books ever published in America".[1]

It was the third-largest bestseller of its time, after Uncle Tom's Cabin and Ben-Hur: A Tale of the Christ.[1] It influenced a large number of intellectuals, and appears by title in many of the major Marxist writings of the day. "It is one of the few books ever published that created almost immediately on its appearance a political mass movement".[2] In the United States alone, over 162 "Bellamy Clubs" sprang up to discuss and propagate the book's ideas.[3] Owing to its commitment to the nationalization of private property, this political movement came to be known as Nationalism, not to be confused with the political concept of nationalism.[4] The novel also inspired several utopian communities.
"Crime of '73"
Election of 1896
climaxed an intensely heated contest in which Republican candidate William McKinley defeated Democrat William Jennings Bryan in one of the most dramatic and complex races in American history.

The 1896 campaign is often considered to be a realigning election that ended the old Third Party System and began the Fourth Party System.[1] McKinley forged a coalition in which businessmen, professionals, skilled factory workers, and prosperous farmers were heavily represented. He was strongest in cities and in the Northeast, Upper Midwest, and Pacific Coast. Bryan was the nominee of the Democrats, the Populist Party, and the Silver Republicans. He was strongest in the South, rural Midwest, and Rocky Mountain states. Bryan's moralistic rhetoric and crusading for inflation (based on free silver) alienated German American voters. Turnout was very high, passing 90% of the eligible voters in many places.

For three years the nation had been mired in a deep economic depression, marked by low prices, low profits, high unemployment, and violent strikes. Economic issues, especially silver or gold for the money supply, and tariffs, were central issues. Republican campaign manager Mark Hanna pioneered many modern campaign techniques, facilitated by a $3.5 million budget. He outspent Bryan by a factor of five. The Democratic Party's repudiation of the Bourbon Democrats (their pro-business wing, represented by incumbent President Grover Cleveland), set the stage for 36 years of Republican control of the White House, interrupted only by the two terms of Democrat Woodrow Wilson.
morgan system
labor conditions: late nineteenth century
William Jennings Bryan
was a leading American politician from the 1890s until his death. He was a dominant force in the liberal wing of the Democratic Party, standing three times as its candidate for President of the United States (1896, 1900 and 1908). He served in Congress briefly as a Representative from Nebraska and was the 41st United States Secretary of State under President Woodrow Wilson (1913-1915), taking a pacifist position on the World War. Bryan was a devout Presbyterian, a supporter of popular democracy, and an enemy of the Gold Standard as well as banks and railroads. He was a leader of the silverite movement in the 1890s, a peace advocate, a prohibitionist, and an opponent of Darwinism on religious and humanitarian grounds. With his deep, commanding voice and wide travels, he was one of the best known orators and lecturers of the era. Because of his faith in the wisdom of the common people, he was called "The Great Commoner."

In the intensely fought 1896 and 1900 elections, he was defeated by William McKinley but retained control of the Democratic Party. With over 500 speeches in 1896, Bryan invented the national stumping tour, in an era when other presidential candidates stayed home. In his three presidential bids, he promoted Free Silver in 1896, anti-imperialism in 1900, and trust-busting in 1908, calling on Democrats to fight the trusts (big corporations) and big banks, and embrace anti-elitist ideals of republicanism. President Wilson appointed him Secretary of State in 1913, but Wilson's strong demands on Germany after the Lusitania was torpedoed in 1915 caused Bryan to resign in protest. After 1920 he was a strong supporter of Prohibition and energetically attacked Darwinism and evolution, most famously at the Scopes Trial in 1925. Five days after the end of the case, he died in his sleep.[2]
Eugene V. Debs
was an American union leader, one of the founding members of the Industrial Workers of the World (IWW or the Wobblies), and several times the candidate of the Socialist Party of America for President of the United States.[1] Through his presidential candidacies, as well as his work with labor movements, Debs eventually became one of the best-known socialists living in the United States.

In the early part of his political career, Debs was a member of the Democratic Party. He was elected as a Democrat to the Indiana General Assembly in 1884. After working with several smaller unions, including the Brotherhood of Locomotive Firemen, Debs was instrumental in the founding of the American Railway Union (ARU), the nation's first industrial union. When the ARU struck the Pullman Palace Car Company over pay cuts, President Grover Cleveland used the United States Army to break the strike. As a leader of the ARU, Debs was later imprisoned for failing to obey an injunction against the strike.

Debs educated himself about socialism in prison and emerged to launch his career as the nation's most prominent socialist in the first decades of the 20th century. He ran as the Socialist Party's candidate for the presidency in 1900, 1904, 1908, 1912, and 1920, the last time from his prison cell.

Debs was noted for his oratory, and a speech denouncing American participation in World War I led to his second arrest in 1918. He was convicted under the Espionage Act of 1917 and sentenced to a term of 10 years. President Warren G. Harding commuted his sentence in December 1921. Debs died in 1926 not long after being admitted to a sanatorium.
"new immigration"
italians, slavs, greeks, jews, southern and eastern europe
settlement house movement
was an influential Progressive-era response to the massive urban social problems of the day, The United States was in a period of rapid growth, economic distress, labor unrest, unemployment, low wages, unfair labor practices, and squalid living conditions. Large numbers of immigrants arrived daily to work in this newly established industrialized society. Ethnic enclaves sheltered immigrants who were experiencing isolation, new customs, and a strange language.

Established in large cities, settlement houses were privately supported institutions that focused on helping the poor and disadvantaged by addressing the environ-mental factors involved in poverty. The basic settlement-house ideal was to have wealthy people move into poor neighborhoods so that both groups could learn from one another. Canon Samuel Barnett, pastor of the poorest parish in London's notorious East End, established the first settlement house in 1884. In the midst of this neighborhood (settlement), Toynbee Hall housed educated and wealthy people who served as examples, teachers, and providers of basic human services to the poor residents of the settlement. Toynbee Hall was based on the social gospel movement and attracted young theologians and other middle-class people to emulate Jesus in living among the poor.

Inspired by Barnett's efforts, Dr. Stanton Coit and Charles B. Stover founded the first American settlement house, the Neighborhood Guild of New York City (1886). Other settlements quickly followed: Hull-House, Chicago, 1889 (Jane Addams and Ellen Gates Starr); College Settlement, a clubfor girls in New York City, 1889 (Vida Dutton Scudder and Jean G. Fine); East Side House, New York, 1891; Northwestern University Settlement, 1891 (Harriet Vittum); South End House, Boston, 1892 (Robert Archey Woods); and Henry Street Settlement, New York, 1893 (Lillian D. Wald). New settlements were established almost every year: University of Chicago Settlement, 1894 (Mary McDowell); Chicago Commons, 1894 (Graham Taylor); Hudson Guild, New York, 1897 (John Lovejoy Elliot); Hiram House, Cleveland, 1896 (George A. Bellamy); and Greenwich House, New York, 1902 (Mary Kingsbury Simkhovitch).

Although settlement houses have often been characterized as largely secular in nature, many of them grew from religious roots. Some settlement house workers who came from a faith perspective included moral teachings, at a minimum, in their work with community residents. Probably the best-known example is Chicago Commons, founded in 1894 by the Reverend Graham Taylor, who was the first professor of Christian sociology at the Chicago Theological Seminary. He founded Chicago Commons partially as a social laboratory for his students. As Allen F. Davis has pointed out, of the more than 400 settlements established by 1910, 167 (more than 40 percent) were identified as religious, 31 Methodist, 29 Episcopal, 24 Jewish, 22 Roman Catholic, 20 Presbyterian, 10 Congregational, and 31 unspecified. In 1930, there were approximately 460 settlement houses, and most of these were church supported.

Settlement houses were run in part by client groups. They emphasized social reform rather than relief or assistance. (Residence, research, and reform were the three Rs of the movement.) Early sources of funding were wealthy individuals or clubs such as the Junior League. Settlement house workers were educated poor persons, both children and adults, who often engaged in social action on behalf of the community. In attaining their goals, the settlement house reformers had an enviable record. They had a realistic understanding of the social forces and the political structures of the city and nation. They battled in legislative halls as well as in urban slums, and they became successful initiators and organizers of reform.

Settlement workers tried to improve housing conditions, organized protests, offered job-training and labor searches, supported organized labor, worked against child labor, and fought against corrupt politicians. They provided classes in art and music and offered lectures on topics of interest. They established playgrounds, day care, kindergartens, and classes in English literacy. Settlement workers were also heavily involved in research to identify the factors causing need and in activities intended to eliminate the factors that caused the need.

Settlement houses assumed as their operational base the adequate functioning of the families they served, many of whom were migrants and immigrants whose problems were associated with making the transition from rural to urban living and from a known to an unknown culture. Whatever their problems, clients of settlement houses were viewed as able, normal, working-class families with whom the wealthier classes were joined in mutual dependence. When such families could not cope, settlement leaders assumed that society itself was at fault, and this assumption led quite naturally to a drive for societal reform.

The most famous settlement house in America was Hull-House of Chicago. Although it was not the first American settlement, Hull-House came to exemplify the particular brand of research, service, and reform that was to characterize much of the American settlement house movement. Jane Addams and her friend, Ellen Gates Starr, moved into a poor immigrant neighborhood in Chicago. They had vague notions of being "good neighbors" to the poor around them and studying the conditions in which they lived. As they observed the structural elements of poverty, however, the two began to create a specific agenda of services and reform. Exploitation of immigrants from southern and eastern Europe, poor employment conditions and inadequate wages, lack of educational opportunities, substandard housing, and inefficient city government were the factors that contributed greatly to the poverty of the area and called for specific responses. Hull-House soon offered a day nursery for children, a clubfor working girls, lectures and cultural programs, and meeting space for neighborhood political groups.

Along with a remarkable group of reformers who came to live at the settlement, Addams supported labor union activity, lobbied city officials for sanitary and housing reforms, and established the Immigrants' Protective League to fight discrimination in employment and other exploitation of newcomers. In addition, Hull-House members carried on an active program of research. Residents surveyed conditions in tenements and workplaces. They publicized their results widely, attempting to create an atmosphere conducive to governmental and legislative reform.

Under Addams's leadership a powerful network of women social reformers emerged from the Hull-House setting that was influential throughout the United States. Three-fourths of settlement workers in America were women; most were well educated and dedicated to working on problems of urban poverty. These included Julia Lathrop and Grace Abbott, prominent figures in the U.S. Children's Bureau; Florence Kelley, labor and consumer advocate; Alice Hamilton, physician and social activist; and Edith Abbott and Sophonisba Breckinridge, social researchers and key leaders in the development of social work education. In addition to these women, Mary O'Sullivan, a labor leader and reformer, organized the Chicago Women's Bindery Workers' Union in 1889. In 1892, she became the American Federation of Labor's first woman organizer. Additionally, Lucy Flower helped found the Illinois Training School for Nurses, the Chicago Bureau of Charities, the Cook County Juvenile Court, the Protective Agency for Women and Children, and the Lake Geneva Fresh Air Association for poor urban children.

World War I had an adverse effect on the settlement house movement. The settlement houses declined in importance and there seemed to be less need of them. Gradually organizations such as the Young Men's Christian Association, summer camps, neighborhood youth centers, and other local and national agencies were established to carry on similar work. The settlement house movement gradually broadened into a national federation of neighborhood centers. By the early twentieth century, settlement houses were beginning to cooperate with, and merge into, "social work." The settlement house movement led the way to community organization and group work practice within the newly proclaimed profession of social work.

Read more: http://www.answers.com/topic/settlement-house-movement#ixzz23fgiO7cn
Sherman Silver Purchase Act
was enacted on July 14, 1890[1] as a United States federal law. It was named after its author, Senator John Sherman, an Ohio Republican, chairman of the Senate Finance Committee. While not authorizing the free and unlimited coinage of silver that the Free Silver supporters wanted, it increased the amount of silver the government was required to purchase every month. The Sherman Silver Purchase Act had been passed in response to the growing complaints of farmers and miners interests. Farmers had immense debts that could not be paid off due to deflation caused by overproduction, and they urged the government to pass the Sherman Silver Purchase Act in order to boost the economy and cause inflation, allowing them to pay their debts with cheaper dollars.[2] Mining companies, meanwhile, had extracted vast quantities of silver from western mines; the resulting oversupply drove down the price of their product, often to below the point at which the silver could be profitably extracted. They hoped to enlist the government to artificially increase the demand for silver.

The act was enacted in tandem with the McKinley Tariff of 1890. William McKinley, an Ohio Republican and chairman of the House Ways and Means Committee worked with John Sherman, the senior Republican Senator from Ohio, to create a package that could both pass the Senate and receive the President's approval.

Under the Act, the federal government purchased millions of ounces of silver, with issues of paper currency; it became the second-largest buyer in the world, after the British Crown in India. In addition to the $2 million to $4 million that had been required by the Bland-Allison Act of 1878, the U.S. government was now required to purchase an additional 4.5 million ounces of silver bullion every month. The law required the Treasury to buy the silver with a special issue of Treasury (Coin) Notes that could be redeemed for either silver or gold. That plan backfired, as people (mostly investors) turned in the new coin notes for gold dollars, thus depleting the government's gold reserves. After the Panic of 1893 broke, President Grover Cleveland oversaw the repeal of the Act in 1893 to prevent the depletion of the country's gold reserves. Banker J. P. Morgan stepped in to form a syndicate that saved the U.S. with a massive gold loan, for which he received a commission. Nevertheless, Morgan succeeded in saving the nation from bankruptcy. While the repeal of the Act is sometimes blamed for the Panic, the Panic was already well underway.[2]
Civil Service Commission
is a government agency that is constituted by legislature to regulate the employment and working conditions of civil servants, oversee hiring and promotions, and promote the values of the public service. Its role is roughly analogous to that of the human resources department in corporations. Civil service commissions are often independent from elected politicians.

In Fiji for example, the PSC reviews government statutory powers to ensure efficiency and effectiveness in meeting public sector management objectives. It also acts as the human relations department, or central personnel authority, for the citizens interaction with the government.

The origin of the Public Service Commission in many jurisdictions was the White Paper Colonial 197 issued in 1946, which set out measures which were proposed to improve the quality and efficiency of the Colonial Service of the British Administration. The setting up of Public Service Commission was proposed in its paragraph 21(xi) which mentioned that[1]

" Public Service Commissions should be established in the Colonies. Subject to the general overriding powers of the Secretary of State, the selection and appointment of candidates in the Colonies to posts in the local service will lie with the Governor of the Colony. It is desirable that the Governor should be advised in these matters by a Public Service Commission appointed by him and so composed as to command the confidence of the Service and the public; "

and that

" such Commissions should be established in the Colonies to advise the Governor on the selection and appointment of candidates to posts in the local service, and should be so composed as to command the confidence of the Service and the public.
Jacob S. Coxey
of Massillon, Ohio, was an American politician, who ran for elective office several times in Ohio. Twice, in 1894 and 1914, he led "Coxey's Army", a group of unemployed men who marched to Washington, D.C. to present a "Petition in Boots" demanding that the United States Congress allocate funds to create jobs for the unemployed. Although the marches failed, Coxey's Army was an early attempt to arouse political interest in an issue that grew in importance until the Social Security Act of 1935 encouraged the establishment of state unemployment insurance programs.
"Cross of Gold" speech
was delivered by William Jennings Bryan, a former United States congressman from Nebraska, at the Democratic National Convention in Chicago on July 9, 1896. In the address, Bryan supported bimetallism or "free silver", which he believed would bring the nation prosperity. He decried the gold standard, concluding the speech, "you shall not crucify mankind upon a cross of gold".[1] Bryan's address helped catapult him to the Democratic Party's presidential nomination; it is considered one of the greatest political speeches in American history.

For twenty years, Americans had been bitterly divided over the nation's monetary standard. The gold standard, which the United States had effectively been on since 1873, limited the money supply but eased trade with other nations, such as the United Kingdom, whose currency was also based on gold. However, many Americans believed bimetallism (making both gold and silver legal tender) was necessary to the nation's economic health. The financial Panic of 1893 intensified the debates, and when Democratic President Grover Cleveland continued to support the gold standard against the will of much of his party, activists became determined to take over the Democratic Party organization and nominate a silver-supporting candidate in 1896.

Bryan had been a dark horse candidate with little support in the convention. His speech, delivered at the close of the debate on the party platform, electrified the convention and is generally credited with getting him the nomination for president. However, he lost the general election to William McKinley and the United States formally adopted the gold standard in 1900.
Thomas Nast
was a German-born American caricaturist and editorial cartoonist who is considered to be the "Father of the American Cartoon".[1] He was the scourge of Boss Tweed and the Tammany Hall political machine. Among his notable works were the creation of the modern version of Santa Claus and the political symbol of the elephant for the Republican Party. Contrary to popular belief, Nast did not create Uncle Sam (the male personification of the American people), Columbia, the female personification of American values, or the Democratic donkey,[2] though he did popularize these symbols through his art.

Contents [hide]
William Marcy Tweed
and widely known as "Boss" Tweed - was an American politician most notable for being the "boss" of Tammany Hall, the Democratic Party political machine that played a major role in the politics of 19th century New York City and State. At the height of his influence, Tweed was the third-largest landowner in New York City, a director of the Erie Railroad, the Tenth National Bank, and the New-York Printing Company, as well as proprietor of the Metropolitan Hotel.[2]

Tweed was elected to the United States House of Representatives in 1852, and the New York County Board of Supervisors in 1858, the year he became the "Grand Sachem" of Tammany Hall. He was also elected to the New York State Senate in 1867, but Tweed's greatest influence came from being an appointed member of a number of boards and commissions, his control over political patronage in New York City through Tammany, and his ability to ensure the loyalty of voters through jobs he could create and dispense on city-related projects.

Tweed was convicted for stealing an amount estimated by an aldermen's committee in 1877 at between $25 million and $45 million from New York City taxpayers through political corruption, although later estimates ranged as high as $200 million.[3] Based on the inflation or devaluation rate of the dollar since 1870 of 2.7%, $25-$200 million is between $1 and $8 billion 2010 dollars. He died in the Ludlow Street Jail.
Thomas A. Edison
was an American inventor and businessman. He developed many devices that greatly influenced life around the world, including the phonograph, the motion picture camera, and a long-lasting, practical electric light bulb. Dubbed "The Wizard of Menlo Park" (now Edison, New Jersey) by a newspaper reporter, he was one of the first inventors to apply the principles of mass production and large-scale teamwork to the process of invention, and because of that, he is often credited with the creation of the first industrial research laboratory.[1]

Edison is the fourth most prolific inventor in history, holding 1,093 US patents in his name, as well as many patents in the United Kingdom, France, and Germany. He is credited with numerous inventions that contributed to mass communication and, in particular, telecommunications. These included a stock ticker, a mechanical vote recorder, a battery for an electric car, electrical power, recorded music and motion pictures.

His advanced work in these fields was an outgrowth of his early career as a telegraph operator. Edison originated the concept and implementation of electric-power generation and distribution to homes, businesses, and factories - a crucial development in the modern industrialized world. His first power station was on Manhattan Island, New York.
Roger Babson
How the 1929 Wall Street Crash unfoldedSee how the Guardian reported the crash in 1929 (pdf)
Share 5
Email The Guardian, Friday 3 October 2008 How it began

The bull market on Wall Street began in 1923 and led to an unprecedented period of share trading. However, by 1929 there were signs of instability. On September 3 the Dow Jones Industrial Average reached its peak, closing at 381.7.

On September 5 the economist Roger Babson gave a speech saying 'Sooner or later, a crash is coming, and it may be terrific'. He had predicted a crash for years but this time the market fell.
Carter Glass
was a newspaper publisher and politician from Lynchburg, Virginia. He served many years in Congress as a member of the Democratic Party. As House co-sponsor, he played a central role in the development of the 1913 Glass-Owen Act that created the Federal Reserve System. Glass subsequently served as the U.S. Secretary of the Treasury under President Woodrow Wilson. Later elected to the Senate, he became widely known as co-sponsor of the Glass-Steagall Act of 1933, which enforced the separation of investment banking and commercial banking, and established the Federal Deposit Insurance Corporation (FDIC).
"dust bowl"
or the Dirty Thirties, was a period of severe dust storms causing major ecological and agricultural damage to American and Canadian prairie lands in the 1930s, particularly in 1934 and 1936. The phenomenon was caused by severe drought coupled with decades of extensive farming without crop rotation, fallow fields, cover crops or other techniques to prevent wind erosion.[1] Deep plowing of the virgin topsoil of the Great Plains had displaced the natural deep-rooted grasses that normally kept the soil in place and trapped moisture even during periods of drought and high winds.

During the drought of the 1930s, without natural anchors to keep the soil in place, it dried, turned to dust, and blew away eastward and southward in large dark clouds. At times, the clouds blackened the sky, reaching all the way to East Coast cities such as New York and Washington, D.C. Much of the soil ended up deposited in the Atlantic Ocean, carried by prevailing winds, which were in part created by the dry and bare soil conditions. These immense dust storms—given names such as "black blizzards" and "black rollers"—often reduced visibility to a few feet (a meter or less). The Dust Bowl affected 100,000,000 acres (400,000 km2), centered on the panhandles of Texas and Oklahoma, and adjacent parts of New Mexico, Colorado, and Kansas.[2]

Millions of acres of farmland were damaged, and hundreds of thousands of people were forced to leave their homes; many of these families (often known as "Okies", since so many came from Oklahoma) migrated to California and other states, where they found economic conditions little better during the Great Depression than those they had left. Owning no land, many became migrant workers who traveled from farm to farm to pick fruit and other crops at starvation wages. Author John Steinbeck later wrote The Grapes of Wrath, which won the Pulitzer Prize, and Of Mice and Men, about such people.
Benny Goodman
Benny Goodman
Marshall Plan
(officially the European Recovery Program, ERP) was the large-scale American program to aid Europe where the United States gave monetary support to help rebuild European economies after the end of World War II in order to prevent the spread of Soviet communism.[1] The plan was in operation for four years beginning in April 1948. The goals of the United States were to rebuild a war-devastated region, remove trade barriers, modernize industry, and make Europe prosperous again.[2]
The initiative was named after Secretary of State George Marshall. The plan had bipartisan support in Washington, where the Republicans controlled Congress and the Democrats controlled the White House. The Plan was largely the creation of State Department officials, especially William L. Clayton and George F. Kennan. Marshall spoke of urgent need to help the European recovery in his address at Harvard University in June 1947.[2]
The reconstruction plan, developed at a meeting of the participating European states, was established on June 5, 1947. It offered the same aid to the Soviet Union and its allies but they did not accept it,[3][4] as to do so would be to allow a degree of US control over the Communist economies.[5] During the four years that the plan was operational, US $13 billion in economic and technical assistance was given to help the recovery of the European countries that had joined in the Organization for European Economic Co-operation. This $13 billion was in the context of a U.S. GDP of $258 billion in 1948, and was on top of $13 billion in American aid to Europe between the end of the war and the start of the Plan that is counted separately from the Marshall Plan.[6] The Marshall Plan was replaced by the Mutual Security Plan at the end of 1951.[7]
The ERP addressed each of the obstacles to postwar recovery. The plan looked to the future, and did not focus on the destruction caused by the war. Much more important were efforts to modernize European industrial and business practices using high-efficiency American models, reduce artificial trade barriers, and instill a sense of hope and self-reliance.[8]
By 1952 as the funding ended, the economy of every participant state had surpassed pre-war levels; for all Marshall Plan recipients, output in 1951 was at least 35% higher than in 1938.[9] Over the next two decades, Western Europe enjoyed unprecedented growth and prosperity, but economists are not sure what proportion was due directly to the ERP, what proportion indirectly, and how much would have happened without it. The Marshall Plan was one of the first elements of European integration, as it erased trade barriers and set up institutions to coordinate the economy on a continental level—that is, it stimulated the total political reconstruction of western Europe.[10]
Belgian economic historian Herman Van der Wee concludes the Marshall Plan was a "great success":
"It gave a new impetus to reconstruction in Western Europe and made a decisive contribution to the renewal of the transport system, the modernization of industrial and agricultural equipment, the resumption of normal production, the raising of productivity, and the facilitating of intra-European trade."[11]
Fair Deal
was the term given to an ambitious set of proposals put forward by United States President Harry S. Truman to the United States Congress in his January 1949 State of the Union address. The term, however, has also been used to describe the domestic reform agenda of the Truman Administration,[1] which governed the United States from 1945 to 1953. It marked a new stage in the history of Modern liberalism in the United States, but with the Conservative Coalition dominant in Congress, the major initiatives did not become law unless they had GOP support. As Neustadt concludes, the most important proposals were aid to education, universal health insurance, FEPC and repeal of the Taft-Hartley Act. They were all debated at length, then voted down. Nevertheless, enough smaller and less controversial (but still important) items passed that liberals could claim some success.[2]
"military-industrial complex"
is a concept commonly used to refer to policy and monetary relationships between legislators, national armed forces, and the defense industrial base that supports them. These relationships include political contributions, political approval for defense spending, lobbying to support bureaucracies, and oversight of the industry. It is a type of iron triangle. The term is most often used in reference to the system behind the military of the United States, where it gained popularity after its use in the farewell address of President Dwight D. Eisenhower on January 17, 1961, though the term is applicable to any country with a similarly developed infrastructure.
The term is sometimes used more broadly to include the entire network of contracts and flows of money and resources among individuals as well as corporations and institutions of the defense contractors, The Pentagon, the Congress and executive branch. This sector is intrinsically prone to principal-agent problem, moral hazard, and rent seeking. Cases of political corruption have also surfaced with regularity. A parallel system is that of the Military-industrial-media complex, along with the more distant Politico-media complex and Prison-industrial complex.
A similar thesis was originally expressed by Daniel Guérin, in his 1936 book Fascism and Big Business, about the fascist government support to heavy industry. It can be defined as, "an informal and changing coalition of groups with vested psychological, moral, and material interests in the continuous development and maintenance of high levels of weaponry, in preservation of colonial markets and in military-strategic conceptions of internal affairs."[2]
A. Mitchell Palmer
best known as A. Mitchell Palmer, was Attorney General of the United States from 1919 to 1921. He directed the controversial Palmer Raids.
Boston police strike
went out on strike on September 9, 1919, in order to achieve recognition for their trade union and improvements in wages and working conditions. They faced an implacable opponent in Police Commissioner Edwin Upton Curtis, who denied that police officers had any right to form a union, much less one affiliated with a larger organization like the American Federation of Labor (AFL). Attempts at reconciliation between the Commissioner and the police officers, particularly on the part of Boston's Mayor Andrew James Peters, failed.
During the strike, Boston experienced several nights of lawlessness, although property damage was not extensive. Several thousand members of the State Guard, supported by volunteers, restored order. Press reaction both locally and nationally described the strike as Bolshevik-inspired and directed at the destruction of civil society. The strikers were called "deserters" and "agents of Lenin."[1]
Samuel Gompers of the AFL recognized that the strike was damaging the cause of labor in the public mind and advised the strikers to return to work. The Police Commissioner remained adamant and refused to re-hire the striking policemen. He was supported by Massachusetts Governor Calvin Coolidge, whose rebuke of Gompers earned him a national reputation. The strike proved a setback for labor, and the AFL reversed its attempts to organize police officers for another two decades. Coolidge won the Republican nomination for vice-president of the US in the 1920 presidential election.
Eighteenth Amendment
John T. Scopes
bio teacher, put in jail for teaching evolution, monkey trial
National Origins Act
Asian Exclusion Act (Pub.L. 68-139, 43 Stat. 153, enacted May 26, 1924), was a United States federal law that limited the annual number of immigrants who could be admitted from any country to 2% of the number of people from that country who were already living in the United States in 1890, down from the 3% cap set by the Immigration Restriction Act of 1921, according to the Census of 1890. It superseded the 1921 Emergency Quota Act. The law was aimed at further restricting the Southern and Eastern Europeans, mainly Jews fleeing persecution in Poland and Russia, who were immigrating in large numbers starting in the 1890s, as well as prohibiting the immigration of Middle Easterners, East Asians and Asian Indians. According to the U.S. Department of State Office of the Historian, "In all its parts, the most basic purpose of the 1924 Immigration Act was to preserve the ideal of American homogeneity."[1] Congressional opposition was minimal.
"lost generation"
is a term used to refer to the generation, actually a cohort, that came of age during World War I. The term was popularized by Ernest Hemingway who used it as one of two contrasting epigraphs for his novel, The Sun Also Rises. In that volume Hemingway credits the phrase to Gertrude Stein, who was then his mentor and patron.
In A Moveable Feast, which was published after both Hemingway and Stein were dead and after a literary feud that lasted much of their life, Hemingway reveals that the phrase was actually originated by the garage owner who serviced Stein's car. When a young mechanic failed to repair the car in a way satisfactory to Stein, the garage owner shouted at the boy, "You are all a "génération perdue."[1]:29 Stein, in telling Hemingway the story, added, "That is what you are. That's what you all are ... all of you young people who served in the war. You are a lost generation."[1]:29 This generation included distinguished artists such as F. Scott Fitzgerald, John Steinbeck, T. S. Eliot, John Dos Passos, Waldo Peirce, Isadora Duncan, Abraham Walkowitz, Alan Seeger, and Erich Maria Remarque.
Charles A. Lindbergh
"best minds"
"Ohio gang"
was a good ol' boy network of politicians and industry leaders closely surrounding Warren G. Harding, the 29th President of the United States of America. Many of these individuals came into Harding's personal orbit during his tenure as a state-level politician in Ohio, thus the name.
During the Harding administration several of the so-called Ohio Gang became involved in financial scandals, including the Teapot Dome scandal and apparent malfeasance at the U.S. Department of Justice, ending in prison terms and suicides. Following Harding's sudden death of a heart attack in 1923, the Ohio Gang were effectively removed from the corridors of power by Harding's Vice President and successor, Calvin Coolidge.
Albert B. Fall
was a United States Senator from New Mexico and the Secretary of the Interior under President Warren G. Harding, infamous for his involvement in the Teapot Dome scandal.
Warren G. Harding
was the 29th President of the United States (1921-1923). A Republican from Ohio, Harding was an influential self-made newspaper publisher. He served in the Ohio Senate (1899-1903), as the 28th Lieutenant Governor of Ohio (1904-1906) and as a U.S. Senator (1915-1921). He was also the first incumbent United States Senator and the first newspaper publisher to be elected President.[1][2]
His conservatism, affable manner, and make-no-enemies campaign strategy made Harding the compromise choice at the 1920 Republican National Convention. During his presidential campaign, in the aftermath of World War I, he promised a return of the nation to "normalcy". This "America first" campaign encouraged industrialization and a strong economy independent of foreign influence. Harding departed from the progressive movement that had dominated Congress since President Theodore Roosevelt. In the 1920 election, he and his running mate, Calvin Coolidge, defeated Democrat and fellow Ohioan James M. Cox in the largest presidential popular vote landslide (60.36% to 34.19%) since popular vote totals were first recorded in 1824.[3]
President Harding rewarded friends and political contributors, referred to as the Ohio Gang, with financially powerful positions. Scandals and corruption, including the notorious Teapot Dome scandal, eventually pervaded his administration; one of his own cabinet and several of his appointees were eventually tried, convicted, and sent to prison for bribery or defrauding the federal government.[4] Harding did however make some notably positive appointments to his cabinet.[5]
In foreign affairs, Harding spurned the League of Nations, and signed a separate peace treaty with Germany and Austria, formally ending World War I. He also strongly promoted world Naval disarmament at the 1921-1922 Washington Naval Conference, and urged U.S. participation in a proposed International Court. Domestically, Harding signed the first child welfare program in the United States and dealt with striking workers in the mining and railroad industries. He also cleaned up the Veterans Bureau in March 1923.[6] The nation's unemployment rate dropped by half during Harding's administration.[7] In August 1923, President Harding suddenly collapsed and died during a stop in California on a return trip from Alaska.[8] Vice President Calvin Coolidge succeeded him.
Historians have traditionally been resistant to giving Harding good presidential reviews due to the multiple federal department scandals during his administration; as a result, Harding has received low rankings as President.[9] His reputation, however, has increased among some historians for his conservative financial policies, fiscal responsibility, and his endorsement of African American civil rights.[10] Harding's creation of the Budget Bureau was a major economic accomplishment that reformed and streamlined wasteful federal spending.[10] In 1998, journalist Carl S. Anthony stated Harding was a "modern figure" who embraced technology and culture and who was sensitive to the plights of minorities, women, and labor.[11] President Harding contended with racial problems on a national level, rather than sectional, and openly advocated African American political, educational, and economic equality inside the Solid South.[12]
1920s economy
was an independent agency of the United States government, established and chartered by the US Congress in 1932, Act of January 22, 1932, c. 8, 47 Stat. 5, during the administration of President Herbert Hoover. It was modeled after the War Finance Corporation of World War I. The agency gave $2 billion in aid to state and local governments and made loans to banks, railroads, mortgage associations and other businesses. The loans were nearly all repaid. It was continued by the New Deal and played a major role in handling the Great Depression in the United States and setting up the relief programs that were taken over by the New Deal in 1933.[1]
Election of 1932
took place in the midst of the Great Depression that had ruined the promises of incumbent President Herbert Hoover to bring about a new era of prosperity. Economics was dominant, and the sort of cultural issues that had dominated previous elections including Catholicism and the Ku Klux Klan (KKK) were dormant. Prohibition was a favorite Democratic target, as few Republicans tried to defend it. There was a mounting demand to end prohibition and bring back beer, liquor, and the resulting tax revenues.[1] The Democratic nomination went to the well known governor of the largest state, New York's Franklin D. Roosevelt (FDR). He had been reelected governor in a landslide in 1930. People still remembered his cousin, the first president Roosevelt, and FDR had been the losing vice presidential nominee in 1920. This time, Roosevelt united all wings of the party, avoided divisive cultural issues such as religion and the KKK, and brought in a leading southern conservative as his running mate, House Speaker John Nance Garner.
The theme of the campaign was an all-out attack on Hoover's economic failures, with the incumbent hard pressed to defend himself. Roosevelt blamed the Great Depression on Hoover, and his protectionist policies. Roosevelt lashed out at Hoover: "I accuse the present Administration of being the greatest spending Administration in peacetime in all our history."[2] Garner accused Hoover of "leading the country down the path of socialism."[3] Roosevelt himself did not have a clear idea of the New Deal at this point, so he promised no specific programs and tried to appeal to practically all groups of voters, even Republicans. Roosevelt won by a landslide, and this critical election marked the collapse of the Fourth Party System or Progressive Era. With another landslide in the 1934 off-year elections, the electorate was realigned into the Fifth Party System, dominated by Roosevelt's New Deal Coalition.
was an American statute which purposed to authorize the President of the United States to regulate industry and permit cartels and monopolies in an attempt to stimulate economic recovery, and established a national public works program.[1][2]
The legislation was enacted in June 1933 during the Great Depression as part of President Franklin D. Roosevelt's New Deal legislative program. Section 7(a) of the bill, which protected collective bargaining rights for unions, proved contentious (especially in the Senate),[1][3] but both chambers eventually passed the legislation and President Roosevelt signed the bill into law on June 16, 1933.[1][4] The Act had two main sections (or "titles"). Title I was devoted to industrial recovery, and authorized the promulgation of industrial codes of fair competition, guaranteed trade union rights, permitted the regulation of working standards, and regulated the price of certain refined petroleum products and their transportation. Title II established the Public Works Administration, outlined the projects and funding opportunities it could engage in, and funded the Act.
The Act was implemented by the National Recovery Administration (NRA) and the Public Works Administration (PWA).[2][5] Very large numbers of regulations were generated under the authority granted to the NRA by the Act,[6][7] which led to a significant loss of political support for Roosevelt and the New Deal.[2] The NIRA was set to expire in June 1935, but in a major constitutional ruling the U.S. Supreme Court held Title I of the Act unconstitutional on May 27, 1935, in Schechter Poultry Corp. v. United States, 295 U.S. 495 (1935).[2] The National Industrial Recovery Act is widely considered a policy failure, both in the 1930s and by historians today.[1][8][9] Disputes over the reasons for this failure continue, however. Among the suggested causes are that the Act promoted economically harmful monopolies,[6] that the Act lacked critical support from the business community,[10] and that the Act was poorly administered.[10][11] The Act encouraged union organizing, which led to significant labor unrest.[12] The Act had no mechanisms for handling these problems, which led Congress to pass the National Labor Relations Act in 1935.[13] The Act was also a major force behind a major modification of the law criminalizing making false statements.[14]
Charles Mitchell
Glass-Steagall Act of 1933
was a law that established the Federal Deposit Insurance Corporation (FDIC) in the United States and imposed banking reforms, several of which were intended to control speculation.[1] It is often referred to as the Glass-Steagall Act, after its Congressional sponsors, Senator Carter Glass (D) of Virginia, and Representative Henry B. Steagall (D) of Alabama.
The term Glass-Steagall Act, however, is most often used to refer to four provisions of the Banking Act of 1933 that limited commercial bank securities activities and affiliations between commercial banks and securities firms.[2] Starting in the early 1960s federal banking regulators interpreted these provisions to permit commercial banks and especially commercial bank affiliates to engage in an expanding list and volume of securities activities.[3] By the time the affiliation restrictions in the Glass-Steagall Act were repealed through the Gramm-Leach-Bliley Act in 1999 by President Bill Clinton, many commentators argued Glass-Steagall was already "dead."[4] Most notably, Citibank's 1998 affiliation with Salomon Smith Barney, one of the largest US securities firms, was permitted under the Federal Reserve Board's then existing interpretation of the Glass-Steagall Act.[5] Clinton publicly declared, "The Glass-Steagall Act is no longer relevant."[6]
Many commentators have stated that the Gramm-Leach-Bliley Act's repeal of the affiliation restrictions of the Glass-Steagall Act was an important cause of the late-2000s financial crisis.[7][8][9] Some critics of that repeal argue it permitted Wall Street investment banking firms to gamble with their depositors' money that was held in affiliated commercial banks.[10][11][12][13][14][15] Others have argued that the activities linked to the financial crisis were not prohibited (or, in most cases, even regulated) by the Glass-Steagall Act.[16] Commentators, including the American Bankers Association in January 2010, have also argued that the ability of commercial banking firms to acquire securities firms (and of securities firms to convert into bank holding companies) helped mitigate the financial crisis.[17]
Share-the-Wealth Society
was a movement begun during the Great Depression by Huey Long, a governor and later United States Senator from Louisiana.
World War II and social change
"sick chicken case"
A.L.A. Schechter Poultry Corp. v. United States, 295 U.S. 495 (1935), was a decision by the Supreme Court of the United States that invalidated regulations of the poultry industry according to the nondelegation doctrine and as an invalid use of Congress's power under the commerce clause. This was a unanimous decision that rendered the National Industrial Recovery Act, a main component of President Roosevelt's New Deal, unconstitutional.
Stimson Doctrine
is a policy of the United States federal government, enunciated in a note of January 7, 1932, to Japan and China, of non-recognition of international territorial changes that were executed by force. The doctrine was an application of the principle of ex injuria jus non oritur.[1] While some analysts have applied the doctrine in opposition to governments established by revolution, this usage is not widespread, and its invocation usually involves treaty violations.[1]
Named after Henry L. Stimson, United States Secretary of State in the Hoover Administration (1929-1933), the policy followed Japan's unilateral seizure of Manchuria in northeastern China following action by Japanese soldiers at Mukden (now Shenyang), on September 18, 1931.[2] The doctrine was also invoked by U.S. Under-Secretary of State Sumner Welles in a declaration of July 23, 1940 that announced non-recognition of the Soviet annexation and incorporation of the three Baltic states—Estonia, Latvia, and Lithuania[3]—and remained the official U.S. position until the Baltic states gained formal international recognition as independent states in 1991.
It was not the first time that the U.S. had used non-recognition as a political tool or symbolic statement. President Woodrow Wilson had refused to recognise the Mexican Revolutionary governments in 1913 and Japan's 21 Demands upon China in 1915.[citation needed]
The Japanese invasion of Manchuria in late 1931 placed U.S. Secretary of State Henry M. Stimson in a difficult position. It was evident that appeals to the spirit of the Kellogg-Briand Pact had no impact on either the Chinese or the Japanese, and the secretary was further hampered by President Herbert Hoover's clear indication that he would not support economic sanctions as a means to bring peace in the Far East.[4]
On January 7, 1932, Secretary Stimson sent identical notes to China and Japan that incorporated a diplomatic approach used by earlier secretaries facing crises in the Far East. Later known as the Stimson Doctrine, or sometimes the Hoover-Stimson Doctrine, the notes read in part as follows:
...the American Government deems it to be its duty to notify both the Imperial Japanese Government and the Government of the Chinese Republic that it cannot admit the legality of any situation de facto nor does it intend to recognize any treaty or agreement entered into between those Governments, or agents thereof, which may impair the treaty rights of the United States or its citizens in China, including those that relate to the sovereignty, the independence, or the territorial and administrative integrity of the Republic of China, or to the international policy relative to China, commonly known as the open door policy...[5]
Stimson had stated that the United States would not recognize any changes made in China that would curtail American treaty rights in the area and that the "open door" must be maintained. The declaration had few material effects on the Western world, which was burdened by the Great Depression, and Japan went on to bomb Shanghai.[4]
The doctrine was criticized on the grounds that it did no more than alienate the Japanese.[6]
"Good Neighbor Policy"
was the foreign policy of the administration of United States President Franklin Roosevelt toward the countries of Latin America. Its main principle was that of non-intervention and non-interference in the domestic affairs of Latin America. It also reinforced the idea that the United States would be a "good neighbor" and engage in reciprocal exchanges with Latin American countries[1]. Overall, the Roosevelt administration expected that this new policy would create new economic opportunities in the form of reciprocal trade agreements and reassert the influence of the United States in Latin America, however many Latin American governments were not convinced.[2]
"quarantine speech"
was given by U.S. President [Franklin D. Roosevelt] on October 5, 1937 in [Chicago], calling for an international "quarantine of the aggressor nations" as an alternative to the political climate of American neutrality and non-intervention that was prevalent at the time. The speech intensified America's isolationist mood, causing protest by non-interventionists and foes to intervene. No countries were directly mentioned in the speech, although it was interpreted as referring to Japan, Italy, and Germany.[1] Roosevelt suggested the use of economic pressure, a forceful response, but less direct than outright aggression.
Public response to the speech was mixed. Famed cartoonist Percy Crosby, creator of Skippy (comic strip) and very outspoken Roosevelt critic, bought a two-page advertisement in the New York Sun to attack it.[2] In addition, it was heavily criticized by Hearst-owned newspapers and Robert R. McCormick of the Chicago Tribune, but several subsequent compendia of editorials showed overall approval in US media.[3]
December 7, 1941
) was a surprise military strike conducted by the Imperial Japanese Navy against the United States naval base at Pearl Harbor, Hawaii, on the morning of December 7, 1941 (December 8 in Japan). The attack was intended as a preventive action in order to keep the U.S. Pacific Fleet from interfering with military actions the Empire of Japan was planning in Southeast Asia against overseas territories of the United Kingdom, the Netherlands, and the United States.
The base was attacked by 353[11] Japanese fighters, bombers and torpedo planes in two waves, launched from six aircraft carriers.[11] All eight U.S. Navy battleships were damaged, with four being sunk. Of these eight damaged, two were raised, and with four repaired, six battleships returned to service later in the war. The Japanese also sank or damaged three cruisers, three destroyers, an anti-aircraft training ship,[nb 4] and one minelayer. 188 U.S. aircraft were destroyed; 2,402 Americans were killed[13] and 1,282 wounded. The power station, shipyard, maintenance, and fuel and torpedo storage facilities, as well as the submarine piers and headquarters building (also home of the intelligence section) were not attacked. Japanese losses were light: 29 aircraft and five midget submarines lost, and 65 servicemen killed or wounded. One Japanese sailor was captured.
The attack came as a profound shock to the American people and led directly to the American entry into World War II in both the Pacific and European theaters. The following day (December 8), the United States declared war on Japan. Domestic support for non-interventionism, which had been strong,[14] disappeared. Clandestine support of Britain (for example the Neutrality Patrol) was replaced by active alliance. Subsequent operations by the U.S. prompted Germany and Italy to declare war on the U.S. on December 11, which was reciprocated by the U.S. the same day.
There were numerous historical precedents for unannounced military action by Japan. However, the lack of any formal warning, particularly while negotiations were still apparently ongoing, led President Franklin D. Roosevelt to proclaim December 7, 1941, "a date which will live in infamy".
James F. Byrnes
was an American politician from the state of South Carolina. During his career, Byrnes served as a US Representative (1911-1925), a US Senator (1931-1941), a Justice of the Supreme Court (1941-1942), Secretary of State (1945-1947), and 104th governor of South Carolina (1951-1955). He is one of very few politicians to serve in all three branches of the American national government while also being active in state government. He was a confidant of U.S. President Franklin D. Roosevelt, and was one of the most powerful men in American domestic and foreign policy in the mid-1940s.
"appeasement policy"
is a diplomatic policy aimed at avoiding war by making concessions to an aggressor. Historian Paul Kennedy defines it as "the policy of settling international quarrels by admitting and satisfying grievances through rational negotiation and compromise, thereby avoiding the resort to an armed conflict which would be expensive, bloody, and possibly dangerous."[1] Kennedy's definition has been widely cited by scholars.[2] Appeasement was used by European democracies in the 1930s who wished to avoid war with the dictatorships of Germany and Italy, bearing in mind the horrors of World War I.
The term is most often applied to the foreign policy of the British Prime Minister Neville Chamberlain towards Nazi Germany between 1937 and 1939. His policies of avoiding war with Germany have been the subject of intense debate for seventy years among academics, politicians and diplomats. The historians' assessments have ranged from condemnation for allowing Hitler to grow too strong, to the judgement that he had no alternative and acted in Britain's best interests. At the time, these concessions were widely seen as positive, and the Munich Pact among Germany, Britain, France and Italy prompted Chamberlain to announce that he had secured "peace for our time".[3]
The word "appeasement" has been used as a synonym for weakness and even cowardice since the 1930s, and it is still used in that sense to denounce policies and behaviors that conflict with firm, often armed, action in international relations.
September 1, 1939
is a poem by W. H. Auden written on the occasion of the outbreak of World War II. It was first published in The New Republic issue of 18 October 1939, and was first published in book form in Auden's collection Another Time (1940).
Lend-Lease Act, 1941
was the program under which the United States of America supplied the United Kingdom, the Soviet Union, China, Free France, and other Allied nations with materiel between 1941 and 1945. It was signed into law on March 11, 1941, a year and a half after the outbreak of World War II in Europe in September 1939 but nine months before the U.S. entered the war in December 1941. Formally titled An Act to Further Promote the Defense of the United States, the Act effectively ended the United States' pretense of neutrality.
A total of $50.1 billion (equivalent to $647 billion today) worth of supplies were shipped: $31.4 billion to Britain, $11.3 billion to the Soviet Union, $3.2 billion to France, and $1.6 billion to China. Reverse Lend-Lease comprised services such as rent on air bases that went to the U.S., and totaled $7.8 billion; of this, $6.8 billion came from the British and the Commonwealth. The terms of the agreement provided that the materiel were to be used until time for their return or destruction. Supplies after the termination date were sold to Britain at a discount for £1.075 billion using long-term loans from the United States. Canada operated a similar program that sent $4.7 billion in supplies to the United Kingdom and the Soviet Union.[2] The United States did not charge for aid supplied under this legislation.
This program was a decisive step away from non-interventionist policy, which had dominated United States foreign relations since the end of World War I, towards international involvement.
Manhattan Project
Battle of Stalingrad
was a major and decisive battle of World War II in which Nazi Germany and its allies fought the Soviet Union for control of the city of Stalingrad (now Volgograd) in southwestern Russia. The battle took place between 23 August 1942 to 2 February 1943[6][7][8][9] and was marked by brutality and disregard for military and civilian casualties. It is among the bloodiest battles in the history of warfare, with the higher estimates of combined casualties amounting to nearly two million. The heavy losses inflicted on the German army made it a significant turning point in the whole war.[10] After the Battle of Stalingrad, German forces never recovered their earlier strength, and attained no further strategic victories in the East.[11]
The German offensive to capture Stalingrad commenced in late summer 1942, and was supported by intensive Luftwaffe bombing that reduced much of the city to rubble. The German offensive eventually became mired in building-to-building fighting; and despite controlling nearly all of the city at times, the Wehrmacht was unable to dislodge the last Soviet defenders clinging tenaciously to the west bank of the Volga River.
On 19 November 1942, the Red Army launched Operation Uranus, a two-pronged attack targeting the weak Romanian and Hungarian forces protecting the 6th Army's flanks.[12] After heavy fighting, the weakly held Axis flanks collapsed and the 6th Army was cut off and surrounded inside Stalingrad. As the Russian winter set in, the 6th Army weakened rapidly from cold, starvation and ongoing Soviet attacks. Command ambiguity coupled with Adolf Hitler's resolute belief in their will to fight further exacerbated the German predicament. Eventually, the failure of outside German forces to break the encirclement, coupled with the failure of resupplying by air, led to the final collapse. By the beginning of February 1943, Axis resistance in Stalingrad had ceased and the remaining elements of the 6th Army had either surrendered or been destroyed.[13]:p.932
was one of the first federations of labor unions in the United States. It was founded in Columbus, Ohio in December 1886 by an alliance of craft unions disaffected from the Knights of Labor, a national labor association. Samuel Gompers of the Cigar Makers' International Union was elected president of the Federation at its founding convention and was reelected every year except one until his death in 1924.
The AFL was the largest union grouping in the United States for the first half of the 20th century, even after the creation of the Congress of Industrial Organizations (CIO) by unions that were expelled by the AFL in 1935 over its opposition to industrial unionism. While the Federation was founded and dominated by craft unions throughout the first fifty years of its existence, many of its craft union affiliates turned to organizing on an industrial union basis to meet the challenge from the CIO in the 1940s.
In 1955, the AFL merged with its longtime rival, the Congress of Industrial Organizations, to form the AFL-CIO, a federation which remains in place to this day. Together with its offspring, the AFL has comprised the longest lasting and most influential labor federation in the United States.
proposed by John L. Lewis in 1938, was a federation of unions that organized workers in industrial unions in the United States and Canada from 1935 to 1955. The Taft-Hartley Act of 1947 required union leaders to swear that they were not Communists. Many CIO leaders refused to obey that requirement, later found unconstitutional. The CIO merged with the American Federation of Labor to form the AFL-CIO in 1955.
The CIO supported Franklin D. Roosevelt and the New Deal Coalition, and was open to African Americans. Both federations grew rapidly during the Great Depression. The rivalry for dominance was bitter and sometimes violent. The CIO (Committee for Industrial Organization) was founded on November 9, 1935, by eight international unions belonging to the American Federation of Labor.
In its statement of purpose, the CIO said it had formed to encourage the AFL to organize workers in mass production industries along industrial union lines. The CIO failed to change AFL policy from within. On September 10, 1936, the AFL suspended all 10 CIO unions (two more had joined in the previous year). In 1938, these unions formed the Congress of Industrial Organizations as a rival labor federation. In 1955, the CIO rejoined the AFL, forming the new entity known as the American Federation of Labor-Congress of Industrial Organizations (AFL-CIO).
Cold War
Jacob Riis
) was a Danish American social reformer, "muckraking" journalist and social documentary photographer. He is known for using his photographic and journalistic talents to help the impoverished in New York City; those impoverished New Yorkers were the subject of most of his prolific writings and photography. He endorsed the implementation of "model tenements" in New York with the help of humanitarian Lawrence Veiller. Additionally, as one of the most famous proponents of the newly practicable casual photography, he is considered one of the fathers of photography due to his discovery of the use of flash in photography. While living in New York, Riis experienced poverty and became a police reporter writing about the quality of life in the slums. He attempted to alleviate the bad living conditions of poor people by exposing their traditionalism to middle class ridicule.
Battle of Midway
is widely regarded as the most important naval battle of the Pacific Campaign of World War II.[5][6][7] Between 4 and 7 June 1942, only six months after Japan's attack on Pearl Harbor, and one month after the Battle of the Coral Sea, the United States Navy decisively defeated an Imperial Japanese Navy (IJN) attack against Midway Atoll, inflicting irreparable damage on the Japanese fleet.[8] Military historian John Keegan has called it "the most stunning and decisive blow in the history of naval warfare."[9]
The Japanese operation, like the earlier attack on Pearl Harbor, sought to eliminate the United States as a strategic power in the Pacific, thereby giving Japan a free hand in establishing its Greater East Asia Co-Prosperity Sphere. The Japanese hoped that another demoralizing defeat would force the U.S. to capitulate in the Pacific War.[10]
The Japanese plan was to lure the United States' aircraft carriers into a trap.[11] The Japanese also intended to occupy Midway as part of an overall plan to extend their defensive perimeter in response to the Doolittle Raid. This operation was also considered preparatory for further attacks against Fiji and Samoa.
The plan was handicapped by faulty Japanese assumptions of the American reaction and poor initial dispositions.[12] Most significantly, American codebreakers were able to determine the date and location of the attack, enabling the forewarned U.S. Navy to set up an ambush of its own. Four Japanese aircraft carriers and a heavy cruiser were sunk for a cost of one American aircraft carrier and a destroyer. After Midway, and the exhausting attrition of the Solomon Islands campaign, Japan's shipbuilding and pilot training programs were unable to keep pace in replacing their losses while the U.S. steadily increased its output in both areas.[13]
internment camps
was the relocation and internment by the United States government in 1942 of about 110,000 Japanese Americans and Japanese who lived along the Pacific coast of the United States to camps called "War Relocation Camps," in the wake of Imperial Japan's attack on Pearl Harbor.[1][2] The internment of Japanese Americans was applied unequally throughout the United States. All who lived on the West Coast of the United States were interned, while in Hawaii, where the 150,000-plus Japanese Americans composed over one-third of the population, an estimated 1,200[3] to 1,800 were interned.[4] Of those interned, 62% were American citizens.[5][6]
President Franklin D. Roosevelt authorized the internment with Executive Order 9066, issued February 19, 1942, which allowed local military commanders to designate "military areas" as "exclusion zones," from which "any or all persons may be excluded." This power was used to declare that all people of Japanese ancestry were excluded from the entire Pacific coast, including all of California and much of Oregon, Washington and Arizona, except for those in internment camps.[7] In 1944, the Supreme Court upheld the constitutionality of the exclusion orders,[8] while noting that the provisions that singled out people of Japanese ancestry were a separate issue outside the scope of the proceedings.[9] The United States Census Bureau assisted the internment efforts by providing confidential neighborhood information on Japanese Americans. The Bureau's role was denied for decades, but was finally proven in 2007.[10][11]
In 1988, Congress passed and President Ronald Reagan signed legislation which apologized for the internment on behalf of the U.S. government. The legislation said that government actions were based on "race prejudice, war hysteria, and a failure of political leadership".[12] The U.S. government eventually disbursed more than $1.6 billion in reparations to Japanese Americans who had been interned and their heirs.[13]
Charles de Gaulle
) was a French general and statesman who led the Free French Forces during World War II. He later founded the French Fifth Republic in 1958 and served as its first President from 1959 to 1969.[1] A veteran of World War I, in the 1920s and 1930s, de Gaulle came to the fore as a proponent of mobile armoured divisions, which he considered would become central in modern warfare. During World War II, he earned the rank of brigadier general (retained throughout his life),[2] leading one of the few successful armoured counter-attacks during the 1940 Battle of France in May in Montcornet, and then briefly served in the French government as France was falling. De Gaulle was the most senior French military officer to reject the June 1940 armistice to Nazi Germany right from the outset.[3]
He escaped to Britain and gave a famous radio address, broadcast by the BBC on 18 June 1940, exhorting the French people to resist Nazi Germany[4] and organised the Free French Forces with exiled French officers in Britain.[5] As the war progressed de Gaulle gradually gained control of all French colonies except Indochina most of which had at first been controlled by the pro-German Vichy regime. Despite earning a reputation for being a difficult man to do business with, by the time of the Allied invasion of France in 1944 he was heading what amounted to a French government in exile, but although he insisted that France be treated as a great independent power by the other Allies, the Americans in particular remained deeply suspicious of his motives. De Gaulle became prime minister in the French Provisional Government, resigning in 1946 because of political conflicts.[6]
After the war he founded his own political party, the Rally of the French People (RPF) on 14 April 1947. Although he retired from politics in the early 1950s after the RPF's failure to win power, and was banned from the government-controlled TV and radio, he was voted back to power as prime minister by the French Assembly during the May 1958 crisis. De Gaulle led the writing of a new constitution founding the Fifth Republic,[7] and was elected President of France, an office which now held much greater power than in the Third and Fourth Republics.[8]
As President, Charles de Gaulle ended the political chaos that preceded his return to power. A new French currency was issued in January 1960 to control inflation and industrial growth was promoted. Although he initially supported French rule over Algeria, he controversially decided to grant independence to that country, ending an expensive and unpopular war but leaving France divided and having to face down opposition from the European settlers and French military who had originally supported his return to power.
Immensely patriotic, de Gaulle and his supporters held the view, known as Gaullism, that France should continue to see itself as a major power and should not rely on other nations - like the US - for its national security and prosperity. Often criticized for his Politics of Grandeur, de Gaulle oversaw the development of French atomic weapons and promoted a foreign policy independent of U.S. and British influence. He withdrew France from NATO military command — although remaining a member of the western alliance—and twice vetoed Britain's entry into the European Community. He travelled widely in Eastern Europe and other parts of the world and recognised Communist China. On a visit to Canada in 1967 he gave encouragement to Quebec Separatism with his historical "Vive le Québec Libre".
During his term, de Gaulle also faced controversy and political opposition from Communists and Socialists, as well as from the far right. Despite having been re-elected as President, this time by direct popular ballot, in 1965, in May 1968 he appeared likely to lose power amidst widespread protests by students and workers, but survived the crisis with an increased majority in the Assembly. However, de Gaulle resigned in 1969 after losing a referendum in which he proposed more decentralization. He is considered by many to be the most influential leader in modern French history.
Casablanca Conference
was held at the Anfa Hotel in Casablanca, Morocco, then a French protectorate, from January 14 to 24, 1943, to plan the European strategy of the Allies during World War II. Present were Franklin D. Roosevelt, Winston Churchill, and many French representatives to file reports for the French. Soviet leader Joseph Stalin had also been invited but declined to attend in light of the ongoing conflict at Stalingrad. General Charles de Gaulle had initially refused to come but changed his mind when Churchill threatened to recognize Henri Giraud as head of the Free French Forces in his place. Giraud was also present at Casablanca, and there was notable tension between the two men during the talks, notably leading Roosevelt to force the two men to shake hands, which they did reluctantly and so quickly that photographers had to ask them to repeat for the shoot. Both men also limited their exchange to offering the other to serve under him.[1]
Burke-Wadsworth Act of 1940
created the first peace-time draft in United States history. Both the Congress and the president were concerned with the military expansion of Germany, Japan, and Italy. By implementing a draft, the United States government would be better prepared if the nation became involved in the military conflicts raging in other parts of the world.
Under the Burke-Wadsworth Act, all American males between twenty-one and thirty-five years of age registered for the draft. The government selected men through a lottery system. If drafted, a man served for twelve months. According to the Burke-Wadsworth Act's provisions, drafted soldiers had to remain in the Western Hemisphere or in United States possessions or territories located in other parts of the world.
The draft began in October 1940. By the early summer of 1941, President Roosevelt asked the U.S. Congress to extend the term of duty for the draftees beyond twelve months. The United States House of Representatives approved the extension by a single vote. The Senate approved it by a wider margin, and Roosevelt signed the bill into law. Many of the soldiers drafted in October 1940 threatened to desert once the original twelve months of their service was up. Many of these men painted the letters "O," "H," "I," and "O" (OHIO) on the walls of their barracks in protest. These letters were an acronym for "Over the hill in October," which meant that the men intended to desert upon the end of their twelve months of duty. Desertions did occur, but they were not widespread. Following the Japanese attack on Pearl Harbor, Hawaii, on December 7, 1941, thousands of American men and women swelled the United States' military's ranks by volunteering for service.
Yalta Conference
New Deal: effects
FDR 1933-1938 created the program that was aimed for short term recovery from the great depression. It produced a wide variety of the following programs to help the needy.
WPA- Largest agency employed millions of people around the rural and western mountain populations.
TVA- Provided construction jobs electricity to many unemployed as well as built The Hoover Dam to protect from flooding.
AAA -Help re-growth of farming and agriculture from destruction of the Dust Bowl. Offerd many jobs to unemployed.
FDIC and SEC (first ss cards) - regulated banking and investment activities.
Both the FDIC and SEC helped to stabilize economy and are still around today.
Wagner Act - This act proected workers rights and disputes between the workers and employers. It set standards for wages and hours and also banned child labor.

It didn't help alot for African Americans.

Those opposing the new deal shut down its progressions by 1937. They complained of its cost and the amount of power it gave to the government.

Gave government a great amount of power over the economy.

It didn't get the U.S. out of the great depression, but it brought hope to many people who felt they had lost their lives.
Jonas Salk
invented the polio vaccine