The Post-War Boom
The period following World War II saw increased prosperity for many Americans.
Analyze the era of U.S. prosperity following World War II
- The years immediately following World War II witnessed stability and prosperity for many Americans. The U.S. economy grew dramatically; expanding at a rate of 3.5% per annum between 1945 and 1970.
- Between 1946 and 1960, the United States witnessed a significant expansion in the consumption of goods and services. Gross national product rose by 36% and personal consumption expenditures by 42%.
- Many socioeconomic changes, including higher and more secure wages, access to paid vacation, Social Security and private pension plans, and more educational opportunities, shaped the life of many working-class families that transitioned to the middle-class standard of living.
- Many city dwellers chose a suburban life style centered on children and housewives, with the male breadwinner commuting to work. Suburbia housed a third of the nation’s population by 1960.
- The rapid social and technological changes brought a growing corporatization of America and the decline of smaller businesses.
- Despite the fast post-WWII economic growth, a significant proportion of Americans continued to live in poverty, including a large number of African-American families.
- baby boom: Any period marked by a greatly increased fertility rate. This demographic phenomenon is usually ascribed within certain geographical bounds. In the United States, the post-WWII period was marked by this phenomenon.
- suburbia: Residential or mixed-use areas, either existing as part of a city or urban area or as a separate residential community within commuting distance of a city. In most English-speaking regions, these areas are defined in contrast to central or inner city areas. Their fast growth was an important component of the post-WWII U.S. economic boom.
The years immediately following World War II witnessed stability and prosperity for many Americans. Increasing numbers of workers enjoyed high wages, larger houses, better schools, more automobiles, and home comforts like vacuum cleaners and washing machines, which were made for labor-saving and to make housework easier. Inventions familiar in the early 21st century made their first appearance during this era.
The U.S. economy grew dramatically in the post-war period, expanding at a rate of 3.5% per annum between 1945 and 1970. During this period, many incomes doubled in a generation; a phenomenon that economist Frank Levy described as “upward mobility on a rocket ship.” The substantial increase in average family income within a generation resulted in millions of office and factory workers being lifted into a growing middle class, enabling them to sustain a standard of living once considered reserved for the wealthy. As noted by Deone Zell, assembly line work paid well, while unionized factory jobs served as “stepping-stones to the middle class.” By the end of the 1950s, 87% of all U.S. families owned at least one television, 75% owned automobiles, and 60% owned homes. By 1960, blue-collar workers had become the most prolific buyers of many luxury goods and services. Additionally, by the early 1970s, post-World War II U.S. consumers enjoyed higher levels of disposable income than those in any other country.
The post-war years were also noted for the rise of the automotive and aviation industries. Many wartime industries continued to conduct business following World War II, driving innovation in newer industries such as aerospace and manufacturing. As companies grew in size, jobs, factory production, and consumer spending rose with it. Between 1946 and 1960, the United States saw greatly increased consumption of goods and services. Gross national product rose by 36% and personal consumption expenditures by 42%, with cumulative gains reflected in the incomes of families and unrelated individuals. While the number of these family units rose sharply from 43.3 million to 56.1 million in 1960, a rise of almost 23%, their average incomes grew even faster, from 3,940 in 1946 to 6,900 in 1960; a 43% increase. After taking inflation into account, the real advance was 16%.
More than 21 million housing units were constructed between 1946 and 1960, and in the latter year, 52% of consumer units in metropolitan areas were homeowners. In 1957, out of all the wired homes throughout the country, 96% had a refrigerator, 87% an electric clothes washer, 81% a television, 67% a vacuum cleaner, 18% a freezer, 12% an electric or gas clothes dryer, and 8% air conditioning. Automobile ownership also soared, with 72% of consumer units owning an automobile by 1960.
The period from 1946 to 1960 also saw a substantial increase in paid leisure time of working people. The 40-hour workweek established by the Fair Labor Standards Act in covered industries became the actual schedule in most workplaces by 1960. The majority of workers also enjoyed paid vacations and industries catering to leisure activities blossomed.
Educational outlays were also greater than in other countries, while a higher proportion of young people graduated from high schools and universities compared with elsewhere in the world, as hundreds of new colleges and universities opened every year. At the advanced level, U.S. science, engineering, and medicine were world-renowned.
In regard to social welfare, the post-war era saw a considerable improvement in insurance for workers and their dependents against the risks of illness, as private insurance programs like Blue Cross and Blue Shield expanded. With the notable exception of farm and domestic workers, Social Security covered virtually all members of the labor force. In 1959, about two-thirds of factory workers and three-fourths of office workers were provided with supplemental private pension plans.
Many city dwellers gave up cramped apartments for a suburban lifestyle centered on children and housewives, with the male breadwinner commuting to work. Suburbia encompassed a third of the nation’s population by 1960. Suburban growth was not only a result of post-war prosperity, but innovations of the single-family housing market with low interest rates on 20- and 30-year mortgages and low down payments, especially for veterans. William Levitt began a national trend with his use of mass-production techniques to construct a large Levittown housing development on Long Island, NY. Meanwhile, the suburban population swelled because of the baby boom, which was a dramatic increase in fertility in the period of 1942–1957.
The rapid social and technological changes brought a growing corporatization of the United States and the decline of smaller businesses, which often suffered from high post-war inflation and mounting operating costs. Newspapers declined in numbers and consolidated. The railroad industry, once one of the cornerstones of the U.S. economy and an immense and often scorned influence on national politics, also suffered from explosive automobile sales and the construction of the interstate system. By the end of the 1950s, it was well into decline and by the 1970s became completely bankrupt, necessitating a federal government takeover. Smaller automobile manufacturers such as Nash, Studebaker, and Packard were unable to compete with the so-called Big Three (General Motors, Ford, and Chrysler) in the new post-war world and gradually declined into oblivion. Suburbanization caused the gradual movement of working-class people and jobs out of the inner cities as shopping centers displaced the traditional downtown stores. In time, this would have disastrous effects on urban areas.
The new prosperity did not extend to everyone. Many Americans continued to live in poverty throughout the 1950s, especially older people and African Americans, the latter of whom continued to earn far less than their white counterparts on average in the two decades following the end of World War II. Immediately after the war, 12 million returning veterans were in need of work and in many cases could not find it. In addition, labor strikes rocked the nation; in some cases exacerbated by racial tensions due to African-Americans having taken jobs during the war and now being faced with irate returning veterans who demanded that they step aside. The huge number of women employed in the workforce in the war were also rapidly cleared out to make room for men.
Between one-fifth and one-fourth of the population could not survive on the income they earned. The older generation of Americans did not benefit as much from the post-war economic boom, especially as many had never recovered financially from the loss of their savings during the Great Depression. Many blue-collar workers continued to live in poverty, with 30% of those employed in industry. Racial differences were staggering. In 1947, 60% of black families lived below the poverty level (defined in one study as below $3,000 in 1968), compared with 23% of white families. In 1968, 23% of black families lived below the poverty level, compared with 9% of white families.
The G.I. Bill of Rights
The G.I. Bill offered returning World War II veterans important benefits that had a great impact on socioeconomic changes in the post-war era.
Compare and contrast the benefits awarded through the G.I. Bill to veterans of World War II and of the Korean War
- The G.I. Bill was a law that provided a range of benefits for returning World War II veterans. Benefits included low-cost mortgages, low-interest loans to start a business, cash payments of tuition and living expenses to attend university, high school or vocational education, as well as 1 year of unemployment compensation.
- By the end of the program in 1956, roughly 2.2 million veterans had used the GI Bill education benefits to attend colleges or universities. An additional 5.6 million used these benefits for vocational training programs.
- Although the G.I. Bill did not specifically advocate discrimination, it was interpreted differently for African Americans. Because the programs were directed by local, white officials, many veterans of color did not benefit.
- The success of the G.I. Bill prompted the government to offer similar measures to later generations of veterans. The Veterans’ Adjustment Act of 1952 offered benefits to veterans of the Korean War who served for more than 90 days and had received an “other than dishonorable discharge.”
- Despite the racial discrimination that the legislation embraced, the G.I. Bill proved extremely effective for white veterans, enabling many to transition into the middle class.
- 52–20 Club: A provision of the 1944 G.I. Bill that enabled all former servicemen to receive $20 of unemployment benefits per week for 52 weeks/year while they were looking for work.
- The Veterans’ Adjustment Act of 1952: A law (signed July 16, 1952) that offered offered benefits to veterans of the Korean War who served for more than 90 days and had received an “other than dishonorable discharge.”
- G.I. Bill: A law that provided a range of benefits for returning World War II veterans. It was available to every veteran who had been on active duty during the war years for at least 90 days and had not been dishonorably discharged. Combat was not required.
World War II Veterans
The Servicemen’s Readjustment Act of 1944, known informally as the G.I. Bill, was a law that provided a range of benefits for returning World War II veterans (commonly referred to as G.I.s). Benefits included low-cost mortgages, low-interest loans to start a business, cash payments of tuition and living expenses to attend university, high school, or vocational education, and 1 year of unemployment compensation. It was available to every veteran who had been on active duty during the war years for at least 90 days and had not been dishonorably discharged. Combat was not required. By the end of the program in 1956, roughly 2.2 million veterans had used the G.I. Bill education benefits so they could enroll in colleges or universities. An additional 5.6 million used the benefits for vocational training programs.
On June 22, 1944, President Franklin Roosevelt signed the Servicemen’s Readjustment Act of 1944 into law. Roosevelt wanted a postwar assistance program to help transition from wartime, but he also wanted it on a need-basis for poor people, not just veterans. The veterans’ organizations mobilized support in Congress that rejected Roosevelt’s approach and provided benefits only to veterans of military service; both men and women. The bill was introduced in the House on January 10, 1944, and in the Senate the following day; both chambers approved their own versions.
An important provision of the G.I. Bill was low-interest, zero-down-payment home loans for servicemen, with more favorable terms for new construction compared with those for existing housing. This encouraged millions of American families to move out of urban apartments and into suburban homes. Another provision was known as the 52–20 clause. This enabled all former servicemen to receive $20 a week for 52 weeks/year while they were looking for work. Less than 20% of the money set aside for the 52–20 Club was distributed; rather, most returning servicemen quickly found jobs or pursued higher education.
Although the G.I. Bill did not specifically advocate discrimination, it was interpreted differently for African Americans. Historian Ira Katznelson argued that “the law was deliberately designed to accommodate Jim Crow.” Because the programs were directed by local, white officials, many veterans did not benefit. Of the first 67,000 mortgages insured by the G.I. Bill, fewer than 100 were taken out by Americans of color. By 1946, only one fifth of the 100,000 African Americans who had applied for educational benefits had registered in college. Furthermore, historically black colleges and universities (HBCUs) came under increased pressure as rising enrollments and strained resources forced them to turn away an estimated 20,000 veterans. HBCUs were already the poorest colleges, and their resources were stretched even thinner when veterans’ demands necessitated a shift in the curriculum away from the traditional “preach and teach” course of study HBCUs offered.
The United States Department of Veterans Affairs, because of its strong affiliation to the all-white American Legion and Veterans of Foreign Wars, also became a formidable foe to many African Americans in search of an education, because it had the power to deny or grant the claims of black G.I.s. Additionally, banks and mortgage agencies refused loans to African Americans, making the G.I. Bill even less effective for Americans of color.
Korean War Veterans
The success of the 1944 G.I. Bill prompted the government to offer similar measures to later generations of veterans. The Veterans’ Adjustment Act of 1952, signed into law on July 16, 1952, offered benefits to veterans of the Korean War who served for more than 90 days and had received an “other than dishonorable discharge.” Korean War veterans did not receive unemployment compensation but were entitled to unemployment compensation starting at the end of a waiting period determined by the amount and disbursement dates of their mustering-out pay. They were entitled to 26 weeks at $26/week to be paid for by the federal government, but administered by the states. One improvement in the unemployment compensation for Korean War veterans was they could receive both state and federal benefits; the federal benefits beginning once state benefits were exhausted.
One significant difference between the 1944 G.I. Bill and the 1952 Act was that tuition was no longer paid directly to the chosen institution of higher education. Instead, veterans received a fixed monthly sum of $110, from which they had to pay for their tuition, fees, books, and living expenses. The decision to end direct tuition payments to schools came after a 1950 House select committee uncovered incidents of overcharging of tuition rates by some institutions under the original G.I. Bill in an attempt to defraud the government.
Congress did not include merchant marine veterans in the original G.I. Bill, even though they are considered military personnel in times of war, in accordance with the Merchant Marine Act of 1936.
Despite the racial discrimination that the legislation embraced, the G.I. Bill proved highly effective for white veterans. Over half of the World War II veterans benefited from educational benefits of the bill, and by 1947, nearly half of college enrollments were veterans. Nearly a third of all veterans accessed low-interest loans. With the post-war economic boom and very low unemployment rates, relatively few depended on unemployment benefits. These opportunities allowed many veterans to transition into the middle class and secure economic prosperity.
Veterans also fought for higher education programs more focused on practical needs, which led to increased valuing of more pragmatic programs such as engineering. A college education, and the resultant higher salary, was no longer limited to the U.S. economic elite. Average federal income rose along with average U.S. taxpayer income. Colleges also benefited from the influx of veterans: increased enrollments meant more money for institutions to operate.
A large demand for housing followed from the G.I. Bill’s mortgage subsidies, leading to the expansion of suburbs and the new U.S. middle class. Historians have argued that the bill had a tremendous impact on the dramatic pace of the post-war growth of housing and suburbia.
The Revival of Domesticity and Religion
U.S. post-war economic prosperity drove much higher birth rates and pushed many women back into the domestic sphere; this coincided with an increase in organized religion.
Examine the factors that contributed to the revival of domesticity and religion in the years following World War II
- The decade following World War II was characterized by increasing wealth throughout much of U.S. society. As economic prosperity empowered couples who had postponed marriage and parenthood, the birth rate began to shoot up in 1941, peaking in the late 1950s; a phenomenon known as the post-war baby boom.
- As men’s return from military service had forced many women out of the labor market, many chafed at the social expectations of being relegated to a stay-at-home housewife who cooked, cleaned, shopped, and tended to the children.
In the 1950s, membership in churches increased significantly, and the growing popularity of organized religion shaped the daily life of Americans and shaped U.S. politics.
- As the resurgence of organized religion continued to grow in the United States, a number of landmark Supreme Court cases addressed the issue of separation of church and state.
- baby boom: Any period marked by a greatly increased fertility rate. This demographic phenomenon is usually set within certain geographical bounds. The phenomenon marked the post-World War II period in the United States.
- fundamentalist: One who reduces religion to strict interpretation of core or original texts.
- Everson v. Board of Education: This 1947 Supreme Court case dealt with a New Jersey law that allowed government funds to be used for transportation to religious-oriented schools. Though the ruling was upheld, this was the first case in which the court applied the Establishment Clause to state law, having interpreted the due process clause of the Fourteenth Amendment as applying the Bill of Rights to the states as well as the federal legislature.
- Engel v. Vitale: This 1962 Supreme Court case determined it was unconstitutional for state officials to compose an official school prayer and require its recitation in public schools, even when the prayer was non-denominational and students could excuse themselves from participation.
The Baby Boom and the Role of Women
The decade following World War II was characterized by growing wealth throughout much of U.S. society. The U.S. economy grew dramatically, expanding at an annual rate of 3.5% between 1945 and 1970. As economic prosperity empowered couples who had postponed marriage and parenthood, the birth rate began to shoot up in 1941, paused in 1944–’45 (with 12 million men in service), and then continued to soar until peaking in the late 1950s; a phenomenon known as the post-war baby boom.
In 1946, live births in the United States surged from 222,721 in January to 339,499 in October. By the end of the 1940s, about 32 million babies had been born, compared with 24 million in the 1930s. In May 1951, Sylvia Porter, a New York Post columnist, first used the term “boom” to refer to the phenomenon of increased births in the post-war United States. Annual births first topped four million in 1954, and did not drop below that figure until 1965, by which time four in 10 Americans were under age 20.
Many factors contributed to the baby boom. In the post-war years, couples who could not afford to raise a family during the Great Depression made up for lost time; the mood was now optimistic. During the war, unemployment ended and the economy greatly expanded. Millions of veterans returned home and were forced to reintegrate into society. To facilitate this process, Congress passed the G.I. Bill, which, through the distribution of loans to veterans at low or no interest rates, encouraged home ownership and investment in higher education. The G.I. Bill enabled record numbers of people to finish high school and attend college. This led to increasingly skilled workers and yielded higher incomes for families.
Returning veterans married, started families, pursued higher education, and bought their first homes. With veterans’ benefits, the 20-somethings found new homes in planned communities on the outskirts of U.S. cities. Marriage rates rose sharply in the 1940s and reached all-time highs for the country. Americans began to marry at a younger age: the average age at first marriage dropped to 22.5 years for males and 20.1 for females; down from 24.3 for males and 21.5 for females in 1940. Getting married immediately after high school was becoming commonplace, and women were increasingly under tremendous pressure to marry by the age of 20. The stereotype developed that women were going to college to earn their M.R.S. (Mrs.) degree.
The role of women in U.S. society became an issue of particular interest in the post-war years, with marriage and feminine domesticity depicted as the primary goal for the country’s women. As men’s return from military service had forced many women out of the labor market, many chafed at the social expectations of being relegated to a stay-at-home housewife who cooked, cleaned, shopped, and tended to the children. In 1963, Betty Friedan published her book, The Feminine Mystique, which strongly criticized the role of women during the postwar years and was a best-seller and a major catalyst of the women’s liberation movement.
As the birth rate soared, families grew, and more people moved to the suburbs, the United States witnessed a subsequent boom in affiliation with organized religion, especially involving various Protestant churches. Between 1950 and 1960, church membership among Americans increased from 49% to 69%. Religious messages began to infiltrate popular culture as religious leaders became famous faces and numerous religious organizations were formed. Institutionalized religion became such a critical aspect of U.S. life that it came to shape major political decisions. Although in 1948, Dwight Eisenhower referred to himself as “one of the most deeply religious men [he knew],” yet unattached to any “sect or organization,” he decided to get baptized in the Presbyterian Church in 1953, the first year of his presidency. A year later, Congress added the words “under God” to the Pledge of Allegiance.
The 1950s saw a boom in the Evangelical Church in the United States. The post-World War II prosperity experienced in the country also affected the church. Church buildings were erected in large numbers, and the Evangelical Church’s activities grew along with this sweeping physical growth. In the southern United States, the Evangelicals, represented by leaders such as Billy Graham, experienced a notable surge, displacing the caricature of the pulpit-pounding country preachers of fundamentalism as stereotypes gradually shifted. Graham began the trend of national celebrity ministers who broadcast to megachurches via radio and television. He is also notable for having been a spiritual adviser to several U.S. presidents, including Eisenhower and Richard Nixon.
In the post-World War II period, a split developed among Evangelicals. Many began to express reservations about being known to the world as fundamentalists. The term neo-evangelicalism was coined in 1947 to identify a distinct movement within self-identified fundamentalist Christianity. The new generation of Evangelicals set their goal as abandoning a militant Bible stance. Instead, they would pursue dialogue, intellectualism, nonjudgmentalism, and appeasement. They further called for increased application of the gospel to sociological, political, and economic issues. Additionally, Christianity Today was first published in 1956; a year that also marked the beginning of the Bethany Fellowship, a small press that grew to be a leading evangelical press. The self-identified fundamentalists also cooperated in separating their “neo-Evangelical” opponents from the fundamentalist title, by increasingly seeking to distinguish themselves from the more open group, whom they often characterized derogatorily as neo-Evangelical or just Evangelical.
The Conservative Baptist Association also emerged in 1947 as part of the continuing Fundamentalist–Modernist Controversy within the Northern Baptist Convention. The forming churches were fundamentalist/conservative churches that had remained in cooperation with the Northern Baptist Convention after other churches had left, such as those that formed the General Association of Regular Baptist Churches.
Separation of Church and State
As the resurgence of religion continued in the United States, a number of landmark Supreme Court cases addressed the issue of separation of church and state. The centrality of the separation concept to the Religion Clauses of the Constitution was made explicit in Everson v. Board of Education (1947), a case that dealt with a New Jersey law that allowed government funds to be used for transportation to religious-oriented schools. Though the ruling was upheld, this was the first case in which the Court applied the Establishment Clause to state law, having interpreted the due process clause of the Fourteenth Amendment as applying the Bill of Rights to the states as well as the federal legislature. Citing Thomas Jefferson, the court concluded that, “The First Amendment has erected a wall between church and state. That wall must be kept high and impregnable. We could not approve the slightest breach.”
In 1962, the Supreme Court addressed the issue of officially sponsored prayer or religious recitations in public schools. In Engel v. Vitale (1962), the Court deemed it unconstitutional for state officials to compose an official school prayer and require its recitation in public schools, even when the prayer was non-denominational and students could excuse themselves from participation.
After 1945, new technologies resulted in revolutionary changes in agriculture, space industry, and medical sciences in the United States.
Evaluate the advances in technology following World War II, and how these influenced the farming, space, and medical industries
- In the aftermath of World War II, technological developments greatly influenced changes in agriculture. Agriculture began to move from small, family-owned farms to large, corporate-owned farms.
- In the 1950s, 77% of households purchased their first television set and the television industry noted dramatic growth, with many classic shows and formats developed by legendary personalities.
- Although the Space Race can trace its origins to Germany in the 1930s, it was a critical component of the Cold War. The Soviet launch of Sputnik I led to a huge spike in U.S. technological and industrial productivity.
- In medical sciences, the discovery of the polio vaccine and mass production of penicillin revolutionized the notion of public health.
- The first successful open heart procedure on a human using a heart–lung machine and the world’s first successful renal transplant took place less than 10 years after the end of World War II.
- Space Race: A 20th-century competition between the two Cold War rivals – the Soviet Union and the United States – for supremacy in spaceflight capability. It had its origins in the missile-based nuclear arms race between the two nations that occurred following World War II, enabled by captured German rocket technology and personnel. The technological superiority required for such supremacy was seen as necessary for national security, and symbolic of ideological superiority. It spawned pioneering efforts to launch artificial satellites, unmanned space probes of the Moon, Venus, and Mars, and human spaceflight in low Earth orbit and to the Moon.
- Sputnik I: The Soviet Union launched this first artificial earth satellite into an elliptical low Earth orbit on October 4, 1957. The surprise success precipitated the U.S. Sputnik crisis, began the Space Age, and triggered the Space Race, a part of the larger Cold War. The launch ushered in new political, military, technological, and scientific developments.
- Apollo Program: A U.S. human spaceflight program carried out by the National Aeronautics and Space Administration (NASA) that landed the first humans on Earth’s Moon in 1969–1972. Conceived during the presidency of Dwight Eisenhower, it began in earnest after President John Kennedy in a May 25, 1961 address to Congress proposed the national goal of “landing a man on the Moon and returning him safely to the Earth” by the end of the 1960s.
- NASA: This agency of the U.S. government is responsible for the nation’s civilian space program and for aeronautics and aerospace research.
- Green revolution: Research and development of technology transfer initiatives occurring between the 1930s and the late 1960s (with prequels in the work of the agrarian geneticist Nazareno Strampelli in the 1920s and 1930s), which increased agricultural production worldwide.
Advancement in Agriculture
In the aftermath of World War II, technological developments greatly influenced changes in agriculture. Ammonia from plants built during World War II to make explosives became available for making fertilizers, leading to a permanent decline in fertilizer prices. The early 1950s was the peak period for tractor sales in the United States, as the few remaining horses and mules were phased out. The horsepower of farm machinery greatly increased. An effective cotton-picking machine was introduced in 1949. Research on plant breeding produced varieties of grain crops that could produce high yields with heavy fertilizer input. These advancements resulted in the Green revolution that began in the 1940s.
A continued increase in productivity led to further increases in farm size and corresponding reductions in the number of farms. Many farmers sold their land and moved to nearby towns and cities. Others transitioned to part-time operation, supported by off-farm employment.
By 1947, when there were 40 million radios in the United States, there were about 44,000 television sets (with probably 30,000 in the New York area). Regular network television broadcasts began on NBC on a three-station network linking New York with the Capital District and Philadelphia in 1944, on the DuMont Television Network in 1946, and on CBS and ABC in 1948. Following the rapid rise of television after the war, the Federal Communications Commission (FCC) was flooded with applications for television station licenses. With more applications than available television channels, the FCC ordered a freeze on processing station applications in 1948; that would remain in effect until April 1952.
By 1949, the networks stretched from New York to the Mississippi River, and by 1951 to the West Coast. Commercial color television broadcasts began on CBS in 1951 with a field-sequential color system that was suspended 4 months later for technical and economic reasons. The television industry’s National Television System Committee (NTSC) developed a color television system based on RCA technology that was compatible with existing black-and-white receivers, and commercial color broadcasts reappeared in 1953.
Seventy-seven percent of households purchased their first television set during the 1950s. The use of television was fueled by the drop in television prices that resulted from mass production, increased leisure time, and additional disposable income. Sitcoms offered a romanticized view of middle-class American life. The Emmy-winning comedy (1951–1960) I Love Lucy starred husband and wife Desi Arnaz and Lucille Ball and enjoyed such popularity that some businesses closed early on Monday nights to allow employees to hurry home and watch it. Music programs, comedy and variety shows, and westerns quickly became a staple of 1950s television entertainment. Popular quiz and panel shows resulted in quiz show scandals that rocked the nation after it was revealed that producers secretly gave contestants assistance and fixed the outcome of supposedly fair competitions. Talk shows also had their genesis in the 1950s with NBC’s Today hosted by Dave Garroway creating the much-copied genre format. The Tonight Show debuted in 1954 with Steve Allen as host. In 1953 CBS anchor Walter Cronkite was the host of a the historical news show, You Are There.
The Space Race can trace its origins to Germany, beginning in the 1930s and continuing during World War II when Nazi Germany researched and built operational ballistic missiles. At the close of World War II, both the U.S. and Russian forces recruited or smuggled top German scientists such as Wernher von Braun to their respective countries to continue defense-related work. von Braun and his team were sent to the U.S. Army’s White Sands Proving Ground, in New Mexico, in 1945. They set about assembling the captured V2s and began a program of launching them and instructing U.S. engineers in their operation. These tests led to the first rocket to take photos from outer space, and the first two-stage rocket, the WAC Corporal-V2 combination, in 1949. The German rocket team was moved from Fort Bliss to the Army’s new Redstone Arsenal, in Huntsville, Alabama, in 1950. From here, von Braun and his team developed the Army’s first operational medium-range ballistic missile, the Redstone rocket, that in slightly modified versions launched both the United States’ first satellite and the first piloted Mercury space missions. It became the basis for both the Jupiter and Saturn family of rockets.
Competition began in true on August 2, 1955, when the Soviet Union responded to the U.S. announcement 4 days prior of intent to launch artificial satellites for the International Geophysical Year. The Soviet Union declared it would also launch a satellite “in the near future.” It ultimately beat the United States, with the October 4, 1957, orbiting of Sputnik 1, and later again beat the United States in sending the first human into space, Yuri Gagarin, on April 12, 1961. The race peaked with the July 20, 1969, with the United States successfully landing the first humans on the Moon, with Apollo 11. The Soviet Union tried, but failed, manned lunar missions, and eventually cancelled them and concentrated on Earth orbital space stations.
In 1948, Jonas Salk undertook a project funded by the National Foundation for Infantile Paralysis to determine the number of different types of polio virus. Salk saw an opportunity to extend this project toward developing a polio vaccine and, together with the skilled research team he assembled, devoted himself to this work for the next 7 years. Over 1.8 million schoolchildren took part in the trial. When news of the vaccine’s success was made public on April 12, 1955, Salk was hailed as a miracle worker and the day nearly became a national holiday. Around the world, an immediate rush to vaccinate began.
New technologies also revolutionized surgery procedures. The first successful mechanical support of left ventricular function was performed on July 3, 1952, by Dr. Forest Dewey Dodrill using a machine calle the Dodrill-GMR, which was co-developed with General Motors. The machine was later used to support right ventricular function. The first successful open heart procedure on a human, utilizing the heart–lung machine, was performed by Dr. John Gibbon on May 6, 1953, at Thomas Jefferson University Hospital in Philadelphia. Gibbon repaired an atrial septal defect in an 18-year-old woman. Gibbon’s machine was further developed into a reliable instrument by a surgical team led by Dr. John W. Kirklin at the Mayo Clinic in Rochester, Minnesota in the mid-1950s.
On December 23, 1954, Dr. Joseph Murray performed the world’s first successful renal transplant between the identical Herrick twins at the Peter Bent Brigham Hospital in Boston. The operation lasted 5 1/2 hours. He was assisted by Dr. J. Hartwell Harrison and other noted physicians. Murray transplanted a healthy kidney, donated by Ronald Herrick, into Herrick’s twin brother Richard, who was dying of chronic nephritis. Richard lived for 8 more years following the operation.
Biotechnology also underwent rapid development. The belief that the needs of an industrial society could be met by fermenting agricultural waste was an important ingredient of the “chemurgic movement.” Fermentation-based processes generated products of continually increasing utility; in the 1940s, penicillin was the most impactful of these. While it was discovered in England, it was produced industrially in the United States using a deep fermentation process originally developed in Peoria, Illinois. The enormous profits and the public expectations penicillin engendered caused a radical shift in the standing of the pharmaceutical industry. Beginning in the 1950s, fermentation technology also became advanced enough to produce steroids on industrially significant scales. Of particular importance was the improved semisynthesis of cortisone, which simplified the old 31-step synthesis to 11 steps. This advance was estimated to reduce the cost of the drug by 70%, making the medicine inexpensive and available.
The Growth of Suburbs
The post-World War II growth of the U.S. suburbs was facilitated by development of zoning laws, redlining, and numerous innovations in transport, and contributed to major segregation trends and decline of inner-city neighborhoods.
Examine the significance of the development of suburban communities
- Suburbs first emerged on a large scale in the 19th and 20th centuries due to improved rail and road transport, which led to increased commuting.
- Suburban growth was facilitated by development of zoning laws, redlining, and numerous innovations in transport; 1950 was the first year that more Americans lived in suburbs than any other type of region.
- Economic growth in the United States encouraged the suburbanization of cities, which required massive investments for new infrastructure and homes, while destroying old inner-city neighborhoods.
- With the growth of the suburbs in the early and mid-20th century, a pattern of hypersegregation – a form of racial segregation characterized by geographical grouping of racial groups – emerged.
- The influx of new black residents caused many white Americans to move to the suburbs (” white flight “); during the 1940s, for the first time, a powerful interaction between segregation laws and race differences in terms of socioeconomic status enabled white families to abandon inner cities in favor of suburban living.
- hypersegregation: A form of extreme racial segregation characterized by geographical grouping of racial groups.
- Zoning: Land use planning used by local governments, derived from the practice of designating permitted uses of land based on mapped zones that separate one set of land uses from another. Zoning may be use-based (regulating acceptable uses of land), or it may regulate building height, lot coverage, or similar characteristics, or some combination of them.
- Levittown: The name of seven suburban developments William Levitt created in the United States. Built in the post-war era for returning veterans and their new families, the communities offered attractive alternatives to cramped, central city locations and apartments. The developments are widely considered to be the archetype of post-war suburbia.
- white flight: A term that originated in the United States, starting in the mid-20th century, and applied to the large-scale migration of people of various European ancestries from racially mixed urban regions to more racially homogeneous suburban or exurban regions.
- Redlining: The practice of denying access or increasing the cost of services such as banking, insurance, denying access to jobs, access to health care, or even supermarkets to residents in particular areas. It describes the practice of marking a red line on a map to delineate the area where banks would not invest. Later the term was applied to discrimination against a particular group of people (usually by race or sex) irrespective of geography.
Suburbs first emerged on a large scale in the 19th and 20th centuries due t improved rail and road transport, which led to increased commuting. In the United States, Boston and New York spawned the first suburbs. The streetcar lines in Boston and the rail lines into Manhattan made daily commutes possible. No metropolitan area in the world was as well served by railroad commuter lines at the turn of the 20th century as New York, and it was the rail lines to Westchester from the Grand Central Terminal commuter hub that enabled its development. Westchester’s true importance in the history of U.S. suburbanization derives from the upper-middle-class development of villages including Scarsdale, New Rochelle, and Rye serving thousands of businessmen and executives working in Manhattan.
The suburban population in North America exploded during the post- World War II economic expansion. Masses of returning veterans wishing to start a settled life moved to the suburbs. Levittown developed as a major prototype of mass-produced housing. At the same time, African Americans were rapidly moving north for better jobs and educational opportunities than were available to them in the segregated South. Their arrival in northern U.S. cities and hostility of many white Americans further stimulated white suburban migration; 1950 was the first year that more Americans lived in suburbs than any other type of region.
Suburban growth was facilitated by development of zoning laws, redlining, and numerous innovations in transport. After World War II, availability of Federal Housing Administration mortgage loans stimulated a housing boom in U.S. suburbs. In the older cities of the northeast, streetcar suburbs originally developed along train or trolley lines that could shuttle workers into and out of city centers where jobs were located. This practice gave rise to the term “bedroom community,” meaning that most daytime business activity took place in the city, with the working population leaving the city at night for the purpose of going home to sleep in the suburbs.
Economic growth in the United States encouraged the suburbanization of cities, which required massive investments for new infrastructure and homes. Consumer patterns were also shifting at this time, as purchasing power was becoming stronger and more accessible to a wider range of families. Suburban houses also brought about needs for products that were not needed in urban neighborhoods, such as lawnmowers and automobiles. During this time, commercial shopping malls were being developed near suburbs to satisfy consumers’ needs and their automobile-dependent lifestyles.
Zoning laws also contributed to the location of residential areas outside of the city center by creating wide areas or “zones” in which only residential buildings were permitted. These suburban residences were built on larger lots of land than in the central city. For example, the lot size for a residence in Chicago is usually 125 feet (38 m) deep, while the width can vary from 14 feet (4.3 m) for a row house to 45 feet (14 m) for a large stand-alone house. In the suburbs, where stand-alone houses are the rule, lots may be 85 feet (26 m) wide by 115 feet (35 m) deep, such as in the Chicago suburb of Naperville. Manufacturing and commercial buildings were segregated in other areas of the city.
White Flight and Hypersegregation
With the growth of the suburbs in the early and mid-20th century, a pattern of hypersegregation – a form of racial segregation characterized by geographical grouping of racial groups – emerged. In the early-20th century, African Americans who moved to large U.S. cities typically moved into the inner city to work industrial jobs. The influx of new black residents caused many white Americans to move to the suburbs. This came to be known as “white flight”. During the 1940s, for the first time, a powerful interaction between segregation laws and race differences in terms of socioeconomic status enabled white families to abandon inner cities in favor of suburban living. The eventual result was severe levels of urban decay that, by the 1960s, resulted in the crumbling urban “ghettos.” Prior to national data obtained by the 1950 U.S. Census, the migration pattern of disproportionate numbers of whites moving from cities to suburban communities was merely anecdotal. The first data set that potentially could prove white flight came from that census. But the original processing of this data, on older-style tabulation machines by the U.S. Census Bureau, failed to attain any such level of statistical proof. It was a rigorous reprocessing of the same mass of raw data, on a UNIVAC I by Donald J. Bogue of the Scripps Foundation, that scientifically proved the reality of white flight.
As industry began to move out of the inner city, African American residents lost the stable industrial jobs that initially brought them there, forcing them to stay in the area to create the inner-city ghettos that form the core of hypersegregation.
New municipalities were established beyond the abandoned city’s jurisdiction to avoid the legacy costs of maintaining city infrastructure. Instead, new governments spent taxes to establish suburban infrastructure. The federal government contributed to white flight and the early decay of non-white city neighborhoods by withholding maintenance capital mortgages, thus making it difficult for the communities to either retain or attract middle-class residents. In addition to providing loans to encourage white families to move to suburbs, the government uprooted many established African American communities by building elevated highways through their neighborhoods. To build a highway, tens of thousands of single-family homes were destroyed. Because these properties were summarily declared to be “in decline,” families were given extremely low compensation for their properties, and were forced into federal housing called “projects.” To build projects, still more single-family homes were demolished.
In some areas, the post-World War II racial desegregation of the public schools catalyzed white flight. In 1954, the U.S. Supreme Court case Brown v. Board of Education (1954) ordered the legal termination of the “separate, but equal” legal racism established with the Plessy v. Ferguson (1896) case. It declared that segregation of public schools was unconstitutional. Many southern jurisdictions mounted massive resistance to the policy. In some cases, white parents withdrew their children from public schools and established private religious schools instead. Upon desegregation in 1957 in Baltimore, Maryland, the Clifton Park Junior High School had 2,023 white students and 34 black students; 10 years later, it had 12 white students and 2,037 black students. In northwest Baltimore, Garrison Junior High School’s student body shifted from 2,504 whites and 12 blacks to 297 whites and 1,263 blacks in that period. At the same time, the city’s working class population declined because of the loss of industrial jobs as heavy industry restructured.