Navigation

Due to technical difficulties, some of the video links in this website no longer work. We are uncertain as to when or if we will be able to correct these problems. However, the video clips constitute only a small portion of the material in this website. Moreover, the full transcripts of the oral histories from which the video clips were drawn can be found by following the "Resources" link below.

To Bear Fruit For Our Race College of Liberal Arts & Social Sciences

Advances in Medicine (1955-1980, Section 15)

Photo of Dr. Levi Perry

Dr. Levi Perry, 2000. (Courtesy of Drs. Levi and Eula Perry)

In contemplating a career in medicine in the 1950s, Dr. Levi Perry found motivations in many sources, and remembers that a regular feature in Jet magazine aroused his intellectual curiosity and gave him inspiration. Jet shared stories about the accomplishments of black doctors and discussed the latest innovations in medicine. Indeed, following World War II, there were a series of revolutionary events that transformed the field of medicine, particularly in the treatment of diseases that had long plagued and terrified Americans.

Hear Dr. Perry tell about the motivation he received from Jet magazine.

In the twentieth century, for example, the world saw a dramatic rise in the number of polio cases, leading to epidemics. In the United States, outbreaks of polio occurred each summer, leaving parents afraid for the health of their children. Polio, an abbreviation for poliomyelitis, is an acute disease caused by a virus. While many infections were minor, severe cases resulted in death or left patients paralyzed or restricted to an “iron lung,” a mechanical ventilator that breathed for them.

Texas experienced several large polio epidemics in the 1940s. In 1943, for example, local officials reported almost 1,300 case of polio, the highest number ever recorded in Texas.  Five years later, Houston reported 313 new cases of polio. The regular recurrence of these epidemics prompted intensive research. Although a cure for polio has yet to be found, Dr. Jonas Salk, a New York University-trained physician and researcher at the University of Pittsburgh, developed a killed virus vaccine that prevented contraction of the disease. After a series of tests to prove its safety and effectiveness, Dr. Salk’s vaccine, which involved an injection, was made available to the nation’s children in 1955. Within two years, polio cases in the United States declined more than 85 percent. Dr. Salk’s rival, Dr. Albert Sabin, developed a live-virus vaccine in the early 1960s, and because it was taken orally, became the preferred vaccine around the world. The last case of polio in the United States was reported in 1991.

Other vaccines emerged to fight childhood diseases. A measles vaccine and a mumps vaccine became available in 1964 and 1967, respectively. Vaccines for rubella (1970) and for meningitis (1978) soon followed. These vaccines, still received by children today, allowed doctors to focus on preventing diseases. Posters and advertisements urged parents to immunize their children, and helped to erase the fear of diseases that often proved debilitating or fatal in the past. Vaccines contributed to the decline in deaths from contagious diseases, but the decrease in overall mortality – and commensurate increase in life expectancy – also are attributable to societal improvements, such as better education, nutrition, sanitation, and public health services.

Sickle Cell Anemia

It was not until the 1970s that a problem of particular concern to the African-American community received national attention. The successes of various civil rights campaigns helped prompt President Richard Nixon to ask Congress for funds for research on sickle cell disease, or sickle cell anemia.  Sickle cell anemia is a recessive hereditary disease that primarily affects African-Americans. It is marked by the distinctive sickle shape of the patient’s damaged red blood cells; red blood cells deliver oxygen to the body’s tissues.  Living with a chronic, lifelong illness, sufferers experience periodic attacks when sickle-shaped cells obstruct capillaries and restrict blood flow to organs, causing swelling, organ damage, and severe pain. Keith Wailoo, a noted medical historian, argues, “sickle cell anemia would emerge to exemplify the African-American condition” in the twentieth century. 1 Initially public health officials had viewed African-Americans as ignorant carriers of disease. At other times, sickle cell anemia, like many other blood diseases, remained unknown, misunderstood, and frequently misdiagnosed.

Hear Dr. Kendall discuss sickle cell anemia.

In World War II, the United States fought fascism across the globe. Medical advancements in the treatment of trauma patients during the war revealed that there were no differences in the blood of different races. Nonetheless, the American Red Cross continued to segregate blood according to race until 1950, a policy that “revealed the illogic of Jim Crow medicine.” 2   As postwar civil rights activities gained momentum, however, the social meaning of disease changed, bringing new attention to issues previously ignored by the medical establishment. The pain associated with sickle cell anemia gave dramatic insight into other difficulties experienced by young African-Americans. “Not surprising, given the tenor of the 1960s politics, pressure was building for more immediate remedies. Pain management was quickly becoming one of the most important issues.” 3   Physicians began to treat sickle-cell crises with analgesics, or drugs designed to relieve pain.

In the 1970s, Americans learned more about the sickle cell disease through media coverage.  With the growing awareness, some African-Americans sought tests to screen for the sickle cell gene before having children.  At Rensselaer Polytechnic Institute in Troy, New York, a professor recruited African-American students to help with genetic screening in the community. Dr. Seymour Weaver, a native Houstonian and an undergraduate participant at the time, remembers that despite a winter storm the night before, more than 100 people came to be tested on the first day. Dr. Weaver later entered the Baylor College of Medicine in 1974. He completed his residency in dermatology at the Martin Luther King County Hospital in Los Angeles before returning to Houston.

Hear Dr. Weaver discuss his decision to specialize in Dermatology.

In 1971, President Nixon and Congress also focused on cancer. Cancer is a group of diseases characterized by the uncontrolled growth and spread of abnormal cells. When the spread is not controlled, it can result in a painful death. In the first half of the twentieth century, with only limited surgical treatments available for most cancers, doctors could do little more than try to bring comfort to the patient and his or her family.

Aerial shot of the TMC

Aerial shot of the Texas Medical Center, c. 1950 (Courtesy of the Houston Metropolitan Research Center, Houston Public Library)

In the 1930s and 1940s, important new research and treatment facilities appeared. The Memorial Hospital for Cancer and Allied Disease opened in New York City in 1939, followed seven years later by its research arm, the Sloan-Kettering Cancer Center. In 1941, the Texas state legislature granted the University of Texas $500,000 to build a cancer research center in Houston.  The M. D. Anderson Foundation, started by Monroe Dunaway Anderson to provide philanthropic funds to the Houston community, agreed to match those funds. The new M.D. Anderson Cancer Research Center, along with Hermann Hospital, marked the start of the Texas Medical Center. Since its inception, the M.D. Anderson Cancer Center has been a leader in its field, frequently and currently ranked the best in the nation.

In 1971, Congress passed the National Cancer Act (1971) which provided additional funding to strengthen the National Cancer Institute (NCI). Created in 1937 as an independent research agency, the NCI is part of the federal government’s National Institutes of Health.  Despite the increased expenditures and breakthroughs in cancer research and treatment at these various facilities, the cancer mortality rate in the United States was higher in 1977 than it had been in 1950. 

Other new developments in medical technology following World War II played to Americans’ long-held faith in scientific solutions. Surgeons, for example, have made enormous strides.  With the growth of computers, doctors used smaller tools to perform delicate, precise, and less invasive surgeries. The new procedures, called microsurgery, aided doctors in the development of transplant surgery, for example.  “In 1967, Godfrey Hounsfield, an engineer and computer expert . . . had the idea of developing a system to build up a three-dimensional body image.” 4   The technology he developed was called Computerized Axial Tomography Scan (CAT or CT scan) followed. First introduced for human use in 1971, a CT scan involves a series of x-ray images which the computer processes into a complete three-dimensional image. The CT scan became a particularly useful technology for the diagnosis of tumors and soft tissue injuries, among other conditions, and earned Hounsfield the Nobel Prize for Physiology or Medicine in 1979.

Hear Dr. Kendall explain the differences in x-rays, MRI and CT scans.

In 1943, the two-year-old Texas Medical Center consisted of four facilities. By 1980, it included more than twenty institutions providing education, research, and medical services to Houstonians and visitors from around the world. New institutions, such as the Institute of Rehabilitation and Research, arrived in the late 1950s. The Life Flight Program began in 1976 and provided rapid transportation to patients outside of Houston. New institutions attracted talented new physicians. Baylor, which had joined the Texas Medical Center in 1943, attracted Dr. Michael DeBakey and Dr. Denton Cooley, famous pioneers in cardiovascular surgery. DeBakey, for example, was one of the first physicians to complete a coronary bypass procedure in 1953; Cooley performed the first successful heart transplant in the United States in 1968.

Building on the movement that called for stricter licensing laws and, in turn, limited competition from other health care providers, physicians had consolidated their authority in the early decades of the twentieth century and now dominated the U.S. medical system. By 1980, the medical profession had emerged as a central institution in American society. The rise of private insurance companies in the preceding decades had expanded the medical market. The federal government supported, but did not control the medical profession through programs that offered construction loans, Medicare and Medicaid, funds for medical research, and Veterans Administration hospitals as training grounds. Placing few controls on physicians, these developments also increased their hegemony. In the following decades, the rise of managed care challenged this authority.

Citations

  1. Keith Wailoo, Dying in the City of Blues: Sickle Cell Anemia and the Policies of Race and Health (Chapel Hill:  University of North Carolina Press, 2001).
  2. Ibid., 89.
  3. Ibid., 161
  4. Roy Porter, The Greatest Benefit to Mankind (New York and London: W. W. Norton & Company,1997).

Center for Public History | Office: 524 Agnes Arnold Hall | (713) 743-3120