August 26, 1998
Counter to Conventional Wisdom: In Defense of DDT and Against Chemophobia
By Thomas R. DeGregori Department of Economics University of Houston
[The following provides documentation and greater detail for the August 26th Rockwell Lecture. It is not intended as documentation and not as a coherent, formal paper. The material will be covered in more outline form as fitting for a coherent half hour presentation with some sections below - such as that on Smallpox eradication - being omitted entirely. TRD]
How can we speak to those who live in villages and slums about keeping the oceans, the rivers and air clean when their own lives are contaminated at the source? The environment cannot be improved in conditions of poverty. Nor can poverty be eradicated without the use of science and technology.
(Indira Gandhi, quoted in Jukes, 1974, 15)
The media simultaneously report on how "chemicals" and other technologies are killing us while also giving us stories about the aging of America. If technology is killing us, why are we living so long?
In recent years, in the United States and in other advanced industrial countries, there have been scares about technological dangers that have bordered on hysteria. In 1989 in the United States, there was a massive public reaction to apples treated with a chemical called alar. (Rosen, 1990; see also Sirkin, 1991 and Mueller, 1990) Apples were removed from school lunch programs and sales plummeted.
One source that wished to have alar banned immediately estimated "that a child who drinks an average of 10 ounces of apple juice every day from the first to fifth birth days would face a cancer risk of from 5 to 50 in 1 million." (Consumer Reports, 1989, 291) Looking at the possible total impact of alar treated apples upon school children, an editorial in Science argued that "even in a worst case scenario the probability of cancer among the affected group would change from 25% to 25.05% (Koshland, 1989, 9).
One 60 Minutes broadcast with Meryl Streep outweighed the counter claims of the National Research Council and a United Nations panel of scientists from the World Health Organization and the Food and Agriculture Organization. 60 Minutes was offered a list of qualified scientist to appear but was contractually obligated to present one side only. A year later, a distinguished toxicologist writing in a policy journal published by the National Academy of Science referred to the entire episode as a "hoax." (Rosen, 1991)
The hysteria that followed included one mother sending the Nebraska state police to catch her child's school bus to retrieve the killer apple in her lunch. Another concerned parent called the International Apple Institute to learn whether it was "safe" to pour apple juice down the drain. The only children to get sick or die from drinking apple juice were those who were drinking unpasteurized juice. Tragically, the parents were paying a premium price under the mistaken belief that the "all natural" juice was healthier for the children. The power of collective, organized ignorance can be lethal.
The hysteria would have been hilariously funny except that apple growers and processors lost $375 million dollars with some literally losing the farm. Many local produce firms also lost on this and subsequent scares. I submit that when Meryl Streep and Oprah Winfrey become our leading toxicologists, we are in trouble.
Doomsday prophets - the bad news bandits:
30 years ago a best selling book on population began with "the battle to feed all of humanity is over." It was followed with the assertion that even if we mended our profligate ways with "crash programs," mass famine - "hundreds of millions people will starve to death" - was inevitable.(Ehrlich, 1971 - rev. ed.) Paul and Anne Ehrlich have poured forth ever since with a torrent of prophesies that have been consistently and egregiously wrong. The more wrong the prediction, the more honored the prophets as they climb the ladder of academic success, wrong by wrong. (Famous line of Tex Guinian who ran a speakeasy in New York in the 1920s and opened her show with the line "hello suckers" and bragged that she had climbed the ladder of success wrong by wrong.)
Had they been just a little wrong, we would probably never heard of them and been all the better for it.
In 1969, Ehrlich predicted in a "scenario" that "all important animal life in the ocean would be extinct" by September 1979 shortly before he won a $10,000 prize at the Woodlands for the best essay on the future. (Ehrlich, 1969) Three years later that same prize (now $30,000) was won by a pair of authors who claimed that 500 million people had died of famine in 1973-1974 which is ten times the number who died each of those years and five times as many people who died of all causes in those two years combined. The winning essay was published at the same time in a special issue for the conference of a (refereed?) journal on forecasting. (Freeman and Karen, 1982) There is more than a touch of irony for a journal devoted to forecasting to lead with an article that is so grossly in error about the past.
In the industrial world almost 30 years in life expectancy has been gained in the 20th century, close to twice the gain in all previous human history. (Herman, 1998)
From circa 1900 to the present in the U.S.:
Infant mortality from 100 per 1,000 births to between 5 & 6.
Child born had less than 1% chance of reaching age 15 without losing a parent or a sibling or own life, Today better than 99% chance of doing so.
Average male died one year before youngest child left home, average female one year after youngest child left home. Today, half of adult life still ahead - the empty nest syndrome - when last child leaves home.
One of innumerable examples of the way that technology has changed our lives in ways that are not fully appreciated can be found in a recent study by two demographers. Their research raises and answers the question as to "how many Americans are alive because of twentieth-century technology?" (White and Preston, 1996, title of article) They found that if the mortality rates of 1900 had prevailed throughout this century, half of all Americans alive would not be here today. Half of that half would have been born and already died. This means that many older Americans would not be alive. But, the other half of that half or "another quarter of the population would not been born because their parents, grandparents, or earlier ancestors would have died before giving birth to them or their progenitors." (White and Preston, 1996, 428) This factor would disproportionately impact on younger Americans who would never have been born.
Since 67% of the effect on population was from a decline in mortality for the 0 to 14 age group, many of today's young people would have been born but died in infancy or childhood. (White and Preston, 1996, 426) Overall, who is in what "category is impossible to say" as the 50% figure is an average that "cannot be assigned to individuals. Because this 50 percent estimate shows little variation by age or sex, any individual could sensibly estimate whether he or she is alive due to survival improvements by merely flipping a coin." (White and Preston, 1996, 428)
One rather ingenious author has argued that since we live longer and older we "have more time at risk to ill health." (Riley, 1989, 215) In fact, morbidity has fallen along with mortality. We live longer, healthier lives. Among older Americans whom Riley deems to "have more time at risk to ill health," there has been a "dramatic" decline in chronic disability according to the National Long Term Care Survey, as reported in a 1997 study by the National Academy of Sciences. (PFB, 1997, 3, London, 1997, 10 & 11 and ACSH, 1997, 11, see also Crimins et al., 1997 and Fogel and Costa, 1997, 62) "In the last decade alone, senior citizens have experienced a 25 percent decline in the number of days of restricted activity due to illness. (Graham and Wiener, 1997, 9) Even in the workplace where safety has been largely ignored by environmentalist, there was a 50% decline in the "risk of a fatal or disabling accident" from 1970 to 1990. (Graham and Wiener, 1997, 8)
Contrary to Riley, one can argue that the factors that have led to longer life are part of the same cluster of factors that reduce morbidity including chronic illness. Fogel and Costa argue cogently and with massive data that epidemiology of chronic disease is not separate from that for contagious disease. (Fogel and Costa, 1997, 56) Beginning "in utero or infancy," inadequate nutrition can lead to a vast array (far to many to list here) of deleterious conditions that make the organism more susceptible to contagious diseases, to chronic illness later in life and to shorter life expectancy. (Fogel and Costa, 1997, 56-57,) Fogel and Costa refer to what they call "technophysio evolution."
The global rate of change in mortality and morbidity coming later and moving more rapidly is in many ways more astounding than that for the United States and other developed countries. Over the last four to five decades, life expectancy has increased about 20 years. (USAID, 1998, 1) From 1955 to 1995, life expectancy increased 10 years from 67 to 77 years in developed countries while it went up 24 years from 40 to 64 years in developing countries which translates into a 60% increase. (WRI, 1998, 8) In 1950, 287 out of every 1,000 children in developing countries died before their fifth birthday. In 1995, the under five mortality rate for developing countries had fallen to 90, a decline of more than two thirds. (WRI, 1998, 8) For roughly the same period, 1955 to 1997, the absolute the decline was from 21 million to 10 million in the number of children in the world dying before reaching age five. (USAID, 1998, 1) Since world population more than doubled during this time period, than somewhere between 42 & 45 million children under five would have died in 1997 if 1955 death rates (ceteris paribus) prevailed or better than four times the actual number.
In fact, since 1950 the total number of people dying in the world has hovered around 50 million though population has much more than doubled. There were several years between 1950 and 1980 when the number of people dying was actually fewer than in 1950. Currently, estimates of the number of people dying in the world each year are around 52 million. One can therefore say that if 1950 death rates prevailed today (ceteris paribus), well over 50 million more people would die each year with over thirty million of that increase being children and with total deaths being considerably over 100 million each year. (The ceteris paribus above is of course for the fact that if 1950 death rates had continued we would very likely have a very different global population.) An important component in the decline in death rates has been the decrease in deaths from infectious diseases.
Currently, about 6% of all deaths in developed countries and 34% of all deaths worldwide (or 17 million) are from infectious diseases. (Murray and Lopez, eds., 1996, 176, WRI, 1998, 11 and USAID, 1998, 1) It is likely that the recent decades in which this happened is the first time in the entire history of our species and certainly, the first time in human history since the agricultural revolution that infectious diseases were not the cause of the preponderance of human mortality.
So many good things have worked together this century to bring about this change that it is almost impossible to isolate the impact of any one. All the major causes involved advances in science and technology. They are:
Increased and regularized food supply and other forms of nutrition intervention - vitamins, food fortification, enrichment, restoration etc.
Use of pesticides and other means to control or eliminate disease vectors
Knowledge of germ theory - personal hygiene.
For developing countries, specific mention must be made of
the entire package of Green Revolution technologies and the use of immunization and vector
control with pesticides to eliminate diseases such as smallpox or dramatically reduce the
incidence of others.
Chlorine and/or DDT have played an essential role in most of these.
Given our phobias about "chemicals," it should be recognized that humankind did not regularly drink hygienically clean water until the advent of purification processes, which included adding chemicals in this century. The late 18th and early 19th centuries brought developments in basic science, which allowed us to understand respiration (Joseph Priestley and Antoine Lavoisier) and statistics which laid the foundations for quantitative medicine. Further, Lavoisier's work led to the "secularization and demystification of water by analyzing it and showing that it could be broken down into hydrogen and oxygen." (Goubert, 1989, 2) In the middle of the 19th century, the physician John Snow who identified the water from one well as the source for an outbreak of cholera. This was followed later on by Louis Pasteur's recognition of the microbial origin of many diseases, some of which were water- borne.
Previously water could be visually clean or ritually clean but now with this new knowledge and chemical intervention we could have hygienically clean water. In short, "water became an industrial product." (Goubert, 1989, Chapter 7, see also Hamlin, 1990, 301)
Chlorination of water in the United States began in the early part of this century and very quickly "produced dramatic reductions in morbidity and mortality associated with waterborne disease such as typhoid, cholera, amoebic dysentery, bacterial gastroenteritis, and giardiasis." A potential typhoid epidemic in Chicago in 1908 was stopped by chlorinating contaminated drinking water. "The introduction of drinking water disinfection in the United States ... is credited with reducing the incidence of cholera by 90%, typhoid and leptospirosis by 80% and amoebic dysentery by 50%." (Farland and Gibb, 1993, 3)
Despite the fact that "millions of lives have been saved by the use of chlorine for disinfection of water," there are some who would ban its further use even though the evidence for its dangers is meager. (Abelson, 1994, 183, for the human benefits from chlorine and compounds with chlorine, see also Emsley, 1994, 173-203, see also Amato, 1993 and Putnam and Wiener, 1997, 124-148) The way some critics speak of Chlorine leads one to believe that it is a chemical compound manufactured by industry rather than an element with the atomic number 17 which you can find in any periodic table of the elements. We can ban certain uses of chlorine to our great loss but we can no more ban chlorine than we can ban that very active carcinogen, the free Oxygen radical.
Clean water has not only been important for drinking but also for personal hygiene, washing produce and other foodstuffs and used with disinfectants (often containing chlorine) for household cleaning. Chlorine is also contained in medicines and in many pesticides - the organochlorines.
The presumption of many that our food was grown "naturally" before modern chemical pesticides, is at best delusional. Innumerable substances have been used to protect crops. The more successful agriculture is, the more it concentrates high quality nutrients. With the exception of hydroponics, greenhouses or other high tech, capital intensive methods, virtually all agriculture takes place in the unprotected outdoors.
What is nutrient for humans is also nutrient to support the life processes of insects, birds, rodents and other animals, bacteria, fungi and viruses. And there are plants that will seek to recolonize the land, the seeds of which are frequently in the fields before planting or prior modern agriculture were inadvertently planted along with the crop. Any or all of these could destroy a farmer's crop and often did. We must not forget that for good reason, Famine was one of the Four Horsemen of the Apocalypse and Death in the form of Pestilence was another. Prior to modern times, famine has been a regular feature of human life throughout the entire world.
The improvements made in the land for the crop also make it more "attractive" for many other plants. For example, virtually everywhere that maize (corn), is planted, it has to compete with what American farmers call pigweed and what health food enthusiasts call grain Amaranth, the food of the Aztec Gods. These competitors do not respect the "property rights" of humans anymore than humans respected the rights of the prior plant and animal occupants of the land.
A major impetus for the rise of the modern consumer movement in the United States in the 1920s and 30s was concern over the use of lead arsenate as a pesticide. (Paehlke, 1989, 24) Other poisons used for crop and livestock protection before modern chemical pesticides include: the alkaloid nicotine, copper acetoarsenite, potassium 4,6-dinitro-o-cresylate, lime sulfurspray, hydrogen cyanide, sodium arsenite and potassium antimonyl tartrate. (Metcalf, 1980, 220 and Metcalf,1986, 253 & 259) It is interesting to note that when DDT and other organochlorines were first used, the earlier pesticides were often referred to as the "nonorganic pesticides." Whatever may or may not be the problems of modern chemical pesticides, they are definitely benign to the worker and consumer compared to the many substances used throughout human history.
DDT was banned in the United States because of its alleged damage to wildlife and only slight suspicion of harm to humans. Nevertheless, one continuously sees reference to DDT as a "known carcinogen" without any reference to who knows it and what evidence do they have. At the time that it was banned in 1972 for use in the United States, "numerous scientist protested ... that DDT had been widely used during the preceding 25 years with no increase in liver cancer in any of the populations among whom it had sprayed." (Lieberman, 1997, 3 & 1998, 8)
The liver cancer deaths in 1944, when DDT was first introduced, numbered 8.4 per 100,000 population. The figure fell to 5.6 in 1972.(Mellanby, 1992, 80)
Mellanby found this decrease "particularly significant" because increasing life spans were putting more people at risk of cancer. He found similar conditions "in other countries where DDT has been extensively used." (Mellanby, 1992, 80) What was not widely told in 1972 when it was banned in the U.S. or told now is that millions of human lives were saved by using DDT. Estimates made by reputable scientists and scientific organizations ran as high as 500 million lives saved. (Lieberman, 1997, 3 & 1998, 8) The World Health Organization that monitored the use of DDT "over the years" failed to find in 1979 "any possible adverse effects of DDT" and deemed it to be the "safest pesticide used for residual spraying in vector control programs." (quoted in Mellanby, 1992, 82, see WHO, 1979) "The excellent safety record of DDT, never matched by other insecticides used in antimalarial campaigns, other vector control programmes, and agriculture, is based mainly on its poor absorption through the skin." (WHO, 1979, 145)
A 1971 National Academy of Sciences report express concerns about DDT throughout the study. (NAS, 1971, 182, 213, 215, 431) Never-the-less, the study praised DDT for its contribution to "the great increase in agricultural productivity, while sparing countless humanity from a host of diseases, most notably, perhaps scrub typhus and malaria." (NAS, 1971, 432) The report calls for a "rule of reason" since "it is estimated, in little more than two decades, DDT has prevented 500 million deaths due to malaria that would have otherwise been inevitable."
Abandonment of this valuable insecticide should be undertaken only at such time and in such places as it is evident that the prospective gain to humanity exceeds the consequent losses.
(NAS, 1971, 432)
The report further states:
At this writing, all available substitutes for DDT are both more expensive per crop-year and decidedly more hazardous to those who manufacture and use them in crop treatment or for other, more general purposes.
The estimate of 500 million deaths from malaria prevented may be a typographical error. But even if it was "only" 50 million lives saved "in little more than two decades," it is still an extraordinary achievement considering also the millions of lives saved from deaths from other diseases or from hunger and malnutrition. Taking the very rough ratio of 50 to 100 cases of Malaria for every death, the 50 million in 20+ years would be 2 1/2 million lives saved and 125 to 250 million cases of malaria prevented each year.
Elizabeth Whelan statement on the life-saving benefit of DDT is more than amply supported by the evidence. " DDT prevented more human death and disease than any other man-made chemical in all of human history." (Whelan, 1993, 67)
The movement in the United States to ban the use of DDT had a devastating effect on the well being of poorer, more vulnerable people in the world. According to a more recent study sponsored by the Institute of Medicine, the National Research Council and the National Academy of Sciences:
In 1972, under pressure from environmental and conservation groups, the United States banned use of the pesticide for all non-public health uses. By 1982, production of DDT in the United States had ceased altogether. The removal of this cheap and effective antimalaria weapon from the U.S. marketplace had a negative impact on malaria control efforts worldwide. Subsequent pesticides (e.g. malathion) have proved to be more expensive and more toxic.
(Oaks et al., 1991, 44-45)
Currently, there are many alternatives to DDT or to be used in conjunction with DDT but the fact remains for many areas and many uses, DDT remains the safest, cheapest and most effective means for protecting humans from malaria and other diseases or causes of death. This is particularly true for indoor spraying for malaria control. (Oaks et al., 1991, 35, 132-133)
Although mosquitoes in some areas have developed resistance to DDT or similar chlorinated hydrocarbons, in many areas these chemicals are still the best weapon or an important weapon in an arsenal of weapons against malaria. (Boyce, 1998) Even when mosquito nets are used to cover one when sleeping as an alternative to spraying or as an added defense, it has been clearly demonstrated that the nets are most effective when impregnated with pesticide. If the aim of those who would ban the production of DDT and other similar chemicals is eventually to prevent or reduce their use worldwide, then the cost of preserving populations of eagles and Peregrine falcons (assuming DDT was responsible for their demise which is questionable) would be in the increased incidence of malaria.
There are known recent examples where the malaria cases increased in countries after the use of DDT was stopped and at least one country, Ecuador where the number of malaria cases fell when spraying was increased. (Boyce, 1998, 19) Of the roughly two and one half million people who die each year of malaria, most (possibly as high as 90%) are in Africa, most (again, possibly as high as 90%) are children, and most are poor or very poor.
In Africa, malaria has been on the increase for several decades. (Connor, 1998) Yet in a news article that referred to DDT as the "notorious pesticide," the World Wide Fund for Nature called for a ban on the use of DDT in Africa as part of a global ban on DDT. (WWF, 1998) In the article, the WWF makes a number of wild charges about DDT including the standard claim that it causes cancer. Currently, "one hundred governments, with United Nations officials and non-governmental organizations, hope to reach a legally binding treaty by 2000" banning the use of DDT. (WWF, 1998) The global ban on DDT may or may not be good for wildlife but it will be devastating for humans particularly for children in Africa.
One representative of a leading environmental group, when told that banning DDT would lead to millions of deaths worldwide, is alleged to have responded that "So what? People are the cause of all the problems; we have too many of them; we need to get rid of some them; and this is as good a way as any." (Sanford, 1992, 20) Similar sentiments have been expressed about the tsetse fly killing Africans or at least keeping them from inhabiting certain areas. The tsetse fly has been called "the best game warden in Africa," which shows a callous disregard for the well being of Africans that we discussed above. (Adams and McShane, 1996, 49)
A British scientist, Dr. Norman Moore, was the first person to advance the claim in the 1950s that DDT was causing the decline in the population of eagles. One author who has written on him gives Moore credit as the person who "both initiated the research that showed how pesticides damage wildlife and framed the laws that brought these chemicals under control." (Wakeford, 1991, 34) Moore himself was "sprayed with DDT when liberated from a prisoner of war camp in 1945." No adverse effect on Moore (who was still alive in 1991 when the article was written and in 1998 when I checked) is mentioned in the article. (author's note - I have been pleasantly surprised when discussing this issue in public as to how many people later volunteer the fact they also were sprayed with DDT or equivalent as part of a group including serviceman, concentration camp or prisoner of war camp survivors and Vietnamese and other refugees. None indicated any known adverse effects.) Moore "refuses to condemn all uses of pesticides, pointing out that they have increased food production and saved millions of people from insect-borne diseases." He adds poignantly that "if I were living in a hut in Africa, I would rather have a trace of DDT in my body than die of malaria." (Wakeford, 1991, 37)
Alleged dangers of DDT and other "chemicals" and technologies
A study on the DDT and PCB's and breast cancer that was "larger and better designed than any before it" found "no evidence that exposure to the chemicals DDT and PCB's increases the risk of breast cancer." (Kolata, 1997, for the study see Hunter et al., 1997 & BMJ, 1997, see also Krieger et al., 1994,) The study "came as a shock to some advocates for patients" while a spokesperson for an advocacy group maintained that the study was "definitely not the last chapter." (Kolata, 1997) It should not in fact have come as a shock to anyone who had followed the debate on the safety of DDT over the last 30 years. Numerous studies on the those who have been most exposed to DDT, including those who were dusted with DDT during WW II, those who worked over 10 years manufacturing DDT and those heavily exposed to DDT in the anti-malaria spraying campaign, all failed to find any adverse health effects on humans. (Mellanby, 1992, pp. 73-82) And over the years, there have been articles in scientific journals that collated these results. (see for example, Spindler, 1983 and Coulston, 1985, cited in Mellanby, 1992, 75) Further, in the U.S., liver cancer in humans declined significantly during the period of DDT's use contrary to animal studies that found malignant tumors in mice in very heavy dose experiments. (Mellanby, 1992, 80)
Scientific and technological inquiry are ongoing. Though the "last chapter" for any inquiry has not been written and never will be as long as humans are doing science, we have to act on the best available evidence - not the last chapter that will be written but the last chapter that has thus far been written - which in this case is that DDT and PCB's are not shown to cause breast cancer. No study closes out inquiry with absolute finality but some are sufficiently conclusive to allow us to close inquiry provisionally and to proceed with policy and action until some new evidence demands that we reopen it on a particular issue.
Is there any doubt that had the DDT/PCB study reached the opposite conclusion that it would have been deemed final and definitive by those who now question its validity. But for the true believer, there is always an escape hatch since this was a "study of two chemicals out of 80,000" in the environment. 80,000 chemicals are guilty until proved innocent. And since no proof of innocence will ever be accepted by the believers than "chemicals" are guilty a priori. In this misunderstanding of scientific inquiry, a guilty verdict completely closes out inquiry while an innocent verdict leaves inquiry open along with the continuing suspicion of guilt. The passive voice in English is useful to the believers as they can simply refer to "known carcinogens" without specifying who knows that they are carcinogens and how do they know it.
Elizabeth Whelan cites an author who found only a "relatively light scattering of ill-effects so far manifested" not to be lack of evidence for the toxicity of a chemical but proof of it because of the "mysterious - one might say devilishly capricious - manner in which it can strike and how little is yet understood about this substance." (Whiteside, 1979, 133, quoted in Whelan, 1993, 293) A similar line of faith masquerading as science was made concerning the alleged synergistic effect of a combination of chemicals and cancer. When the original study could not be replicated (see below), one believer is quoted as saying that "given the context of the study" it does not mean that "there is not synergy going on out there, it just means we have not found it yet." (Inside EPA, 1997)
Any study that purports to find that a manufactured chemical, particularly a pesticide, may cause cancer or other human disorders gets massive media attention. When "certain chemicals in the environment" acting synergistically are found to be "estrogenic," possibly leading to an increase in breast cancer, it is widely publicized. (Arnold et al, 1996, see also Stone, 1994, 308-310) But when this report was withdrawn because neither the original researchers nor anyone else has been able to replicate the reported results, I could find no media coverage. (McLachlan, 1997) The letter withdrawing the report states:
People in many walks of life have, on their own, put great weight on this report as the basis for much discussion, thought and public policy.
Whatever merit this publication contained, and despite the enthusiasm it generated, it is clear that any conclusions drawn from this paper must be suspended until such time, if ever, the data can be substantiated.
(McLachlan, 1997, 463)
Cost of not using technology:
Because of U.S. risk assessments claiming that chlorine as a carcinogen, Peru stopped chlorinating its drinking water. This was followed by the "largest outbreak of cholera in recent history, which killed nearly 7,000 people and afflicted over 800,000 more" in 1991. (Graham and Gray, 1997, 15, see also Putnam and Wiener, 1997, 132, 145 and Salazar-Lindo et al., 1993, 403-404) Unfortunately, "regulators ... are sometimes influenced by the public's present tendency toward chemophobia" and fail to adequately weigh risks and benefits. (Malaspina, 1992, xvi)
An outbreak of E. Coli 0157:H7 occurred in one community in the United States where the spring water used in the public water system was not chlorinated. (Kluger, 1998, 60) Earlier E. Coli 0157:H7 was spread in a swimming pool in Atlanta, Georgia that was insufficiently chlorinated. In a study in Britain covering the years 1937 to 1986, "defective chlorination was blamed in 8 out of 10 outbreaks of disease from public water supplies and in all 13 outbreaks in private supplies." (Boyce, 1998b, 19) The costs of not using a technology are too often not considered. The cure for not using a technology is to use more technology and to use it more intelligently. In response to the recent outbreak of E.Coli 0157:H7, the community cited above, began chlorinating its water supply.
B) DDT and other pesticides:
Through time, plants evolved for their own survival and not to serve human needs (Boyer, 1982, 443; Ames, 1983, 1256). "Plants in nature synthesize toxic chemicals in large amounts, apparently as a primary defense against the hordes of bacterial, fungal, and insects and other animal predators," Bruce Ames says, "plants in the human diet are no exception." Even if fungicides are toxic as many argue, not using them may be even more toxic to humans.
Ames argues that not only is fungus infestation of plants dangerous in and of itself but that such infestation causes plants to "produce very much larger amounts of their natural toxins" many of which are likely carcinogens. (Ames, 1990, 78, 80, see also Abelson, 1994b, and Graham and Wiener, 1997, 13-14) According to Ames, "we are ingesting in our diet at least 10,000 times more by weight of natural pesticides than of man-made residues." (Ames, 1990, 78, see also Ames, Profet and Gold, 1992a & b) He estimates that the "light-activated carcinogens" in celery can "increase 100-fold when plants are damaged by mold and in fact, can cause an occupational disease in celery-pickers and in produce checkers at supermarkets." (Ames, 1990, 80; see also, French, 1990, 15-16) Ames also argues that very low doses of some chemicals may actually be beneficial in helping our immune system be able to later withstand larger doses. (Ames, 1992 and Calabrese, 1994, cited by Gray and Graham, 1997, 183)
I am not saying that Ames is right, though I believe that he is. What I am arguing is that his argument should be part of the public discourse and policy process and not drowned out by the anti-technology cacophony.
Costs imposed on others:
Domestic laws that prevent the importation of produce that carry residue of chemicals that are banned in the United States force growers to use pesticides that are more toxic to the farm workers and more expensive. (Perfecto, 1992, 189. Perfecto is a critic of most uses of pesticides)
To ensure the entry of vegetables into the U.S., Mexican farmers switched from the highly persistent organochlorines, which were banned or restricted in the U.S., to the less persistent organophosphates.
(Perfecto, 1992, 189)
Unfortunately this transformation, instead of relieving the health threat to those who lived and worked in the Culiacan Valley, increased their immediate public health risk. Organophosphate insecticides are more acutely toxic than the compounds that they have replaced. They were recommended by the EPA because they degraded faster and wouldn't persist in the environment or food.
(Perfecto, 1992, 189)
The same attempt to lower pesticide residue on domestically produced food in the U.S. means that "moderately toxic but persistent" pesticides have been replaced by organophosphate pesticides that are "less persistent but acutely toxic to workers." (Gray and Graham, 1997, 189) The consumers' presumed gain has been at the expense of agricultural workers.
In the United States, "the Food Quality Protection Act of 1996 requires the Environmental Protection Agency to consider children, who are most vulnerable to such health effects, when it sets limits for pesticides in food." (Crenson, 1997)
The law is so strict that even the environmentalists who lobbied for it can hardly believe it passed the 104th Congress unanimously. But the Food Quality Protection Act specifically forbids the EPA from considering occupational exposures, which are known to be much higher than those generally found in food and water. The act goes out of its way to exclude the children and adults who pick America's produce.
Food in the U.S. and Developed countries:
One counter-culture advocate, Warren J. Belasco, brushes off any suggestion that our modern food production and processing including fortification played any role in the decline in nutritional deficiencies (Belasco, 1989, 126 and elsewhere. In a more recent article, Belasco now recognizes the gains of the Green Revolution, but remains skeptical about the future of modern agriculture. Belasco, 1997). Down to World War II, anemia, beriberi, pellagra and ariboflavinoses were common in the United States, causing illness and death. According to one report, "it is difficult to find a case for study" of the vitamin deficiency diseases that numbered in the hundreds of thousands of cases in the 1930s. "The number of deaths due to pellagra in 1966 was 1.1 percent of the figure for 1941. In 1921 three-fourths of the children in New York City showed signs of rickets. Now, due in great part to the widespread vitamin D fortification of milk, infantile rickets in the United States is extremely rare." (Berg, 1973, 109-110) Berg, as with most specialists on the subject, recognizes that much else was going on, contributing to these improvements, but these other concurrent events in expanded and improved food supply are equally abhorred by Belasco.
It is interesting to note that vitamin D was first manufactured in the 1920s (and may still be), the same way that the sun and our bodies do it, by radiating fats (cholesterol) with ultraviolet light. It was proudly advertised in national magazines as "Sunshine Vitamin D by Irradiation" in ads which touted irradiated products including "Irradiated Evaporated Milk." (Apple, 1989, 374) Given the climate of fear created by the counter cuisine enthusiasts, had this technique been developed in recent years, it would have never been approved for use. These phobias in large measure account for the spirited opposition to use of irradiation of food to destroy micro-organisms and thereby make food safer. (Leslie, 1990, Brody, 1994 and The New York Times, 1994) Dr. David Kessler, scientists, one of the leading health and consumer advocates and former head of the U.S. Food and Drug Administration has strongly argued that fears of food irradiation are unfounded and the potential benefits should not be denied to consumers. (Anderson, 1998)
Not to do something at this point is to condemn us as a country to suffer the consequences of these food borne illnesses again and again. ... Food irradiation is a food safety tool that we as consumers should not ignore.
Jane E. Brody presents an extensive list of potential benefits of food irradiation and compares it to opposition to a turn of the century (19th to 20th century) innovation in food processing. (Brody, 1994)
The innovation under attack almost a century ago was the pasteurization of milk, a health- and life-saving process that myth-mongering opponents kept from commercial use for 50 years.
The Green Revolution:
The anti-science, anti-technology dogma reaches a level of virtual absolute certainty on questions around the research and implementation in agriculture in what has popularly become known as the "green revolution."
It is hard to imagine anyone opposing the Green Revolution. By 1987, the increase in yields from Green Revolution in rice alone had produced enough to feed one billion people. The Green Revolution was in many ways a "grain revolution" - wheat, rice and maize - but it also was a "research revolution" that facilitated an extraordinary and sustained general expansion in food production. "Green Revolution" must be framed in the context of a revolution in food production that has allowed world population to double while dramatically increasing per capita food production and providing a level of nutrition for more people unprecedented in human history. Just for rice production, the increase in yields alone was enough to feed one billion extra people. The "Green Revolution" is irreversible without a catastrophic loss of life.
"Between 1961 and 1994, the number of daily food calories per capita from about 1,900 to 2,600 in developing countries, while their populations nearly doubled from 2.2 billion to more than 4.3 billion." Globally in the same period, "average daily per capita food supplies increased more than 20 percent." (Bender and Smith, 1997, 18, for similar data, see Mitchell, Ingco and Duncan, 1998) Again contrary to widespread dogma, the poor benefited proportionately more than those better off.
Contrary to popular mythology, Green Revolution crops were bred to be disease resistant and therefore required a lower use of pesticides to get the same crop protection. Yet there is a persistent, widely believed anti-technology mythology that claims that Green Revolution crops are less disease resistant. And they are more efficient requiring less nutrient (fertilizer) per unit of output.
Immunization and the eradication of smallpox
Surely no one could oppose immunization in the 20th
century. Of course, belief in a harmonious nature and a natural evolution of
micro-organism to being relatively benign can contribute to such a belief but not fully
justify it. Prior to immunization, the main prophylaxis against diseases such as smallpox
was inoculation. In any human action, there are risks even in those activities that are
otherwise lifesaving. For smallpox immunization, the risks from it in the United States
were 0.00011% from all causes (both direct death or from complications). This risk factor
was 20,000 times less risky than inoculation which was itself considerably less risky than
no protection at all. (Copp and Zanella, 1993, 259-260) Even for the controversial
pertussis vaccine, there were and remain clear benefits.
Up to 250,000 people per year, mostly children, suffered
from pertussis in the 1930s, and approximately 7,000 people died each year in the United
States. After 1947, when the DTP combined vaccine was standardized and put into routine
use in the United States, the case rate soon fell by a factor of 100 to approximately
2,000 with fewer than a dozen deaths each year.
(Copp and Zanella, 1993, 276-277)
The increased death rate from pertussis that followed the decline in immunization England and Wales is further evidence for the life-saving benefit of immunization. The alleged complications from the vaccine that led many not to have their children immunized against pertussis has been corrected in a vaccine that was recently tested and is now being used. Given the complexity of the epidemiological issues involved, it is also possible that the observed complications may not have been caused by the pertussis vaccine.
What is clear is that the warnings about the dangers of the pertussis vaccine did cause a loss in life for children of those who were frightened into not having their children immunized. The lesson is that warnings of danger are not risk free and even when scientifically based, warnings need to be very carefully phrased and the media should exercise additional care less they help perpetuate a false alarm. (see Allen, 1998) In early 1998, there was another vaccine scare in the United Kingdom, this time it was the vaccine for measles, mumps and rubella. The scientists for the Medical Research Council investigating the issue could find no verifiable links between immunization and autism and other maladies. But they did note the verifiable dangers of not being vaccinated. (BBC, 1998a) Similarly, "Finnish scientists have given the controversial triple vaccination a clean bill of health." (BBC, 1998, see also Reuters, 1998)
Even if someone distrusted immunization despite its enormous successes, one would think that there would not be objections to the total elimination in ten years and nine months (January 1967 to October 1977 with "victory" being declared in May 1980) of a disease, smallpox, that was infecting 10 to 15 million people and killing 1.5 to 2 million people each year and blinding and maiming many countless others. During the first eight decades of this century, nearly 300 million people died from small pox - "three times more than all the wars in this century." (Henderson, 1996, Fenner and Henderson, 1988, Oldstone, 1998, 3 & 27, see also Armelagos, 1998, 24)
Within five years (1972) of the start of the eradication campaign, smallpox no longer existed in the Western Hemisphere and by the end of another year (1973), smallpox "was restricted to the Indian subcontinent and the horn of Africa, Ethiopia, and Somalia." (Crosby, 1993, 1012, see also Oldstone, 1998, 43-44) Smallpox sufferers "developed spots like flea-bites, which grew into pustules containing a transparent fluid which turned into a thick pus. The eyelids would swell and become glued together. Suffers had to be prevented from tearing their flesh to shreds." (Kedzierski, 1992, 1)
Never underestimate what Veblen called the "trained incapacity" of trendy intellectuals and presumptive Ivy League scholars. Believe it or not, there are supposedly serious articles in a book published by one of the world's oldest and most prestigious publishers that argues that "Smallpox need not have been eradicated; it could have been contained." (F. A. Marglin, 1990,140, see also S. A. Marglin, 1990, 20) This is an assertion so preposterous that only an intellectual could make it.
Smallpox now exists only in two laboratories and may end existing simply as a genetic code in a computer file. (Rensberger, 1992, for the debate as to whether or not to destroy the remaining stocks of smallpox, see Mahey et al, 1993 and Joklik, 1993) There will be a meeting of a special committee of the World Health Organization in 1999 to see if they can resolve the issue of preserving or destroying the remaining stocks of smallpox. (Oldstone, 1998, 188)
In 1967, the year when the program began, somewhere between 1.5 million and 2 million people died of smallpox Perhaps half a million more were blinded, and more than 10 million were seriously and permanently disfigured. In the early 1950s the toll had been three or four times greater.
(World Bank, 1993, 17)
In the 1950s, a number of countries had begun their own
eradication campaigns so that by the time the global program began, smallpox "had
been virtually eradicated in 125 countries." Vaccination had begun in some countries
in the early 1800s so that smallpox had already been eradicated in North America and
Europe. (Crosby, 1993, 1012)
Even so, the cost of smallpox vaccination, quarantine programs and treatment totaled more than $300 million in 1968 alone. The eradication program, by contrast, cost $300 million over the whole of its twelve-year life and has therefore saved hundreds of millions of dollars a year in direct, measurable costs, as well as unquantifiable amounts of human suffering.
(World Bank, 1993, 17, see also Copp and Zanella, 1993, 262)
F. A. Marglin deconstructs the science from which immunization is derived and finds it guilty of "logocentricism" and "phallologocentrism". (F. A. Marglin, 1990, 102) In India (and elsewhere), smallpox was historically contained by a method called variolation and "only about 1 in 100 persons died even during epidemic phases of the disease." (F. A. Marglin, 1990, 110, editor's underlining) Variolation was intertwined with the worship of Sitala. Marglin argues that variolators with the proper education could have been used as vaccinators. "The aim of vaccinating 80 percent of the population could have been attained if the indigenous system of variolation had not been destroyed." (F. A. Marglin, 1990, 140) This "would have been sufficient to prevent devastating epidemics from occurring." (F. A. Marglin, 1990, 140) However, the disease would have remained and would have continued to kill people but presumably only an acceptable few.
Marglin is clearly wrong on the criticism that the smallpox eradication program did not use variolators as vaccinators. Everywhere the program worked closely with governments at all levels using whatever local resources were available. This was definitely the case in India. Further, in India, a smallpox eradication campaign was long in place prior to the commencement of the global effort and provided a an essential foundation for the global eradication program there.
There are more than a few inconvenient problems with F. A. Marglin. To start with, containment means people continue to die from smallpox. There would be no big epidemics but plenty of death and blindness. After all, 20 percent of India's current population would mean more than 180 to 200 million people at risk from smallpox (or about the fourth largest country in the world). If this 180 to 200 million people were a separate country, they would be close to being tied with Indonesia as the fourth largest country in the world. Who would decide those to be vaccinated and those who would not be. (For the terror of smallpox in India before its eradication, see Kolenda, 1982)
During the smallpox eradication campaign, many women in south Asia tried to avoid having their children vaccinated. But today, the vast majority of the world's population know the benefits of immunization, and parents throughout the world even in the most remote and inaccessible areas expend great effort and endure considerable personal hardship to get their children immunized. Simply stated, mothers who once feared immunization for their children could not help but observed that immunized children did not get sick and die from the diseases for which they were immunized.
What if more than 80 percent of the population wished to be immunized? How do we deconstruct that reality? or how do we show respect for the wishes of that culture? And epidemics would be "contained" only for populations that were largely immunized. The world's other 5 billion plus people and growing would forever after have to be immunized (with a cost in lives lost directly and those lost indirectly because of health resources diverted from other uses) until a variety of smallpox emerged that was resistant to the vaccine. And since the eradication campaign did not use universal immunization but devised an effective strategy of "identification and containment," then a residual pool of smallpox in India would have immediately put many hundreds of millions of people around the world at risk of this disease. Smallpox immunity begins to decline after about twenty years, so that the vast majority of the world's population today, including those previously immunized, would theoretically be vulnerable to smallpox infection. All this advocated by our academic Torquemadas in the name of respect for the beliefs of another culture. Fortunately, the Government of India had a more profound understanding of their own beliefs and the rights of their own people and the people of the rest of the world than Western academics who presume to speak for the Third World population. And as we have noted, India had a smallpox eradication program in place before the global program. But then again, the democratically elected government of India has never experienced "decolonization" of its mind. (F. A. Marglin, 1990, 140)
Paradoxically, the Marglins and their deconstructionist kith and kin are guilty of the same sin of which they accuse modern science: namely, they are convinced of the absolute superiority of their system of belief and analysis. India has long prided itself on having the third largest cadre of scientists and technologists (after the United States and what was then the Soviet Union). One wonders what distinguished Indian scientist and technologist in India and around the world would think of those who view their achievements as being the product of colonized minds. Most would vehemently deny the implicit thesis that modern science and technology are a unique product of Western culture.
Quite possibly our new academic guardians of post-modernism
are guilty of what is called "orientalism" (the presumption of outsiders that
they can tell a non-Western people what their true values are) and are more ethnocentric
than those whom they criticize. Simply stated,they presume to be more authentic
spokespersons for India's cultural traditions than hundreds of thousands of its people and
its government. In another context, Tomlinson argues that the "temptation is strong
for intellectuals who do feel cultural imperialism as a threat to `speak for' the culture
by attributing a form of `false consciousness' to the masses who don't." (Tomlinson,
1991, 94) Deconstructionist physicians, heal thyself! Their sacred academic intellectual
constructs are more important than people's lives. We need to
create a new category of intellectual, the criminally misinformed.
In addition to the eradication of smallpox, polio has been
eliminated in our hemisphere and will join smallpox in total eradication early next
century. Other diseases like Guinea worm and Leishmaniasis also have programs for their
total eradication. The attempt to eradicate Guinea worm (which infects 40 million each
year, primarily in West Africa) is being conducted with a combination of pesticides and
water filtration. Ban the production of DDT and other organochlorines and this and other
programs to control a number of tropical diseases would be greatly impeded to the
detriment of the peoples of the tropics. Containment of Dengue and Yellow fever would also
be adversely effected.
Some have tried to delay the introduction of medication that fights a parasitic nematode (onchocerca volvulus) that results in excruciating pain and permanent blindness in humans (river blindness) in West Africa because the resulting human excrement may be toxic to dung beetles. (On the cure, see Massey, 1989, 4-7; on the opposition to its use, Crump, 1989, 126-127; on a similar problem and its solution by the introduction of new varieties of dung beetles, see Wall and Strong, 1987; on Ivermectin and medication, see Coe, 1987 and Taylor et al., 1990) Thankfully, the critics were unable to stop the use of Ivermectin. The Onchocerciasis Control Programme run by the World health Organization using a combination of pesticides for control and Ivermectin for treatment has dramatically reduced the scourge of this disease. (Gilbert, 1998) A foundation created by a Houston philanthropist has been actively involved in programs using Ivermectin to control Onchocerciasis. If the other agencies were restricted in their pesticide use, the worthy Ivermectin programs would simply be overwhelmed.
Conclusions: Neither Eden or Utopia
Technology and Science are ubiquitous in modern life. Paradoxically, they are also invisible in many of their most important dimensions and implications and benefits. Half of us are here because of 20th century technology and science but we don't know which half and most of us don't know why. The simple glass of water served to you today embodies the history of 19th century chemistry and medicine and all other prior and subsequent knowledge that allows it to become so readily available to us that we take it for granted.
Unfortunately, the consequence of not using technology and science may also be invisible as our actions may bring great harm to others and be completely unknown to ourselves.
1) There is no penalty for being a false prophet if it is a doomsday prophecy. Being wrong is better than being right.
2) Ideas have consequences. What may seem like a benign idea may cause great harm and suffering to others. if we are to be moral agents as well as advocates, it is our responsibility to be as fully informed as possible about the consequence of our beliefs and actions. By all means be a do-gooder but be informed about what it is your doing.
3) If we had followed the policies of the prophets of doom, we would have had the doomsday that they prophesied. Paul Ehrlich deemed the Green Revolution a failure and opposed the export of "death control," the very factors that have sustained the increase in population and have led to the most rapid rate of decline in history in human fertility which will allow population to stabilize at a sustainable level. The limits to growth and the appropriate technology would have had us switch from non-renewable resources to renewable ones or in other words switch from those that are now cheap and abundant to those which are in more critical supply, more polluting and less likely to provide a higher standard of living.
4) Being anti-technology prevents the doomsdayers from understanding the way in which technology and science create resources and how, if our policies are correct - pro technology and science, pro research - that resources will (and have been) created faster than they are being used. The real price of raw material resources has been falling almost continuously for over two hundred years while the real price of food has been falling for over 100 years. For example, the real price of rice has fallen in half since the beginning of the Green Revolution and within the time frame of the worst doomsday predictions.
5) In order to sustain anti-technology and anti-science beliefs, falsehoods have to be defended against massive, mounting evidence against them. This has given rise to schools of thought in which truth no longer matters. In post-modernism, deconstructionism etc., language has become a tool to obfuscate (to the user as well as the listener) and not to clarify fulfilling some of the worst nightmares of George Orwell. How else to explain, what might otherwise be a decent, caring individual who puts forth such an utterly grotesque idea that smallpox should not have been eradicated?
6) Many who have not fallen into the post-modernist morass have succumbed to various romantic illusions about healthier, ecologically sane, pure lifestyles of other poorer less developed peoples. As with other anti-technology belief systems, they can only be sustained by an accumulation of misinformation, In this case, it consists of romantic delusions which are luxury affordable only by the affluent.
The model of technological and scientific change that is proffered here is not a romantic view with an Edenic past or Utopian future but one in which each new change brings new problems to be solved. Progress is simply when the problems created are less severe than the problems solved with the technological and scientific change, giving us greater means to address them. Simply saying that "there are problems" with a new technology is not enough to condemn it without an understanding of the cost and benefits of using the technology and of not using it.
The progress of the process of problem solving requires continuous and vigorous criticism both from within and without. In economics, we have Gresham's law that bad money drives out good. Technophobia and chemophobia act as kind of a Gresham's law where hysterical criticism has driven more reasoned critics from the public arena and the policy process. One of the greatest sins of the bad news bandits is that they have degraded that fructifying power of difference by which intelligent human beings disagree with one another and in so doing advance knowledge and make our world better and safer for all.
If the record of this century is any guide to the next than
humankind will not only rise to the new challenges that it faces, it will continue to rise
above. The future envisioned here is best described in the words of T. S. Eliot (no friend
We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.
Through the unknown, remembered gate
When the last of earth left to discover
Is that which was the beginning.
(Eliot, 1952, 145)
Back to DeGregori's Home Page
Back to Department of Economics