Dr. Robert Butler ’49 Keeps Going
In a state of nature . . . no Arts, no Letters,
no Society; and which is worst of all,
continual fear, and danger of violent death;
and the life of man, solitary, poor, nasty,
brutish, and short.
Thomas Hobbes (1588–1679)
Leviathan, Part I, Chap. XIII (1658)
Life through much of human history was indeed brutish and short—humans lived barely long enough to reproduce themselves. For the mere survival of the human race, a proportion of individuals had to live long enough to give birth and rear their young (Figure 1.1). Yet, it is sobering to note that, according to archeological estimates, half of all Neanderthals (the archetypal caveman living one hundred thousand to thirty-five thousand years ago) and Upper Paleolithic homo sapiens (beginning forty thousand years ago and including Cro- Magnon man) died by the time they were twenty, with only a few living beyond age fifty.
The Cro-Magnon era brought longer life expectancy for some, with the rare individual living beyond age sixty. This fledgling longevity, sporadic as it was, became possible as humans began to work together to create a better standard of living. Although they were less muscular, Cro-Magnons eventually replaced the intellectually outpaced Neanderthals. However, like their predecessors, the vast majority of Cro- Magnons continued to die at an average age of eighteen to twenty. Before I continue, it is important to define the terms life expectancy and life span and distinguish between them. Life expectancy is based on the average number of years that each sex can expect to live, under specific conditions. Life span is the genetically determined length of life of a specific animal species under the best of environmental circumstances. It probably increased during early hominid development but has not, in all likelihood, increased since that time. Life expectancy is more malleable and is dependent upon a variety of factors that can change quickly, such as the conquest of diseases. For example, between 1900 and 2000, life expectancy in the United States increased over thirty years.
Although early humans had bodily defenses, their immunities were probably specific to local pathogens, so when our forebears traveled they were exposed to new infectious diseases to which they lacked immunity. Illness was largely a mystery to be accounted for by spirits, gods, evil, and retribution, or, as Euripides wrote in Medea, “A throw of chance—and there goes Death bearing off your child into the unknown.” Nature or the gods were blamed for accidents, plagues, pestilence, famine, and even wars.
The prospects for health and survival brightened a bit as hunting, fishing, and food gathering progressed to include the cultivation of plants and the domestication of animals during the Neolithic era. The Fertile Crescent was the site of the first technology essential to public health. Reliable fire-making techniques made cooked food and heated water possible. The development of pottery in the Neolithic period advanced more hygienic and convenient storage of food and water and the disposal of waste and garbage. Permanent population sites became common, and the overall number of human beings greatly increased. Nonetheless, the length of life that could be expected by any individual changed little, even though many more people were alive. The Bronze Age opened the door to the first real improvements in human longevity, such as the manufacture of metal tools and weapons made of bronze, the rise of urbanization, the specialization of labor, the exploration and colonization of new territories in search of raw materials, and undoubtedly the production of surplus food. These developments provided the social and environmental supports for increases in life expectancy. The Iron Age (around 1200 BC) continued the trend toward permanent settlements, laying the foundation for social organization and advancing agriculture through the use of iron implements.
Imperial Rome at its height brought a relatively high standard of living and health to its more than one million inhabitants. Life expectancy was twenty-five years; however, it is important to note that this number includes a very high infant mortality rate. Those who survived beyond childhood had an average life expectancy of forty.2 But by AD 180 Roman culture began its decline, and nothing remotely like it appeared again until the eighteenth and nineteenth centuries in Europe. Scholars believe there was a dip in life expectancy after the decline of the empire.
As humankind moved from prehistory to the early modern era of Western civilization,3 life continued to be fragile. Infant and childhood mortality and the random and frequent deaths of adults—especially of women during childbirth—from infections, disease, and accidents were the norm. These commonplace and expectable deaths were punctuated by the devastations of plagues, failed harvests, famines, and scourges like scurvy and beriberi that seemingly sprang from nowhere. Typically, epidemics broke out as people became so weakened by hunger and malnutrition that they were predisposed to disease.
Although plague was first recorded in Athens in 430 BC, and five thousand died daily in Rome in an epidemic in the third century AD, the most infamous plague occurred in the late Middle Ages. The Black Death, as the bubonic form of the plague was known, began in Constantinople in 1334 and spread throughout Europe from 1348 until 1351. In fewer than twenty years, it is estimated to have killed perhaps three-quarters of the population of Europe and Asia. It was not until Shibasaburo Kitasato and Alexander Yersin discovered the plague bacterium in 1894 that it became clear that the disease was carried by fleas from rats to humans and proliferated in dirty, garbage-strewn living situations. Plague continued into the nineteenth century in Europe; the last was in India in the early twentieth century, resulting in ten million deaths.
Epidemics continued into the twentieth century. Tuberculosis, known variously as consumption or the white plague, was particularly rampant and deadly. As late as 1930, eighty-eight thousand people in the United States died of tuberculosis. Although Robert Koch, a German physician and bacteriologist, discovered the tubercle bacillus in 1882, the disease was not controlled until the 1940s when streptomycin and other drugs became available.
Many scholars consider smallpox to have been more significant in its effect on populations and political developments than even the Black Death, because it struck all classes of society. Edward Jenner, an English physician, demonstrated how it could be prevented by a vaccination with cowpox virus. His discovery laid the foundation for the sciences of modern immunology and virology as well as the eventual elimination of smallpox. The last outbreak occurred in the 1970s in Somalia, where it was quickly suppressed. In May 1980 the World Health Organization officially declared its global eradication, marking perhaps the world’s greatest public health achievement. Smallpox, a scourge throughout history, is now extinct in nature.
Polio could be the second disease of epidemic proportions to become extinct worldwide. As with smallpox, humans are its only known natural host. During its peak, from 1943 to 1956, polio infected some four hundred thousand Americans and killed about twenty-two thousand. The disease began to decline rapidly in the United States in 1955, after a mass immunization program with the Salk vaccine, followed by the Sabin vaccine in 1961.
Influenza, an old enemy, made its most spectacular showing between 1917 and 1919, killing at least twenty-one million worldwide and infecting half the world’s population. Twice as many people perished in a few months’ time as had been killed in World War I, with people dying faster than from any other disease. Half a million died in the United States. Eventually, the pandemic ended, probably because most survivors developed antibodies, producing what might be called a herd immunity and leaving few people for the virus to attack. In assessing the impact of disease over the centuries, or even over the first half of the twentieth century, it is important to remember that death was not the only consequence. Permanent disability was widespread in children who survived the so-called children’s diseases, such as whooping cough and German measles, and disability hastened death.
In addition to the physical trauma, one can only imagine the emotional anxiety and fear that families felt, especially for their children, when diseases of mysterious origin, and for which no treatment was known, struck at random or with chilling predictability during epidemics. The life of a child was precarious; that of an adult only somewhat less so. Nonetheless, the Industrial Revolution and the wealth it generated brought significant increases in longevity.4 Beginning in the middle of the eighteenth century in England, Europe was transformed from a rural, agricultural, and handicraft economy to one dominated by mass production of manufactured goods, improved agriculture, and wider distribution. It became possible to feed a much larger population, many of whom were now working in urban areas. The significance of this transformation cannot be overstated. Robert Fogel, economist and Nobel Prize winner, estimates that prior to the Industrial Revolution in France and England, about one-fifth of working class people had a calorie intake that was inadequate to sustain them and that during the eighteenth and even the nineteenth century there was widespread and chronic malnutrition.5 Fogel notes that the increase of longevity and stature6 over the past two hundred years was due to the availability of more food, and he introduced the concept of “technophysio evolution” to describe these changes.7
In response to epidemics of yellow fever, cholera, smallpox, typhoid, and typhus, communities began to recognize the benefits of organized efforts to address health issues. Predating the germ theory of disease, social reformers, motivated by moral concern, contributed critically to public health measures.8 In 1866, New York created the first state health department, with local boards in each town mandated to monitor serious health problems and attend to unsanitary living conditions. Other states followed, and organized public health efforts began. Another element that contributed to the Longevity Revolution really began in the bedrooms of Europe (first, notably, in France) in the nineteenth century, when couples started to limit the number of children they conceived by purposefully abstaining from sexual inter- course in order to save the women from dying in childbirth. The resulting decline in birthrates produced two very important changes: the proportion of older persons and other age groups in the population increased and the longer time between births as well as fewer pregnancies per woman contributed to increased health and survival of both infants and mothers.
Meanwhile, by the beginning of the nineteenth century, another major element of the Longevity Revolution was taking shape, namely a revolution in medical science. In 1846, the general structure of the human body was almost fully known when an American surgeon, William Morton, opened the way to the field of surgery by using ether as a general anesthetic. At the same time, it was becoming clear that specific organisms caused infections. John Snow, a physician, unraveled the basis of contagion when he demonstrated that contaminated water flowing through a Broad Street pump was the cause of the 1866 London cholera epidemic. Subsequent improvements in water supply and sewage systems reduced both water-borne and food-borne diseases. The real breakthrough came in the late nineteenth century when Louis Pasteur and Robert Koch developed and demonstrated the germ theory of disease, beginning a dramatic decline in death rates. With this discovery, health professionals and the general public finally understood how some diseases and infections were communicated. The work of Pasteur, a French chemist, led directly to pasteurization and the protection of millions of children and adults from disease transmitted through milk. At about the same time, Koch developed ingenious techniques for the study of bacteria that are still in use today. He established criteria, referred to as Koch’s postulates, for proving the bacterial cause of a disease. In the process he discovered the microorganism causing tuberculosis as well as those causing wound infections and Asiatic cholera.
By the end of the nineteenth century, proof of the germ theory of disease began to transform medical care and hospital practices. Decades earlier, Hungarian physician Ignaz Semmelweiss was driven to insanity and suicide after his peers ridiculed his pioneering belief that midwives and other medical personnel delivering babies should thoroughly wash their hands and wear clean clothes to prevent “childbirth fever” (puerperal fever),9 a major cause of death among women giving birth. Skeptics were unconvinced even by the evidence of greatly reduced deaths from infection in the Viennese hospital where Semmelweiss worked.
In 1865, the very year of Semmelweiss’s death, English surgeon Joseph Lister demonstrated that heat sterilization of surgical instruments and the use of antiseptic agents on wounds could dramatically reduce infection. Cleanliness during childbirth and in medical care in general was adopted by the 1890s, unfortunately too late for Semmelweiss to know that he had been vindicated.
The developing science of endocrinology, the study of the body’s hormonal system, also brought dramatic changes, exemplified by the important discovery of insulin by Frederick Banting and Charles Best, both Canadians, in 1921. Practically overnight, diabetics were saved from almost certain death and given the prospect of reasonably long and healthy lives.
Other drug discoveries led to seemingly miraculous cures, and Alexander Fleming’s discovery of penicillin in 1928 was perhaps the most dramatic of all. Prior to penicillin, even minor cuts and bruises could have dire consequences. Many Americans past the age of sixty still clearly recall the days of childhood ear infections and other miseries that quickly became curable when penicillin came into general use in the 1940s. Young military personnel in World War II were among the first to benefit from its lifesaving effects.
The Industrial Revolution changed the world and touched nearly every aspect of life—from the social and cultural to the political, economic, and ecological—much of it without precedent. Wealth became more widely distributed and contributed to the growth of the middle classes, the famous bourgeoisie.11 Political power reflected the shift in wealth and the needs of an industrializing society. Cities grew and workers organized, spawning labor movements and universal public education. The economic foundation for the modern Western welfare state was laid, and it, in turn, contributed to longevity. It is sheer foolishness to imagine that we can extend life or sustain complex modern societies without substantial governmental participation. Many European countries realized the necessity of developing protections as well as social programs and services for the workers who were crowding urban areas. Laissez-faire capitalism slowly gave way to welfare capitalism. Worker compensation and safety were promoted in Germany, Austria, and Great Britain in the late 1880s, and by 1920 most of the United States had passed some form of relevant legislation. Unemployment benefits were made available in Europe in the latter part of the nineteenth and the first part of the twentieth century but were not legislated in the United States until the Social Security Act of 1935.
THE MODERN WELFARE STATE
In many countries the underlying purpose of social programs is not to be humane or altruistic. It is understood that death, accidents, and disease and their timing are unpredictable. Moreover, it is implied, if not always directly stated, that social supports and a floor of security for all are necessary to produce a healthy, educated, and productive workforce capable of maintaining economic productivity and buying power, and of making any necessary adjustments to changing economic conditions and technology. And as Otto von Bismarck so shrewdly recognized in the late nineteenth century, they protect against civil unrest and are potent vote-getters in democratic societies. Eventually, this long-term and essentially enlightened self-interest may penetrate the general American consciousness.
The success of post-1945 European welfare was due to socialdemocratic principles and policies and, interestingly, to the success of capitalism. Although the socialism and its electoral successes helped regulate capitalism humanely,12 prosperity after World War II made implementation of social protections possible. Thus, although it was the strength of capitalism, not its failure, that brought us the welfare state, its failure could bring down the infrastructure of social support, as unemployment and dwindling prosperity put the taxation base of social protections at risk, ultimately jeopardizing our increased longevity.
At present, the industrialized world appears to be moving toward a five-legged stool of support for social protections. First, much more is expected of individuals. They are being called upon to alter measurably their health habits and to prudently save for the future. Second, the family is expected to take on more caregiving responsibilities, at the same time as some nations have instituted pro-family policies that provide family assistance and respite programs. Third, in the United States the civil society continues its important role and is continuing to grow. Philanthropy and volunteerism are gaining more support by the community at large. Fourth, despite its continuing resistance to them, the business enterprise, too, along with labor, continues to provide some social protections. Finally, the government itself is endeavoring to reduce its vulnerability in times of social and individual financial crises, while retaining at least some measure of responsibility for the health and well-being of the people.
The United States was the twenty-eighth country to adopt a social security system,14 specifically the beginnings of the guarantee of income maintenance in old age. But it was not until 1939 that the United States moved toward a family-oriented, life-course social protection system that eventually included disabled workers and survivors of deceased workers.
THE TWENTIETH AND TWENTY-FIRST CENTURIES
By the mid-twentieth century, the Longevity Revolution spawned what could be called the new longevity (Tables 1.1 and 1.2). Life expectancy after age sixty-five to age ninety tripled between 1940 and 1980. In 1940, only 7 percent of Americans had a chance of living to the age of ninety; by 1980 that percentage had risen to 24 percent. People over eighty are usually referred to as the fastest-growing age group of significant size, growing at 3.8 percent annually. In fact, centenarians are the most rapidly increasing age group. There are some seventy-two thousand centenarians today, and one Census Bureau projection predicts nearly one million by the middle of the twenty-first century! And we’re not only living longer, we’re living better. There has been a 60 percent drop in deaths from cardiovascular disease and stroke since 1950, as well as significant decreases in disability rates.15 From 2000 to 2001 alone there was an overall decline in deaths of 1.7 percent, with a decline of 4.9 percent in those resulting from stroke and 3.8 percent from heart disease.
In the 1990s, the United States was in a “population lull,” with a relatively small number of persons in their fifties and early sixties waiting to move into old age. This was because older women had relatively few children.16 Influenced by the difficult years of the Great Depression in the 1930s, these older persons limited the number of children they produced—20 percent of women now over seventy-five had had no babies, and 20 percent had only one. But then came the baby boom generation, born after World War II. Now middle-aged, this generation will soon begin reaching old age, conventionally defined as beginning at sixty-five. (There is, however, nothing magical or scientific about this or any other number in defining old age.)
In the twentieth century we were offered realistic opportunities for health promotion and disease prevention through public health measures, healthy lifestyles, education, rising wealth, and workplace regulation, in addition to application of new knowledge, such as in understanding hypertension and atherosclerosis. The advent of possible means to delay aging and extend longevity and the growing encouragement of health promotion/disease prevention converge to offer a strategy that could be adapted by individuals and by society in the twenty-first century. Progress made thus far creates rising expectations of still more profound advances in medical technology and in life expectancy. Coronary bypass grafts, balloon angiography with stents, drugs that lower blood pressure and cholesterol, knee and hip replacements, and cataract extractions with lens implants simply whet the appetite. Indeed, some believe that humans can master their evolution. Among them is Aubrey de Grey of Cambridge University, who suggests a life expectancy of five thousand years by 2100.17 The philosopher John Harris of Oxford views extraordinary longevity from another perspective when he considers the possibility of immortality and its consequences for humankind.18
A NEW SENSE OF THE LIFE CYCLE
For the first time in recorded history we are beginning to see the entire life cycle unfolding for a majority of the population in developed nations. Infancy, childhood, adolescence, early adulthood, middle age, and old age have become expectable stages in the lives of nearly all. The special characteristics, responsibilities, and needs of each stage, as well as the transitions that carry individuals from one stage to the next, are revealing themselves. Names have evolved. The term “adolescence”was introduced in 1904 by G. Stanley Hall, “teenager” in the United States in the 1920s, and “youth” as a special appellation in the 1960s by Kenneth Kenniston. “Empty nest” was coined in the 1960s to refer to the period somewhere in parents’ midlife when children grow up and leave home. Also, from the 1960s, the term “prime of life generation”was applied to those fifty to sixty-four. “Young-old” and “old-old,” terms originally intended by gerontologist Bernice Neugarten to differentiate among healthy and less well-functioning older persons, gradually evolved in common usage to mean persons sixty-five to seventy-five (the young-old) and seventy-five and above (the old-old).
We are beginning to get a grasp of the nature and complexities of old age itself. It is already clear that it is not a fixed and unchangeable condition. Nor are older people a homogeneous group. In fact, there is increasing variability among people as they grow older. Children are much more like one another than are “the elderly.” This tremendous variability among the old is a function of the combination of genetics, health habits, health care, life styles, personality, personal history, occupation, chance, and, in some measure, luck or the lack thereof.
The concept of old age itself is undergoing constant redefinition; it is not the same as it was, nor will it be. As an example, the last third of life has typically been equated with decline and illness. But the aging population in the past few decades is increasingly represented by vigorous, robust older people. There has been a growing, “active” life expectancy with a reduction of disability. To be sure, this is not yet universal, but it is a portent of the future as we now see it. Already, over half of the “oldest old,” the eighty-five-plus group, report no significant physical disability whatsoever. They can go about their everyday activities without any personal assistance.19 Illness and disability rates among older people declined by 5 percent between 1982 and 1989, according to Manton et al. (1993), who used data from the National Long Term Care Survey.20
There has been a modest decline in the life expectancy differential for men and women. Two new factors have become influential: men’s survival rates are increasing somewhat more rapidly than women’s, probably reflecting a decrease in deaths from heart disease, and lung cancer rates are increasing faster for women, reflecting the greater numbers of women who began to smoke some thirty and forty years earlier. However, around the globe, women still live longer than men except in ten countries including Pakistan and Bangladesh, in part due to female infanticide. (See Table 1.3.)
Gender differences in life expectancy disappear at age 105, at which point both men and women have an equal chance of living on. The apparent reason is that those sturdy men who managed to reach 105 represent the fittest (and perhaps the luckiest) of their sex and thereby have outlived the disadvantages of being male. Eighty-five percent of centenarians are women, but the men in that group are in better shape physically and cognitively.
THE IMPACT OF RACE AND CLASS ON LONGEVITY
Social class (measured by income and education) is a major factor in how long we can expect to live. Urban, middle-class, salaried individuals have the longest life expectancy. Many more whites fall into this category than do persons of minority ethnic and racial backgrounds. Differences in life expectancy between blacks and whites narrowed from 7.6 years in 1970 to 5.6 years in 1983 and 1984. Since that time the difference has begun to widen again. In 1984, black life expectancy began declining steadily for the first time in eighty years. The gap in life expectancy remains significant across racial lines. In 2003, life expectancy for blacks (at birth) was 72.2 years (the average for males and females combined), compared to 77.7 years for whites (at birth).
Black men, on average, will not live to retirement. The mortality rate of a black man in Harlem is currently higher than that of a man in Bangladesh, one of the world’s poorest nations. In Bangladesh, 55 percent of men live to age sixty-five, while in Harlem, only 40 percent do. The main causes of such “excess mortality” (a statistical term that hardly conveys the human dimension of the situation) are cardiovascular disease, cirrhosis of the liver, homicide, tumors, and drug dependency. Violence (a homicide rate six times greater among blacks than whites), substance abuse, AIDS, and poor health care underlie this shocking state of affairs.
Overall, blacks are disproportionately among the 47 million Americans who do not have health insurance coverage and who therefore often do not receive health care until they develop some acute problem that brings them to an emergency room. This means that much injury and disease that could be treated easily in early stages becomes full blown before treatment is given. One study reported in the International Journal of Epidemiology found that blacks, who number only 13 percent of the population, account for nearly 80 percent of what are considered “premature deaths” in the United States. In that study, deaths occurred between the ages of fifteen and forty-four from disorders that are normally not fatal if treated early, such as appendicitis, asthma, bladder infections, and pneumonia.
THE LONGEVITY REVOLUTION IS WORLDWIDE
Growing longevity is a global geopolitical force. Japan took only twenty-four years to become an “aged society” (defined by the UN as one in which more than 14 percent of a population is over sixty). This is a faster pace than that in either Europe or the U.S. For instance, it took Germany forty-five and France 130 years to accomplish what Japan achieved.
As developing countries evolve, they will experience population shifts unique in the history of national development. In the past, countries have typically seen their first great gains in life expectancy occurring from improvements in infant, child, and maternal mortality, as well as from the control of infection and disease in the general population. For example, water purification and vaccination of children have been inexpensive and effective. And with the recent gains in prevention, treatment, and control of the illnesses of older people, the developing world will witness the increased survival of at least a proportion of the old who have access to such care at the same time that survival among the young is improving. We anticipate seeing simultaneous survival booms among the young and the old. By 2025, it is anticipated that 80 percent of all persons over sixty-five will be living in the developing world.
With the Longevity Revolution, the world enters a new and unprecedented stage of human development—the impact of which has been made greater because of its rapidity. We are no longer limited to a life view that must accommodate itself to the historic brevity of life, to random and premature illness and death, as Thomas Hobbes described it. The Longevity Revolution is a great intellectual and social as well as medical achievement and an opportunity that demands changes in outmoded mind-sets, attitudes, and socioeconomic arrangements.
Many of our economic, political, ethical, health, and other institutions, such as education and work life, have been rendered obsolete by the added years of life for so many citizens. The social construct of old age, even the inner life and the activities of older persons, is now subject to a positive revision.
As Thomas Jefferson said, “Our laws and institutions must move forward hand in hand with the progress of the human mind.” Just as societies have not yet adapted fully to the Industrial Revolution, with its pollution, unemployment, displacement, and other unsolved social and environmental issues, so too we have not yet fully adapted to the Longevity Revolution. But the revolution is fully launched. Adjustments are well under way. What will the twenty-first century bring?
From the book The Longevity Revolution by Robert Butler M.D. Excerpted by arrangement with PublicAffairs (www.publicaffairs.com), a member of the Perseus Books Group. Copyright © 2008.