Return to Emily Friedman home page
Originally published in Hospitals & Health Networks OnLine, June 2, 2011
The debate over providing health care for an aging population is at a tipping point. Where do we go from here?
|"Do not go gentle into that good night,
Old age should burn and rage at close of day;
Rage, rage against the dying of the light."
|— Dylan Thomas, on his father's dying|
In the past couple of years, one could almost conclude that it's open season on older Americans. During the health care reform debate, many advocates seemed to delight in terrorizing Medicare beneficiaries by warning that if the bill passed, they would lose their coverage (this came largely from Medicare Advantage plans), or their care would be "rationed" by the panel established by the law to rein in future Medicare spending, or they would simply be dispatched by some mythical "death panel." This last accusation stemmed from a simple (and, to my mind, completely appropriate) proposal that physicians be paid the princely sum of $50 to discuss advance directives with patients, which most physicians do anyway for free. How that got twisted into a purported grand plan to kill Granny is beyond me. It was removed from the bill, in any case.
At about the same time, in 2009, Ezekiel Emanuel, M.D., who until recently was advising the Obama administration on ethics issues, published, along with two colleagues, a proposal that scarce health care resources be allocated on the basis of a "complete lives" system, which would give priority to younger patients, because older people had lived longer (Persad, G., Wertheimer, A., and Emanuel, E. "Principles for allocation of scarce medical interventions." The Lancet, Jan. 31, 2009, pp. 423-431).
This followed Emanuel's and Wertheimer's controversial 2006 proposal that, in the event of an influenza pandemic and an insufficient supply of vaccine, priority should be given to people aged 13 to 40 (Emanuel, E., and Wertheimer, A. "Who should get influenza vaccine when not all can?" Science, May 12, 2006, pp. 854-855). Not surprisingly, critics dubbed Emanuel "Obama's rationer-in-chief."
And earlier this year, it was reported that the United Network for Organ Sharing, the nation's coordinator of distribution for most donated organs, was considering giving priority for kidney transplants to younger, healthier patients, in order to maximize the number of years each transplanted kidney would last. This would be a switch from the current practice of allocating kidneys largely on the basis of who had waited longest for a transplant. The proposal divided the bioethics community, with some ethicists endorsing the idea and others calling it age discrimination. A decision is likely forthcoming next year.
We seem to have come a long way from the wave of reverence for "the greatest generation." Now we instead hear rumblings about old folks living longer than they should and spending the boomers' expected inheritances on themselves.
None of this is new, of course. In 1987, renowned ethicist Daniel Callahan, Ph.D., published a book entitled Setting Limits: Medical Goals in an Aging Society (Simon & Schuster), in which he proposed that beyond a given age, people should not receive curative treatment, but only palliative care. He reasoned that by a certain age — at the time, he referred to patients in their 70s — most people have lived out their "biographical" lives, and extending their "biological" lives through massive expenditures was not justifiable.
His idea provoked a firestorm. He ended up defending it on television talk shows and at conferences, and over the course of time I noticed two things: He began to raise the age at which the proposal should kick in, and he said he was only talking about public expenditures (Medicare and Medicaid). People with means of their own, he said, should be able to purchase whatever they wanted. That didn't sit too well, either.
In 1990, after acknowledging that his age-based idea was wildly unpopular, Callahan issued an elegant proposal in his book (Simon & Schuster). He envisioned a health care system that saw itself as part of a larger, "good" society, a system that "would be less prone to think only of its own needs, or to forget that health is only a means to the living of a life, not its goal." He asked for "a system that guaranteed a minimally decent level of health care for all, in turn asking each of us to rein in our private demands." If both the health care system and individuals could be a little less greedy, he argued, there would be enough for everyone.
(Interestingly, in The New Republic on May 19, 2011, Callahan and Sherwin Nuland published an article entitled "The Quagmire," in which they proposed that people over the age of 80 should have the lowest priority for curative medical treatment. Both Callahan and Nuland are 80 years old.)
So this is not a new debate. But both politics and demographics have given it something of a new urgency. Although data from the 2010 U.S. Census have not all been released, the trends are clear: By 2050, one in five Americans will be over 65. In 2009, 12.5 percent of us were in that category, and 5.8 percent of us were over 75. Those older than 85, of whom there were perhaps 100,000 in 1900, will number 18.9 million by 2050. The fastest-growing group of Americans, by age, is people over 100, of whom there could be nearly 1 million by 2050. Meanwhile, the ratio of working-age Americans to older, non-working ones continues to shrink, from 14:1 in 1900 to a projected 3:1 in 2050, raising concerns about how public services for seniors will be funded.
There are many things to worry about here, including the fact that 10 percent of all people who are 65 or older live in poverty (the figures are much higher for minorities, women and those living alone); the disproportionate number of vulnerable elderly women who are living alone due to divorce, widowhood or never having married; and the effect of the financial crash on fragile retirement funds.
But the chief issue for those involved in the debate over what to do about Granny is the cost of her health care. It will surprise no one that the average health care expenditure for a person over 64 is nearly six times that for a child under the age of 5. It is also now axiomatic that 5 percent of the population accounts for 49 percent of all health care expenses — and of that group, 61 percent are over the age of 55. Medicare is predicted to go broke in 13 years, and the average 65-year-old can expect to spend $197,000 out of his or her own pocket on health care for the remainder of his or her life (that's excluding nursing home care). A recent Dartmouth Atlas Project research report found that although chronically ill Medicare beneficiaries spend less time in the hospital than previously, they receive more extensive and aggressive treatment when they are hospitalized.
Many, many years ago, when I was a book editor at the American Hospital Association, one of my projects was Estimated Useful Lives of Depreciable Hospital Assets (available in a recently revised edition from Health Forum). Moby Dick it isn't — not that compelling a read — but it is a classic and necessary resource for health care financial managers. Part of the publisher's description of the current edition reads, "Every capital asset has a limited useful service life, which must be determined and recorded to practically manage the capital planning process as well as meet government regulations. This updated 2008 edition defines the productive period of time [that] typical health care capital assets have before they become technically or commercially obsolete."
As I watched the debate over the Patient Protection and Affordable Care Act get increasingly ugly, and as I watch the current Congressional battle over what to do about Medicare, I have thought often about that book and its thesis. Should we consider human beings depreciable? If people are unproductive, or not productive enough, or are not contributing to the GDP, should we just cut them off? Can't human beings become "technically or commercially obsolete"? A diabetic 85-year-old woman with congestive heart failure is a drain on the system; why should we keep expending resources on her? Put her in hospice, and that's that. She's taking money away from our grandchildren!
There is precedent, after all. Although Germany, at the beginning of the 20th century, had some of the best psychiatric hospitals in the world, many patients with mental disorders were placed at the bottom of World War I rationing priority lists for food and medical care, and as a result starved to death or died of treatable disease.
The Nazis continued this pattern; beginning in 1939, there was widespread murder of psychiatric patients. By the end of the war, the number of surviving patients in psychiatric hospitals in Berlin alone was 25 percent of the total a few years before. The term "useless eaters" emerged, used to condemn persons with disabilities, the mentally ill, children with developmental disabilities or congenital deformities, and amputee veterans of World War I. Thousands were killed directly or starved to death.
Theirs were not considered to be "useful lives." Indeed, in 1920, German intellectuals Karl Binding and Alfred Hoche published a book in which they promoted the idea of "lives not worthy of life," which was eagerly embraced by the Third Reich a few years later.
"We would never do that in this country!" we insist. Yet we allow many low-income, Medicaid-dependent, very ill seniors to languish in substandard nursing homes or to receive home care from questionable agencies. A physician friend of mine, whose practice encompassed large numbers of older, chronically ill African-American women, was told to get lost by the medical center where he worked, because his patient population wasn't lucrative enough.
Another friend of mine, watching her mother die slowly and horribly from cancer, confided one day that putting her sick dog down had seemed so humane, and she wondered why we couldn't do it with people. Yet another friend, who is the primary caregiver for a chronically ill, profoundly abusive father who returns her kindness with insults and taunts, admitted to me that sometimes she hopes he would just die. A man I know, who was in his 60s at the time, had to put his demented father into a nursing home at the same time that he was nursing his wife, who had Alzheimer's. Exhausted and depressed, he looked at me one day and said, "People live too long."
Maybe some people do, and quality of life may not be exactly wonderful for more than a few elderly people. There's poverty, loneliness, chronic and acute disease, the endless string of medical appointments, and the fear of disability and dementia.
But, as was argued during the great debate over the state of Oregon's Medicaid rationing program in the early 1990s (which originally ranked disabled people as a low priority), many people with disabilities, and even some with dementia, don't think their quality of life is all that terrible. When my friend whose wife had Alzheimer's finally had to institutionalize her, she had a blast — dancing with fellow patients and attending social events at the nursing home. She even came to one of my birthday parties. I looked at her husband and said, "She hates coming to my parties." He smiled and said, "She forgot that she does."
And the wife of a colleague of mine, struck by an extremely rare and difficult-to-treat cancer, and suffering mightily from the results of radiation and chemotherapy, finally said, "The heck with it," and opted for palliation and living her life as fully as she could. She is amazing everyone — playing 18 holes of golf, spending time with her grandchildren and living actively. She's in pain, she tires easily, and she has a host of frustrating problems, but I doubt she believes that her life is useless.
Nobel laureate René DuBos wrote in The Mirage of Health in 1959, "Complete and lasting freedom from disease is but a dream remembered from imaginings of a Garden of Eden designed for the welfare of man." He went on to warn us that "It is legitimate to hope that with enough regulations, restrictions, and injections, man could achieve effective control over most of his fatal diseases. But too often the goal of the planners is a universal gray state of health corresponding to absence of disease rather than to a positive attribute conducive to joyful and creative living."
DuBos, Callahan and others are making the same point: Quality of life is what matters — and the quality of a life is usually best judged by the person whose life it is. And sometimes the health care system, far from helping, makes things worse with its laser-like focus on cure, and its all-too-common abandonment of patients who aren't going to get better, especially if they are older. Unfortunately, that sometimes happens only after every last possible personal and public dollar has been extracted. Providers sometimes lose sight of the fact that being elderly is not a medical condition that can be addressed only by clinical means.
So what is at the root of all this fighting over the social value of older people? Is it really "saving" Medicare or protecting our grandchildren from those greedy elders? I don't think so. I think it's common, everyday fear. Fear of being old in the country of the young, where billions of dollars are spent by aging people seeking to look younger than they are. Fear of being that demented old lady who hounds you on the street or the subway. Fear of being in a wheelchair or forced to use a walker or becoming homebound. Fear of being alone and vulnerable. Fear of being a "senior," despite all the nice discounts.
And yes, there is a fear of being useless, of not being needed. I remember visiting a religious congregation in the Midwest where every retired sister, no matter how old or frail, had a job. Among those who had interesting things to do were the quilting sisters, who made beautiful quilts that were sold to help support the congregation, and the praying sisters, who would pray for those who had requested help. Did that make them "useful" enough? Were they contributing enough to justify their existence?
"Take away a person's job and you take away his identity," my friend John says.
Most of us will have to live with a chronic condition at some time in our lives. And most of us, if we're lucky, will grow old. And all of the Touch of Gray and cosmetic surgery and workouts at the gym won't change that basic equation. They might make us feel better, and taking care of our bodies and minds will probably keep us healthier longer, but they won't make us young again.
I also think it should be pointed out that most elderly people live productive, contributing lives; they just might not be making anybody rich. That doesn't hit me as a very appealing standard for a meaningful life. My friend Edie, who just turned 92, is still a popular, happy member of the folk music scene in the town where she lives. B.B. King, the Energizer bunny of American blues, is still active and performing at 86, despite being diabetic. As Suzy Farren just described in her excellent article in the May-June issue of Health Progress, Mother Marianne Cope, the heroine of Molokai who worked with Saint Damien to care for exiled Hawaiians with Hansen's disease, carried on Damien's work for 29 years after his death, and was still at it when she died at the age of 80 (Farren, S. "The sisters knew — a child needs a home." Health Progress, May-June 2011, pp. 15-19).
Duke Ellington was still active when he died at 75. Albert Einstein promulgated the unified field theory when he was 71. Mother Teresa was pursuing her mission with the poor until her death at 87. My father, a pathologist, was working until a few days before his death at 90. The same was true of the great labor and health care economist Eli Ginzberg, who died at 91. And Sally Gordon, at 101, is still assistant sergeant at arms for the Nebraska legislature. "I used to be a model," she told a reporter. "Now I feel like a Model T." She says she has no plans to retire.
Most of us may not be so lucky; we may not spend our lives with our bodies and faculties intact, and we may not be able to do what we were once able to accomplish. But that doesn't mean that those who have lived long have nothing to give, or that their lives are not important, even if they have significant health problems.
Albert Sabin, who invented the oral polio vaccine, ironically was struck by paralysis when he was in his 70s, and, although he admitted that he had prayed for death because of terrible pain, he recovered, and, at a birthday party, offered a simple and heartfelt toast: "May we all be healthy when we die."
Let's hope we are. But if we aren't, I don't think we should be serving as political footballs for people who are unwilling to do what is really needed: promulgation of proper standards for care of older, sicker patients; affordable hospice and palliative care available to all; wider use of advance directives ("death panels" or not); the end of unnecessary and/or overly aggressive care of people at the end of their lives (and that includes defrauding of public and private insurers); treatment of depression, which is almost epidemic among older people; a true continuum of care that is appropriate, sensitive, and includes both clinical and social needs; far more support for the family and friend caregivers who are saving this society's bacon through their love and sacrifice; promotion of health-enhancing lifestyles at all ages; and, as Daniel Callahan has also proposed, a refusal to abandon those who cannot be cured.
I would add that respect for wisdom and experience, true reaching-out between generations and an understanding that maybe the old farts have learned a thing or two along the way would also work wonders. We spend so much time reinventing the wheel — in policymaking, in economic thought, in designing all forms of infrastructure — and it's such a waste of time and money. One of the reasons I don't like term limits for elected officials — and please remember that I live in one of the most politically corrupt cities and states in the country — is that term limits force us to lose the institutional memory of veteran policymakers, who can inform the political process. Maybe asking people who have been down the road before could be — pardon the term — more productive.
As genetic and genomic research proceeds, it is quite likely that within a relatively short time, we will be able to predict, at birth, what a person's health will be like, whether he or she will have chronic disease, and what he or she is likely to die from. Just last month, it was announced that with a relatively simple set of questions, parents can likely ascertain if their infant is autistic. We are on the verge of being able to predict accurately who will come down with Alzheimer's.
So shall we just kill those who aren't likely to measure up at birth, or shall we try to create a society and a health care system that welcome all and do their best to ensure that all lives are meaningful, especially as judged by the people who are living them?
Copyright © 2011 by Emily Friedman. All rights reserved.
Emily Friedman is an independent writer, speaker and health policy and ethics analyst based in Chicago. She is also a regular contributor to H&HN Daily and a member of the Center for Healthcare Governance's Speakers Express service.
The opinions expressed by authors do not necessarily reflect the policy of Health Forum Inc. or the American Hospital Association.
First published in Hospitals & Health Networks OnLine, June 2, 2011
GIVE US YOUR COMMENTS!
Hospitals & Health Networks welcomes your comment on this article. E-mail your comments to firstname.lastname@example.org, fax them to H&HN Editor at (312) 422-4500, or mail them to Editor, Hospitals & Health Networks, Health Forum, One North Franklin, Chicago, IL 60606.
Return to Emily Friedman home page