Friday, September 19, 2008

What's Race Got to Do With It?

A new PBS documentary series explores the complicated and controversial connection between social inequality and health.

Raina Kelley

Newsweek Web Exclusive

Updated: 5:02 PM ET Apr 11, 2008

We're all the products of our environment and our genes. But when it comes to health, which factor is the trump card? Would a woman with a family propensity for ovarian cancer avoid coming down with the disease if she were raised on a macrobiotic diet in pollution-free rural North Dakota? Or on the flip side, could a white woman adopted from a middle-class family in Idaho into a poor Hispanic family in New York suddenly become vulnerable to diabetes or asthma?

Figuring out how the interplay of race, socioeconomic status, schooling and other environmental factors influences our health is a complicated challenge. But that's what a new four-hour PBS series, "Unnatural Causes: Is Inequality Making Us Sick?" aims to do. The series premiered March 27 and ends April 17 and will soon be released on DVD. Producers Larry Edelman and Llew Smith say it was inspired by a medical mystery they discovered in their earlier documentary, "Race—The Power of an Illusion, "a series that investigated some common myths and misconceptions about race. They found, for example, that African-Americans have some of the highest rates of hypertension in world, which has been linked to heredity. But West Africans, who share many of the same genes as African-Americans, have some of the lowest blood-pressure rates.

In "Unnatural Causes," the filmmakers raise other disturbing conundrums. For example, the United States spends more than twice the average of other industrialized countries on health care and yet does worse than 28 other countries in life expectancy and 29 other countries in infant mortality. Why are children living in poverty about seven times as likely to be in poor or fair health than children living in high-income households? And why has HIV infection doubled in the African-American population but remained stable among whites? According to the film, the answers are at least in part related to economic conditions, "a continuous health gradient tied to wealth. At each step down the socioeconomic ladder—from the rich to the middle class to the poor—people tend to be sicker and die sooner," the narrator says.

But that's not the whole story. African-Americans, Native Americans and Pacific Islanders at all income levels fare worse than their white counterparts. To the filmmakers, that seemed to indicate that institutional racism as well as poverty has an impact on an individual's health—an interesting idea in itself, and a refreshing relief from the patronizing assumption that "some people" don't know how to take care of themselves or don't make the effort. Smith says the film reveals a more complicated picture: "When we looked at populations and entire communities, you begin to see that there are larger forces at work beyond what an individual can control. That led us to the growing research focused on social determinants of health and health equity." The filmmakers offer plenty of background from experts like Angela Glover Blackwell, founder and CEO of PolicyLink, an advocacy organization for economic and social equity, and Ana Diez-Roux, M.D., a University of Michigan epidemiologist specializing in neighborhoods. But the film's power comes not from experts or statistics but stories of real people.

The first segment of the series, "In Sickness and in Wealth," looks at the lives of a CEO, a lab supervisor, a janitor and an unemployed mother to find out how class determines our access to health care. In "When the Bough Breaks," the filmmakers tell the story of Kim Anderson, a successful Atlanta attorney who, despite being healthy and well above the poverty level, delivers a dangerously premature daughter—a disproportionately common outcome for black women no matter their socioeconomic level. In "Becoming American," we meet Amador Bernal, an immigrant from Mexico who earns $9 an hour at a mushroom farm in Kennett Square, Pa. His health is beginning to suffer—but  Bernal has never been to a doctor.

One installment focuses on the Pima and the Tohono O'odham, two Native American tribes in Arizona. A century ago, type 2 diabetes was so rare as to be unheard of in this population. Today, these tribes have the highest rates of the disease in the world: filmmaker James Fortier draws a direct parallel between this fact and the loss of their water rights and farms.

Director Ellie Lee looks at two urban neighborhoods—Richmond, Calif., and Seattle—for a segment called "Place Matters." The Richmond area bears the hallmark lack of access to fresh food and safe streets that defines urban blight. In the Seattle community, leaders and government are working to create an area that promotes the health of its inhabitants. The differences in the residents' futures is stark, says the film. "If you lived in Richmond, you'd be 30 percent more likely to live into old age than if you lived in Seattle. In Richmond, your child would be six times more likely to be hospitalized for asthma than if you lived in Seattle."

The most damning indictment of the U.S. health-care system comes in the last two segments. "Collateral Damage" explores the effect on the lives and health of Marshall Islanders in the South Pacific since the Ronald Reagan Ballistic Missile Testing Site was located there—dislocating thousands of people, destroying their traditional way of life and resulting in a rise in tuberculosis and other diseases encouraged by squalid living conditions. And "Not Just a Paycheck" compares the socio-economic and health repercussions of an Electrolux factory closing in Greenville, Mich., with those in a Swedish community that had endured a similar factory shutdown. Hospital visits in Greenville tripled due to depression, alcoholism and heart disease. In Sweden, there was barely an increase in head colds: citizens there are protected by their country's generous social-welfare programs.

Some of the stories in "Unnatural Causes" are not entirely surprising (especially after Michael Moore's documentary "Sicko"), but they powerfully reinforce the fact that where you live can predict not just how well you live but also how long. According to the producers, more than 120 organizations from The Joint Center for Political and Economic Studies' Health Policy Institute to the Chi Eta Phi nursing sorority have begun to use this film as a teaching curriculum. Once you check out the series, you'll see why.

URL: http://www.newsweek.com/id/131597

©  2008 

Virtual Mentor, Sep #2 - Weighing the Duty to Inform a Patient of Possible Future Illness

American Medical Association Journal of Ethics
September 2008, Volume 10, Number 9: 553-555.
CLINICAL CASE

Commentary by Shannon Sullivan, MD

Mr. Watts went to see Dr. Pass, a specialist in sleep disorders, because of his history of violent behavior during sleep. Although Mr. Watts didn’t have any complaints, his wife was extremely frightened by the episodes of shouting, kicking, and punching that would occur while her husband was sleeping, usually in the early morning hours. After conducting a thorough history and physical exam, Dr. Pass was certain that the diagnosis was idiopathic REM sleep behavior disorder. Although Mr. Watts had always been healthy, Dr. Pass had seen in the literature that, with this diagnosis, Mr. Watts—now 58 years old—had a significant chance of developing a neurodegenerative disease within the next 10-15 years. He wondered whether he should tell Mr. Watts about his risk, given that there was some chance that he would not develop neurodegenerative disease. Whether or not he developed more serious disease later, Mr. Watts, currently an active, working attorney, might experience depression and grief if he were informed of this possibility. To complicate matters even more, there was nothing Mr. Watts or Dr. Pass could do now to prevent or delay onset of the disease.



Commentary

REM sleep behavior disorder (RBD) is a parasomnia that occurs during REM sleep and involves loss of normal REM-related skeletal muscle atonia. It is often associated with motor activity and the acting out of one’s dreams [1]. The condition is more common in men than in women and is often characterized by violent behaviors. Those with RBD have more aggressive dreams than those without the disorder, but this tendency does not carry over into the waking hours.

It has been proposed that many patients with “idiopathic” RBD are actually exhibiting early clinical signs of an evolving neurodegenerative disorder [1]. Current data indicate that approximately two-thirds of men aged 50 and older who are diagnosed with this disorder go on to develop Parkinson’s disease or a linked condition, dementia with Lewy bodies [2]. The average interval between the onset of RBD and the onset of classic Parkinson’s disease is about 13 years, but the time span can vary greatly. Interventions are available to treat the symptomatic manifestations of RBD, but there is no reliable neuroprotective treatment to slow onset or to reduce the risk of Parkinson’s disease. Ongoing research in this area could produce such protection within a decade.

Dr. Pass has a duty to make relevant information available to his patient [3]. This is one application of the principle of beneficence—the notion that the practitioner must act in the best interests of the patient. But this same duty to act in the patient’s best interest also invokes the principle of autonomy, which recognizes that competent individuals have a right to make their own health care decisions. Mr. Watts should be informed that his RBD places him at increased risk for being diagnosed with a neurodegenerative disease in the future because it may be important for planning and making decisions about relationships, retirement, finances, and travel. Although there is no neuroprotective treatment available now, Mr. Watts might benefit from discoveries made between the present and the onset of disease, if indeed he falls ill. Without knowledge of his increased risk, he will not be able to take advantage of new interventions should they become available. In these ways, information about his condition is likely to be relevant to Mr. Watts and should be communicated to him. Dr. Pass may also feel that withholding information about a patient’s health status is dishonest, insofar as omission of important details about his risk for disease is akin to avoiding part of the truth, as it is understood among experts in the medical community.

On the other hand, Dr. Pass must balance the duty to inform his patient with the principle of nonmaleficence, the ethics term for “first, do no harm.” One can argue that imposing on Mr. Watts the psychological burden of knowing that he is at increased risk for a neurodegenerative disease associated with dementia is not justifiable. This may be especially true if Dr. Pass has reasonable clinical suspicion that such knowledge will prompt Mr. Watts to develop depression or exacerbate other conditions, such as anxiety, that will impact his overall well-being. There is, after all, a reasonable chance (about 33 percent) that Mr. Watts will not develop neurodegenerative disease. Even if he is diagnosed in the future, the time until onset may be so long as to minimize the beneficial value of being informed now—a type of “future discounting.” If knowledge of disease risk is disclosed now, Mr. Watts will be burdened by the possibility of neurological deterioration at any time. Dr. Pass may deem these considerations, in combination with the lack of available treatment options, as insufficient justification for imposing potential emotional and psychological distress.

Dr. Pass may harm Mr. Watts to some degree either by informing him of the risk or by withholding the knowledge of future disease. Dr. Pass must weigh his duty to avoid psychologically burdening his patient unnecessarily against his duty to be truthful and forthcoming with important information and Mr. Watts’s right to know relevant medical information that may affect his future.
References
1. Boeve BF, Silber MH, Saper CB, et al. Pathophysiology of REM sleep behaviour disorder and relevance to neurodegenerative disease. Brain. 2007;130(Pt 11):2770-2788.
2. Schenck C. What do parasomnias tell us about the brain? Presented at: Sleep Research Society Trainee Symposia; June 8, 2008; Baltimore MD.
554 Virtual Mentor, September 2008—Vol 10 www.virtualmentor.org
3. American Medical Association. Principles of medical ethics. Code of Medical Ethics. Chicago, IL: American Medical Association; 2008. http://www.ama-assn.org/ama/pub/category/2512.html. Accessed July 30, 2008.
Shannon Sullivan, MD, is a sleep physician at Stanford University Sleep Disorders Clinic in Palo Alto, California. She is also a pediatric pulmonologist and received her medical training at Stanford Medical Center; University of California, San Francisco; and University of Michigan Medical School. Her interests include respiratory physiology, cystic fibrosis, and sleep disorders associated with neurological disease.



The people and events in this case are fictional. Resemblance to real events or to names of people, living or dead, is entirely coincidental.

The viewpoints expressed on this site are those of the authors and do not necessarily reflect the views and policies of the AMA.

Copyright 2008 American Medical Association. All rights reserved.
www.virtualmentor.

Saturday, September 13, 2008

We Fought Cancer…And Cancer Won.

After billions spent on research and decades of hit-or-miss treatments, it's time to rethink the war on cancer.
Sharon Begley
NEWSWEEK
Updated: 1:55 PM ET Sep 6, 2008

There is a blueprint for writing about cancer, one that calls for an uplifting account of, say, a woman whose breast tumor was detected early by one of the mammograms she faithfully had and who remains alive and cancer-free decades later, or the story of a man whose cancer was eradicated by one of the new rock-star therapies that precisely target a molecule that spurs the growth of malignant cells. It invokes Lance Armstrong, who was diagnosed with testicular cancer in 1996 and, after surgery and chemotherapy beat it back, went on to seven straight victories in the Tour de France. It describes how scientists wrestled childhood leukemia into near submission, turning it from a disease that killed 75 percent of the children it struck in the 1970s to one that 73 percent survive today.

But we are going to tell you instead about Robert Mayberry. In 2002 a routine physical found a lesion on his lung, which turned out to be cancer. Surgeons removed the malignancy, which had not spread, and told Mayberry he was cured. "That's how it works with lung cancer," says oncologist Edward Kim of the University of Texas M. D. Anderson Cancer Center in Houston, who treated Mayberry. "We take it out and say, 'You're all set, enjoy the rest of your life,' because really, what else can we do until it comes back?" Two years later it did. The cancerous cells in Mayberry's lung had metastasized to his brain—either after the surgery, since such operations rarely excise every single microscopic cancer cell, or long before, since in some cancers rogue cells break away from the primary tumor as soon as it forms and make their insidious way to distant organs. It's impossible to know. Radiation therapy shrank but did not eliminate the brain tumors. "With that level of metastasis," says Kim, "it's not about cure. It's about just controlling the disease." When new tumors showed up in Mayberry's bones, Kim prescribed Tarceva, one of the new targeted therapies that block a molecule called epidermal growth factor receptor (EGFR) that acts like the antenna from hell: it grabs growth-promoting signals out of the goop surrounding a cancer cell and uses them to stimulate proliferation. Within six months—it was now the autumn of 2005—the tumors receded, and Mayberry, who had been unable to walk when the cancer infiltrated his brainstem and bones, was playing golf again. "I have no idea why Tarceva worked on him," says Kim. "We've given the same drug to patients in the same boat, and had no luck." But the luck ran out. The cancer came back, spreading to Mayberry's bones and liver. He lost his battle last summer.

We tell you about Mayberry because his case sheds light on why cancer is on track to kill 565,650 people in the United States this year—more than 1,500 a day, equivalent to three jumbo jets crashing and killing everyone aboard 365 days a year. First, it shows the disconnect between the bench and the bedside, between what science has discovered about cancer and how doctors treat it. Biologists have known for at least two decades that it is the rare cancer that can be completely cured through surgery. Nevertheless, countless proud surgeons keep assuring countless anxious patients that they "got it all." In Mayberry's case, says Kim, "my gut feeling is that [cells from the original lung tumor] were smoldering in other places the whole time, at levels so low not even a whole-body scan would have revealed them." Yet after surgery and, for some cancers, radiation or chemotherapy, patients are still sent back into the world with no regimen to keep those smoldering cells from igniting into a full-blown metastatic cancer or recurrence of the original cancer. Mayberry's story also shows the limits of "targeted" cancer drugs such as Tarceva, products of the golden age of cancer genetics and molecular biology. As scientists have learned in just the few years since the drugs' introduction, cancer cells are like brilliant military tacticians: when their original route to proliferation and invasion is blocked, they switch to an alternate, marching cruelly through the body without resistance.

We also tell you about Mayberry because of something Boston oncologist (and cancer survivor) Therese Mulvey told us. She has seen real progress in her 19 years in practice, but the upbeat focus on cancer survivors, cancer breakthroughs and miracle drugs bothers her. "The metaphor of fighting cancer implies the possibility of winning," she said after seeing the last of that day's patients one afternoon. "But some people are just not going to be cured. We've made tremendous strides against some cancers, but on others we're stuck, and even our successes buy some people only a little more time before they die of cancer anyway." She pauses, musing on how the uplifting stories and statistics—death rates from female breast cancer have fallen steadily since 1990; fecal occult blood testing and colonoscopy have helped avert some 80,000 deaths from colorectal cancer since 1990—can send the wrong message. "With cancer," says Mulvey, "sometimes death is not optional."

Yet it was supposed to be. In 1971 President Richard Nixon declared war on cancer (though he never used that phrase) in his State of the Union speech, and signed the National Cancer Act to make the "conquest of cancer a national crusade." It was a bold goal, and without it we would have made even less progress. But the scientists and physicians whom Nixon sent into battle have come up short. Rather than being cured, cancer is poised to surpass cardiovascular disease and become America's leading killer. With a new administration taking office in January, and with the new group Stand Up to Cancer raising $100 million (and counting) through its telethon on ABC, CBS and NBC on Sept. 5, there is no better time to rethink the nation's war on cancer.

In 2008, cancer will take the lives of about 230,000 more Americans—69 percent more—than it did in 1971. Of course, since the population is older and 50 percent larger, that raw number is misleading. A fairer way to examine progress is to look at age-adjusted rates. Those statistics are hardly more encouraging. In 1975, the first year for which the National Cancer Institute has solid age-adjusted data, 199 of every 100,000 Americans died of cancer. That rate, mercifully, topped out at 215 in 1991. In 2005 the mortality rate fell to 184 per 100,000, seemingly a real improvement over 1975. But history provides some perspective. Between 1950 and 1967, age-adjusted death rates from cancer in women also fell, from 120 to 109 per 100,000, found an analysis by the American Cancer Society just after Nixon's speech. In percentage terms, the nation made more progress in keeping women, at least, from dying of cancer in those 17 years, when cancer research was little more than a cottage industry propelled by hunches and trial-and-error treatment, as it did in the 30 years starting in 1975, an era of phenomenal advances in molecular biology and genetics. Four decades into the war on cancer, conquest is not on the horizon. As a somber statement on the NCI Web site says, "the biology of the more than 100 types of cancers has proven far more complex than imagined at that time." Oncologists resort to a gallows-humor explanation: "One tumor," says Otis Brawley of the ACS, "is smarter than 100 brilliant cancer scientists."

The meager progress has not been for lack of trying. Since 1971, the federal government, private foundations and companies have spent roughly $200 billion on the quest for cures. That money has bought us an estimated 1.5 million scientific papers, containing an extraordinary amount of knowledge about the basic biology of cancer. It has also brought real progress on a number of fronts, not least the invention of drugs for nausea, bowel problems and other side effects of the disease or treatment. "These have reduced suffering and changed people's ability to live with cancer," says Mulvey. In fact, just a few months after Nixon's call to arms, Bernard Fisher of the University of Pittsburgh began studies that would show that a woman with breast cancer has just as good a chance of survival if she receives a mastectomy rather than have her breast, chest-wall muscles and underarm tissue cut out, the standard at the time. The new approach spared millions of women pain and disfigurement. In 1985, treatment improved again when Fisher showed that lumpectomy followed by radiation to kill lingering cells was just as effective for many women as mastectomy. It wasn't a cure, but it mattered. "One can wait for the home run," says Fisher, now 90, "but sometimes you get runs by hitting singles and doubles. We haven't hit a home run yet; we can't completely prevent or completely cure breast cancer."

Nixon didn't issue his call to arms in order to reduce disfigurement, however. The goal was "to find a cure for cancer." And on that score, there are some bright spots. From 1975 to 2005, death rates from breast cancer fell from 31 to 24 per 100,000 people, due to earlier detection as well as more-effective treatment. Mortality from colorectal cancer fell from 28 to 17 per 100,000 people, due to better chemotherapy and, even more, to screening: when colonoscopy finds precancerous polyps, they can be snipped out before they become malignant.

But progress has been wildly uneven. The death rate from lung cancer rose from 43 to 53 per 100,000 people from 1975 to 2005. The death rate from melanoma rose nearly 30 percent. Liver and bile-duct cancer? The death rate has almost doubled, from 2.8 to 5.3 per 100,000. Pancreatic cancer? Up from 10.7 to 10.8. Perhaps the most sobering statistic has nothing to do with cancer, but with the nation's leading killer, cardiovascular disease. Thanks to a decline in smoking, better ways to control hypertension and cholesterol and better acute care, its age-adjusted mortality has fallen 70 percent in the same period when the overall mortality rate from cancer has fallen 7.5 percent. No wonder cancer "is commonly viewed as, at best, minimally controlled by modern medicine, especially when compared with other major diseases," wrote Harold Varmus, former director of NCI and now president of Memorial Sloan-Kettering Cancer Center in New York, in 2006.

About all scientists knew about cancer 50 years ago was that cancer cells make copies of their DNA and then of themselves more rapidly than most normal cells do. In the 1940s, Sidney Farber, a Boston oncologist, intuited that since cells need a biochemical called folate to synthesize new DNA, an anti-folate might impede this synthesis. After a friend at a chemical company synthesized an anti-folate—it was named methotrexate—Farber gave it to cancer patients, sending some into short-term remission, he reported in 1948. (Two years earlier, scientists had serendipitously discovered that mustard gas, a chemical weapon, could reduce tumors in patients with non-Hodgkin's lymphoma, but no one had any idea how it worked.) Thus was born the era of chemotherapy, one that continues today. It is still based on the simple notion that disrupting DNA replication and cell division will halt cancer. Soon there would be dozens of chemo drugs that target one or more of the steps leading to cell proliferation. Almost all of those approved in the 1970s, 1980s and 1990s were the intellectual descendants of Farber's strategy of stopping cancer cells from making copies of their DNA, and then themselves, by throwing a biochemical wrench into any of the steps involved in those processes. And none of it had anything to do with understanding why cancer cells were demons of proliferation. "The clinical-research community was expending enormous effort mixing and matching chemotherapy drugs," recalls Dennis Slamon, who began a fellowship in oncology at UCLA in 1979 and is now director of clinical/translational research at the Jonsson Cancer Center there. "There was nothing coming out of the basic science that could help" patients.

In the high-powered labs funded by the war on cancer, molecular biologists thought they could change that. By discovering how genetic and other changes let cancer cells multiply like frisky rabbits, they reasoned, they could find ways to stop the revved-up replication at its source. That promised to be more effective, and easier on healthy cells than chemotherapy drugs, which also kill normal dividing cells, notably in the gut, bone marrow, mouth and hair follicles. In the 1970s, cancer scientists discovered cancer viruses that alter DNA in animals, and for a while the idea that viruses cause cancer in people, too, was all the rage. (The human papilloma virus causes cervical cancer, but other human cancers have nothing to do with viruses, it would turn out.) In the 1970s and 1980s they discovered human genes that, when mutated, trigger or promote cancer, as well as tumor suppressor genes that, when healthy, do as their name implies but when damaged release the brakes on pathways leading to cancer.

It made for a lot of elegant science and important research papers. But it "all seemed to have little or no impact on the methods used by clinicians to diagnose and treat cancers," wrote Varmus. Basic-science studies of the mechanisms leading to cancer and efforts to control cancer, he observed, "often seemed to inhabit separate worlds." Indeed, it is possible (and common) for cancer researchers to achieve extraordinary acclaim and success, measured by grants, awards, professorships and papers in leading journals, without ever helping a single patient gain a single extra day of life. There is no pressure within science to make that happen. It is no coincidence that the ratio of useful therapy per basic discovery is abysmal. For other diseases, about 20 percent of new compounds arising from basic biological discoveries are eventually approved as new drugs by the FDA. For cancer, only 8 percent are.

A widely discussed 2004 article in Fortune magazine ("Why We're Losing the War on Cancer") laid the blame for this at the little pawed feet of lab mice and rats, and indeed there is a lot to criticize about animal studies. The basic approach, beginning in the 1970s, was to grow human cancer cells in a lab dish, transplant them into a mouse whose immune system had been tweaked to not reject them, throw experimental drugs at them and see what happened. Unfortunately, few of the successes in mice are relevant to people. "Animals don't reflect the reality of cancer in humans," says Fran Visco, who was diagnosed with breast cancer in 1987 and four years later founded the National Breast Cancer Coalition, an advocacy group. "We cure cancer in animals all the time, but not in people." Even scientists who have used animal models to make signal contributions to cancer treatment agree. "Far more than anything else," says Robert Weinberg of MIT, the lack of good animal models "has become the rate-limiting step in cancer research."

For this story, NEWSWEEK combed through three decades of high-profile successes in mice for clues to why the mice lived and the people died. Two examples make the point. Scientists were tremendously excited when Weinberg and colleagues discovered the first cancer-causing gene (called ras) in humans, in 1982. It seemed obvious that preventing ras from functioning should roll back cancer. In this decade, scientists therefore began testing drugs, called FTIs, that do exactly that. When FTIs were tested on human cancers that had been implanted into mice, they beat back the cancer. But in people, the drugs failed. One reason, scientists suspect, is that the transplanted cancers came from tumors that had been growing in lab dishes for years, long enough to accumulate countless malignant genes in addition to ras. Disabling ras but leaving those other mutations free to stoke proliferation was like using a sniper to pick off one soldier in an invading platoon: the rest of the platoon marches on. That general principle—not even the malignancy in a single cancer has one cause—would haunt cancer research and treatment for years. A compound called TNF, for tumor necrosis factor, raised hopes in the 1980s that it would live up to its name. When it was injected into mice carrying human tumors, it seemed to melt them away. But in clinical trials, it had little effect on the cancer. "Animal models have not been very predictive of how well drugs would do in people," says oncologist Paul Bunn, who leads the International Society for the Study of Lung Cancer. "We put a human tumor under the mouse's skin, and that microenvironment doesn't reflect a person's—the blood vessels, inflammatory cells or cells of the immune system," all of which affect prognosis and survival.

If mouse models have a single Achilles' heel, it is that the human tumors that scientists transplant into them, and then attack with their weapon du jour, almost never metastasize. Even in the 1970s there was clear evidence—in people—of the deadly role played by cells that break off from the original tumor: women given chemo to mop up any invisible malignant cells left behind after breast surgery survived longer without the cancer's showing up in their bones or other organs, and longer, period, than women who did not receive such "adjuvant" therapy, scientists reported in 1975. "Every study of adjuvant therapy shows it works because it kills metastatic cells even when it appears the tumor is only in the breast or in the first level of lymph nodes," says the ACS's Brawley. By the mid-1990s studies had shown similar results for colon cancer: even when surgeons said they'd "got it all," patients who received chemo lived longer and their cancer did not return for more years.

Yet for years, despite the clear threat posed by metastatic cells, which we now know are responsible for 90 percent of all cancer deaths, the war on cancer ignored them. Scientists continued to rely on animal models where metastasis didn't even occur. Throughout the 1980s and 1990s, says Visco, "researchers drilled down deeper and deeper into the disease," looking for ever-more-detailed molecular mechanisms behind the initiation of cancer, "instead of looking up and asking really big questions, like why cancer metastasizes, which might help patients sooner."

There was another way. At the same time that molecular biologists were taking the glamorous, "look for the cool molecular pathway," cojones-fueled approach to seeking a cure, pediatric oncologists took a different path. Pediatric cancer had long been a death sentence: in Farber's day, children with leukemia rarely survived more than three months. (President Bush's sister Robin died of the disease in 1953; she was 3.) Fast-forward to 2008: 80 percent of children with cancer survive well into adulthood.

To achieve that success, pediatric oncologists collaborated to such a degree that at times 80 percent of the children with a particular cancer were enrolled in a clinical trial testing a new therapy. In adults, it has long been less than 1 percent. The researchers focused hardly at all on discovering new molecular pathways and new drugs. Instead, they threw everything into the existing medicine chest at the problem, tinkering with drug doses and combinations and sequencing and timing. "We were learning how to better use the drugs we had," says pediatric oncologist Lisa Diller of Dana-Farber Cancer Institute and Children's Hospital Boston. By 1994, combinations of four drugs kept 75 percent of childhood leukemia patients—and 95 percent of those enrolled in a study—cancer-free. Childhood brain cancer has been harder to tame, but while 10 percent of kids survived it in the 1970s, today 45 percent do—a greater improvement than in most adult cancers. (To be sure, some scientists who work on adult cancers are sick of hearing about the noble cooperation of their pediatric colleagues. Childhood cancers, especially leukemias, are simpler cancers, they say, often characterized by a single mutation, and that's why the cure rate has soared. Neutral observers say it's a little of both: pediatric-cancer scientists really did approach the problem in a novel, practical way, but their enemy is less wily than most adult cancers.)

Biologists who never met a signaling pathway they didn't love tend to dismiss the success in pediatric oncology. It involved no discoveries of elegant cell biology, just plodding work. Ironically, however, it is these "singles," not the grand slams of molecular biology, that have made the greatest difference in whether people develop cancer and die of it. Fewer smokers (54 percent of men smoked in 1971; 21 percent do today), more women having mammograms and fewer taking hormone-replacement therapy (the incidence of breast cancer fell an unheard-of 7 percent from 2002 to 2003, after a 2002 study found that HRT can stimulate the growth of tiny breast tumors) have had at least as great an impact on cancer as the achievements of basic-science labs that received the bulk of the funding in the war on cancer. Similarly, the widespread use of Pap smears to detect precancerous changes in cells of the cervix is almost entirely responsible for the drop in both incidence of and deaths from cervical cancer. Incidence has fallen some 65 percent since 1975, and mortality at least 60 percent. Little wonder, then, that by the 1980s critics were asking why the war on cancer was spending the vast majority of taxpayers' money on elegant biology that cured millions of mice rather than on the search for more practical advances like these.

By "critics," we don't mean disgruntled laypeople. At UCLA, Denny Slamon had been inspired by Robert Weinberg's discovery of the first human oncogene, ras, in 1982. Although drugs to squelch the gene directly did not pan out, the discovery did lead to the first real success of the reductionist, "let's get in there and study the genetics and molecules of cancer" approach. Slamon was at first following the crowd, examining animal cancers for signs of DNA changes. But in 1982 he had an idea: look for unusual genes in tissue samples taken from human tumors. He applied to NCI for funding and, he recalls, "they basically sent it back with a laugh track. They said it was just a fishing expedition, that it wasn't hypothesis-driven. We tried to explain the logic—that if cancer reflects a problem of genetic control, then finding mutated genes should be important—but still didn't get funded." The same year that NCI laughed at Slamon's idea, MIT's Weinberg and colleagues discovered another gene involved in cancer. Called HER2, it makes a molecule that sits on the outside of cells and acts like an antenna, picking up growth signals that are then carried to the cell nucleus, where they deliver a simple if insidious message: go forth and multiply, really really fast. That made Slamon wonder whether HER2 might play a role in major human cancers.

In 1984, backed by private funding, Slamon found that 27 percent of breast cancers contain extra copies of HER2. Over the next decade he and other scientists showed that HER2 caused the cancer, rather than being an innocent bystander (or "marker," as scientists say). They also found an antibody that attaches to HER2 like a squirrel's nest on a TV antenna, preventing it from picking up signals. In 1998 the FDA approved that antibody, called Herceptin, for use in breast cancers fueled by HER2. It was stunning proof of the principle that drugs could be precisely crafted to cripple molecules that lie upstream of cell replication, stoking the growth of cancer cells and only cancer cells, not healthy ones, and has cured thousands of women. After the 1984 discovery, NCI was happy to fund Slamon. "It was only because we had already shown that the research would work," he says wistfully. "It is, shall we say, a conservative way to spend your money."

Slamon was not the only scientist who noticed NCI's preference for elegant molecular studies over research that offered the possibility of new treatments. (We should note that funding decisions are made not by NCI bureaucrats but by panels of scientists from, mostly, universities and medical institutions.) In the mid-1990s Brain Druker of the Oregon Health and Science University Cancer Institute wanted to study a molecule involved in chronic myelogenous leukemia. Targeting that molecule, he thought, might cure CML. "People rolled their eyes and asked, 'What's new and different about this?' " By "new and different," they meant scientifically novel, elegant, offering new insight into a basic cellular process. He didn't even apply for an NCI grant. "I knew I'd just be wasting my time," he says. "NCI would have looked at what I wanted to do and said it was too high-risk. Instead I took the tried-and-true approach of getting funded for basic research, seeing how cell growth is regulated" by molecules that are grabbed by receptors on a leukemia cell and that send proliferation orders to the cell nucleus. This work led to a useful clinical test, but the work NCI did not fund (a private foundation did) eventually led to Gleevec, the blockbuster CML drug.

Indeed, there is no more common refrain among critics of how the war on cancer has been waged: that innovative ideas, ideas that might be grand slams but carry the risk of striking out, are rejected by NCI in favor of projects that promise singles. "We ask the scientists all the time why aren't we further along," says Visco. "Part of the answer is that the infrastructure of cancer is to keep things moving along as they have been and to reward people doing safe research. Exciting new ideas haven't fared well." As coincidence would have it, in the very year that Nixon launched the war on cancer, an unknown biologist named Judah Folkman published a paper proposing that metastatic cells survive, and become deadly, only if they grow blood vessels to keep themselves supplied with nutrients. That process is called angiogenesis, and it had nothing to do with the genes and proteins that the soldiers in the war on cancer were fixated on. Throughout the 1970s "the reaction was mainly hostility and ridicule," Folkman (who died earlier this year) recalled to NEWSWEEK in 1998. "People would ask me [at scientific meetings], 'You really don't believe that, do you?' " NCI turned down his request for funds to continue his work, calling his ideas about the importance of angiogenesis in metastasis "just your imagination," Folkman said. He persisted, of course, laying the groundwork for what would become anti angiogenesis drugs. Avastin was approved for colorectal cancer in 2004.

If the 1990s were the era of identifying cellular processes and molecules unique to cancer cells—not the blunderbuss approach of wrecking DNA and stopping replication, which brings friendly fire down on healthy cells—the focus of the 2000s is to personalize treatment. The reason is that, just as cancer cells develop resistance to standard chemo drugs, so they are finding ways to elude the new targeted drugs such as Avastin, Gleevec and Herceptin. In the studies that led the FDA to approve Avastin, for instance, the drug prolonged life in patients with advanced colorectal cancer by a median of four months. In later studies, it increased survival in advanced lung-cancer patients by a couple of months, says Roy Herbst, a lung oncologist at M. D. Anderson. Why so little? "Angiogenesis is a redundant process," Herbst explains. "Most cells use the VEGF pathway [that Avastin blocks], but there are at least 12 other pathways, and Avastin doesn't block any of them." With VEGF out of commission, malignant cells turn to these alternatives. Or consider Tarceva, given to lung-cancer patients, which turns off a molecule called EGFR that fuels the proliferation of some lung and other cancer cells. "It shrinks the tumor 60 to 80 percent of the time, and the effect lasts about a year," says David Johnson, a thoracic oncologist at Vanderbilt (University) Ingram Cancer Center. But if even a tiny fraction of malignant cells in the tumor or at metastatic sites use a proliferation pathway other than EGFR, they laugh off Tarceva and proliferate unchecked; most patients are dead within three years. Of the first patients with a rare gastric cancer whom George Demetri of Dana-Farber treated with Gleevec in 2000, 85 percent became resistant to it after five years. (Before Gleevec, though, patients with this cancer died within six weeks.) The malignant cells, it turns out, change the shape of the molecule that Gleevec blocks. It's as if a teenager, knowing Mom has a key to his room and wanting his privacy, changed the lock before she arrived.

In response to the limits of targeted therapies, scientists are pursuing the next big idea: that there is no such thing as cancer. There are only cancers, plural, each one characterized by a different set of mutations, a different arsenal it uses to fight off drugs and proliferate. "By the time there are 10 cancer cells, you probably have eight different cancers," says Demetri. "There are different pathways in each of the cells." And that's why cancer patients keep dying. One woman found a lump in her breast in 2002, nine months after a mammogram had shown nothing amiss. She had the breast tumor removed, says oncologist Julie Gralow, who treated her at the Fred Hutchinson Cancer Center, and chemotherapy to kill any remaining malignant cells. The woman did well for three years, but in 2005 an exam found cancer in her bones. She underwent half a dozen different chemotherapies over the next three years, until last March the cancer was detected in her brain. She received radiation—because chemo drugs generally do not cross the blood-brain barrier, radiation rather than chemo is the treatment of choice for brain cancer—but by July tumors had riddled her body. She died that month.

To beat down cancer mortality, oncologists need to target all the many cancers that make up a cancer—the dozens of different pathways that cells use to proliferate and spread. That is the leading edge of research today, determining how this patient's tumor cells work and hitting those pathways with multiple drugs, simultaneously or sequentially, each chosen because it targets one of those growth, replication and angiogenesis pathways. "The hope is to match tumor type to drug," says Roy Herbst. "We need to make the next leap, getting the right drug to the right patient."

Both presidential candidates have vowed to support cancer research, which makes this a propitious time to consider the missed opportunities of the first 37 years of the war on cancer. Surely the greatest is prevention. Nixon never used the word; he exhorted scientists only to find a cure. Partly as a result, the huge majority of funding for cancer has gone into the search for ways to eradicate malignant cells rather than to keep normal cells from becoming malignant in the first place. "The funding people are interested in the magic-bullet research because that's what brings the dollars in," says oncologist Anthony Back, of the Hutch. "It's not as sexy to look at whether broccoli sprouts prevent colon cancer. A reviewer looks at that and asks, 'How would you ever get that to work?' " And besides, broccoli can't be patented, so without the potential payoff of a billion-dollar drug there is less incentive to discover how cancer can be prevented.

Another missed opportunity involves the environment around a tumor cell. "We used to focus on cancer cells with the idea that they were master of their own destiny," says MIT's Weinberg. "By studying genes inside the cell we thought we could understand what was going on. But now [we know] that many tumors are governed by the signals they receive from outside"—from inflammatory cells, cells of the immune system and others. "It's the interaction of signals inside and outside the tumor that creates aggressiveness and metastasis."

Which leads to the third big missed opportunity, the use of natural compounds and nondrug interventions such as stress reduction to keep the microenvironment inhospitable to cancer. (Cancer cells have receptors that grab stress hormones out of the bloodstream and use them to increase angiogenesis.) "Funding has gone to easier areas to research, like whether a drug can prevent cancer recurrence," says Lorenzo Cohen, who runs the integrative care center at M. D. Anderson. That's simpler to study, he points out, than whether a complicated mix of diet, exercise and stress reduction techniques can keep the micro-environment hostile to cancer. And while we're on the subject of how to reduce mortality from cancer, consider these numbers: 7 percent of black women with breast cancer get no treatment, 35 percent do not receive radiation after mastectomy (the standard of care), and 26 percent of white women do not. As long as scientists are discovering how to thwart cancer, it might make sense to get the advances into the real world.

Breakthroughs continue to pour out of labs, of course. Cutting-edge techniques are allowing scientists to identify promising experimental drugs more quickly than ever before. And just last week separate groups of scientists announced that they had identified dozens of genes involved in glioblastoma, the most common brain cancer, as well as pancreatic cancer. That raises the possibility that the mutations cause the cancer, and that if the pathways they control can be blocked the cancer can be beaten back. Stop us if you've heard that before. Hope springs eternal that such findings will not join the long list of those that are interesting but irrelevant to patients.

Tuesday, September 9, 2008

Fewer med students are choosing primary care

Only 2 percent plan to enter family medicine, new study says
The Associated Press
updated 4:09 p.m. ET, Tues., Sept. 9, 2008

CHICAGO - Only 2 percent of graduating medical students say they plan to work in primary care internal medicine, raising worries about a looming shortage of the first-stop doctors who used to be the backbone of the American medical system.

The results of a new survey being published Wednesday suggest more medical students, many of them saddled with debt, are opting for more lucrative specialties.

The survey of nearly 1,200 fourth-year students found just 2 percent planned to work in primary care internal medicine. In a similar survey in 1990, the figure was 9 percent.

Paperwork, the demands of the chronically sick and the need to bring work home are among the factors pushing young doctors away from careers in primary care, the survey found.

"I didn't want to fight the insurance companies," said Dr. Jason Shipman, 36, a radiology resident at Vanderbilt University Medical Center in Nashville, Tenn., who is carrying $150,000 in student debt.

Primary care doctors he met as a student had to "speed to see enough patients to make a reasonable living," Shipman said.

Dr. Karen Hauer of the University of California, San Francisco, the study's lead author, said it's hard work taking care of the chronically ill, the elderly and people with complex diseases — "especially when you're doing it with time pressures and inadequate resources."

Family medicine offers lowest salary
The salary gap may be another reason. More pay in a particular specialty tends to mean more U.S. medical school graduates fill residencies in those fields at teaching hospitals, Dr. Mark Ebell of the University of Georgia found in a separate study.

Family medicine had the lowest average salary last year, $186,000, and the lowest share of residency slots filled by U.S. students, 42 percent. Orthopedic surgery paid $436,000, and 94 percent of residency slots were filled by U.S students.

Meanwhile, medical school is getting more expensive. The average graduate last year had $140,000 in student debt, up nearly 8 percent from the previous year, according to the Association of American Medical Colleges.

Another likely factor: Medicare's fee schedule pays less for office visits than for simple procedures, according to the American College of Physicians, which reported in 2006 that the nation's primary care system is "at grave risk of collapse."

Lower salaries in primary care did not deter Dr. Alexis Dunne of Chicago, who is 31 and carrying $250,000 in student debt.

Last year, a parade of specialists couldn't solve the mystery of her mother's weight loss, fevers and severe anemia. Finally, an internist diagnosed a rare kidney infection. The kidney was removed, and Dunne's mother has felt fine since.

Watching her mother go through the health crisis affirmed her decision to go into primary care. She also enjoys being "the point person" for her patients.

"You become so close to them you're almost like a family friend," said Dunne, who completed her residency at Chicago's Northwestern Memorial Hospital in July.

She also found inspiration from the doctors she met during training: "They were the ones who would sit at a patient's bedside and spend more time with them rather than running off to surgery."

International students fill gap
A separate study in JAMA suggests graduates from international medical schools are filling the primary care gap.

About 2,600 fewer U.S. doctors were training in primary care specialties — including pediatrics, family medicine and internal medicine — in 2007 compared with 2002. In the same span, the number of foreign graduates pursuing those careers rose by nearly 3,300.

"Primary care is holding steady but only because of international medical school graduates," said Edward Salsberg of the Association of American Medical Colleges, a co-author of the study. "And holding steady in numbers is probably not sufficient when the population is growing and aging."

And as American students lose interest, teaching hospitals will probably become less interested in offering primary care programs, said Dr. David Goodman, associate director of the Center for Health Policy Research at Dartmouth Medical School.

In a JAMA editorial, Goodman called on Congress to create a permanent regulatory commission to encourage training for needed specialties. U.S. teaching hospitals now receive $10 billion a year from the government to train doctors "with virtually no accountability," he said.

The coordinated care provided by primary care doctors can keep costs down by preventing harmful drug interactions, unneeded medical procedures and fragmented specialty care, Goodman said.

The Web-based survey was done at 11 medical schools with demographics and training choices similar to all U.S. medical students.

Monday, September 8, 2008

Beshear Announces Plan to Enroll More Children in Health Insurance Program

Governor Reduces Barriers, Increases Retention and Outreach Efforts

FRANKFORT, Ky. (Sept. 3, 2008) Gov. Steve Beshear today outlined an ambitious plan to get as many children as possible enrolled in the Kentucky Children’s Health Insurance Program (KCHIP) by 2010.  The Beshear Plan, unveiled in several communities across Kentucky today with an implementation date of November 1, intends to dramatically cut the number of children without health coverage by removing barriers to enrollment, retaining more children once they are enrolled and significantly increasing education and outreach regarding the program. The plan hopes to enroll over 35,000 by FY2010. KCHIP provides health insurance to children whose family income is below 200 percent of the federal poverty level, about $42,400 a year for a family of four.

“It is shameful and shortsighted that we are not providing children with the health care they need and deserve,” Gov. Beshear said. “The steps we are taking today to get more eligible children enrolled in the Kentucky Children’s Health Insurance Program are fiscally responsible, economically smart, and, quite simply, the right thing to do.”

Beshear unveiled plans to enroll eligible children in KCHIP, including:


SIMPLIFYING THE ENROLLMENT PROCESS: 

• Eliminate the face-to-face interview requirement and allow: 

o Applicants to mail in the application

o Applicants to download, print and fill out application from the Governor’s website

o Applicants to apply in-person at the local DCBS office, though this is no longer required.

• Short application:

o An easy-to-use application has been created.

• Amend denial process:

o Provide a new 30-day grace period to provide additional information for denied applications

o Follow up by phone or mail with applicants who fail to supply requested information.


INCREASING RETENTION EFFORTS:

Currently, KCHIP families must recertify at the end of each year that they want to continue in the program, and are given 30 days to return a renewal form by mail. The Beshear Plan will:

• Allow a new 30-day grace period to complete the renewal process 

• Contact families by telephone and mail who fail to return renewal form.

INCREASING OUTREACH:

• Identify families participating in the Food Stamp Program, the Free and Reduced Meal Program and other agency programs who may qualify and provide outreach to their families.

• Create a system prompt for Food Stamp workers to inform applicants of potential eligibility for KCHIP.

• Provide training to employees at Federally Qualified Health Care Centers, Free Clinics, Local Health


Departments to assist customers with application completion.

Embargoed until Wednesday, September 3 at 9 a.m.

• Contact families by mail that dis-enrolled from the program within the past year and encourage them to reapply for benefits.

• Contact the parents of every newborn in Kentucky with a postcard providing information about KCHIP enrollment.

• Provide training to and encourage advocacy groups and other interested parties to assist applicants with application and ensure documentation is attached.

• Distribute applications during back-to-school campaigns.

• Create a list serve for all school districts and send message each school year from Cabinet Secretary encouraging school staff to assist in identification and enrollment of children into KCHIP.

In addition, outreach to new members will include:

• Information about the benefits available to their children through KCHIP, such as well child visits, vision, dental, pharmacy, etc.


IMPLEMENTATION

The Beshear Plan, which requires an eight-week implementation, will be accomplished by:

• Hiring two additional KCHIP Central Office Staff;

• Hiring 26 specialized KCHIP workers; and

• It is estimated this plan will cost $6.1 million in state funding and $16.7 million in federal funding in

FY09 and $25 million in state funding and $64.6 in federal funding in FY10.

Governor Beshear made note of several critical reasons the state is taking these steps, including:

• Supporting children’s health is vital to supporting Kentucky’s future.

• Children’s health status impacts academic achievement.

• Treating Kentucky children will save money on future chronic conditions.

• Uninsured Kentuckians cost insured Kentuckians.

• The Beshear Plan may reduce use of emergency rooms by uninsured families


Saturday, September 6, 2008

The Curious Lives of Surrogates

Thousands of largely invisible American women have given birth to other people's babies. Many are married to men in the military.

Lorraine Ali and Raina Kelley
NEWSWEEK
Updated: 2:55 PM ET Mar 29, 2008


Jennifer Cantor, a 34-year-old surgical nurse from Huntsville, Ala., loves being pregnant. Not having children, necessarily—she has one, an 8-year-old daughter named Dahlia, and has no plans for another—but just the experience of growing a human being beneath her heart. She was fascinated with the idea of it when she was a child, spending an entire two-week vacation, at the age of 11, with a pillow stuffed under her shirt. She's built perfectly for it: six feet tall, fit and slender but broad-hipped. Which is why she found herself two weeks ago in a birthing room in a hospital in Huntsville, swollen with two six-pound boys she had been carrying for eight months. Also in the room was Kerry Smith and his wife, Lisa, running her hands over the little lumps beneath the taut skin of Cantor's belly. "That's an elbow," said Cantor, who knew how the babies were lying in her womb. "Here's a foot." Lisa smiled proudly at her husband. She is, after all, the twins' mother.

It is an act of love, but also a financial transaction, that brings people together like this. For Kerry and for Lisa—who had a hysterectomy at the age of 20 and could never bear her own children—the benefits are obvious: Ethan and Jonathan, healthy six-pound, 12-ounce boys born by C-section on March 20. But what about Cantor? She was paid, of course; the Smiths declined to discuss the exact amount, but typically, surrogacy agreements in the United States involve payments of $20,000 to $25,000 to the woman who bears the child. She enjoyed the somewhat naughty pleasure of telling strangers who asked about her pregnancy, "Oh, they aren't mine," which invariably invoked the question, "Did you have sex with the father?" (In case anyone is wondering, Lisa's eggs were fertilized in vitro with Kerry's sperm before they were implanted on about day five.)

But what kind of woman would carry a child to term, only to hand him over moments after birth? Surrogates challenge our most basic ideas about motherhood, and call into question what we've always thought of as an unbreakable bond between mother and child. It's no wonder many conservative Christians decry the practice as tampering with the miracle of life, while far-left feminists liken gestational carriers to prostitutes who degrade themselves by renting out their bodies. Some medical ethicists describe the process of arranging surrogacy as "baby brokering," while rumors circulate that self-obsessed, shallow New Yorkers have their babies by surrogate to avoid stretch marks. Much of Europe bans the practice, and 12 states, including New York, New Jersey and Michigan, refuse to recognize surrogacy contracts. But in the past five years, four states—Texas, Illinois, Utah and Florida—have passed laws legalizing surrogacy, and Minnesota is considering doing the same. More than a dozen states, including Pennsylvania, Massachusetts and, most notably, California, specifically legalize and regulate the practice.

Today, a greater acceptance of the practice, and advances in science, find more women than ever before having babies for those who cannot. In the course of reporting this story, we discovered that many of these women are military wives who have taken on surrogacy to supplement the family income, some while their husbands are serving overseas. Several agencies reported a significant increase in the number of wives of soldiers and naval personnel applying to be surrogates since the invasion of Iraq in 2003. At the high end, industry experts estimate there were about 1,000 surrogate births in the United States last year, while the Society for Assisted Reproductive Technology (SART)—the only organization that makes an effort to track surrogate births—counted about 260 in 2006, a 30 percent increase over three years. But the number is surely much higher than this—in just five of the agencies NEWSWEEK spoke to, there were 400 surrogate births in 2007. The numbers vary because at least 15 percent of clinics—and there are dozens of them across the United States—don't report numbers to SART. Private agreements made outside an agency aren't counted, and the figures do not factor in pregnancies in which one of the intended parents does not provide the egg—for example, where the baby will be raised by a gay male couple. Even though the cost to the intended parents, including medical and legal bills, runs from $40,000 to $120,000, the demand for qualified surrogates is well ahead of supply.

Another reason for the rise in surrogacies is that technology has made them safer and more likely to succeed. Clinics such as Genetics & IVF Institute in Virginia, where Cantor and the Smiths underwent their IVF cycles, now boast a 70 to 90 percent pregnancy success rate—up 40 percent in the past decade. Rather than just putting an egg into a petri dish with thousands of sperm and hoping for a match, embryologists can inject a single sperm directly into the egg. The great majority of clinics can now test embryos for genetic diseases before implantation. It's revolutionizing the way clinics treat patients. Ric Ross, lab director at La Jolla IVF in San Diego, says these advances have helped "drop IVF miscarriage rates by 85 percent."

IVF has been around only since the 1970s, but the idea of one woman bearing a baby for another is as old as civilization. Surrogacy was regulated in the Code of Hammurabi, dating from 1800 B.C., and appears several times in the Hebrew Bible. In the 16th chapter of Genesis, the infertile Sarah gives her servant, Hagar, to her husband, Abraham, to bear a child for them. Later, Jacob fathers children by the maids of his wives Leah and Rachel, who raise them as their own. It is also possible to view the story of Jesus' birth as a case of surrogacy, mediated not by a lawyer but an angel, though in that instance the birth mother did raise the baby.

The most celebrated case of late, though, resulted in the legal and ethical morass known as the "Baby M" affair. Mary Beth Whitehead, age 29 in 1986, gave birth to a girl she had agreed to carry for an infertile couple. But Whitehead was also the baby's biological mother and tried to keep her after the birth, leading to a two-year custody battle. (In the end, she was denied custody but awarded visitation rights.) As a result, surrogacy agreements now almost always stipulate that the woman who carries the baby cannot also donate the egg.

But even as surrogacy is becoming less of a "Jerry Springer" spectacle and more of a viable family option for those who can afford it, the culture still stereotypes surrogates as either hicks or opportunists whose ethics could use some fine-tuning. Even pop culture has bought into the caricature. In the upcoming feature film "Baby Mama," a single businesswoman (Tina Fey) is told by a doctor she is infertile. She hires a working-class gal (Amy Poehler) to be her surrogate. The client is a savvy, smart and well-to-do health-store-chain exec while Poehler is an unemployed, deceitful wild child who wants easy money.

When Fey's character refers to her surrogate as "white trash," we're supposed to laugh. "I just don't understand how they can think that," says surrogate Gina Scanlon of the stereotypes that influenced the film. Scanlon, 40, is a married mother of three who lives in Pittsburgh. Scanlon is also a working artist and illustrator who gave birth to twin girls for a gay New Jersey couple 18 months ago. The couple—a college professor and a certified public accountant—chose Scanlon because she was "emotionally stable," with a husband and children of her own. Unlike egg donors, who are usually in their 20s, healthy women as old as 40 can serve as surrogates; Scanlon two weeks ago underwent an embryo transfer and is now pregnant again for a new set of intended parents. "Poor or desperate women wouldn't qualify [with surrogacy agencies]," she says. As for the implication that surrogates are in it only for the money, she notes that there are many easier jobs than carrying a baby 24 hours a day, seven days a week. (And most jobs don't run the risk of making you throw up for weeks at a time, or keep you from drinking if you feel like it.) "If you broke it down by the hour," Scanlon says wryly, "it would barely be minimum wage. I mean, have [these detractors] ever met a gestational carrier?" And even if they have, how would they know?

Very little is understood about the world of the surrogate. That's why we talked to dozens of women across America who are, or have been, gestational carriers. What we found is surprising and defies stereotyping. The experiences of this vast group of women—including a single mom from Murrietta, Calif., a military spouse from Glen Burnie, Md., and a small-business owner from Dallas—range from the wonderful and life-affirming to the heart-rending. One surrogate, Scanlon, is the godmother of the twins she bore, while another still struggles because she has little contact with the baby she once carried. Some resent being told what to eat or drink; others feel more responsible bearing someone else's child than they did with their own. Their motivations are varied: one upper-middle-class carrier in California said that as a child she watched a family member suffer with infertility and wished she could help. A working-class surrogate from Idaho said it was the only way her family could afford things they never could before, like a $6,000 trip to Disney World. But all were agreed that the grueling IVF treatments, morning sickness, bed rest, C-sections and stretch marks were worth it once they saw their intended parent hold the child, or children (multiples are common with IVF), for the first time. "Being a surrogate is like giving an organ transplant to someone," says Jennifer Cantor, "only before you die, and you actually get to see their joy."

That sense of empowerment and self-worth is one of the greatest rewards surrogate mothers experience. "I felt like, 'What else am I going to do with my life that means so much?' " says Amber Boersma, 30, of Wausau, Wis. She is blond, outgoing and six months pregnant with twins for a couple on the East Coast who could not bear children on their own due to a hysterectomy. Boersma, married to a pharmaceutical rep, is a stay-at-home mom with a 6-year-old girl and 4-year-old boy, and a college graduate with a communications degree. "Some people can be successful in a major career, but I thought I do not want to go through this life meaning nothing, and I want to do something substantial for someone else. I want to make a difference."

Then there's the money. Military wife Gernisha Myers, 24, says she was looking through the local San Diego PennySaver circular for a job when she saw the listing: "Surrogate Mothers Wanted! Up to $20,000 Compensation!" The full-time mother of two thought it would be a great way to make money from home, and it would give her that sense of purpose she'd lacked since she left her job as an X-ray technician in Phoenix. In 2004, Myers and her husband, Tim, a petty officer third class in the Navy, were transferred from Arizona to California. Ever since, she missed bringing home a paycheck, helping other people—and being pregnant. She loved the feel of her belly with a baby inside, and the natural high that comes from "all those rushing hormones." So last fall she signed with one of the many surrogacy agencies near the 32nd Street Naval Station, where her husband is assigned. Her grandmother was not pleased with Myers's decision. "She said, 'Gernisha! We just do not do that in this family'," recalls Myers. "My uncle even said he was disgusted. But you know what? I'm OK with it because I know I am doing something good for somebody else. I am giving another couple what they could never have on their own—a family."

Like Myers, military wives are largely young stay-at-home moms who've completed their own families before they hit 28. IVF clinics and surrogate agencies in Texas and California say military spouses make up 50 percent of their carriers. "In the military, we have that mentality of going to extremes, fighting for your country, risking your life," says Jennifer Hansen, 25, a paralegal who's married to Army Sgt. Chase Hansen. They live in Lincoln, Neb., and have two young kids, and Chase has been deployed to Iraq for two of the past five years. "I think that being married to someone in the military embeds those values in you. I feel I'm taking a risk now, in less of a way than he is, but still a risk with my life and body to help someone." Surrogate agencies target the population by dropping leaflets in the mailboxes of military housing complexes, such as those around San Diego's Camp Pendleton, and placing ads in on-base publications such as the Military Times and Military Spouse. Now surrogate agencies say they are solicited by ad reps from these publications.
Military wives who do decide to become surrogates can earn more with one pregnancy than their husbands' annual base pay (which ranges for new enlistees from $16,080 to $28,900). "Military wives can't sink their teeth into a career because they have to move around so much," says Melissa Brisman of New Jersey, a lawyer who specializes in reproductive and family issues, and heads the largest surrogacy firm on the East Coast. "But they still want to contribute, do something positive. And being a carrier only takes a year—that gives them enough time between postings."

Dawne Dill, 32, was a high-school English teacher before she married her husband, Travis, a Navy chief, and settled in Maryland. She's now a full-time mother with two boys of her own, and is carrying twins for a European couple who prefer to remain anonymous. Dill is due in May. The attraction of surrogacy for her, apart from wanting to feel useful, was that the money could help pay for an occupational-therapy gym for her older son, who is autistic. "We're thinking of building the gym in our basement so he can get to it whenever he needs," says Dill. She worried that having an autistic child might disqualify her as a surrogate, but fortunately the agency was unconcerned. "They said because I was not genetically related to the twins, that it was just not an issue, and my IPs [intended parents] never brought it up to me personally. I assume they're OK with it, but maybe think it's too touchy of a subject to discuss openly with me," says Dill. As a prepartum gift, the couple sent Dawne and her husband to the Super Bowl.

Military wives are attractive candidates because of their health insurance, Tricare, which is provided by three different companies—Humana, TriWest and Health Net Federal Services—and has some of the most comprehensive coverage for surrogates in the industry. Fertility agencies know this, and may offer a potential surrogate with this health plan an extra $5,000. Last year military officials asked for a provision in the 2008 defense authorization bill to cut off coverage for any medical procedures related to surrogate pregnancy. They were unsuccessful—there are no real data on how much the government spends on these cases. Tricare suggests that surrogate mothers who receive payment for their pregnancy should declare the amount they're receiving, which can then be deducted from their coverage. But since paid carriers have no incentive to say anything, most don't. "I was told by multiple people—congressional staff, doctors and even ordinary taxpayers—that they overheard conversations of women bragging about how easy it was to use Tricare coverage to finance surrogacy and delivery costs and make money on the side," says Navy Capt. Patricia Buss, who recently left the Defense Department and now holds a senior position with Health Net Federal Services. The subject of Tricare surrogacy coverage is becoming a hot topic throughout the military world; on Web sites such as militarySOS .com, bloggers with sign-on names such as "Ms. Ordinance" and "ProudArmyWife" fiercely debate the subject.

Surrogacy is not just an American debate—it is global. Thanks to reproductive science, Gernisha Myers, who is African-American, is now 18 weeks pregnant with the twins of Karin and Lars, a white couple who live in Germany. They are one of many international couples who turned to America to solve their infertility issues because surrogacy is not allowed in their own country. Couples have come to the United States from many countries, including Iceland, Canada, France, Japan, Saudi Arabia, Israel, Australia, Spain and Dubai in recent years. Although some couples are now turning to India for cheaper fertility solutions—yes, even surrogacy is being outsourced at a tenth of the price—the trend has yet to diminish America's draw as a baby mecca.

Karin and Lars picked Myers after they read her agency profile. Myers says that the psychological screening is one of the most grueling, invasive and odd parts of the process. "The [questionnaire] asked some weird questions, like 'Do you think about killing people sometimes?' Or 'Would you want to be a mountain ranger if you could?' Or 'Do you find yourself happier than most?' But when they asked 'Are you afraid you're going to get attached to the babies?' I said, 'In a way, yes, even though I know they're not mine.' They said, 'Believe it or not, some GCs [gestational carriers] never feel any kind of bond.' I found that hard to believe back then, but now I know what they're talking about. I don't feel that motherly bond. I feel more like a caring babysitter."

Myers's psychological detachment has a lot to do with the fact that, like most carriers today, she's in no way biologically related to the baby inside her—the legacy of the "Baby M" case. The most recent significant case involving a surrogacy dispute, Johnson v. Calvert in 1993, was resolved in favor of the intended parents, and against a surrogate who wanted to keep the baby. John Weltman, president of Circle Surrogacy in Boston, says that parents who work with a reputable agency have a "99 percent chance of getting a baby and a 100 percent chance of keeping it." But up until just about two years ago, Weltman says every single intended parent asked, "Will she [the carrier] try and keep the baby?" Now, he says, a third of his clients don't even mention it.

That doesn't mean that it's gotten any easier for the surrogate to give up the baby. Most gestational carriers say it is still the hardest part of the job, and some have a rougher time than others. Gina Scanlon recalls the days after the birth of her first pair of surrogate twins: "When you go home it's so quiet," she says. "The crash comes. It's not the baby blues. It's not postpartum depression. It's that the performance is over. I was practically a celebrity during the pregnancy—someone was always asking me questions. After I had them, no one was calling. Now nobody cares. You're out. You're done. It's the most vain thing. I felt guilty and selfish and egotistical."

Stephanie Scott also found that life after surrogacy was not what she expected, especially since everything hummed along so nicely when she was pregnant. Seven and a half months in, she was feeling great—all except for those damn nesting urges. The stay-at-home mom tried to stay out of the baby stores and avoid those sweet pink onesies and baby booties shaped like tiny ballet slippers—but it was near impossible to resist. Her mind-set should have served as a warning. Although she knew the baby in her swollen belly belonged to a couple on the East Coast, she hadn't prepared herself for that biological surge that keeps stores like Babies "R" Us in business. "I showed up to the delivery room with six months' worth of baby clothes," admits Scott, 28. "They ended up being my gift to the baby's intended parents. Sort of like a baby shower in reverse. I know, it's weird." But that was nothing compared to the childbirth: "When she was born, they handed her to me for a second," she says. "I couldn't look, so I closed my eyes tight, counted 10 fingers and 10 toes, then gave her away. I cried for a month straight. I was devastated."

The baby Scott gave birth to is now 3, and photos of the toddler come twice a year, on the child's birthday and Christmas. Scott says she thinks things would have been different had she been counseled more by the agency on attachment issues, but it was a small and less than professional operation (and there are plenty of those in the unregulated world of surrogacy agencies). It's one of the reasons Scott opened her own business in Dallas, Simple Surrogacy. "I would never just throw a girl out there like that. Surrogates need to know what lies ahead."

Any comprehensive road map of surrogacy should include not just potential attachment but an entire pull-down sheet on the second most difficult area of terrain: the relationship between surrogate and intended parent. The intentions and expectations of both parties are supposed to be ironed out ahead of time through a series of agency questionnaires and meetings. What kind of bond do they seek with one another—distant, friendly, close? Do they agree on difficult moral issues, like abortion and selective termination? And what requests do the IPs have of potential carriers? The parties are then matched by the agency, just as singles would be through a dating service. And the intended parents—or parent—are as diverse as the surrogates: gay, straight, single, married, young and old. Much of the time it works, even though it does often resemble an experiment in cross-cultural studies. "In what other world would you find a conservative military wife forming a close bond with a gay couple from Paris?" says Hilary Hanafin, chief psychologist for the oldest agency in the country, Center for Surrogate Parenting. And a good match doesn't necessarily equal a tight connection like that of Jennifer Cantor's and Lisa Smith's. Christina Slason, 29, who delivered a boy in January for same-sex partners from Mexico City, felt as the couple did—that a close relationship was not necessary. "We agreed that we would keep in touch, but neither of us felt the need to really bond," says Slason, a mother of three who lives in San Diego with her husband, Joseph, a Navy corpsman. "We were there to have a baby, nothing more. We were all clear on that."

But things are not always that clear. For Joseph, a single father from Massachusetts who asked to be identified only by his first name for privacy reasons, the process of finding a suitable surrogate on his own was frustrating, particularly when the first match got cold feet and pulled out. Intended parents Tamara and Joe Bove were troubled when the carrier for their triplets refused to go on bed rest even when a doctor advised her the babies' lives would be at risk if she did not: "She had delivered monstrously large twins vaginally before, even though one of them was breech. So she was kind of surprised that this could happen to her and just wouldn't cooperate." Tamara was plagued with worry. "Our plan was to keep in touch even after the babies were born, but then she stopped listening to the doctors. But you still have to keep acting like everything is fine because she's in control until the babies are born." (Despite Tamara's worries, the triplets were born healthy at 31 weeks via a C-section.)

Control, not surprisingly, is a sore point. A favorite pastime among surrogates—most of whom join support groups at the request of their agencies—is sharing stories of the most bizarre IP requests they've heard. One military surrogate was told if her husband was deployed anywhere in Asia, she was not to have sex with him when he returned for fear that he was unfaithful and carrying an STD.

Jennifer Hansen, the surrogate from Nebraska, says she had a few requests from her intended parents that were odd to her "as a Midwestern girl." Hansen says she's been asked not to pump her own gas. "They believe it leads to miscarriage," she says. "I've also been asked to change my cleaning supplies to all green, natural products. I'm a Clorox girl, and have no idea where to even buy these products. So they just box them up and send them to me from California." What most surrogates don't realize, according to Margaret Little, a professor of philosophy at Georgetown University and fellow at the Kennedy School of Ethics, is that the contracts governing their conduct during the pregnancy are not enforceable. She does have to surrender the baby once he's born, but cannot be forced to have (or not have) an abortion, or to obey restrictions on what she can eat, drink or do. The intended parents' only recourse is to withhold payment; they cannot police her conduct. "Surrogacy raises important red flags," Little says, "because you are selling use of the body, and historically when that's happened, that hasn't been good for women."

On the other hand, other agencies reported that some concerned surrogates have pumped and shipped their breast milk to the intended parents weeks after the birth out of fear that the newborn will not build a strong immune system without it.

As for Jennifer Cantor, resting at home last week after delivering Jonathan and Ethan, she intends to stay in touch with the family whose lives are now inextricably bound up with hers. Before returning to their home in Georgia, Lisa and Kerry brought the twins for a visit with the stranger who bore them, and with Cantor's daughter, Dahlia, whose relationship to them doesn't even have a word in the language yet. Lisa described her babies as "the true meaning of life … absolutely perfect." Next time they're hoping for girls. They're also hoping to find someone like Cantor—who, however, does not plan to be a surrogate again, much as she enjoyed it. She is relieved that she can sit normally and put her arms around Dahlia again, without a big belly in between them. She was happy that she had been able to fulfill her dream of bearing a child for someone else. "It was exactly," she said last week, "the experience I imagined it would be."
URL: http://www.newsweek.com/id/129594
© 2008

AMA Apologizes for History of Racial Inequality and Works to Include and Promote Minority Physicians

July 10, 2008

CHICAGO, Ill. - The American Medical Association (AMA) today apologizes for its past history of racial inequality toward African-American physicians, and shares its current efforts to increase the ranks of minority physicians and their participation in the AMA. In 2005, the AMA convened and supported an independent panel of experts to study the history of the racial divide in organized medicine, and the culmination of this work prompted the apology. Details of the panel's work will be made public on the Web site of the AMA's Institute for Ethics to coincide with publication in the July 16 Journal of the American Medical Association (JAMA).*

"The AMA is proud to support research about the history of the racial divide in organized medicine because by confronting the past we can embrace the future," said AMA Immediate-Past President Ronald M. Davis, M.D. "The AMA is committed to improving its relationship with minority physicians and to increasing the ranks of minority physicians so that the workforce accurately represents the diversity of America’s patients."

The AMA created the Minority Affairs Consortium (MAC) to address the specific needs of minority physicians and to stimulate and support efforts to train more minority physicians. The philanthropic arm of the AMA each year provides $10,000 scholarships to medical student winners of the AMA Foundation Minority Scholars Award, in collaboration with the MAC. This year, 12 students received the award.

"Five years ago, the AMA joined with the National Medical Association and the National Hispanic Medical Association to create the Commission to End Health Care Disparities," said Dr. Davis. "Our goal is to identify and study racial and ethnic health care disparities in order to eradicate them. We strongly support the ‘Doctors Back to School’ program, which the AMA founded, to inspire minority students to become the next generation of minority physicians."

The Doctors Back to School program, which was developed by the AMA and adopted by the Commission, has visited more than 100 schools, ranging from elementary schools to undergraduate colleges, nationwide. The program has reached out to nearly 13,000 students to urge them to consider a career in medicine. More information about the program and the Commission are available on the AMA Web site.