When my father was 14 years old, instead of dangling a fishing pole with his father on Saturday mornings, he dashed downstairs to assist him with tonsillectomies. Light fastened on his forehead like a coal miner, he peered past the tonsillar arch into the long tunnel leading to the
enigmatic interior, confirming what he had known since he was 5 years old: he too wanted to be a surgeon. Inquisitive, he questioned his father about the purpose of their mission. This was Brooklyn in 1935, a time when removing the tonsils was believed to be a prophylactic measure that would deter downstream health crises.
The tonsils were conceptualized to be the repository of streptococcal bacteria, the ubiquitous source of rheumatoid arthritis, nephritis, and rheumatic fever. To be a responsible physician meant helping patients to avoid harm. Consequently, two generations lost their tonsils to good medicine. Because the only indication needed for a tonsillectomy was an occasional sore throat and the presence of tonsils, business boomed for my grandfather in those Depression years. Mrs. Derringer, the nurse, made a salary of $8 per week, while the removal of each pair of tonsils cost $15. (This globally included the surgical fee, home-office operating room, anesthesia, and 6 hours in recovery.)
More than a dozen of these procedures were performed each day in my grandfather’s basement clinic. The standard of care was plain and clear: the surgery relieved patients of otherwise anticipated pain and suffering, kept doctors in their position of useful service, and constituted a procedure that the profession had learned to do well. By the time I was 4 years old in 1950, my grandfather, by then at Long Island College Hospital, removed my tonsils. Today tonsillectomies are no longer believed to confer any protective benefit against degenerative disease. The original theory has been disavowed and the rate at which they are performed has plummeted.
Standards of medical practice are established by well-intentioned authorities first and, ideally, validated by science later. At this moment, Sally’s head is covered in peach fuzz, though her chemotherapy ended 4 months ago. A 38-year-old mother of three (Billy, her eldest, is not yet 9), Sally joined the 184,000 women last year to be diagnosed with breast cancer, hoping not to become one of the 46,000 to die of the disease this year. She did not hesitate to comply with the advice of her doctor to undergo a mastectomy and the removal of a sampling of the lymph glands under her arm (to track traces of the disease that may have spread beyond the breast). When Sally awoke, still groggy, she asked, “How did it go?” Her physician replied confidently, “Don’t worry, we got it all,” and Sally, relieved, believed him.
Although there is no ambiguity on Sally’s part about wanting to do whatever may be required to stay alive and raise her children, there is a woeful gap between our collective wish for a remedy for breast cancer and medicine’s ability to furnish one. There is a further schism between what we know and what we do–a split that could be mended. Incongruities, misconceptions, and illusions surround the prevailing rituals employed in the management of breast cancer. Perhaps doctors, rather than being a syndicate of sinister conspirators, are prey to the simpler motive of wanting to rescue and redeem; and women, in their eagerness to be saved, are willing to surrender and endure, by any means necessary.
Reassurance by Sally’s doctor is emblematic. Much of the profession has mistakenly confused its best hopes for women with a prognostic and therapeutic competence that does not exist. Regrettably, there has been no significant improvement in the survival of women in 100 years,1,2 despite publication of William Stewart Halsted’s 1894 paper3 heralding his results on “operations for the cure of cancer of the breast.” For an entire century, the principle of the Halsted mastectomy has been the corner stone for the management of breast malignancy, even though a review of the data reveals that mortality hardly declined between the years of 1925 and 1990.4
Science is nothing if not an attempt to let the evidence speak for itself, assume its own authority, contradict hypotheses once taken for granted, and, if necessary, remake the rules. Medicine, guided by science, takes its lead from that which is proven–if not in laboratories, then in clinical study. Its Hippocratic dictum is to do no harm, but what does this mean?
At a symposium on breast cancer in 1984, pathologist Edwin Fisher5 remarked, “Conceptual aspects of most diseases in medicine–such as breast cancer–have been notoriously rigid. Historically, practitioners have been resistant to change.” Surgeon Anaxagoras Papaioannou6,7 comments that “[a] conceptual dichotomy has thus evolved: we accept breast cancer basically as a systemic disease but we persist in treating it primarily as a locoregional problem…. [T]here are some limited, uncontrolled, but intriguing data in women with breast cancer that suggest that the less physicians do, either by surgery, irradiation, or by both, the better the patients do.”
After Sally was told she had breast cancer, she was unequivocal about what she wanted from her friends: absolute support for the decisions she was making. I had the impulse to share with her what I knew from 30 years of conversations with my father, whose specialty was breast surgery. But it was too late. She made it clear that to be her friend meant not to question her doctor’s opinions. “He’s not my congressman,” she said, “he’s my lifeline.” Agitated by fear and muddled by the conflicting opinions of experts, she was focused single-mindedly on heeding her doctor’s advice.
Breast cancer is a disease enmeshed in contentious debate. Friction does not revolve solely around techniques, but becomes heated as theoretical models diverge. Inquisitions have been held over contested portraits of reality. As irrefutably as diabetes is a medical rather than surgical problem, breast cancer wobbles across boundaries, straddling internal medicine, surgery, radiology, and oncology. Sally couldn’t consider that anything other than surgical intervention would deliver and protect her from harm. In her haste to just want to make it better, she was incapable of considering her options.
Exposing and exploring the premises that have shaped the menu of current choices is itself worthy. How breast cancer is experienced may change as the perception of it shifts. In the summer of 1989 my father traveled to France to witness the early laparoscopic cholecystectomies–removal of the gall bladder via a surgical instrument inserted into small incisions in the belly rather than the former open–abdomen operation. At present, the newer, less invasive surgery has virtually replaced the former operation, reducing patient recuperation time and expense. Continuing with the old operation (except in special circumstances) is now considered unforgivable–no surgeons could justify the more extensive procedure. Yet lumpectomy for breast cancer has still not “caught on.” Marc Lippman, a renowned breast cancer researcher, says, “I am puzzled as to what combination of educational, prejudicial, financial, and historical issues have failed to get lumpectomies going…. Most [women] do not choose mastectomies….” Yet they have them anyway. The problem, he said, “is the doctors” (Men York Times. I May 5, 1993).
Unlike gall bladder surgery, about which there is no controversy, different postulates underlie the rationale for mastectomy and lumpectomy. Until the 1960s, breast cancer was conceived as a methodical march from a central encampment outward like a company of soldiers filing from a barracks to outlying regions via two terrains: along mountainous muscles and through marshes of lymph. Now, it is indisputably held that cancerous cells travel to distant sites (metastasize) via the bloodstream. It is also indisputable that, in a majority of women, this happens years before a tumor is or can be detected. By the time detection has occurred, either by palpation or mammography, the tumor has been germinating for approximately 10 years.
The Achilles heel is that, whereas cancer originates in the breast, it has the potential to spread. The word “cancer” derives from the Latin meaning “crab-like” because it claws and crawls into other tissue. Women do not perish from the local problem, but from the systemic one–and whether or not they do, and when, is dependent on the biological properties of the tumor: how fast and aggressively it multiplies, scatters, and infiltrates. It is now believed that individual body ecology is also a factor–the relationship between the seeds of the disease and the body–soil in which they are planted. Some think that within this tumor–host relationship, immunity is as significant as the virulence of the malignancy. To know whether the tumor has shed cells that have migrated to other parts of the body is only possible in retrospect–after there is evidence of malignant breast tissue growing in the bone, liver, lungs, or brain. No satisfactory method exists for detecting micrometastases or the trajectory of single cells that travel through the blood and lymph–some finding a home and colonizing. Even less mechanistic theories have been proposed, suggesting that genetic factors (inherited or mutagenic) cause normal cells to transform and become malignant, a process that is wholly out of reach of the surgeon’s scalpel.
It is all the more baffling fully aware of these data, Sally’s doctor assured her by saying, “We got it all.” What he meant was, “I hope that you have no cells maturing in a distant site, but there is no way for me to know that. What I know is that the 1.5-cm tumor that was in your breast is no longer there, and that this would be the case whether we’d done a mastectomy or lumpectomy. The reason I did a mastectomy is to prevent local recurrence, even though I’m aware that local recurrence itself has no impact on survival and that women who have lumpectomy live just as long as those who have mastectomy. Survival depends on the systemic picture.” If the horse bolted before the stable door was shut, no repair of the barn or its latch will be of consequence. Similarly, no use will come of removing more and more breast, or the chest wall, or nearby lymph tissue, if the malignant cells have taken up residence in the femur, liver, or lungs.
By now it’s well known: it is not necessary for a woman to lose her breast in an effort to save her life. Yet the majority of physicians still subscribe to the belief that mastectomy is the “gold standard,” even though they are fully cognizant of equivalent outcomes for the less invasive lumpectomy. Despite the National Cancer Institute’s (NCI) declaration in 1990 that lumpectomy followed by radiation is the preferred therapy (note 1), only 26% of diagnosed women today receive the breast-conserving lumpectomy. Most doctors advise in favor of mastectomies, and most women have them, demonstrating that data alone are not powerful enough to spur change–in medical or social practice.
In Vienna in 1848, for example, Ignaz Semmelweis discovered that women who died following childbirth of puerperal fever were infected as a result of physicians failing to wash their hands between deliveries. Yet because of the complete entrenchment of practice, doctors not only offered up great resistance to his ideas, but fiercely ridiculed him for suggesting that respectable physicians needed to wash their hands. Even though the death rate in Semmelweis’s clinic dropped immediately, it was not until Louis Pasteur presented his theory of germs 3 decades later that routine cleanliness was integrated into practice.
Thirty years after encouraging the continuation of the war in Vietnam, former Secretary of Defense Robert McNamara8 said in retrospect that we should have left sooner. What is categorical and undisputed in one epoch may be reversed in another. One hundred years after William Stewart Halsted popularized what came to be known as the Halsted mastectomy, the large majority of women diagnosed with breast cancer still undergo a modified version of his procedure even though no good data indicate that mastectomies have ever effectively resolved the wracking actuality of cancer.
This rather drastic surgical routine has flourished during the same century that ushered in the domain of science with its stringent standards of efficacy. How did a speculative hypothesis become converted into an unquestionable dogma, slipping through the net of scientific rigor and leading even the most conscientious to forsake corroboration? Somehow the correlative has been twisted and tangled into a confused web of causality, and fingers have been pointed in mistaken directions.
Scientific facts are not merely discovered–they are produced. Laboratories are not sterile environments from which subjectivity is hygienically excised, but a place where physicians immersed in their own value systems rely on conceptual models and draw on their personal experience. Medicine is as much a cultural product as a scientific endeavor. Acceptance of medical ideas hinges not solely upon evaluations of impartial evidence, but also upon social networking, political savvy, patronage, and an adherence to protocols in vogue. Medical knowledge, like any other, is contingent on the context within which it is constructed. Subject to voluminous acts of interpretation, it is a perpetual challenge to keep a keen eye on clinical efficacy.9
Recent History: Labor Pains and the Birth of a Profession
Today medical institutions have such massive weight and are so embedded within our social landscape that they appear as creations of nature, like the Rocky Mountains. It was not so long ago, however, that the medicine familiar to us today was born. To fathom how Halsted’s promulgation of the mastectomy has advanced virtually uncontested since the close of the 19th century, the ground from which it emerged must be sifted, including medical thinking in Europe toward the close of the 19th century, the cultural climate in America, the social history of American medicine, and the burgeoning professionalization of surgery. The singular influence of William Stewart Halsted himself must also be pondered.
There was tremendous activity in medicine toward the end of the 19th century. The profession of nursing originated in London after the Crimean War, when Florence Nightingale founded a school for nurses in 1860. In America Clara Barton founded the Bellevue Hospital School for Nurses in 1873. By 1900 there were 432 nursing schools, and by 1910 there were 1129 more. The trajectory of surgery as a profession mimicked nursing: whereas Midwesterners William and Charles Mayo recorded only 54 operations in the 3 years before 1893, in 1900 they chronicled 612; by 1904 the number exceeded 1000. Wilhelm Roentgen’s discovery of x-ray technology in 1895 improved diagnosis, and by 1916, with the aid of Marie Curie, new treatments were being generated as well. Furthermore, the flowering of surgery can be attributed to the discovery of ether as anesthesia, permitting operations to be performed without undue pain; the disinfectant carbolic acid, averting sepsis; and the honing of specialized skills, distinguishing expert surgical craftsmen from the less competent general practitioners.
In Vermont in 1843 there was a 50-cent fee for a doctor’s visit at less than half a mile, a $1 fee between a half and 2 miles, $1.50 for 2 to 4 miles, and $2.50 for more than 4 miles. In a 1910 survey, 96 physicians using horses reported costs that worked out to 13 cents per mile, whereas for 116 doctors using cars for which they paid less than $1000, the cost per mile was 5.6 cents. The advent of the automobile considerably widened the market.10
Although ether anesthesia was first demonstrated at the Massachusetts General Hospital in 1846, postsurgical infections caused such high mortality that major surgery was nicknamed a “capital operation.” Neither carbolic acid, needed to eliminate microorganisms during surgery, nor sterile procedures were accepted practice until much later. Joseph Lister published papers on antisepsis in 1867 and lectured for 3 hours on the subject at a medical congress in Philadelphia in 1876.11 But at the first meeting of the American Surgical Association in 1883, more speakers opposed his principles than supported them, steadfastly disregarding reports that in European hospitals that implemented his methods, postsurgical problems such as gangrene were no longer rampant. As late as 1900 most surgeries were conducted in the home because hospitals were feared as filthy, foul houses of death.
In 1847, the American Medical Association (AMA) was founded in an effort to upgrade the profession. They vowed that raising educational standards was their ticket, but it took another 60 years for their train to pull into the station. The AMA tracked the career choices of 12,400 men graduating from elite colleges between 1800 and 1850, finding that only 8% became physicians, while more than three times that many entered the clergy and legal professions. The AMA interpreted this as signifying a disdain for medicine among “educated talent.”10 Confirming their suspicions, in 1880 fewer students at medical schools had bachelor’s degrees than at either law or divinity schools.
In those days, medicine offered more status than wealth–doctors were a cut above manual laborers. Unable to earn a living solely by practicing medicine, doctors cultivated livestock, pulled teeth, mixed herbal preparations, nursed patients through long and difficult nights, and embalmed the dead. Throughout medical history surgeons were regarded as the least sophisticated and learned craftsmen in the guild, trained principally in the use of their hands through apprenticeship, many with barber-surgeons. A carryover of this remains in England today–internists are referred to as doctor, and surgeons as mister, denoting their lesser rank.
As for the specialty of surgery, in 1876 Samuel Gross of Philadelphia wrote his observations about the American surgical scene: “Although this paper is designed to record the achievements of American surgeons, there are, strange to say, as a separate and distinct class, no such persons among us. It is safe to affirm that there is not a medical man on this continent who devotes himself exclusively to the practice of surgery.”12 It was Gross who founded the American Surgical Association in 1880, but it would take at least another 2 decades for surgery to become established as a legitimate profession.
Although doctors were aspiring to an image of erudition, few actually completed much higher education. Both of my grandfathers graduated from Long Island College Hospital in 1914 with high school diplomas, never having attended college. Henry Beinfield went on to perform tonsillectomies, earning S900 a week while his nurse and chauffeur earned $8, whereas Harry Roster set up his own research hospital, frequently publishing in JAMA and Archives of Surgery.
Princeton sociologist Paul Starr,10(PP7980) says, “Acknowledged skills and cultural authority are to the professional classes what land and capital are to the propertied. They are the means of securing income and power. For any group, the accumulation of authority requires the resolution of at least two distinct problems. One is the internal problem of consensus; the other is the external problem of legitimacy.”
In terms of consensus, physicians were struggling mightily to come to agreement about their common rules and standards. Internal divisions beset the profession from the mid-19th century until the early part of the 20th century. Concerning legitimacy, in Europe medical degrees granted deference and respect, but in America the meager educational requirements left physicians with a perilously slender margin between themselves and their patients–and sometimes no margin at all. Therefore their powers of persuasion, along with their ability to kindle feelings of confidence and trust, were critical to their success. In America a physician’s standing was tied to his own family background, as well as the social rank of his patients. At the top were men who, like William Stewart Halsted, had graduated from elite colleges, attended medical school, and received further instruction in Europe.
Although the AMA was in its infancy in the mid-19th century, it wasn’t until 50 years later–the early 1900s–that medical societies began replacing the internal dissension and competitive relationships among doctors with a brotherhood of shared interests (note 2). But in 1900 the AMA still sought to address the issue that had motivated its formation: control of medical education. This was the chosen methodology to consolidate the profession: standardized schooling would ensure both conforming ideas and uniform practice.
Reform of medical education had its beginnings in the ~ 1870s when the Quaker merchant Johns Hopkins died in 1874, j willing half of his $7 million estate to found a university, and the l other half to build a hospital. At the time, this was the most substantial endowment in American history, setting the precedent for linking laboratory research with clinical patient care. The prototype came from Europe, where laboratories in physiology, chemistry, pathology, and histology were transforming hospitals. Johns Hopkins University opened in 1876, the hospital in 1889, and the medical school in 1893. Johns Hopkins School of Medicine had the hitherto unheard of admission requirement of a Bachelor of Arts degree, and the curriculum was lengthened from 3 to 4 years. The crucial half million dollars needed to complete the school was donated by wealthy Baltimore women who made their offer contingent upon the admission of women on the same basis as men.
Johns Hopkins was the paragon of virtue in the eyes of the AMA. This single institution had–and continues to have–enormous leverage on the course of medicine. The policies ensconced there determined which institutions survived to govern the field, how they were structured and administered, and what ideology would triumph (note 3).
Finally standards were being set for medical education as graduate study, with strength in both science and clinical medicine. The next advance was creating residencies in specialized fields. Two towering giants in medicine, William Welch, a pathologist, and William Osler, an internist, were dedicated to building Johns Hopkins as the archetype for training not only physicians, but medical scientists as well. Welch, however, vied for the interests of research, whereas Osler championed the interests of clinical medicine. Osler admonished: “Care more for the individual patient than for the special features of the disease…. Put yourself in his place … enter into his feelings, scan gently his faults. The kindly word, the cheerful greeting, the sympathetic look–these the patient understands.”13 Osler further expressed concern that patient care might suffer if it became completely subservient to research, but Welch differed, determined to elevate the role of science in medicine.
Then and Now: History of Cancer of the Breast
The earliest known chronicle of breast tumors was recorded on Egyptian papyrus more than 3000 years ago, but no treatment was described. During the Middle Ages, mastectomy was wielded as tortuous punishment against women accused of religious or political deviation, such as Saint Agatha, patron saint of the breast. In France in the 1820s it was hypothesized that certain personality dispositions were considered to be more prone to breast cancer than were others. Women with a “bilious constitution and a dejected, melancholy character,” for example, were especially predisposed.14 In Italy in 1842 Domenico Rigoni-Stern analyzed statistical data from death registries and noted that breast cancer incidence increases with age, and that unmarried women are at greater risk than married women.’) Judgments that blamed sad or angry women for bringing breast cancer on themselves as well as the prescient insight that childbearing affects the incidence of this disease both occurred more than a century ago, repeating echoes across a historical canyon.
One of the earliest recorded nonpunitive mastectomies was performed by the accomplished surgeon James Syme (whose daughter married Joseph Lister) in a surgical amphitheater in Edinburgh in 1830. Dr John Brown, recollecting the event in 1863, tells us,
Allie stepped up on a seat, and laid herself on the table … arranged herself, gave a rapid look at James [her husband]’ shut her eyes … and took my hand The operation was at once begun; it was necessarily slow; and chloroform–one of God’s best gifts to his suffering children–was then unknown. The surgeon did his work. The pale face showed its pain, but was still and silent…. It is over: she is dressed, steps gently and decently down from the table, looks for James, then, turning to the surgeon and the students, she curtsies–and in a low, clear voice, begs their pardon as if she has behaved ill. The students–all of us–wept like children; the surgeon wrapped her up carefully, and resting on James and me, Allie went to her room.4
Four days later, Allie Noble developed an infection that she did not survive. Surgery was excruciating and death from it a likely possibility before anesthesia and asepsis. Interestingly, Symes also noted in 1842: “The result of operations for carcinoma when the glands are affected is almost always unsatisfactory, however perfectly they may seem to have [been] taken away. The reason for this is probably that the glands do not participate in the disease unless the system is strongly disposed to it.”4(p650) British breast physician Michael Baum4(p650) comments that “[t]his statement is of great historical significance for two reasons. Firstly, it illustrates that surgeons long before the Halsted era were attempting perfect clearance of the axilla. In addition, it also illustrates that they recognized that such efforts were futile in the presence of extensive involvement, a sentiment that was ignored for a further 120 years.”
Accidents in which women’s breasts were caught in the wringers of old-fashioned washing machines were commonplace. After such a mishap, women would visit their doctors because of tenderness, swelling, and pain. Probably because an existing lump was noticed following such an accident, it was supposed that breast cancer was caused by trauma. This is one early example of the coincidental being mistaken for the causative. But for the most part, before routine palpation or mammography, breast cancer was not recognized by subtle signs. Instead, women suffered from glaring complaints, such as oozing ulcerations and the malodorous weeping of distended, deformed, throbbing, eroded flesh. Such agonizing symptoms, even more than implacable death, caused the worst despair. In the context of the times, it was highly desirable to seek a cure for these symptoms–to remove the field upon which the game was played, annihilate the messenger (if not the message), and abort the short-term pain (if not the final demise). Today, though there is a pervasive amorphous panic, a spiny dread of death, in this country there are never open, rank, leaking wounds.
In the 19th century, women with breast cancer were in a social sense considered to be lepers–it was a disgrace as well as a medical problem. As late as the 1960s breast cancer was not publicly discussed and women did not openly volunteer that they had the disease. (It was not until 1974 that by announcing her breast cancer diagnosis, First Lady Betty Ford pierced the public veil on the subject. President Ford did not hesitate to decide that she would have a mastectomy.)
Perhaps what Halsted meant when he promised to “cure” carcinoma of the breast was to remove the immediate and recurring misery, not the disease itself or its eventual outcome. In Halsted’s paper, published in 1894, he acknowledges: “The efficiency of an operation is measured truer in terms of local recurrence than of ultimate cure.”3(p302) But Halsted’s zealous victory over local recurrence assumed a life of its own and later followers confused elimination of symptoms with a remedy for the disease. In the urgency to effect an absolute cure, progressively more and more tissue was expunged in an attempt to avert “recurrence.” The concept that removing the breast would erase the disease was irresistibly seductive. It is useful to trace the intellectual origins of this theory.
A Short History of Ideas: Virchow’s Influence
Tuberculosis, the sovereign disease of the 19th century, was the leading cause of death, as feared as it was widespread. In Europe, the work of the German physician Rudolph Virchow (1821-1902), the father of “cellular pathology,” advanced medical knowledge. His contributions were substantial; for example, he identified leukemia in 1845 and in 1846 articulated the process by which blood clots become obstructive. At a time when medical focus was narrowed to the courses of particular diseases, Virchow both broadened and magnified the lens by gazing into the nature of specific pathophysiological processes. He mapped the tissue reactions of atrophy, hypertrophy, inflammation, embolism, necrosis, tuberculosis, cancer, fibrosis, and calcification. Many of Virchow’s concepts have withstood the test of decades, but a few of his ideas were off course. Because of his immense stature, however, his faulty conclusions were also fully embraced and perhaps disproportionately influential.
Virchow proclaimed the tissue changes characteristic of tuberculosis as emblematic for the disease process in general, and cancer in particular. His revolutionary biological model of breast cancer professed that tumors arose within the skin, rather than as a systemic disorder, invading locally and centrifugally in all directions, spreading along the planes of muscles and through Iymphatic channels. Furthermore, Virchow thought that the lymph nodes under the arms acted like filters, blocking the spread of the disease to the organs and skeleton. If the tumor burden penetrated the Iymphatic defenses, then the disease progressed in an orderly manner from the center outward to the chest, trunk, upper arms, and thighs.
Virchow was not a clinician. He did not engage in the care of patients, instead focusing solely on tissue reactions in the lab. His positive disdain for clinical evidence became an intellectual trend. A tacit reverence for and acceptance of Virchow’s theory that the lymph is the highway of the cancerous process persist today, though we know that metastases require blood supply (angiogenesis) and also travel through the circulatory system to distant (metastatic) sites.
The Legacy of William Stewart
Virchow and Halsted were characterized by monumental achievements. Just as Virchow was credited as the most influential early figure in German medicine, so Halsted occupies that position in American surgery. More than any other physician, Halsted was personally instrumental in the genesis and rise of the specialty of surgery. First, he performed operations that only highly trained specialists could duplicate; second, he transformed surgical education by establishing a residency program in surgery, overturning a hierarchy in medicine that had endured for centuries in both Europe and America. Halsted singularly hoisted surgeons to the pinnacle of the social caste of medicine.
In 1852, when Halsted was born, his family owned the textile import firm of Halsted, Haines and Company (note 4). Halsted attended boarding school at age 10, graduated from Phillips Andover, and joined the Yale Class of 1874. He then entered the College of Physicians and Surgeons in New York (which was to become the Columbia School of Medicine) for the customary 3 years, interned at Bellevue in 1876 during medical school, then in 1878 studied for 2 years in the illustrious medical centers of Vienna and Germany. After returning from abroad, Halsted put Virchow’s theories into practice, performing operations that removed the entire Iymphatic and muscular field surrounding carcinoma. The golden rule for the management of breast cancer hence became the Halsted radical mastectomy.
Halsted introduced techniques and set standards that are now customary, but which at that time were startling surgical innovations–namely, radical en bloc removal of the breast; hernia repair; refined thyroidectomy and intestinal anastomosis operations; a completely bloodless operating field and uncompromising sterility; careful, meticulous, anatomically precise surgical dissection that minimized undue trauma to surrounding tissue; direct blood transfusion; and fastidious closure of the j wound, layer by layer, with silk sutures.11(pp386-421) When Halsted’s operating room nurse and soon-to-be wife, Caroline Hampton, developed a rash from handling irritating solutions of mercuric chloride, Halsted wrote to Goodyear Rubber and requested that they produce an experimental pair of thin rubber gloves. On trial, they were so successful that more were ordered, and now no surgery can be imagined without them. (Although Halsted was neither the first surgeon to perform a mastectomy nor the first to use rubber gloves, because he popularized them in America it is he who is given credit for them [note 51.)
Halsted was surgeon-in-chief and professor of surgery at Johns Hopkins at the time the medical school opened in 1893. Having personally observed the European medical nobility (Virchow, Billroth, Kocher, von Volkmann), he emulated them, hitching the pathology laboratory to the surgical theater, splicing science with clinical practice. Reproducing the best of w hat he had witnessed a dozen years before, Halsted created the first and foremost surgical residency program in America, directing it for 3 decades. The seeds of his philosophy were sown deep, far, and wide–his residents initiated top-notch residency programs across the country, graduating 166 chief residents who bred successive generations of surgeons. Halsted also trained more than 50 teachers–among them men who became professors of surgery at Harvard, Stanford, Yale, Johns Hopkins, Cornell, Pittsburgh, Cincinnati, Virginia, and other exceptional schools of medicine. This group produced a second generation of 139 teachers of various ranks, influencing a prestigious and vast swath across the geographical landscape of medical education.15 They proceeded to teach others, insuring that Halsted’s views were so broadly disseminated that they became the official guideposts and doctrine of the surgical world.
The early 1880s, a decade before Johns Hopkins Medical School commenced, were productive and prolific for Halsted. He published 20 scientific papers, lectured in anatomy at his alma mater, became an associate in a surgical practice at Roosevelt Hospital, and set up the outpatient clinic there. But by 1885 this had changed and Halsted’s ability to deliver lectures as well as his attendance at professional meetings dramatically waned. Although a well-kept secret at the time (which wasn’t confirmed conclusively until 1969, when the diary of William Osler was unlocked and disclosed), Halsted’s study in Europe had launched him into a cocaine and morphine addiction that was to last the
rest of his life. Halsted’s dependence began in Vienna in 1884 I when ophthalmology resident Karl Roller discovered that a few drops of cocaine numbed the surface of the eye. This discovery led to the use of local anesthesia and, curiously enough in the light of history, was proposed by none other than Koller’s friend, Sigmund Freud, then a 28-year-old neurologist (note 6).
Both Freud and Halsted, inspired by Koller, undertook their own investigations. In 1884, Halsted began injecting this remarkable substance into himself and his colleagues to determine its effect in blocking nerve conduction (note 7). From this time forward, Halsted struggled with a successfully clandestine yet sometimes debilitating addiction that profoundly altered his personality, yet never eclipsed his medical life. Welch, the renown Johns Hopkins pathologist and Halsted’s dutiful friend, took him sailing on a 2-month voyage through the Caribbean in the winter of 1886, hoping to correct his habit. But Halsted was admitted to Butler Hospital in Providence for 7 months later that same year, and for 9 months in 1889. Halsted’s addiction effectively terminated his career in New York. Again Welch rescued him by inviting him to Baltimore and securing him an appointment at Johns Hopkins.
William Osler, Welch’s partner in shaping the medical school as well as its first professor of medicine, regarded as the most eminent clinician of his time, entered in his diary that 6 months after Halsted had been awarded his full position at Johns Hopkins, he saw him in a severe chill, realizing that he was still taking morphia. Having gained one another’s confidence, they discussed that Halsted had never been able to reduce the amount to less than three grains daily (one grain equals about 60 mg). Osler also recorded that he did not think anyone suspected Halsted’s habit–not even Welch, who assumed the addiction had been conquered. Later Osler added that in 1898 Halsted reduced his dose to 1’/: grains–nine times the standard 10 mg of morphine prescribed for severe pain today. Halsted permitted the popular deception to persist that he had been “cured” after his second hospitalization; in the public eye, he was clean. His close friends, however, noted that the socially exuberant extrovert u ho had studied in Europe had returned strangely altered.
Halsted’s distinguished resident, Harvey Cushing–the progenitor of neurosurgery, the chief of surgery at Harvard, and the man for whom Cushing’s Disease was named–knew Halsted only after his temperamental shift. Upon Halsted’s death in 1922, Cushing eulogized his mentor (Yale Alumni Weekly. February 23, 1923), regarded by many as the most eminent surgeon of his time:
- [Halsted] was a man of unique personality, shy, something of a recluse, fastidious in his tastes and in his friendships, … the victim of indifferent health, he nevertheless … may be considered to have established a school of surgery comparable, in a sense, to the school of Billroth in Vienna…. [A]n aristocrat in his breeding, scholarly in his habits … having little interest in private practice, he spent his medical life avoiding patients…. A bed-to-bed ward visit was almost an impossibility for him. If he were interested he would spend an interminable time over a single patient, … carrying the problem to the laboratory and perhaps working on it for weeks.
Halsted’s lack of interest in his patients as people was reminiscent of the heroic Virchow. He was scrupulous and painstaking in the surgery itself, yet harbored an aversion for interaction as a form of caring for his patients. At the same time, he fashioned himself as their savior. Most significantly, the complicated radical mastectomy launched surgeons on a trajectory of prestigious professional accomplishment. Because of the anatomical and technical prowess required, in 1898 surgeon Frederick Gerrish’6 said of Halsted’s radical mastectomy: “We now have an operation which should be regarded as unjustifiable for the general practitioner.”
Virchow and Halsted were uncommonly devoted medical scholars and sleuths. Hooked on deciphering pathological mysteries, the interest of science was their priority. Surgery afforded the chance for live dissection, an occasion immensely more instructive than the scrutiny of cadavers. Throughout surgical history, peeking within the pulsing inner sanctum yielded scholarly returns, even when there appeared to be no profit for the patient.
Advances in knowledge sometimes occur in the absence of therapeutic gain–the interests of clinicians and researchers are interdependent, but not necessarily identical. Lithographs of Halsted’s early mastectomies illustrate exceptional textbook learning opportunities, showing the skin vividly peeled back from the chest wall, exposing the vast web of glands and vessels. On the other hand, women were left with a large, open chest wound thick with clots that sometimes took months to heal. Halsted defined success by the tissue samples gleaned and the perfection of the technique employed. Ultimately, however, contrary to concurrent insights, he believed in Virchow’s notion that cancer spread to muscles via lymph.
As late as 1907, in a follow-up paper titled “The Results of Radical Operations for the Cure of Cancer of the Breast,” Halsted17 echoed Virchow’s flawed theory, writing:
- I recall … cases … in which general metastasis was believed, erroneously, I think, to have occurred by way of the bloodvessels [sic]…. We believe, with Handley, that cancer of the breast, in spreading centrifugally … before involving the viscera may become widely diffused along surface planes…. It permeates to the bone rather than metastasizes to it, and, by way of the Iymphatics, along facial planes … the liver may be invaded by way of the deep fascia … the brain by the Iymphatics accompanying the middle meningeal artery…. Though the area of disease extends from cranium to knee, breast cancer in the broad sense is a local affection … invariably by process of Iymphatic permeation, and not embolic by way of the blood. If extension, the most rapid, takes place beneath the skin along the fascial planes, we must remove not only a very large amount of skin and a much larger area of subcutaneous fat and fascia, but also strip the sheaths from the upper part of the rectus, the serratus magnus, the subscapularis, and, at times, from parts of the latissimus dorsi and the teres major. Both pectoral muscles are, of course, removed. A part of the chest wall should, I believe, be excised in certain cases, the surgeon bearing in mind always that he is dealing with Lymphatic, and not blood, metastases…. It must be our endeavor to trace more definitely the routes traveled in the metastases to bone, particularly to the humerus, for it is even possible in case of involvement of this bone that amputation of the shoulderjoint, plus a proper removal of the soft parts, might eradicate this disease…. So, too … amputation at the hipjoint may seem indicated.
Halsted proposed the notion that more is better, suggesting the removal of the sheath covering all muscles surrounding the breast, the upper part of the abdominal muscle that extends from the rib cage to the pubis, those that control the motion of the shoulder blade and rotate the arm, and, in some cases, removal of the arm and hip as well. Halsted’s hypothesis is captured above: to contain the disease it may be necessary to excise all contiguous areas. Of particular note is that this flawed logic persists today. Cancer continues to be treated more like dry rot in the rafters of a house than microbes in a river.
In 1886 Rudolph Matas (1860-1957), founder of the Tulane School of Medicine and the father of vascular surgery, visited Paris and observed breast operations there. Later, in 1898, Matas’8 followed Halsted’s protocol, but remarked,
But if we were to follow this principle of prophylactic extirpation to its legitimate and logical conclusions we would be compelled to control part of the vascular (venous) channels which drain the region, as these are just as likely to serve as avenues of dissemination as the lymph tracts. The impracticability of such a proposition is so grossly apparent that it would be absurd even to refer to it were it not that it demonstrates how imperfect and limited are our surgical resources to cope with this illusive [sic] and far-reaching evil. The new operation will unquestionably greatly diminish the probability of local recurrence, but the patients will die, as a rule, just as quickly by regional and internal metastases as if a superficial operation had been performed.
It was common sense to Matas that cancer was as likely to spread via the blood vessels as via the Iymphatic channels, and that if it had disseminated, no amount of local management would be sufficient. He comments that within the abiding logic, all the blood vessels must be removed, along with the Iymphatic channels–a patently infeasible process. Although the observed lions expressed by Matas cast Halsted’s model into doubt, the two were close personal friends. Because Halsted was his senior, Matas never crossed him.
Medical Veracity: Authority vs. Standard of Proof
Early in the 1900s it was popular to employ gold salts in the treatment of tuberculosis and arthritis. Not until 1924 was a critical experiment undertaken in which, out of 24 people with similar disease, 12 received gold salts and 12 received distilled water. Those receiving the gold fared worse than did the untreated (control) group. Commenting on the experiment in retrospect, Harry Dowling said it was noteworthy because it introduced the notion of controlled therapeutic trials to eliminate false claims of efficacy. In addition, Dowling19 contended the following: “The I lesson was long overdue. If every therapeutic agent advocated for I an infectious disease since 1900 could have been studied as rigorously, the medical profession would have fewer remedies, but ~ the patients would have been exposed to less discomfort and I danger, the community would have had less expense, and fewer | patients would have died.”
An experiment similar to the test for the efficacy of gold salts has yet to be undertaken for women being treated for breast cancer. Physicians sometimes issue proclamations that appear more like sacred doctrine than secular investigations. Reflecting on the shift of belief from religion to science in the 19th century, philosopher Seren Kierkegaard noted that visits with priests were being replaced by appointments with doctors. It was they who were deciding who was crazy or sane, sick or well, who should serve in the army and who should not. By determining how people are born and die; by naming disease; by interpreting feelings, behaviors, signs, and symptoms; and by issuing prognoses, doctors assume immense authority.
Authority by nature commands obedience. Medicine acquires cultural authority by dictating definitions of reality and forwarding judgments about which schema of meaning will triumph as valid. It is ironic that in an attempt to implement scientific advances, verification is sometimes ignored and the principles of science are set aside. At times the mere newness of a technology is taken as evidence of its superiority. An intrinsic contradiction in medicine also exists: because solutions are often fragmentary and incomplete–sometimes merely analytical and speculative–doctors try to avoid saying “we know,” yet they must act as though they do! There is a grand expectation on the part of patients for I deliberate, confident action to relieve suffering.
It is superbly American to, as the Nike advertisements exhort, “Just Do It.” The preference for intervention over reflection is codified by tradition and practice; doctors charge higher fees for performing procedures than for cognitive services. Within this environment, in which deeds are valued more than deliberation, certainty more esteemed than doubt, an inexorable faith in future progress also exists.20 It is paradoxical that a blind faith in reason sometimes supersedes the doctrine of proof.
Medical sociologist Paul Starr10(p55) comments that, in the beginning of the 19th century, “[the early empirical investigations showed that accepted techniques [like bloodletting] had no therapeutic value, yet there were no effective alternatives available to replace them. This bears surprising resemblance to the use of mastectomy–there is little evidence to validate its use when compared with lumpectomy, yet in a vacuum of viable alternatives it persists, because at least it is something that we can do. Surgeons are loyal to the Nike mentality, even if they wear Guccin on their feet. They are an athletic, action-oriented guild. When Halsted championed mastectomy as “the operation for the cure of cancer of the breast,” he did not attempt anything less than complete conquest–a total solution. As captain of both the Yale boxing and football teams, he was nothing if not a man of action, preferring definitive solutions over thorny dilemmas.
A Chronicle of Breast Conservation vs. Removal
The breast-conserving approach to the management of breast cancer is understood as the excision of the tumor itself, the lump, and a small margin of surrounding tissue, but not the entire breast. This is now called lumpectomy. The challenge to Halsted’s teachings met with a legacy of disregard and disrespect within devout surgical gatherings.
Just as Semmelweis had been ridiculed for suggesting hygiene in childbirth decades earlier, so Sir Geoffrey Keynes of Britain was scorned when he introduced the breast-conserving tumorectomy with a radium needle insertion in the 1930s. Five-year survival rates were similar to those of Halsted’s mastectomy, but Keynes was greeted with profound contempt during his lecture tour in the United States. Twenty years later, failing to adhere to surgical dogma, he was again punished when Scotsman Robert McWhirter spoke at a meeting of the American College of Surgeons. McWhirter suggested replacing radical mastectomy (removal of the breast, pectoral muscles, and Iymph nodes) with what is now called a simple or total mastectomy (removal of the breast, leaving muscles and nodes) accompanied by radiation, and thousands of physicians thunderously booed him off the stage. McWhirter was not even challenging the conceptual model–merely simplifying the surgical procedure. Today the modified radical mastectomy (introduced by Patey and Dyson in England in the 1940s) consists of removal of the breast and nodes, leaving the pectoral muscles intact.
In the late 1940s, after attending Yale (Halsted’s alma mater) as an undergraduate and completing medical school, my father, Malcolm Beinfield, did a surgical residency at Harlem Hospital. Harlem housed several prodigious masters of surgery at the time. However, unlike their counterparts at the Mayo
Clinic, Memorial Sloan-Kettering, or Presbyterian hospitals–all
of whom were part of the grand establishment of medicine, replete with highly endowed funding for the best and most advanced research–Harlem depended on old-fashioned empirical observation and pragmatic experience. The surgeons at Harlem questioned the logic of mastectomy for their patients with breast cancer. It was not until 1948 that Harlem’s Louis Wright became the first black surgeon admitted to the American College of Surgeons. Perhaps the forming of independent clinical judgments was facilitated by his status as an outsider.
Joining the clinical faculty at Yale in the 1950s, my father witnessed Drs Ira Goldenberg and Leonard Prosnitz in the 1960s perform lumpectomies followed by radiation therapy. In 1964 he heard George Crile describe animal experiments that refuted the teachings of Virchow and Halsted: cancer cells did not spread predictably, Iymph nodes did not act as filters, and access to vital organs occurred via the bloodstream as well as the Iymph. In a 1955 article called “Common Sense in Cancer,” Crile21 warned against super-radical attempts to accomplish the impossible. He noted that for many surgeons, the presence of cancer justified anything that they elected to do: “They do not admit that attempts to cure incurable cancers usually do harm. Fear of cancer should not be exploited. Surgeons should not subject patients to useless operations in cancer’s name…. This is not the solution of a problem, it is the definition of one…. When we cannot cure, we must be careful that at least we do no harm.”
Yet at the same time that Crile was rethinking the model and suggesting a less drastic surgical intervention, Owen Wangensteen, himself a surgeon of great distinction at the University of Minnesota, submitted that the reason the Halsted mastectomy did not produce better results was that it was not radical enough. Wangensteen proposed what he called a super-radical mastectomy, removing not only the pectoral muscles and Iymph nodes of the breast and underarm, but the nodes adjacent to the sternum as well as a portion of the first rib and collar bone. It was necessary for him to saw through and split the sternum to excise the Iymph nodes in the space around the heart. This brutal surgery required at least several weeks of hospitalization, and a number of women did not survive. To Wangensteen’s credit, he noted his rather poor results, reported the operative deaths, and terminated the use of this procedure. He erroneously thought, however, that his “failure may have been in the execution of the concept rather than in the concept itself.”22
In the early 1950s, Wangensteen’s contemporary, Jerome Urban at Memorial Sloan-Kettering, excised a sizable portion of the chest wall in order to reach the internal mammary Iymph nodes. Bypassing mortality, Urban performed a comprehensive calisthenic surgery (removing more tissue than anybody else) without any proven gain. Wangensteen and Urban were both clinical investigators whose approach to medicine appeared to regard patients primarily as experimental subjects.
In contrast to the super-radical mastectomy, Crile’s arguments began to be echoed by brothers Bernard and Edwin Fisher, who in 1958 began studies that were to culminate in the genesis of the school of “biological determinism”–meaning that the outcome of treatment was predetermined by the biology of a systemic disease process. Unlike many of his predecessors, Bernard Fisher was a pioneer in the application of clinical research methodology, establishing the importance of prospective randomized studies, which have now become the standard. Prospective means preplanned and randomized means selected by chance (such as every other chart). Through 23 clinical trials with thousands of women over decades, Fishery23 clearly established that mastectomy had no survival advantage over lumpectomy with radiation in women with a tumor size that conformed to the criteria of the study: 4 cm or less.
It was my father’s medical school roommate, Nathaniel Berlin, clinical director of the National Cancer Institute (NCI) through the 1960s and chairman of the NCI Breast Cancer Task Force until 1975, who secured funding for Fisher’s studies after Congress passed the National Cancer Act of 1971. The climate was such that Fisher was unable to recruit enough American surgeons into the study–they were unwilling to venture beyond the conformity of ideas and established standards of practice, though Canadian physicians were willing. The atmosphere surrounding the clinical selection of lumpectomy over mastectomy remained charged well into the ’80s.
My father performed his first lumpectomy in 1978, but not without derision from his colleagues. On occasion, the women he treated would request a second opinion from another surgeon. If a woman had metastatic disease–sometimes years following a lumpectomy–one colleague of my father’s insinuated that had the woman come to him (rather than my father), he would have done the proper operation (mastectomy) and cured her, thus proving he was able to “get it all.”
By modern standards, Halsted’s studies were sloppy and unkempt. This is not completely incomprehensible, though, because his landmark paper proclaiming “operations for the cure of cancer of the breast” was based on research between 1889 and 1894, the same period that his addiction plagued him so heavily. For the bulk of 1889 he was even hospitalized in Providence. Although Halsted’s study covered the period between June 1889 and January 1894, he mistakenly included women in his report from March 1894, three months after the study was closed. Halsted3 stated: “Local recurrence is a return of the disease in the field of operation in the apparent or buried scar.” Yet under the heading of women without local recurrence, he included those who recurred on their scar, contradicting himself. He focused on local recurrence, not survival, and tracked the women he saw for 3 years or less. Out of 50 cases, only 3 women were followed and found to be alive 3 years later. Eighteen were followed for less than 2 years, and 43 were followed for less than 3 years. If lumpectomy studies showed anything less than a 5-year survival, they would have been regarded as statistically laughable. But due to Halsted’s authority and the ideological loyalty he inspired, his research methodology and results, though poor, never seemed to deter multitudes of followers.
One hundred years later, a double standard still remains. Lumpectomies are held to rigorous standards of efficacy, whereas mastectomies have never been subjected to anything close to the same requirements. A recent scandal has also clouded clear thinking. In 1994 Bernard Fisher, professor of surgery at the University of Pittsburgh, was ousted from his chair of the National Surgical Adjuvant Breast and Bowel Project (NSABP) because an investigator from Montreal, Roger Poisson, committed acts of scientific misconduct on Fisher’s watch. Poisson altered surgical biopsy dates for 6 patients so they would be eligible within the Protocol B-06 requirements. His actions, irresponsible because of the deceit involved, did not, however, affect the end results. All 354 patients at his hospital were eliminated from the total group of 2163 women by subsequent auditors, and adequate numbers remained to assure overall credibility for the study, which covered the period between 1976 and 1984.24,25
There were, however, public alarm and breech of trust over this incident. Even though no patient’s welfare was compromised, and no research outcomes were altered, the safety of lumpectomies was thrown into question by newspaper headlines that did not fully explain the nature of the error, possibly setting back use of this breast-conserving procedure. Now extensive reviews of Fisher’s data have been published, confirming the original conclusions-namely, that mastectomy, lumpectomy, and lumpectomy with radiation provide comparable survival advantage.26
Outmoded Ideas and Practices
It is becoming clear that the Halsted mastectomy was based on an outdated model of breast cancer. Fisher27 revised the model after years of clinical trials, concluding that
“cancer is a systemic disease involving a complex spectrum of host-tumor interrelations and that variations in local-regional therapy are unlikely to substantially affect survival. All of the findings … did not conform to the concepts that served as the basis for the principles of the Halstedian hypothesis but, rather, provided a matrix for the formulation of an alternative thesis, which is biologic, rather than anatomic and mechanistic, in concept. Its components are completely antithetical to those of the Halstedian thesis.”
Fisher further clarified some misconceptions regarding u ho is eligible for lumpectomy. Tumor size or location does not preclude saving the breast by use of lumpectomy. Large tumors can often be shrunk by preoperative chemotherapy. Women with lymph nodes that are found to have (positive) or not have (negative) cancerous cells are equally eligible. Age is also not a factor–lumpectomy is equally appropriate for older and younger women. Finally, there is the issue of patient choice, and a woman’s preference for mastectomy. To this Fisher27 says, “Patient autonomy will not be compromised and paternalism will not be resurrected if physicians firmly inform patients that, in almost all cases based on current knowledge, mastectomy is no longer justifiable, and lumpectomy followed by breast irradiation will not put them at greater risk of developing systemic disease or of dying than mastectomy would.” Fisher’s reanalysis and results were published in a 1995 report. He found that upon evaluation of three treatments (simple mastectomy, lumpectomy with irradiation, and lumpectomy alone), an average of 60% of patients were alive after 12 years and about 50% had no tangible signs of disease.26
To account for the discrepancy between the research supporting lumpectomy and the persistence of its l
lack of use, Harvard professor of surgery William Silen28 laments the replacement of data by dogma. “One of the best examples of this,” Silen says, “is the use of the Halsted radical mastectomy for breast cancer.~ He identifies several problems, beginning with residency training when the young doctor is indoctrinated into managing situations in the “usual manner because that’s the way we’ve always done it. Such normative behavior is expected to occur automatically and without question.” He continues: “Beyond the period of training, surgical practice is strongly influenced by the leaders of the profession who are not always meticulously scrupulous in attention to the validity of the material they publish.” He chastises the profession to more accurately assess the outcomes of what it does.28
Although remuneration for mastectomy is more than triple that of lumpectomy, financial motives do not account for the hegemony of this procedure. Habits and tradition assume an authority of their own. Is it reasonable to liken surgeons, men or women, to the tribal Africans who perform clitorectomies with the unshakable conviction that they are acting in the best interest of the woman? In both instances, what is best for the woman is associated with maintaining conformity with an outmoded belief. It is neither the women nor the doctors who are to blame; both come to the matter with honorable intentions. Cultural forces conspire: professional recommendations conflict, an irrational fear of keeping the breast is planted in women, and mastectomy constitutes a conclusive sacrificial act that permits women to feel as though they are doing everything they can.
Mastectomy itself is not difficult, nor does it constitute a serious risk. Perhaps it even serves as a form of penance for women who unconsciously feel that they have been bad enough, or foolish enough, to have contracted the disease in the first place. It appears to be the very least they can do to neutralize the offending body part, to cast it, along with some small measure of their fear, aside. Upon encountering the dreaded words of the doctor, “I’m sorry, the mass is malignant,” a woman can be overcome by waves of shock, succeeded by an avalanche of terror, followed by the resolve to beat this disease. It is not uncommon for a woman to respond with offensive resolve, asserting, “l want it out.”
Yet in 1951 Scotsman Wallace Park and Englishman James Lees29 theorized that treatment has little, if any, influence on the natural history of the disease, maintaining that the type of tumor and its biology are determinant.29,30 It is curious that in Europe there has been less resistance to this view than in America. There has yet to be a modern prospective randomized trial of how women fare with and without treatment.
There was, however, a unique study of 250 untreated women between 1805 and 1933 at Middlesex Hospital in London, where women were diagnosed without benefit of mammography–only by the naked eye or palpation of a mass. Middlesex, founded as a hospice for cancer patients in 1792, housed only women with extensive and measurable disease. According to an analysis comparing the Middlesex patients with those treated by Halsted, the untreated women did about as well as those who received the Halsted mastectomy between 1889 and 1933 (note 8). Many did as well as women today who obtain the most advanced therapies.
Highly regarded medical oncologist Craig Henderson, formerly of Harvard’s Dana Farber Cancer Institute and the University of California-San Francisco’s Breast Cancer Center, uses the Middlesex patients as an example. “The median survival time of the untreated patients was 2.7 years,” he says, “and several patients lived almost two decades without treatment after the first symptom or sign of cancer in the breast. The survival of treated patients in the earliest radical mastectomy series was not very different … [anal strikingly similar to that of this subset of American patients with apparently aggressive disease whose tumors were diagnosed and treated more than a century later” (note 9).
Nancy Evans of Breast Cancer Action–herself diagnosed with breast cancer 7 years ago–points out that, as with people diagnosed with HIV, it is unlikely that breast cancer is a curable disease, despite the reality that many women live with it a long time, dying finally of other causes (note 10). In patients followed for 20 to 30 years after initial diagnosis and treatment, 75% to 85% showed some evidence of tumor persistence at the time of death.3t Although “the earliest possible diagnosis” is sometimes helpful, it is not necessarily so.
The notion of a “cure” can be misleading, implying that we are fixed, inoculated against death, our existential state of impermanence magically remedied. There are several different medical uses of the term “cure.” Clinical cure refers to a 10-year period in which there are no known symptoms and no known recurrence. Statistical cure means that a woman diagnosed with breast disease has the same relative survival chances as does the normal population–even though she may die “of” or “with” her tumor. Biological cure means that there is no evidence of malignancy at autopsy (verbal communication, Nat Berlin, MD, February 1996).
Continue to Part 2