Authors: Luke Dittrich
And then he came across Egas Moniz's monograph describing Moniz's first experiments with leucotomy. Immediately Freeman knew what he wanted to do, and just a few months later, on September 16, 1936, in collaboration with his neurosurgeon partner, James Watts, Freeman oversaw the first leucotomy on U.S. soil, on an anxious, insomniac housewife from Topeka, Kansas.
In 1938, Moniz was forced into a temporary professional hiatus after one of his former patients confronted him in his home and shot him through the spine. By that time, Freeman and Watts had already overtaken him as the most prolific practitioners of psychiatric surgery. Freeman had also given the new field a nameâpsychosurgeryâand developed what he considered to be an improvement on Moniz's technique, replacing the wire-based leucotome with a scalpel and having Watts make his approach through the sides of the skull rather than taking Moniz's somewhat messier route through the forehead. Freeman described this as his “precision method” and also tweaked the name of the operation, replacing the Greek root
leuco
with the Greek root
lobo,
meaning “of the lobes.”
Lobotomy.
That Halloween morning, as Freeman stared down into his fifty-second lobotomy patient's eyes and heard her mistake him for the publishing magnate William Randolph Hearst, his career was ascendant. He was a young, charismatic doctor wielding an exciting new treatment for ancient maladies. Soon he would be more famous than his grandfather ever was. His growing celebrity would be helped along by a number of fawning profiles in major publications (a few of which, coincidentally, were owned by Mr. Hearst). The first
New York Times
piece about Freeman, which ran on June 7, 1937, carried the headline “Surgery Used on the Soul-Sick” and gushed about his “new surgical technique, known as âpsycho-surgery,' which, it is claimed, cuts away sick parts of the human personality, and transforms wild animals into gentle creatures in the course of a few hours.”
By 1939, the age of the lobotomy had arrived, and Freeman was its most fervent evangelist. He had been performing the operations at a frenetic pace, traveling around the country giving demonstrations to scores of curious physicians, who in many cases soon began eagerly trying it on their own patients. He was also preparing for the publication of his first book. The bookâ
Psychosurgery
âwas as much a call to arms as it was a medical text, and it centered on a simple yet revolutionary thesis:
“In the past,” Freeman wrote, “it's been considered that if a person does not think clearly and correctly, it is because he doesn't have âbrains enough.' It is our intention to show that under certain circumstances, an individual can think more clearly and more productively with less brain in actual operation.”
He was glad she didn't recognize him.
If Patient 52 had remained lucid and continued to answer his questions correctly, if her voice had not suddenly been drained of emotion, if her eyes had remained sharp and inquisitive, if she had continued to sing that song, he would have told his colleague to keep cutting deeper, farther. He would have told him to cut until she became confused, until her thoughts became muddy and her personality ebbed away. This is why he kept his patients conscious during the operation: He wanted to make sure that their brains were receiving sufficient damage. He described this operative strategy as his “disorientation yardstick”; he explained that “impairment of memory, confusion, and disorientation usually come on within a few seconds to a few minutes after the fourth quadrant is sectioned” and that “when this disorientation occurs on the operating table we are satisfied that an effective operation has been performed. If the patient is still alert, oriented, and responsive, it is our custom to extend the incision into the upper and lower quadrants for fear that the relief obtained by the operation may be insufficient.” In general, he adhered to a simple strategy: “The best method, of course, is just to cut until the patient becomes confused.”
The trick, though, was to cut enough of the patient's brain to cause a state of confusion but not so much that the patient died or was permanently incapacitated. In successful operations, patients would become immediately disoriented, and perhaps incontinent, but would then over a period of weeks or months recover a measure of lucidity, recognizing those around them and remembering their pasts before the operation. They wouldn't be the same, though. That was the whole point, after all. To do with a few swipes of a blade what years on an analyst's couch, or in an asylum's cell, failed to accomplish.
Freeman expended a great deal of energy trying to gather evidence for these beneficial changes. He kept meticulous records and was diligent about staying in touch with all of his patients, monitoring their progress. He looked for signs of improvements everywhere. Sometimes he saw evidence where others didn't. For example, he always photographed his patients before their operations, then at some interval afterward. He developed the negatives himself and spent time peering into their pre- and postoperative eyes, reading them like tea leaves. In papers and presentations, he liked to point out how the eyes of most of his female patients looked notably more fearful and anxious preoperatively. (He failed to attribute this to the fact that for the preoperative pictures he almost always photographed the women while they were naked, while for the postoperative ones he almost always photographed them while they were fully clothed.)
He kept tabs on everything. Regarding postoperative eating habits, he found that there was “a high correlation between improvement and gain in weight” and noted approvingly that one woman, Patient 53, more than doubled her weight in the months after operation, from 85 to 210 pounds. Regarding postoperative personal grooming, he admitted that even his prying eyes were not “sufficiently acquainted with the mysteries of the boudoir to know just what happens following operation in regard to cosmetics, creams, lotions, rouge, lipstick, perfume, and the rest.” But he was confident that most of his patients, after their lobotomies, “have again been able to pay some attention to their personal appearance,” while preoperatively they “did not resort much to this socially acceptable feminine activity.”
This idea of social acceptability was key to what Freeman hoped the lobotomy would achieve. In his view, the world was full of social misfits. Some were plain to see: the hopeless cases caterwauling in the back wards of asylums, the disheveled vagrants wandering the streets muttering to themselves. The majority, though, were less obvious. And although his initial focus was on the extreme cases, he had started to perform lobotomies on people suffering in much more subtle ways: the housewife who displayed “affective incontinence” and descended into crying jags every afternoon, the premature spinster “gradually drifting into seclusiveness,” the obsessive-compulsive who washed his hands “so excessively that the skin was dry, rough, and cracking,” boys and girls who had a tendency to misbehave, throw excessive tantrums, or display an excessive preoccupation with masturbation. They spanned all ages: Freeman's youngest patient was seven years old, his oldest seventy-two. All of these misfits, old and young alike, leading lives of quiet or not so quiet desperation, hungry for relief.
Freeman didn't pretend to understand what, exactly, his lobotomies were doing to his patients. He knew that the swipes of the scalpel were severing many of the connections between the frontal lobes and the brain's deeper structures. And he knew that the frontal lobes were important. (The evolution of Homo sapiens from lower orders of anthropoids can be distilled down to the fact that our simian ancestors have much smaller frontal lobes than we do. The frontal lobes, then, must be crucial to humanity itself.) One of Freeman's contemporaries, the anthropologist Frederick Tilney, expressed what was then the prevailing view when he described the frontal lobes as “the accumulator of experience, the director of behavior, and the instigator of progress.”
But although their importance was unquestioned, their precise function was still a mystery. There were many different theories. The one Freeman was most drawn to imagined the frontal lobes as the rough physical analogue to Freud's concept of the superego. The emotional, animalistic impulses of the id originated in the deepest structures of the brain and radiated outward through the intermediate structures, which gave rise to the self (the ego) before reaching the frontal lobes, where emotions were processed and interpreted and reflected upon and controlled. In a functioning brain, the frontal lobes acted as a regulatory body. A feeling of profound sadness might rise up from the lower structures, and the frontal lobes would allow this sadness to be experienced fully, for a period of time, before tamping it down and cutting it off. In a dysfunctional brain, however, the frontal lobes might lock that feeling of sadness (or of fearfulness, paranoia, shyness, et cetera) into an unending cycle, or downward spiral, creating a neurosis.
Physicians might attempt to treat a neurosis with psychotherapy. Or with a change in environment. Or by placing patients in a copper box and heating them up until they developed a fever of 106 degrees. Freeman, who was nothing if not open-minded about the potential effectiveness of radical treatments, even suggested that a bullet to the brain, as long as one survived it, might have a salutary effect. “There can be no doubt that the first shock of the shooting produces a profound effect psychologically,” he wrote. “From the purely physical side, that is, from the trauma, the pain, the shock, the fever, and possibly the surgical intervention, there is some resemblance to shock methods which are apparently quite effective in treating depressions. These same effects might be expected in relation to any severe trauma whether self-inflicted or not.”
But Freeman believed that the lobotomy was a better, more direct, more scientific approach. By cutting a hole in a person's head, by inserting a scalpel and physically breaking up unhealthy “constellations of neuron patterns” while at the same time slicing through many of the pathways between the brain's emotional centers and the frontal lobes, he believed he could obliterate neuroses and prevent future ones.
Still, although he was a passionate believer in the lobotomy's potential, Freeman didn't think that potential had been fully realized yet. His patients, after their lobotomies, might no longer be the misfits they were before, but neither were they entirely normal. Whereas before they were prey to the whims of irrational emotions, postoperatively they often lacked the basic emotional responses we expect to see in human beings. One might be unable to cry at all, even when her mother died. Another might lose all interest in eating and would have to be prompted, bite by bite, through the course of a meal. A third might simply sit in a corner, mute, rocking back and forth for days and weeks and years on end, speaking only when spoken to. “Following operation,” Freeman wrote in 1942, “there would seem to be a certain emotional bleaching of the individual's concept of himself. How much of [the brain] must be sectioned in order to relieve disabling mental symptoms, and how much must be preserved to enable the individual to function adequately in society, has not yet been definitely established.”
Just as Freeman had refined Moniz's leucotomy into his own customized procedure, he believed future refinements surely lay ahead. He thought new approaches to the operation were necessary and that the ideal lobotomy had yet to be devised. Scores of doctors around the world, intoxicated by psychosurgery's promise, by the prospect of discovering surgical solutions to some of mankind's most intractable problems, had taken up Freeman's call to action. They were opening the skulls of misfits everywhere and rummaging around inside, attempting to find that one simple cut that might make them well.
None would perform as many lobotomies as Freeman, who was as prolific as he was passionate.
My grandfather, however, would come in second.
O
n the fifth floor of the Francis A. Countway Library, on the campus of Harvard Medical School in downtown Boston, there are a number of glass-fronted cabinets and glass-topped display cases. Their contents constitute the principal holdings of the Warren Anatomical Museum, which was founded in 1847 by a Boston physician who hoped it would stimulate curiosity and a spirit of inquiry among young medical students. There's a placard on one of the display cases with a Latin phrase that sums up the collection's animating principle:
MORTUI VIVOS DOCENT
.
The dead teach the living.
The cases are filled with the dead, or remnants of them. A gnarled skeleton of a woman with a severely contorted spine stands beside a photograph of her during life, naked, her face turned away from the camera. A row of four fetal human skeletons, ten, fourteen, eighteen, and twenty-two weeks old respectively, are posed in standing positions, as though they had learned to walk. One entire display case is dedicated to an assortment of kidney stones of all shapes and sizes and colors. In another, a plaster cast of the seven-fingered hand of a nineteenth-century Boston machinist grips a plaster cast of a rock.
All but one of these relics are from anonymous individuals. The exception is so famous that even just a glimpse of his skull might bring his name to your lips.
On September 13, 1848, a twenty-five-year-old construction foreman named Phineas Gage was leaning over a hole he had drilled in a shelf of rock, using a six-foot-long and two-inch-diameter iron tamping rod to jam a charge of gunpowder deep inside. He was in the wilderness of western Vermont, helping to clear a path for the construction of the Rutland and Burlington Railroad. He was by all accounts a diligent and conscientious man, so it was uncharacteristic of him to have forgotten to place a spark-inhibiting layer of sawdust over the gunpowder.
The blast jettisoned the tamping rod out of the hole in the rock like a missile out of a silo. The upper end of the rod, which was tapered to a dull point, penetrated Gage's skull just under his left cheekbone, then continued upward, moving at a diagonal slant through his frontal lobes before exiting through a hole in the upper right portion of the top of his skull. The rod flew a great distance but was eventually found, and a witness noted that it was “covered with blood and greasy to the touch.” Gage was loaded into an oxcart and rushed to the nearest town. He remained conscious during the entire trip, then walked up a long flight of stairs to a hotel room. When the doctor arrived, Gage calmly showed him the holes in his head and said he hoped he was “not much hurt.”
For the next two decades, Phineas Gage lived with an odd sort of fame. He attempted to go back to work at the railroad, but his co-workers found that the affable man they'd known had become a surly drunk who flew into unpredictable rages or inappropriate hysterics. A doctor's report indicated that he would indulge “at times in the grossest profanity” and that the balance “between his intellectual faculties and his animal propensities seems to have been destroyed.” The railroad fired him, and P. T. Barnum later hired him to join his traveling circus, where he would sit with the now polished and engraved tamping rod across his knees, a gawker's delight. Eventually Gage tired of the freak show and on a sudden impulse decided to move down to South America, where he tried to start up a streetcar company in the port city of Valparaiso, Chile. A number of scientific papers were written about his case, which helped establish the general notion that the frontal lobes play a part in impulse control. One doctor noted that Gage was constantly on the move and that he “always found something that did not suit him in every place he tried.”
Gage died in San Francisco in 1860, nearly twelve years after his accident. Seven years after that, his body was exhumed and his skull was shipped east, where it came to a permanent rest here in this museum, one shelf above the rod that pierced it.
I hadn't come to the library to see Phineas Gage. I hadn't even known he was there. I'd come to make copies of some old letters from the library's rare books and manuscripts archives and stumbled on the museum by chance.
The letters were between my grandfather and two Harvard scientistsâthe endocrinologist Fuller Albright and the neurologist Stanley Cobbâand they gave a glimpse of my grandfather's early ambitions, as well as some of his motivating impulses. He had written most of the letters while he was a neurology intern at Bellevue Hospital in New York City, exploring his next steps. He wanted, he wrote in his first letter to Albright, “advice and possible help in getting some first-hand experience in clinical endocrinology, especially as related to a neurological-psychiatric practice.” Endocrinology is the study of the endocrine system, which regulates the body's hormones. He enclosed a copy of his only publication at the time, a solipsistic case study from the
Journal of the American Medical Association
chronicling his own bout with an at-first-mysterious illness. “A physician,” the paper reads, “aged 28, in the summer of 1934, after drinking raw cow's milk and eating goat cheese in Norway, developed periodic exacerbations of malaise, easy fatigability, and generalized muscle and joint pains.” He describes how the aching and exhaustion had laid him low for six months, causing him to put his young career on hold and leading to multiple diagnoses of neurasthenia, a catchall psychiatric term that at the time was often applied to describe people who were simply unable, mentally, to cope with high levels of stress. The paper chronicled his attempts to find an alternate explanation and ended with his discovery that he tested positive for brucella, an undulant-fever-causing bacteria carried by cows and goats in, among other places, Norway. The paper's single illustration was a black-and-white photograph of my grandfather's pallid forearm displaying a large abscess that a diagnostic skin test had provoked. He clearly relished proving his physicians wrong and finding a simple, easily treated, biological cause for what they had attributed to a vague and hard-to-target mental condition. “This paper,” he wrote in the closing comment, “suggests one more substitute for that diagnostic wastebasket âneurasthenia.'â”
In the letter to Albright, my grandfather outlined the career path he'd followed so far: “I attended Yale College, B.A. 1928, Univ. of Penn medical school 1932, two years of general medicine and surgery at the Hartford Hospital and the Presbyterian medical center NYC; one year in psychiatry at the Cornell Medical center; and one year in neurology under Dr. Foster Kennedy at Bellevue Hospital.” Kennedy, incidentally, was an aggressive proponent of eugenics, who in 1942, while he was president of the American Neurological Association, raised eyebrows by arguing that people who suffered from mental retardation should be killed, declaring that “the place for euthanasia, I believe, is for the completely hopeless defective; nature's mistake; something we hustle out of sight, which should not have been seen at all.”
Cobb and Albright offered my grandfather a yearlong dual fellowship, splitting time between Cobb's clinical practice and Albright's laboratory. My grandfather accepted with enthusiasm and agreed to move to Boston as soon as his contract with Bellevue expired. He was excited at the prospect of tackling important research, though he was worried that his enthusiasm might outstrip the constraints of time. “The problems of urine-assay of sex hormones in homosexuals or menopausal or pregnancy psychoses seems too vast and a bit impractical to list in an application for a year's fellowship, don't you think?” he wrote. In the meantime, he wrote, he was open to suggestions for any experimental endocrinological work he might conduct before he left Bellevue. He wondered whether there might be “a simple problem I might work at” while at the hospital, with its “wealth of material but poor laboratory facilities.”
I had to read it twice before I understood what he meant by “material.”
The dual fellowship with Cobb and Albright did not end up being particularly successful. After reviewing some of the research reports my grandfather produced in Albright's lab, Albright wrote to him that “I have been over your manuscripts. The problem with them is this: they contain a lot of different problems but not quite enough observations on any one problem to absolutely clinch it. I feel that there is very little in them which would materially help the progress of medical science.” My grandfather didn't object to the harsh critique: “Thank you for your kind and clear letter re my various articles and data. I quite agree that they constitute a hodge-podge of information of no great value,” he wrote.
But his stint with Cobb and Albright was fruitful in another way, since it was during this time that he discovered his passion for neurosurgery. Albright's laboratory was located at Massachusetts General Hospital, which had a peerless neurosurgery department, and in his free time my grandfather took the opportunity to observe some of the best neurosurgeons in the world at work. He was captivated by what he saw and decided to apply for a neurosurgical residency there, which began the following year, in 1938. He proved a quick study. After a whirlwind of additional residencies at the Boston City Hospital and Johns Hopkins, he founded his own department of neurosurgery at Hartford Hospital in 1939.
He kept in touch with Cobb and Albright, even as their careers diverged. He'd write to them on his Hartford Hospital stationery, telling them about the milestones in his life. “Emmie just had a 7 ½ pound boy, with a magnificent Jewish nose,” he wrote to Albright upon the birth of my uncle Peter, his third child, in late 1939. Albright wrote back, offering his “congratulations on the new prophet which has arrived in your house.” He also let his former bosses know about any new material he came across that might be of interest to them. That same year, for example, he informed Albright about a twenty-eight-year-old man who presented a “beautiful picture of true pituitary hypofunctionâ¦.It is impossible to guess his age. He has a soft skin, a high voice, no facial nor body hair, long legs, asthenic habitus, low blood pressure, no libido, etc., etc.” The following year, he wrote to inform him of a similar case, “a twenty-one-year-old dwarf who looks and acts as if he were in his early teensâ¦.His genitalia is about one-half adult size, his pubic hair is abundant but silky and in feminine distributionâ¦.Development and skeleton is symmetrical except for all structures being smaller than normal. Height approximately 3 feet 10 inches or so.”
At the end of that letter he asked his old boss a question. “Are you interested in having him on your experimental ward for study?” He noted that the dwarf was “cooperative and passive” and told Albright to write back “if you want us to send him up.”
The broken illuminate the unbroken.
An underdeveloped dwarf with misfiring adrenal glands might shine a light on the functional purpose of these glands. An impulsive man with rod-obliterated frontal lobes might provide clues to what intact frontal lobes do.
The history of modern brain science has been particularly reliant on broken brains, and almost every significant step forward in our understanding of cerebral localizationâthat is, in discovering what functions rely on which parts of the brainâhas relied on breakthroughs provided by the study of individuals who lacked some portion of their gray matter.
This had not always been the case. Until the nineteenth century, most scientists viewed the brain as an undifferentiated mass. They recognized its importance, understood that it was the seat of emotion and intellect and consciousness, that it mediated our senses, that it more or less
was
us, but the reigning theory of brain function held that the brain was a perfect democracy, where every part was equal in potential and capability to every other part. From this view, injury to a particular part of the brain would simply cause a generalized diminishment of function, rather than any specific deficit. The movement away from this view faced a lot of resistance. One reason for this was the rise of phrenology, a pseudo science that became a worldwide fad in the mid-1800s and held that people's personalities and intellects could be minutely described simply by running your fingers over their heads and reading the contours of their skulls, which reflected the dimensions of the brains they encased. Phrenologists believed in cerebral localization, but the problem was, their arguments and theories about that localization had more in common with astrology than astronomy. The eventual debunking and stigmatization of phrenology made real scientists wary of accepting the reality of cerebral localization until inescapable evidence for it began to emerge in the form of brain-damaged individuals.
Phineas Gage was pivotal.
Then, in 1861, the year after Gage's death, a French neurosurgeon named Pierre Broca wrote a paper describing a new patient who was, in many ways, more scientifically significant than Gage. The patient's name was withheld, but he came to be known in the literature as Monsieur Tan, owing to the fact that he could not speak with any coherence and was able to say only the word
tan
over and over again. He had maintained his other faculties, however, and was able to understand everything he heard, and write legibly and intelligently. Upon the patient's death, Broca performed an autopsy and discovered that Monsieur Tan had a small and sharply circumscribed lesion in a small part of the left hemisphere of his inferior frontal lobe. He surmised, correctly it turns out, that this region of the brain was crucial for speaking. Today all basic human anatomy courses identify that spot corresponding to the damage in Monsieur Tan's brain as Broca's area, the center for speech articulation. (Monsieur Tan's brain, incidentally, went on to find a home in another museum of anatomical curiosities, this one located in Paris.)
Seventeen years later, in 1878, Carl Wernicke, a German neurologist, described a patient with damage to his posterior left temporal lobe, a man who spoke fluently but nonsensically, unable to form a logical sentence or understand the sentences of others. If Broca's area was responsible for speech articulation, then Wernicke's area, as it came to be known, must be responsible for language comprehension.