In part, I have to thank (or blame) my then-boyfriend and now-husband, Andrew, for this change in focus. He was one of those kids, at a very early age, who announced to strangers in the supermarket that he was going to become a neurosurgeon. He convinced me that pre-med classes at Cornell would be no problem. After all, he was certainly breezing right through them himself. But my Mr. Smarty Pants was off the mark a bit. Organic chemistry was a painful exercise in the short-term memorization of shapes, letters, and numbers and the complex system of all those things merging with one another. The class was held in a massive two-tiered lecture hall with hard-core savants seated all around me. Physics was only a little less agonizing. Andrew did what we still refer to as the wild-limbed “physics dance” when I told him that I got an A- in the class. Unfortunately, I didn’t earn an “organic chemistry dance.”
I returned to my hometown of Cleveland for medical school and lived with my family for the first year. My youngest sister, Elizabeth, (fourteen years younger and in elementary school) was at a very curious age and soaking up knowledge like a sponge. After being away for four years of college, I was eager to get back to telling her bedtime stories. Very quickly, though, I exhausted my sketchy repertoire of the classics and decided to transition to a less traditional theme, but one that was more current in my mind: parasitology. I assumed that cardiac physiology and neuroanatomy were just beyond her, but my new expertise in human parasites from around the world proved to be just her speed.
She accused me of telling a tall tale when she heard the one about
Dracunculus medinensis,
the parasite that causes a disease called dracunculiasis in Africa. I explained that the female worm is long and thin, up to three feet, and usually lives just underneath the skin of the human host’s legs or feet. When she is ready to release her offspring, a skin blister develops, the blister turns into an ulcer, and the end of the worm is exposed to the outside world. In order to remove the long, thin parasite, locals afflicted with this worm wind the exposed end around a stick, and turn the stick, slowly, so as not to break the worm, until the whole thing is released. Despite her suspicions, she requested a retelling of this one at a subsequent tuck-in. Her unflagging curiosity was an inspiration.
Perhaps my proudest moment, though, as an older sister featured an even lowlier parasite, the pinworm. A couple years had gone by, and I assumed that the parasite parables had faded from her memory. My family was invited to a friend’s home for dinner one summer night, and Elizabeth left the dining room table, mid-meal, to use the bathroom. Upon returning, she stood behind my chair and whispered into my ear: “I think I have pinworm.” I left the table and pulled her aside in the hallway. “How do you know?”
She explained that she had been itchy lately and was certain that she saw one of the tiny worms in the toilet, just then. Plus, she was attending summer camp along with plenty of other grimy kids with questionable hygiene and eating habits—the perfect setup, as she had recalled. She even remembered the simple confirmatory “Scotch tape test” that picks up the tiny eggs, commonly deposited around a certain orifice, when pressed against the skin. I told her that we would leave that to her pediatrician.
My mom had never heard of pinworm. However, based on the steadfast conviction of her youngest and the support of her eldest, she was willing to suspend her suspicions, be a good sport, and take a trip to the pediatrician’s office anyway. The diagnosis was confirmed. Elizabeth then rid herself of this annoying, but fairly harmless, infestation upon the ceremonial swallowing of a single antiparasite pill. I beamed with pride, as any big sister would under similar circumstances.
Although I briefly—for about two days—considered going into parasitology as a career, my thoughts changed. The creatures were intriguing but I couldn’t envision building an entire career around them. The first couple years of medical school are like a buffet, where you get a little taste of everything as you go around the table. But eventually, you are presented with a menu and can pick only one dish. That can be difficult for students who are generally and widely curious. In choosing one area, you are excluding the others. As a way of hedging your bets, you can choose a general field, like internal medicine. I didn’t consider that option. I suspected that the specialists got the most flavorful dishes, leaving the internist with the bland staples, like rice. I valued the notion of serving the public, but I couldn’t get excited about treating high blood pressure day in and day out.
In college I deliberated over entering medical school, but once in medical school, I deliberated very little over neurosurgery. The brain was definitely more interesting than the kidney (or the heart, or the bones, or the skin…). The kidney balances electrolytes and produces urine. The brain harbors personality and produces thought. If push comes to shove you can use someone else’s kidney; there’s nothing unique about your own. Your brain, on the other hand, is who you are. So, when presented with the menu of options, I knew that something brain-related was required to feed my curiosity over the span of a career.
The stories of neurologist Oliver Sacks gently tapped me over the edge in the direction that I was already going. His famous book,
The Man Who Mistook His Wife for a Hat,
featured a quirky guy (Dr. Sacks) noticing quirky things (unusual neurological symptoms in his patients). He was passionate, inquisitive, and thoughtful. Stories like these, I figured, could only be written about the brain. To this day, I have not come across equally intriguing tales concerning the pancreas.
I asked other neurologists what they thought of Oliver Sacks. Some felt he was a good storyteller but a run-of-the-mill neurologist. I found these comments cruel and unfair, first because these neurologists didn’t really know him and, second, because they were probably just jealous. They had certainly never written anything about the brain compelling enough for medical students (and the lay public) to devour during their free time. To my disappointment, a few neurologist-cynics even seemed to have lost their sense of wonder about the brain, which can happen, understandably, to a volume-dependent service provider whose days are packed with fifteen-minute visits, especially once they’ve been sued over one of those fifteen-minute visits.
Despite the influence of Sacks, though, I decided against neurology. There was no real manual component to the job, and the mainstay of intervention consisted of prescribing medication, which I worried might not sustain my interest. So I looked into neurosurgery instead. I remain a big fan of Oliver Sacks, though. A highlight of my neurosurgery residency was driving him back to the airport after he gave a lecture in town (our chairman knew him from decades earlier and was able to arrange for this chauffeur role, to my excitement). Andrew and I had a chance to sit down with him, discuss a few philosophical quandaries, and buy him a drink. (In his absentmindedness about mundane matters, he forgot to bring his wallet on the trip.) True to the literary persona that I had come to know, he was quick to notice and comment on anything peculiar, like the word “standee” on the sign posted over the walking sidewalk in the airport (“Walkers to the left. Standees to the right.”).
Returning to my days as a medical student with a menu in hand, I was faced with a final decision: What specialty? I considered my influences, my curiosity about the brain, and my newly emerging inclination toward surgery. Then, with little thought given to the lifestyle consequences of the most critical career move in my life, I decided to become a neurosurgeon.
FOUR
Acceptance
If you hope to become a neurosurgeon, you have to prove your passion for science to the gatekeepers in the academes of neurosurgery. Even if the manual work is what really excites you, it would be unwise to say, “I just want to open heads” during your interviews. Surgeons are fond of explaining that they could teach a monkey to operate. What they mean is that operating is only part of what a surgeon does and thinks about, and it’s not the hardest part. I remember my father recounting this classic monkey line when I was too young to appreciate the role of sarcasm in conversation. The literal image was unsettling.
At the most competitive residency programs, a mind for science is particularly high on the list of essentials. These programs aim to continue their tradition of prolific research publication, and residents play a key role in cranking out the papers. Program chairmen also want their graduates to remain in the academic system after residency. The thinking here is that the most serious scientist-surgeons are less likely to be lured into lucrative private practices. Regardless, smart graduates often want or need to take advantage of their most lucrative free-market options anyway, especially the financially strapped ones with growing offspring. Medicine is a business, after all, as well as a profession.
So, recognizing the desirability of the scientific mind, savvy medical students do all they can to bolster the research sections of their résumés—the academic equivalent of a peacock’s tail—often to the detriment of their already tenuous social lives. Fear, in addition to savvy, is a factor here, too, as there are always more applicants than spots, similar to musical chairs. Neurosurgeons were those kids who always managed to grab hold of a chair.
In anticipation of becoming a neurosurgeon, I, too, spent a few afternoons a week in a laboratory as a medical student so that my résumé would excite the decision makers on the interview circuit (or, at least, would avoid inciting laughter). It was worth it. If you were to examine the research section of my résumé, you would notice the following title of one of my projects: “Use of amiloride to minimize reperfusion injury in gerbil model of ischemia.” You must admit that the words “amiloride” and “ischemia” are intimidating. Perhaps “reperfusion” gave you pause, too, but you may have seen that word in other contexts. Although I did do my best to come up with an impressive line on my CV, I was trumped by my competitive colleagues who had the foresight to choose a research project in genetics, thus qualifying them to include formidable nonwords such as “p53” in the titles of their projects. (This has nothing to do with page numbers.) There’s no way to prove it, but they may have had a slight edge over me.
Now that more than ten years have passed since I spent sunny afternoons in a dark lab during medical school, I have to disclose which word in my research project title had the most enduring impact on me: it wasn’t “amiloride” or “ischemia,” but, rather, “gerbil.” My research project involved trying my best to cause strokes in these subjects, animals that other people call pets, not subjects. The long-term goal was to prove that the drug could minimize stroke damage in humans whose brains were at risk during certain operations. Unfortunately, the drug had to be given just
before
the stroke, so wider and more practical outside-world applications were hard to envision.
By the way, I do understand the emotions of those who would oppose such use of rodents in medical research, but I am swayed far more by the emotions of a family gathered around the bedside of a stroke victim, wondering why nothing more could be done.
In order to cause these strokes, I was taught to carefully and completely cinch off the two most significant arteries that feed the brain, the carotid arteries. Even then, despite such extreme measures, the strokes ended up being pretty small. If only humans were so resilient. It always amazed me how hearty a gerbil’s brain was in comparison to that of Homo sapiens. And for what? Evolution really gypped us.
Creating gerbil strokes requires a mini-operation performed by mini-instruments on a mini-subject. All told, this was the most satisfying aspect of the project for me. It proved that I had at least the basic manual skills—the monkey part—to be a surgeon. Previous tasks in my life that required coordination, such as mastering the use of chopsticks or playing the piano, were not as clearly translatable. My confidence was bolstered. Never mind that the young friendly lab tech who taught me the technique could do the job just as well, and maybe even faster.
The most disturbing part of the project was not creating the strokes, believe it or not. I could justify that in the name of science and future human welfare. What did get to me, though, was the method of ending the gerbils’ lives. The scientific protocol required the freezing of their brains, literally and figuratively, at a precise moment just a few minutes into the stroke process. Here’s how it’s done: you take a limp anesthetized gerbil by its tail and lower the entire body, head first, into a large silver cylinder filled with liquid nitrogen. It comes out frozen stiff.
Regarding the wonders of liquid nitrogen, I heard from the more seasoned lab researchers (the so-called lab rats) that if you submerge a rose into liquid nitrogen and then drop it on the floor, it shatters into numerous pieces. (Is that what these guys did after hours? What else were they submerging?) With that image in mind, I was especially firm in holding on to the gerbils’ tails upon lifting them out of the cylinder.
After the freezing, I rolled each subject into tin foil like a burrito, labeled them by subject number (they didn’t get names), and placed them in a freezer alongside other people’s subjects. During my next afternoon in the lab, I retrieved my gerbils for the following step: isolating the brains. This required chipping away at their thin skulls with a scalpel, taking care not to violate the underlying cortical brain surface, an inelegant and tedious task similar to whittling a stick, but more precise. I had to hold them with an oversized oven mitt to protect my warm hands from their frozen bodies and vice versa.
It could be that my distaste for the project was rooted in family history. One of my younger sisters always had gerbils as pets. Although I didn’t feel much affection for them myself, I respected the fact that she did. I felt bad when my father had to put one of them to sleep and my sister cried. (He brought home some sort of gas from the hospital. The gas plus a brown paper lunch bag was all that was required to perform this simple act of mercy, which reached a level of urgency when the sickly animal started to gnaw at its own hands.) So much emotional trauma surrounding one single small-brained rodent, and there I was, years later, dunking one after another into liquid nitrogen so that I could become a neurosurgeon.
As a physician, I understand and respect the critical role of selective animal experimentation in advancing science and medicine. Rodent martyrs may well contribute to my own future health and longevity. I learned from my gerbil experiment, though, that I would prefer to leave such critical work to the “lab rats.” All of my projects from that point on were based on human data. Most relied heavily on the civilized review of room-temperature medical records.
Even the seemingly guilt-free projects that I chose, though, the ones that required no sacrifice of gerbils or other living beings, could be a little uncomfortable at times. In an effort to help me get into a good program, a mentor of mine suggested I do a study that would be quick and straightforward: look at a small group of patients with an uncommon type of tumor and see how they did after neurosurgical intervention. The sample size would be small. All I had to do was get the charts, organize the data about the patients, their tumors, and their treatment, and look at outcomes. Outcome meant length of survival, which seemed simple enough. I could have a paper done in no time.
The catch here was that many patients were from out of town and had much of their follow-up elsewhere. They had traveled to the big university for specialized treatment but it was too far for them to return again and again. Some of the charts clearly stated the date of death (a short note usually written in the neat secretary’s handwriting rather than the more cryptic surgeon’s hand, based on information from a phone call or newspaper obituary), but others left me hanging.
One chart documented Mrs. So-and-so’s six-month follow-up visit. She was stable. I could almost picture her, in the surgeon’s office, grateful for her stability. That was two years earlier, though. The chart had remained filed, stagnant, until that day when I happened to reach for it. Mrs. So-and-so was about to play a small role in this incremental step toward achieving my career goals, and I was worried. I didn’t even know Mrs. So-and-so, but I assumed the worst and felt sorry for her husband whose name I had spotted on the face sheet in her chart.
I went back to my mentor and asked what to do with all the patients who fell off the radar screen. “Just call them up!” was the reply; an obvious answer that made my question seem ridiculous. Easy for him to say. He wasn’t the one who was going to have to ask for Mrs. So-and-so, only to endure a long pause from the other end of the line, from her widower. What was worse, not only did I have to find out whether or not a patient was still alive, I had to get the date and cause of death.
I did what I had to do. Luckily, most of the families were not only helpful but gracious. Still, I felt just a little awkward. In a small way, I was relying on these families to help me get ahead. I kept reminding myself that they were contributing valuable information in the name of science, not just in the name of my CV.
I hoped that, in the end, all of my science projects would pay off, landing me a spot at a top-notch residency program. The final selection process, though, was a black box for me as a fourth-year medical student. Not only was it a black box, it was also frighteningly out of my control, especially once all the variables that I could influence or tweak—test scores, research, papers in press, glowing letters from mentors, application essay—had already been influenced or tweaked. Interviewing was the final step. The decision process, after that, remained a mystery.
At least everyone else was in the same boat. Midway through the fourth and final year of medical school, a pack of medical students hoping to become neurosurgery residents spend a wad of money (that they don’t really have) flying around the country to interview. The next couple months after that are spent worrying. You tend to see the same students over and over again, and as you watch a certain guy turn on the charm with all the decision makers, you tend to wonder: Is this the guy who will be taking my spot here? Most programs accept only one or two new recruits per year, so the paranoia is actually rational, not a precursor to schizophrenia. You know that everyone else’s CV must be as good as or better than yours (they got the interview, too), and so charm can be a critical distinction.
As a woman trying to enter a largely male-dominated specialty, what was I supposed to wear to my interviews? I didn’t have many mentors to turn to in this deceivingly trivial dilemma. I had interacted with only a couple female neurosurgeons up to that point. One was exceptionally smart but a bit frumpy, and the other, although also smart, wore higher heels and tighter skirts than I could have managed. And one female neurosurgery resident I knew had a brain tattoo on her hip, before tattoos were commonplace, and I wondered if she used it as proof of her dedication during interviews. (She eventually left neurosurgery to become a radiologist.) Regardless, I wasn’t willing to go that far.
I took the conservative approach with a dark pantsuit, always a safe choice. I still struggle at times with matters of style. I would like to branch out, but I feel a bit constrained. When a sales clerk suggests a great looking but low-cut blouse, I am tempted to explain that patients do not appreciate a hint of cleavage in their surgeon. It does nothing to inspire additional confidence.
Interviewing lore is passed around from program to program and from year to year. There was the chairman who claimed to offer a spot to anyone who could beat him in chess. There was the student who split the back seam of his pants in the bathroom and, in his contortions within the stall to try to remedy the problem, managed to dunk the end of his tie into the toilet. There was the guy who completely mangled his interviewer’s name, and so on.
On the interview circuit, we warned each other about which neurosurgeons tended to ask tough questions and which might actually put you on the spot with an anatomy quiz. I was most impressed by the audacity of one student who was asked to draw a detailed cross-section of the spinal cord (a complex structure) and label all the parts. He drew a simple circle within a circle, pointed to the inner circle with his pen, and stated: “Filum terminale.” Although the filum terminale
is
technically a part of the spinal cord, it’s really just the simple, spindly, nonfunctional tail end of it. I don’t recall if his cockiness worked to his advantage or disadvantage, but I could see it having gone either way.
Despite the lore, most of the questions posed to me were straightforward: Why do you want to be a neurosurgeon? Where do you see yourself in ten years? How do you know you can handle the stress? And (if they didn’t bother to read my application), what kind of research did you do? One question, though, did stick out as more amusing. It seemed specifically tailored for me: How do you know you can handle all the big drills? I smiled and assured the older, male interviewer that I could handle the big drills. Short of performing a demonstration in his office, that was about all I could do to address the question. And, over the next several years, I had the chance to use big drills on several of his own patients, settling any concerns he may have had.
In general, I don’t bother getting worked up over minor things that could be construed as sexist. Most people (myself included) don’t enjoy working with colleagues who are alarmist, easily outraged, or overly sensitive. I prefer to prove my abilities, naturally, over time. Luckily, in this modern era, I’ve never found the need to storm out of a room, call anyone a chauvinist, or report any transgressions to the authorities. The way for women in surgery has already been paved to a great degree, and I’m grateful for all the women who must have had it harder—much harder—than I did.