Read Happy Accidents: Serendipity in Major Medical Breakthroughs in the Twentieth Century Online
Authors: Morton A. Meyers
Tags: #Health & Fitness, #Reference, #Technology & Engineering, #Biomedical
By the mid-1980s the stage was set for a dramatic coalescence of events. In 1984, after completing a ten-year study conducted in twelve centers in North America involving more than 3,800 men and costing $150 million, the NIH established that lowering blood cholesterol clearly lowers the risk of heart attack. The nation's premier health institute officially recommended that Americans lower the amounts of fat and cholesterol in their diet.
9
It was now clear that atherosclerosis in the general population was caused by a dangerously high blood level of low-density lipoproteins, resulting from failure to produce enough LDL receptors, and that cholesterol not absorbed into the cells as it courses through the circulatory system sticks instead to the walls of blood vessels, disrupting the flow of blood to the heart and brain. It was now understood
why some people can eat more cholesterol-rich foods and not have high blood cholesterol concentrations, while others on low-fat diets continue to have high levels: the difference is in the number and efficiency of the LDL receptors. Some people are born with a greater or lesser capacity to develop adequate receptors. Individuals with “low capacity” LDL receptors have higher cholesterol levels in the bloodstream and an increased risk of coronary heart disease because they remove cholesterol more sluggishly from their circulation. Once this mystery was solved, the therapeutic approach was clear. It lay in developing drugs that block cholesterol synthesis in the liver in order to stimulate production of the receptors and reduce blood cholesterol.
Brown and Goldstein promptly identified the underlying genetic mutation in the Watanabe rabbits, leading to their sequencing the human LDL-receptor gene. They found that a variety of mutations occur naturally and can be located anywhere on the gene, resulting in failure to synthesize the LDL receptor proteins or in production of deficient proteins. “The multiple mutations may be the reason for the high frequency of defective genes in the population,” Brown suggests. Approximately one person in every five hundred carries a single deficient copy of the LDL receptor gene, incurring a higher risk of heart attack. One person in a million inherits defective genes from both parents, and these individuals with familial hypercholesterolemia have extremely high blood cholesterol concentrations and often die of heart attacks before their twentieth birthday.
Two Defective Genes
A Texas girl named Stormie Jones was literally “one in a million,” as one in every million people in the population inherits two defective LDL receptor genes. She had stratospheric cholesterol levels and did not make functional LDL receptors—hence drugs were ineffective. She suffered her first heart attack at age six, followed within the next year by another heart attack and two coronary bypass operations. In 1984, at age eight, she underwent the first heart-liver transplant ever performed and, as Brown and
Goldstein's theories had predicted, the presence of normal LDL receptors on the transplanted liver produced an 80 percent reduction in her blood cholesterol levels. The double transplant enabled her to live for six more years.
In 1985, thirteen years after they began their work, Goldstein, age forty-five, and Brown, age forty-four, shared the Nobel Prize in Medicine. In an interview with a local newspaper a few days after the Nobel Prize announcement, Brown openly disclosed that their brilliant basic research over thirteen years had benefited from serendipity:
In the beginning, we started with a hypothesis that was incorrect…. Initially we thought that an enzyme had gone wild and was producing the excessive cholesterol. As it turned out, we found the enzyme was not the problem, it was the fact that the [body] cells had trouble getting cholesterol from a lipoprotein [and thus removing it from the blood]…. We did not dream there was any such thing as a receptor. It was not within the worldview of any scientist that such a thing existed.
10
In 1987 Merck introduced a cholesterol-lowering drug, lovastatin, that works by slowing down the production of cholesterol by the liver and increasing its ability to remove LDL cholesterol in the blood.
11
Of the six statins on the American market in 2006, Pfizer's Lipitor was the most popular, taken daily by as many as 16 million Americans. It was the world's best-selling prescription medicine, with $11 billion in sales in 2004. As a group, these drugs were generating revenues of more than $25 billion a year for their manufacturers, which include, along with Pfizer and other American drug companies, Germany's Bayer and the British-Swedish company AstraZeneca.
In July 2004 aggressive new guidelines for LDL levels were proposed. These range from a rock-bottom level of 70 or lower for those with the very highest risk factors (heart disease, high blood pressure, smoking) to 160 for healthy people with little risk.
12
Large clinical trials
found that lowering LDL to such levels sharply decreases the risk of heart attacks. Under the old guidelines, about 36 million people in the United States should have been taking statins, but less than half that number had been.
Brown and Goldstein are still at the University of Texas Health Center, where they are often referred to as “Brownstein” or “the Gemini twins.” They plan research jointly, publish together, and share the podium for lectures. Out of the lab, they are partners at bridge. The two men brought the concept of cholesterol receptors and their specific genetic coding into the light of day. Their turn down an unexpected path is emblematic of the process of basic research: diligent pursuit of fundamental processes of nature, often without a hint of what one may unexpectedly find.
29
Thinning the Blood
Blood, to maintain its crucial roles in the body, must remain fluid within the vascular system, and yet clot quickly at sites of vascular injury. Examples of such conditions include a heart attack due to a clot formation within a coronary artery, the abnormal cardiac rhythm known as atrial fibrillation, which may be complicated by a clot within that chamber of the heart, and deep venous thrombosis, in which clots form in the leg veins (often from prolonged inactivity, such as a long flight), a piece of which may become dislodged and travel to the lungs, a frequently life-threatening event.
Drugs used to “thin the blood” to prevent clotting or to dissolve a formed clot have different mechanisms of action. The drugs were chanced upon in unexpected and sometimes very dramatic circumstances.
T
HIS
I
S
E
XACTLY
W
HAT
I W
ASN'T
L
OOKING
F
OR
!
In 1916 a medical student at Johns Hopkins, Jay McLean, was assigned a research project to find a natural bodily substance to clot blood. Under the direction of William Howell, the world's expert in how blood clots, McLean set out to characterize factors in the human body that promote clotting. Instead, he discovered the exact opposite: a powerful anticoagulant (blood thinner). When his mentor was skeptical, McLean placed a beaker of blood before him and added the substance.
The blood never clotted, but Howell remained unconvinced. McLean rushed the report of his discovery into print to establish his priority and to indicate that Howell had played a subordinate role.
1
This would be the only paper he would write on the substance.
Howell shortly labeled the substance heparin (from the Greek
hepar,
the liver) to indicate its abundant occurrence in the liver, although heparin can be extracted from many other body organs.
2
After its physiological and chemical behavior was detailed and the factor purified, heparin became clinically available in 1936, and in the 1940s its use became a standard in treating a variety of venous and arterial clotting disorders. Its major limitation was its inability to be taken orally.
The real nature of the serendipitous discovery was published by McLean forty-three years later. He wrote: “I had in mind, of course, no thought of an anticoagulant, but the experimental fact was before me.”
3
S
TREPTOKINASE
: C
LOT
-B
USTING
In the early 1930s William Tillet, a bacteriologist at the New York University School of Medicine, was studying streptococci and the body's defensive ability to clump these bacteria together. At the time, the process by which blood clots are formed and dissolved was not his focus of interest. In one phase of his experiments, a particular specimen of blood initially clotted, but then, just before discarding his test tubes, Tillet happened to look at them again. To his surprise, the contents of the tubes had liquefied.
Tillet pursued this unexpected observation—which was outside his area of interest—and in 1933 reported that this species of streptococci produces a protein that inactivates a critical factor in the formation of a clot. Ultimately named “streptokinase,” this protein has become useful as a clot-buster if given to a person in the initial stages of a heart attack.
4
N
OT
-S
O
-S
WEET
C
LOVER
One Saturday afternoon in February 1933, in the middle of a howling blizzard, a Wisconsin farmer appeared in the office of chemist Karl
Paul Link, where he presented him with a dead cow and a milk can containing blood that would not clot. The man had driven almost two hundred miles from his farm to seek help from the Agricultural Experiment Station at the University of Wisconsin in Madison. But it was a weekend, and the veterinarian's office was closed, so chance led him to the first building he found where the door was not locked: Link's biochemistry building. Many of the farmer's cows had recently hemorrhaged to death. He had been feeding his herd with the only hay he had: spoiled sweet clover.
This hemorrhagic disease had been recognized first in the 1920s after farmers in the northern plains states began planting sweet clover imported from Europe, which survives in the harsh climate and poor soil, as feed for cattle. Fatal spontaneous bleeding in the cows, along with prolonged clotting time, was traced to spoiled sweet clover.
Sweet clover is so named because the plant is sweet-smelling when freshly cut, but some strains are bitter-tasting and avoided by cattle. Link was hired to investigate this problem and found that one particular chemical in the clover, coumarin, was responsible for both characteristics. Coumarin was even used commercially to scent inferior tobacco and some perfumes. There was no suspicion that the sweet-smelling, bitter-tasting chemical was related to spoiled sweet clover disease.
Link's serendipitous encounter with the farmer forever changed the direction of his research, and it took seven years before he solved the mystery of why only
spoiled
hay caused the bleeding disorder in cattle. When clover spoils, its natural coumarin is chemically transformed into dicumarol, which would be shown to interfere with the role of vitamin K in the clotting process.
5
After engaging in some creative thinking along the lines of “I've found the solution, what's the problem?” Link and his colleagues reasoned that if too much caused a hemorrhage, a minuscule amount might prove to be a useful anticoagulant. Physicians avidly welcomed the new drug.
Link synthesized scores of coumarin variants. Noticing that one of them appeared to induce particularly severe bleeding in rodents, he developed it as a rat poison and gave it the name warfarin. It is still sold today for this purpose under the same generic name.
6
In early
1951 an army inductee tried to kill himself by eating the rat poison, but his surprising full recovery led to the testing of warfarin on human volunteers, and by 1954 it was proven more potent and longer-lasting than dicumarol. Warfarin was promptly marketed, sometimes under the trade name Coumadin, and it is now the standard treatment for all venous or arterial blood clots requiring long-term treatment. It requires careful monitoring. An estimated 2 million Americans take it each day to help prevent blood clots that could result from heart attack, abnormal heart rhythm, stroke, or major surgery.
A
N
A
SPIRIN A
D
AY
K
EEPS THE
D
OCTOR
A
WAY
Aspirin (acetylsalicylic acid), derived from the bark of the willow tree, has for over a hundred years been marketed for its painkilling and fever-quenching properties. It is the single most commonly used medicine. In the United States alone, more than 30 billion pills are purchased annually.
In the late 1940s a California physican stumbled upon another of aspirin's powers. Tonsillectomy in children was all the rage in the 1940s when Lawrence Craven was working as an ear, nose, and throat (ENT) physician in Glendale, California. After surgery, Craven would tell his patients to chew Aspergum (containing aspirin) to relieve their sore throats. He was surprised to find that those who used more gum bled excessively. Craven speculated that Aspergum reduced the tendency of the blood to clot.
At a time in history when health-care officials were recognizing heart disease as a national epidemic, he took it upon himself to treat all his adult male patients with aspirin to prevent blockage of the coronary arteries and heart attacks. Within a few years, Craven was convinced that heart attack and stroke were occurring far less frequently among his patients than among the general population. His reports on his astonishing series of cases were largely ignored, as they were published in rather obscure regional medical journals and his trials had had no controls.
7