THAT’S THE WAY THE COOKIE CRUMBLES (7 page)

BOOK: THAT’S THE WAY THE COOKIE CRUMBLES
3.35Mb size Format: txt, pdf, ePub
ads
Agitate for Ice Cream

Nancy Johnson of Philadelphia had a problem. She loved ice cream. But she found that making it was a struggle. She’d often spend up to an hour shaking the metal pot containing her mixture of cream and sugar before the stuff would freeze. And all that time she had to keep the pot immersed in a bath of ice and salt. There had to be a better way. So, in 1843, Nancy dreamed up the ice cream maker. She placed a metal can filled with ingredients in a wooden bucket and packed it with layers of ice and salt. Then she attached a hand crank to a brace positioned across the top of the bucket and ingeniously connected it to a paddle that would churn the mix as it froze. Thanks to Nancy, anyone could now make ice cream at home.

The concept of making ice cream is simple enough. Take some cream, add sugar and flavor, and freeze the mixture. Pure water freezes at zero degrees Celsius, but by dissolving any substance in water we lower its freezing point. So the ice cream mix, with all of its dissolved sugar, requires a temperature lower than zero to solidify. Now picture what happens if we place this mix in a container and then immerse it in a bucket packed with ice. The original temperature of the ice is well below zero (just check the temperature in your freezer), but the surfaces that are in contact with the air will quickly warm up to zero degrees and begin to melt. The water from the melted ice will also be at zero degrees, and this mixture of ice and water will remain at that temperature as long as any ice is present. But at zero degrees, the ice cream mix will not freeze. However, if we sprinkle salt on the ice, we create a different scenario. As before, the surface of the ice warms up and melts. The water dissolves the salt, and soon the pieces of ice are swimming in salt water. Since this liquid has a lower freezing point than pure water, the ice will lower its temperature until the new freezing point is reached. In other words, the ice cream container is now surrounded by salt water, which is at a temperature well below zero. The mix freezes.

But just freezing the mix won’t give you ice cream. It will yield a dense, solid mass filled with ice crystals. Hardly mouth-watering stuff. If you want good taste, you must agitate. Shaking or mixing the ingredients during the freezing process is the key to making good ice cream. This accomplishes two things. First, it minimizes the size of the ice crystals that form; second, it blends air into the ice cream. The smaller the ice crystals, the smoother the ice cream. But it is the pockets of air blended into the product, known as the “overrun,” that give it its prized foamy consistency. Nancy Johnson’s hand-cranked device minimized crystal formation and incorporated air admirably. Indeed, ice cream manufacture today still uses the same principle.

Human ingenuity does come to the fore when ice cream makers are unavailable. During World War II, American airmen stationed in Britain and pining for ice cream discovered that the gunner’s compartment in a bomber had just the right temperature and vibration level for making the sweet treat. They would put the ingredients into a can before a mission, stow it in the gunner’s compartment, and then look forward to returning to base with a batch of freshly made ice cream.

The method of simultaneous mixing and freezing solves the main problems of ice cream manufacture, but it does introduce a complication. Cream essentially consists of tiny fat globules suspended in water. These globules do not coalesce, because each is surrounded by a protein membrane that attracts water, and the water keeps the globules apart. Stirring breaks the protein membrane, the fat particles come together, and the cream rises to the top. This effect may be desirable when we’re making butter, but not when we’re making ice cream. There is a simple solution: we can add an emulsifier to the mix. Emulsifiers are molecules that take the place of the protein membrane, since one end dissolves in fat, and the other in water. Lecithin, found in egg yolk, is an excellent example. That’s why even the simplest ice cream recipe requires some egg yolk.

There is nothing like freshly made ice cream. Its smooth, airy consistency affords us a break from reality; it’s a throwback to childhood and a less complex world. Storing ice cream, however, does present a problem — the dreaded heat shock. By taking the container out of the freezer, for example, we may cause the surface of the ice cream to melt. When it refreezes, it forms larger ice crystals, resulting in the crunchy texture that so terrifies ice cream lovers. The commercial solution? Add some wood pulp.

Now, don’t get all worried — we’re not talking about adding sawdust to ice cream. Microcrystalline cellulose is a highly purified wood derivative that sops up the water as ice cream melts and prevents it from refreezing into crystals. Cellulose is indigestible, and it comes out in the wash, so to speak. Guar gum, locust bean gum, or carrageenan, all from plant sources, can also be used for the same purpose. Although lecithin is a good emulsifier, there are others that are more commercially viable. Mono- and diglycerides or polysorbates disperse the fat globules very effectively.

For those of you yearning for homemade ice cream but unwilling to deal with the salt and ice, here’s a solution. Find a chemist friend who can provide you with some liquid nitrogen and supervise your activity. Place the mix in a Styrofoam container, add liquid nitrogen, and stir. The mix freezes almost immediately and develops just the right foamy consistency as the nitrogen evaporates. With a little practice, you can outdo Nancy Johnson.

A final word of warning, though. Ice cream may be addictive. A study conducted at the U.S. Institute of Drug Abuse suggests that eating it stimulates the same receptors in the brain as certain drugs. If you run into this problem, you may want to sample one of the new flavors that commercial manufacturers are tinkering with. Garlic, spinach, pumpkin, or tuna ice cream is sure to curb your craving.

Man Cannot Live on Corn Alone

Italian cuisine is one of my favorites. Except for polenta. I have never developed a taste for that odd corn mush, which was once a dietary staple of poor Italians. When explorers returned home to Europe from North America with corn, it quickly became popular with landowners because of its abundant yield. These landowners often paid the farm workers they hired to grow the corn with a share of the crop, and corn became an important part of their diet.

By the late 1700s, however, it was becoming evident that the sharecroppers who subsisted on corn were an unhealthy bunch. One could easily recognize them by their crusty, reddened skin. “Pellagra,” from the Italian for “rough skin,” became a common term for the condition. Most people believed that it was caused by eating spoiled corn. Rough skin was not the only symptom the poor sharecroppers had to worry about. The disease was often characterized by a red tongue, a sore mouth, diarrhea, and dementia — before it killed its unfortunate victim. Pellagra came to be referred to as the disease of the “four Ds”: dermatitis, diarrhea, dementia, and death.

By the early 1900s, pellagra had reached epidemic proportions in the southern U.S. It ravaged the poor, especially cotton pickers. Some sort of a communicable infection now seemed a more probable cause than contaminated corn. Rupert Blue, the U.S. Surgeon General, stepped in and assigned his top epidemiologist, Dr. Joseph Goldberger, to solve the mystery of pellagra. Many of the pellagra victims ended up in insane asylums, so these institutions seemed appropriate places to start the investigation. Goldberger soon realized that while many inmates had the symptoms of pellagra, no doctor, nurse, or attendant showed signs of the disease. He noted the same phenomenon in orphanages, where children often developed pellagra but staff members never did. This was inconceivable if pellagra were an infectious disease. So Goldberger began to ponder the lifestyle differences between the asylum and orphanage inhabitants and the attending staff of these institutions. He also began to speculate about differences in diet.

Goldberger observed some pretty dramatic differences. Both inmates and staff got plenty of food, but the variety was not the same. While the attendants dined on milk, butter, eggs, and meat, the pellagra sufferers had to subsist mostly on corn grits, corn mush, and syrup. Goldberger suspected that some sort of dietary deficiency might be triggering pellagra. But he uncovered one troublesome finding. In one orphanage he studied, most of the younger children showed symptoms of pellagra, but the older ones seemed to fare much better. This mystery was solved when Goldberger discovered that the resourceful older children were supplementing their diet with food that they snitched from the kitchen.

It was obvious to Goldberger what the next step in his investigation had to be. He must obtain government funding to add meat and dairy products to the diets of the orphans and the asylum inmates. He did so, and the results were miraculous. Almost all of the pellagra victims regained their health. But if he was to prove the dietary connection conclusively, Goldberger would have to conduct one more critical experiment. He would have to show that pellagra could be induced by a faulty diet. And where was he going to find volunteers for such a study? In prison. Convicts would do anything to get out of jail. So Goldberger approached the director of the Rankin Prison Farm in Mississippi and outlined his idea. The director agreed to cooperate. He would release any prisoner who volunteered to take part in Goldberger’s study upon the study’s completion.

The volunteers were soon lining up to lend Goldberger a hand, especially after the doctor explained the protocol. To the prisoners, it sounded like a cakewalk. For six months, they could eat to their heart’s content, as long as they confined themselves to a menu of corn biscuits, corn mush, corn bread, collard greens, and coffee. Then they would be freed. After about five months, though, the fun went out of the experiment. The convicts began to suffer from stomachaches, red tongues, and skin lesions. Goldberger had proven his point. Unfortunately, he did not have a chance to cure his patients, since, true to his word, he’d had them released. The convicts quickly scattered, wanting no more of Goldberger’s dietary schemes.

It would seem that the problem of pellagra was solved. But many scientists who had pet theories about contagion remained unconvinced. In a letter to his wife, a frustrated Goldberger described these colleagues as “blind, selfish, jealous, prejudiced asses.” He would show them that pellagra was not a contagious disease! Dr. Goldberger organized a series of “filth parties,” at which he swallowed and injected himself, his wife, and supportive colleagues with preparations made from the blood, sputum, urine, and feces of pellagra patients. Nobody came down with the disease. Goldberger had made his point by eating excrement.

Unfortunately, Dr. Goldberger did not live to see the day when the “pellagra-preventative factor” was finally identified. In 1937, scientists put the finger on niacin, one of the B vitamins. Corn, as it turns out, is a very poor source of niacin; when people — like Goldberger’s inmates and orphans and convicts — eat little else, they develop pellagra, a deficiency disease. It’s a shame that the convicts dispersed before the doctor could arrange to follow them up. It would have been interesting to see how they eventually fared, whether they suffered strokes or age-related macular degeneration, a leading cause of visual impairment. Why? Because recent studies demonstrate that lutein, a pigment abundant in corn, may be protective against both of these conditions.

Ultrasound measurements of the thickness of carotid arteries, a predisposing factor for stroke, reveal an inverse correlation with blood levels of lutein. Furthermore, lutein-fed mice that were genetically engineered to develop atherosclerosis developed lesions only half as large as those seen in mice on normal feed. Epidemiological studies have also shown that people who consume foods rich in lutein have a lower risk of macular degeneration. Apparently, lutein concentrates in the eye and protects it from the harmful effects of blue light. Sounds pretty good. It almost makes polenta sound appealing.

Lessons from Popeye

The most famous landmark in Crystal City, Texas, is a statue of Popeye the sailor man. He’s squeezing his trademark can of spinach, ready to save Olive Oyl from the clutches of Bluto. Crystal City, you should know, is the spinach capital of the world. Its citizens erected the statue in 1937 to honor the character who single-handedly boosted spinach consumption and helped save an industry. But there may be more of interest in Crystal City than Popeye’s statue. I think someone should look into the incidence of heart disease there. In Crystal City, spinach is a way of life — and, I suspect, a longer one. That’s because spinach is an outstanding source of folic acid, a B vitamin that is increasingly being linked with a plethora of health benefits. Let me explain.

Our story starts in the hallowed halls of Harvard University, far from the spinach fields of Crystal City. It was here, in 1969, that Dr. Kilmer McCully became involved in the unusual case of a boy who died at the age of eight from a stroke. The boy had suffered from a rare condition in which a substance called homocysteine builds up in the blood. Homocysteine is a normal metabolite of methionine, a common amino acid found in virtually all dietary proteins. A healthy person’s body quickly processes it, but it accumulates in those suffering from homocystinuria, like McCully’s young patient. An autopsy clearly revealed the cause of death. The boy’s arteries were like those of an old man. Could the damage have been caused by excess homocysteine, McCully wondered? To investigate this further, he needed to examine other children who were afflicted with the same condition.

It didn’t take him long to reach a conclusion: children with high homocysteine levels sustain artery damage typical of that seen in older men. And then, to prove his point, McCully injected homocysteine into rabbits, causing artery damage. This was enough evidence to suggest a revolutionary idea: homocysteine is a risk factor for heart disease. McCully proposed that high levels of the substance cause damage quickly, while levels that are only slightly elevated take longer to wreak havoc. Excited by his findings, he submitted a paper to
The American Journal of Pathology
. But instead of getting famous, McCully got sacked.

Harvard denied him tenure, supposedly because of his unorthodox theory about heart disease. Members of the medical establishment had declared that cholesterol was the main culprit, and they could see no room for homocysteine in their scenario. Eventually, however, Dr. McCully would be vindicated. And, somewhat fittingly, one of the first studies to show the validity of the homocysteine theory was carried out at the Harvard University School of Public Health. In 1992, researchers reported on an analysis of disease patterns in over fourteen thousand male physicians. Those subjects whose blood levels of homocysteine ranked in the top five percent had a heart attack risk that was three times greater than the risk calculated for subjects with the lowest levels. Numerous other studies have shown a similar relationship. A high homocysteine level (above twelve micromoles per liter) seems to be a clear, independent risk factor for heart disease.

Knowing about a risk factor is not much good unless we can do something about it. And in the case of homocysteine, we can. Let’s take a moment to explore the relevant biochemistry. Homocysteine forms through the action of certain enzymes on methionine. Once it has formed, one of two things will happen. It will either be reconverted to methionine or metabolized to glutathione, a powerful antioxidant. Both of these pathways require the presence of B vitamins. The body needs folic acid and vitamin B
12
to change homocysteine back to methionine, and it requires vitamin B
6
for the glutathione route. You are probably starting to get the picture. Inadequate levels of these B vitamins lead to increased levels of circulating homocysteine, which in turn causes arterial damage and heart disease.

The B vitamin doses we need to keep homocysteine in check are not extreme. About four hundred micrograms of folic acid, three micrograms of B
12
, and three milligrams of B
6
should do the job. While we can certainly get these through diet, the fact is that many of us don’t. Indeed, the average intake of folic acid in North America is about two hundred micrograms — far from adequate. This is where spinach comes in. It is an outstanding source of folic acid, particularly if we eat it raw. So, go for that spinach salad. And may I suggest dressing it with orange juice? Just one cup contains two hundred micrograms of folic acid. You can also throw in some green beans or cooked brown beans, also great sources of folate. You’ll be helping your heart, and other parts of your anatomy as well. A recent study of 25,000 women showed that those who consumed the most folic acid were one-third less likely to develop precancerous polyps in their colon. And if that isn’t motivation enough to seek out foods that are rich in folic acid, then consider that it may even lower the risk of Alzheimer’s disease. Yup, you heard right.

Researchers at the University of Kentucky explored the Alzheimer’s connection because they were aware of the extensive evidence showing that women who took folic acid supplements during pregnancy had babies with fewer neurological birth defects, such as spina bifida. Could folic acid affect the nervous system later in life, the researchers wondered? A group of nuns in Minnesota who had willed their bodies to scientific research provided the answer. Those who had ingested adequate amounts of folic acid throughout their lives were less likely to succumb to Alzheimer’s disease. This finding was corroborated by researchers at Tufts University, who fed spinach to rats and found that it not only prevented but also reversed memory loss. It seems that homocysteine can damage nerve cells the same way it damages blood vessels.

What all of this comes down to is that Popeye was right. That’s why I’m dismayed by his fading popularity among children. We could use his nutritional support. Especially when you consider that some researchers suggest we could prevent fifty thousand heart attacks a year in North America simply by increasing our folic acid intake. That spinach salad with orange dressing is looking mighty good.

BOOK: THAT’S THE WAY THE COOKIE CRUMBLES
3.35Mb size Format: txt, pdf, ePub
ads

Other books

Running From Forever by Ashley Wilcox
Christmas in the Rink by Dora Hiers
Garden Princess by Kristin Kladstrup
The New Life by Orhan Pamuk
Jack by Liesl Shurtliff
The Undertaking by Thomas Lynch
Life on the Level by Zoraida Cordova