An Edible History of Humanity (15 page)

BOOK: An Edible History of Humanity
3.83Mb size Format: txt, pdf, ePub
ads

The scale of the devastation was unlike anything seen in some parts of Europe since the Black Death. The potato crop failed
again in 1846, and the famine continued because farmers gave up planting potatoes in subsequent years. The people faced not
just starvation, but disease. William Forster, a Quaker who visited Ireland in January 1847, recalled the scene in one village:

The distress was far beyond my powers of description. I was quickly surrounded by a mob of men and women, more like famished
dogs than fellow creatures, whose figures, looks and cries, all showed that they were suffering the ravening agony of hunger
. . . in one [cabin] there were two emaciated men, lying at full length, on the damp floor . . . too weak to move, actually
worn down to skin and bone. In another a young man was dying of dysentry; his mother had pawned everything . . . to keep him
alive; and I never shall forget the resigned, uncomplaining tone in which he told me that all the medicine he wanted was food.

In Ireland around one million people starved to death as a result of the famine or were carried off by the diseases that spread
in its wake. Another million emigrated to escape the famine, many of them to the United States. The potato blight also spread
across Europe, and for two years there were no potatoes to be had anywhere. But Ireland’s unrivaled dependence on the potato
meant that it suffered the most.

As the magnitude of the disaster became apparent in late 1845, the British prime minister, Sir Robert Peel, found himself
in a difficult situation. The obvious response to the famine was to import grain from abroad to relieve the situation in Ireland.
The problem was that such imports were at the time subject by law to a heavy import duty to ensure that homegrown grain would
always cost less, thus protecting domestic producers from cheap imports. The Corn Laws, as they were known, were at the heart
of a long-running debate that had pitted the aristocratic landowners, who wanted the laws to stay in place, against an alliance
of opponents led by industrialists, who demanded their abolition.

The landowners argued that it was better to rely on homegrown wheat than unreliable foreign imports, and warned that farmers
would lose their jobs; they left unspoken their real concern, which was that competition from cheap imports would force them
to reduce the rents they charged the farmers who worked their land. The industrialists said it was unfair to keep the price
of wheat (and hence bread) artificially high, given that most people now bought food rather than growing their own; but they
also knew that abolition would reduce demands for higher wages, since food prices would fall. Industrialists also hoped that
cheaper food would leave people with more money to spend on manufactured goods. And they favored abolition of the Corn Laws
because it would advance the cause of “free trade” in general, ensuring easy access to imported raw materials on one hand,
and export markets for manufactured goods on the other. The debate over the Corn Laws was, in short, a microcosm of the much
larger fights between agriculture and industry, protectionism and free trade. Was Britain a nation of farmers or industrialists?
Since the landowners controlled Parliament, the argument had raged throughout the 1820s and 1830s to little effect.

The outcome was determined by the potato, as the famine in Ireland brought matters to a head. Peel, who had vigorously opposed
the abolition of the Corn Laws in a Parliamentary debate in June 1845, realized that suspending the tariff on imports to Ireland
in order to relieve the famine, but keeping it in place elsewhere, would cause massive unrest in England, where people would
still have to pay artificially high prices. He became convinced that there was no alternative but to abolish the Corn Laws
altogether, a reversal of his government’s policy. At first he was unable to persuade his political colleagues, but some of
them changed their minds as the news from Ireland worsened and it became apparent that the survival of the government itself
was at stake. Finally, with a vote in May 1846, the Corn Laws were repealed. The support of the Duke of Wellington, an aristocratic
war hero who had long been a strong supporter of the Corn Laws, was crucial. He persuaded the landowners who sat in the House
of Lords to back the repeal on the grounds that the survival of the government was more important. But he privately conceded
that “those damned rotten potatoes” were to blame for the demise of the Corn Laws.

The lifting of the tariff on imported grain opened the way for imports of maize from America, though in the event the government
mishandled the aid effort and it made little difference to the situation in Ireland. The removal of the tariff also meant
that wheat could be imported from continental Europe to replace the much diminished Irish supply. In the second half of the
nineteenth century British wheat imports soared, particularly once the construction of railways in the United States made
it easy to transport wheat from the Great Plains to the ports of the East Coast. Within Britain, meanwhile, the shift from
agriculture to industry accelerated. The area of land under cultivation and the size of the agricultural workforce both went
into decline in the 1870s. By 1900, 80 percent of Britain’s main staple, wheat, was being imported, and the proportion of
the labor force involved in agriculture had fallen to less than 10 percent.

Coal was not the only fuel that had driven this industrial revolution. The growth in agricultural productivity that had started
two centuries earlier (supplemented by sugar from the Caribbean) and the supply of wheat from Ireland (made possible by the
potato) had also played their part in carrying England over the threshold into the new industrial age. And by clearing away
the obstacle to a greater reliance on food imports, the tragedy of the potato famine helped to complete the transformation.

FOOD AND ENERGY REVISITED

It is no exaggeration to suggest that the Industrial Revolution marked the beginning of a new phase in human existence, just
as the Neolithic revolution associated with the adoption of farming had done some ten thousand years earlier. Both were energy
revolutions: Deliberate farming of domesticated crops made a greater proportion of the solar radiation that reaches Earth
available to mankind, and the Industrial Revolution went a step farther, exploiting solar radiation from the past, too. Both
caused massive social changes: a switch from hunting and gathering to farming in the former case, and from agriculture to
industry in the latter. Both took a long time to play out: It was thousands of years before farmers outnumbered hunter-gatherers
globally, and industrialization has only been under way for 250 years, so only a minority of the world’s population lives
in industrialized countries so far—though the rapid development of China and India will soon tip the balance. And both are
controversial: Just as it is possible to argue that hunter-gatherers were better off than farmers and that the adoption of
agriculture was a big mistake, a case can also be made that industrialization has caused more problems than it has solved
(though this argument is most often advanced by disillusioned people in rich, industrialized countries). There have been dramatic
environmental consequences in both cases, too: Agriculture led to widespread deforestation, and industrialization has produced
vast quantities of carbon dioxide and other greenhouse gases that have started to affect the world’s climate.

In this sense the industrialized countries have not escaped Malthus’s trap after all, but have merely exchanged one crisis,
in which the limiting factor was agricultural land, for another, in which the limiting factor is the atmosphere’s ability
to absorb carbon dioxide. The possibility that the switch to fossil fuels might provide only a temporary respite from Malthusian
pressures occurred even to nineteenth-century writers, notably William Stanley Jevons, an English economist and author of
The Coal Question
, published in 1865. “For the present,” he wrote, “our cheap supplies of coal and our skill in its employment, and the freedom
of our commerce with other wider lands, render us independent of the limited agricultural area of these islands, and apparently
take us out of the scope of Malthus’s doctrine.” The word
apparently
did not appear in the first edition of the book, but Jevons added it to a later edition shortly before his death in 1882.

He was right to worry. In the early twenty-first century, renewed concerns about the connection between energy supplies and
the availability of sufficient land for food production have been raised once again by the growing enthusiasm for biofuels,
such as ethanol made from maize and biodiesel made from palm oil. Making fuel from such crops is appealing because it is a
renewable source of energy (you can grow more next year) and over its life cycle it can produce fewer carbon emissions than
fossil fuels. As plants grow, they absorb carbon dioxide from the air; they are then processed into biofuel, and the carbon
dioxide goes back into the atmosphere when the fuel is burned. The whole process would be carbon neutral, were it not for
the emissions associated with growing the crops in the first place (fertilizer, fuel for tractors, and so on) and then processing
them into biofuels (something that usually requires a lot of heat). But exactly how much energy is required to produce various
biofuels, and the level of associated carbon emissions, varies from crop to crop. So some biofuels make more sense than others.

The type that makes least sense is ethanol made from maize (corn), which is, unfortunately, the predominant form of biofuel,
accounting for 40 percent of world production in 2007, most of it in the United States. The best-guess figures suggest that
burning a gallon of corn ethanol produces only about 30 percent more energy than was needed to produce it, and reduces greenhouse-gas
emissions by about 13 percent compared with conventional fossil fuel. That may sound impressive, but the corresponding figures
for Brazilian sugarcane ethanol are about 700 percent and 85 percent respectively; for biodiesel made in Germany they are
150 percent and 50 percent. Put another way, making a gallon of corn ethanol requires four fifths of a gallon of fossil fuel
(not to mention hundreds of gallons of water), and does not reduce greenhouse-gas emissions by very much. America’s corn-ethanol
drive makes even less sense on economic grounds: To achieve these meager reductions in emissions, the United States government
subsidizes corn-ethanol production to the tune of some seven billion dollars a year, and also imposes a tariff on sugarcane
ethanol from Brazil to discourage imports. Corn ethanol seems to be an elaborate scheme to justify farming subsidies, rather
than a serious effort to reduce greenhouse-gas emissions. En gland abolished its farmer-friendly Corn Laws in 1846, but America
has just introduced new ones.

Enthusiasm for corn ethanol and other biofuels is one of the factors that has helped to drive up food prices as crops are
diverted to make into fuel, so that they are in effect fed to cars, not people. Opponents of biofuels like to point out that
the maize needed to fill a vehicle’s twenty-five-gallon tank with ethanol would be enough to feed one person for a year. Since
maize is also used as an animal feed, its higher price makes meat and milk more expensive, too. And as farmers switch their
land from growing other crops to growing corn instead, those other crops (such as soy) become scarcer, and their prices also
rise. Food and fuel are, it seems, once again competing for agricultural land. Cheap coal meant that English landowners in
the eighteenth century realized their land was more valuable for growing food than fuel; concern about expensive oil today
means American farmers are making the opposite choice, and growing crops for fuel rather than for food.

Biofuels need not always compete with food production, however. In some cases, it may be possible to grow biofuel feedstocks
on marginal land that is unsuitable for other forms of agriculture. And those feedstocks need not be food crops. One potentially
promising approach is that of cellulosic ethanol, in which ethanol is made from fast-growing, woody shrubs, or even from trees.
In theory, this would be several times more energy efficient even than sugarcane ethanol, could reduce greenhouse-gas emissions
by almost as much (a reduction of around 70 percent compared with fossil fuels), and would not encroach upon agricultural
land. The problem is that the field is still immature, and expensive enzymes are needed to break down the cellulose into a
form that can be made into ethanol. Another approach involves making biofuel from algae, but again the technology is still
in its early days.

What is clear is that the use of food crops for fuel is a step backward. The next logical step forward, after the Neolithic
and Industrial revolutions, is surely to find new ways to harness solar energy beyond growing crops or digging up fossil fuels.
Solar panels and wind turbines are the most obvious examples, but it may also be possible to tinker with the biological mechanism
of photosynthesis to produce more efficient solar cells, or to create genetically engineered microbes capable of churning
out biofuels. The trade-off between food and fuel has resurfaced in the present, but it belongs in the past.

Amateurs talk tactics, but professionals talk logistics.

—ANONYMOUS

The fate of Europe and all further calculations depend upon the question of food. If only I have bread, it will be child’s
play to beat the Russians.

—NAPOLEON BONAPARTE

“MORE SAVAGE THAN THE SWORD”

What is the most devastating and effective weapon in the history of warfare? It is not the sword, the machine gun, the tank,
or the atom bomb. Another weapon has killed far more people and determined the outcomes of numerous conflicts. It is something
so obvious that it is easy to overlook: food, or more precisely, control of the food supply. Food’s power as a weapon has
been acknowledged since ancient times. “Starvation destroys an army more often than does battle, and hunger is more savage
than the sword,” noted Vegetius, a Roman military writer who lived in the fourth century A.D. He quoted a military maxim that
“whoever does not provide for food and other necessities, is conquered without fighting.”

For most of human history, food was literally the fuel of war. In the era before firearms, when armies consisted of soldiers
carrying swords, spears, and shields, food sustained them on the march and gave them the energy to wield their weapons in
battle. Food, including fodder for animals, was in effect both ammunition and fuel. Maintaining the supply of food was therefore
critical to military success; a lack of food, or its denial by the enemy, would lead swiftly to defeat. Before the advent
of mechanized transport, keeping an army supplied with food and fodder often imposed significant constraints on where and
when it could fight, and on how fast it could move. Although other aspects of warfare changed dramatically from ancient times
to the Napoleonic era, the constraints imposed by food persisted. Soldiers could only carry a few days’ worth of supplies
on their backs; using pack animals or carts allowed an army to carry more supplies and equipment, but fodder for the animals
was then needed, and the army’s speed and mobility suffered.

This was recognized in the fourth century B.C. by Philip II of Macedonia, who introduced a number of reforms that were extended
by his son, Alexander, to create the fastest, lightest, and most agile force of its day. Families, servants, and other followers,
who sometimes equalled the soldiers in number, were restricted to an absolute minimum, allowing the army to throw off its
immense tail of slow-moving people and carts. Soldiers were also required to carry much of their own equipment and supplies,
with pack animals rather than carts carrying the rest. With fewer animals there was less need to find fodder, and the army
became more mobile, particularly over difficult terrain. All this gave Alexander’s army a clear advantage, allowing him to
launch lightning strikes that struck fear into his enemies, according to Greek historians. Satibarzanes, a Persian governor,
“learning of Alexander’s proximity and astounded at the swiftness of his approach, fled with a few Arian horsemen.” The Uxians,
a Persian hill tribe, were “astounded by Alexander’s swiftness, and fled without so much as coming to close quarters.” And
Bessus, a treacherous Persian nobleman, was “greatly terrified by Alexander’s speed.” Alexander’s mastery of the mechanics
of supplying his army—a field known today as logistics—enabled him to mount one of the longest and most successful military
campaigns in history, conquering a swath of territory from Greece to the Himalayas.

Armies in history rarely brought along all of their own food supplies, however, and Alexander’s was no exception. Food and
fodder were also drawn from the surrounding country as the soldiers marched through. Such foraging could be an efficient way
to feed an army, but it had the disadvantage that if the soldiers stopped moving, the local area would be rapidly depleted.
Initially the army would have plenty of food at its disposal, but on each successive day foraging parties would have to travel
farther to reach areas that had not yet been stripped of food. Alexander’s rule of thumb, which was still valid centuries
later, was that an army could only forage within a four-day radius of its camp, because a pack animal devours its own load
within eight days. An animal that travels four days through barren country to gather food must carry four days’ worth of food
for its outward journey; it can then load up with eight days’ worth of food, but will consume half of this on the return journey,
leaving four days’ worth—in other words, the amount it started off with. The length of time an army could stay in one place
therefore depended on the richness of the surrounding country, which in turn depended on the population density (more people
would generally have more food that could be appropriated) and the time of year (there would be plenty of food available just
after the harvest, and very little available just before it). Alexander and other generals had to take these factors into
account when choosing the routes of marches and the timing of campaigns.

Delivering supplies in bulk to an army on campaign was best done by ship, which was the only way to move large quantities
of food quickly in the ancient world. Pack animals or carts could then carry supplies the last few miles from the port to
the army’s inland bases when necessary. This compelled armies to operate relatively close to rivers and coasts. As Alexander
conquered the lands around the Mediterranean he was able to rely on his fleet to deliver supplies, provided his soldiers secured
the ports along the coast beforehand. Moving from port to port, the soldiers carried a few days’ worth of supplies and supplemented
them by living off the land when possible. In the centuries after Alexander’s death, the Romans took his logistic prowess
a stage further. They established a network of roads and supply depots throughout their territory to ensure that supplies
could be moved quickly and in quantity when needed. Their depots were resupplied by ship, which made it difficult for Roman
armies to operate more than seventy-five miles from a coast or a large river. This helps to explain why Rome conquered the
lands around the Mediterranean, and why the northern boundaries of its territory were defined by rivers. Maintaining permanent
supply depots meant that a large force could move quickly through Roman territory without having to worry about finding food
or fodder. The Roman army also introduced rules to govern the process of foraging while on campaign.

In enemy territory, demanding food requisitions from the surrounding area served two purposes: It fed the invading army and
impoverished the local community. Food in such situations was literally a weapon: A marauding army could strip a region bare
and cause immense hardship. As a medieval Chinese military handbook puts it, “If you occupy your enemies’ store houses and
granaries and seize his accumulated resources in order to provision your army continuously, you will be victorious.” Sometimes
merely the threat of seizure was enough. In Alexander’s case, local officials often surrendered to him before he entered their
territory and agreed to provide food for his army, in return for more lenient treatment. As Alexander advanced into the Persian
Empire, this was a deal that local governors were increasingly happy to agree to.

Conversely, removing or destroying all food and fodder in the path of an advancing army (a so-called scorched-earth policy)
provided a way to use food defensively. An early example came during the Second Punic War between Rome and Carthage, during
which Hannibal, the Carthaginian general, humiliated the Romans by rampaging around Italy with his army for several years.
In an effort to stop him, a proclamation was issued that “all the population settled in the districts through which Hannibal
was likely to march should abandon their farms, after first burning their houses and destroying their produce, so that he
might not have any supplies to fall back upon.” This ploy failed, but on other occasions in history it was highly effective.
Another defensive strategy was to deny the enemy access to food-processing equipment. In order to delay the advance of Spanish
troops in 1636, French generals were instructed to “send out before them seven or eight companies of cavalry in a number of
places, with workers to break all the ovens and mills in an area stretching from their own fronts to as close as possible
to the enemy.” Without ovens and mills, seized grain could not be turned into bread, and soldiers would have to make camp
for a couple of days to set up portable ovens.

All these food-related constraints on the waging of war persisted throughout most of human history, despite the emergence
of new technologies such as firearms. But over time the supply systems used by armies invariably became more elaborate. In
particular, warfare in eighteenth-century Europe became increasingly formalized, and armies came to rely less on requisitions
and foraging, which they regarded as old-fashioned and uncivilized, and more on supplies amassed in depots and delivered by
wagon trains. Professional soldiers expected to be fed and paid while on campaign; they did not expect to have to forage for
food. The resulting need to build up supplies beforehand meant that campaigns had to be planned long in advance. With armies
tethered to their supply depots, lightning strikes or long marches were out of the question. One historian has likened wars
of this period to “the jousting of turtles.”

The American Revolutionary War of 1775–1783 provides a microcosm of how logistical considerations could still be crucial in
determining the outcome of a conflict, centuries after Alexander and Hannibal. In theory, the British should easily have been
able to put down the rebellion among their American colonists. Britain was the greatest military and naval power of its day,
presiding over a vast empire. In practice, however, supplying an army of tens of thousands of men operating some three thousand
miles away posed enormous difficulties. Britain’s 35,000 soldiers required 37 tons of food a day among them (a pound of beef
each, plus some peas, bread, and rum); their 4,000 horses needed a further 57 tons.

To start with, the British commanders expected their soldiers’ reliance on supplies delivered across the Atlantic by ship
to be temporary. They hoped that American loyalists would rally to their cause, allowing the army to draw food and fodder
from the country in loyalist areas. But this proved to be impractical, both because of the quantities required and because
requisitioning food alienated the loyalists on whose support the British strategy depended. Many of the British troops, accustomed
to Europe’s more formal style of warfare, lacked experience in foraging and felt that it was beneath them. The British troops
found themselves penned up near ports for security, dependent on supplies brought in by sea and unable to move very far inland.
Attempts to enlarge the area under control provided a larger area in which to forage, but it caused resentment among the colonists,
who refused to continue food production or mounted guerrilla resistance. Foraging expeditions sent beyond the British lines
required covering forces of hundreds of troops. A small group of rebels could harass a much larger foraging party, picking
off men using ambushes and snipers. The British lost as many men in such skirmishes as they did in larger pitched battles.

Unwilling to venture inland, where their movements would end up being determined by the needs of supply rather than military
strategy, the British concluded that they would need to build up a reserve of at least six months’ worth of food (and ideally
a year’s worth) before mounting a major offensive, a condition that was met only twice over the course of the eight-year war.
The shortage of supplies also meant that the British were unable to press their advantage when the opportunity arose, repeatedly
giving their opponents the chance to regroup. The British failed to strike a decisive blow in the early years of the conflict,
and after other European powers entered the war on America’s side it became clear that Britain could not win.

The American forces also suffered from supply problems of their own, but they had the advantage of being on familiar territory,
and could draw manpower and supplies from the country in a way the British could not. As George Washington, the commander
in chief of the American forces, remarked shortly afterward, “It will not be believed that such a force as Great Britain has
employed for eight years in this Country could be baffled in their plan . . . by numbers infinitely less, composed of men
sometimes half starved; always in rags, without pay, and experiencing, at times, every species of distress which human nature
is capable of undergoing.” The British failure to provide adequate food supplies to its troops was not the only cause of its
defeat, and of America’s subsequent independence. But it was a very significant one. Logistical considerations alone do not
determine the outcome of military conflicts, but unless an army is properly fed, it cannot get to the battlefield in the first
place. Adequate food is a necessary, though not sufficient, condition for victory. As the Duke of Wellington put it: “To gain
your [objectives] you must feed.”

“AN ARMY MARCHES ON ITS STOMACH”

In the early hours of October 5, 1795, a promising young artillery officer named Napoleon Bonaparte was put in charge of the
forces defending the French government, known as the National Convention. It had been elected in 1792, in the wake of the
French Revolution that had overthrown the monarchy, but there were still large numbers of royalist sympathizers in the country.
An army of thirty thousand royalists was now advancing on the Tuileries Palace in Paris, where the convention’s members had
taken refuge. Napoleon immediately sent a cavalry officer to fetch forty cannons and their crews, and by dawn he had positioned
them carefully in the streets around the palace and had them loaded with grapeshot. His defending forces were outnumbered
six to one, and at one point Napoleon had his horse shot out from under him as he directed his men. When the royalist columns
launched their main attack, the defending troops managed to channel them toward the main bank of guns, positioned in front
of a church. Napoleon gave the order to fire, and the cannons cut down the royalist troops with devastating effectiveness,
causing the survivors to turn and flee. “As usual I did not receive a scratch. I could not be happier,” Napoleon wrote to
his brother Joseph afterward. It was to prove a turning point in his career.

BOOK: An Edible History of Humanity
3.83Mb size Format: txt, pdf, ePub
ads

Other books

Westlake, Donald E - Novel 50 by Sacred Monster (v1.1)
Chosen by Sin by Virna Depaul
My Holiday in North Korea by Wendy E. Simmons
Smarty Bones by Carolyn Haines
The Little Sister by Raymond Chandler
Three Thousand Miles by Longford, Deila