Mathematics and the Real World (32 page)

BOOK: Mathematics and the Real World
12.06Mb size Format: txt, pdf, ePub

The third reason for the development of probability concepts was the development of jurisprudence. The awareness of the validity of legal arguments was increasing in Europe in those days, as was the recognition that legal proof that leads to conviction or acquittal beyond all reasonable doubt was almost impossible to obtain. Probability considerations in legal arguments had already appeared in ancient times, but with social progress the requirement arose that litigants base their case on quantitative assessments of the probability that their case was correct. In any event there was a growing need to develop terminology and methods of analysis that would help in such quantification.

The questions that arise in an analysis of chances and risks in ensuring payments are totally different from those that arise in, say, legal claims. Claims about betting, insurance, public-opinion surveys, and so on, relate to a random occurrence among many possibilities or random sampling from a large population, namely, statistics. In contrast, legal claims generally relate to a single, non-repeated event or to the degree of belief in the correctness of a certain argument. There is no apparent reason to expect the same theory and the same mathematical methods to apply to both types of situation, but nevertheless the same terminology of probability theory is used in the analysis of repeated and one-off incidents. This is the duality
inherent in probability theory and statistics, and the pioneers of the theory were aware of it. We will discuss this in more detail later on.

A few years after the Pascal-Fermat correspondence, in 1657, Christiaan Huygens (1629–1695) published a book that summarized his own work on the theory of probability and the knowledge on the subject that had been accumulated until then. He was encouraged to publish it by Artus Gouffier de Boissy, Duke of Roannez, who was also a patron of Pascal. Despite his efforts, Huygens never met Pascal face-to-face, as Pascal was at that time more involved in theology than mathematics, but he was well aware of Pascal's work. Huygens's book was the first published on the subject of probability and was entirely devoted to an analysis of the randomness related to games of chance, repayments of loans, and so on, in other words, its statistical aspects. Among other things, he wrote about the methods of calculation developed over many years by mathematicians related to random draws, counting methods, and the like. Huygens, who was Dutch, was one of the most respected and esteemed mathematicians, physicists, and astronomers of the period. Apart from his contribution in various other fields, he was famous for his explanation of the propagation of waves before the wave equation had been formulated. He traveled widely on the Continent and to Britain, and in 1663 he was elected to the Royal Society and later to the French Academy.

Huygens's book was also the first to discuss the idea of an average and of expected value. He coined the term in Latin
expactatio
(“expectation”). Today, engulfed as we are by statistical data of all sorts on all subjects and in all locations, it is difficult to imagine that until the middle of the seventeenth century the concept of the average was not in widespread use as a statistical quantity. At that time averages were calculated by physicists to obtain a good estimate of inaccurate measurements, for instance the paths of the celestial bodies, but not for statistical analyses. The mathematical analysis of games of chance and expected future payments led naturally to the development and use of the concept of the average. The first well-ordered presentation of that concept appeared in Huygens's book.

The discussion was incisive. To illustrate: Christiaan's brother Ludwig
used the existing tables of mortalities and found that the average lifetime of someone born in London was eighteen years. “Does that number have any significance?” Ludwig asked his brother. It is known, he claimed, that infant mortality is very high, and many children die before they are six years old, whereas those who do survive live to the age of fifty or longer. Indeed, a good question, and in our days the answer would be that different indices are used for different purposes. Christiaan Huygens did not deal with these situations but focused on the subject of gambling, and he stuck to his opinion that expectation was the right measure for estimating the value of bets, with the value being the expected payments weighted according to the chances of winning.

Here we will repeat the precise mathematical definition of expectation. If the monetary payments
A
1
,
A
2
,…,
A
n
in a draw are won with probabilities
p
1
,
p
2
,…,
p
n
, respectively, the
expectation
of the draw is
p
1
A
1
+
p
2
A
2
+…+
p
n
A
n
. Just as with the concept of the average, questions about the justification and interpretation of the idea of expectation were asked, forcefully, because the idea of probability had not been sufficiently clarified. It was recognized only in the context of relative frequency, in other words,
p
1
in the definition is the approximate proportion in which the win of
A
1
will occur if the draw is repeated many times.

One basic question that was asked was where do these probabilities come from? Also, how are they calculated? In throws of a die, for example, as there is no reason that one face should be on top more than any other, the probability of each face coming on top can be calculated; but how, Huygens had already asked himself, can the probability of catching a particular disease or being injured in an accident be calculated?

Despite the difficulty in establishing the concepts, the relevance to what they could be used for was clear. The first use was statistical analyses. The word
statistics
is derived from the word
state
, in the sense of country, and statistics did in fact deal mainly with subjects related to the management of a country. The most rapid progress was made in Holland, in relation to the pricing of the repayment of loans. This was because both the politician responsible for that subject, Johan de Witt, who was a leading figure in Dutch political life, and the
burgomaster
(mayor) of Amsterdam, Johannes Hudde, a town that was hard hit because of promises of unrealistic payments,
were good mathematicians. They both engaged in and contributed to Descartes's Cartesian geometry. They consulted Huygens himself, who was then in Amsterdam, and in 1671 de Witt published a book in which he set out the theory as well as the practice of calculating loan repayments under various conditions with detailed exemplary calculations. The book came with a confirmation by Hudde that the calculations were correct. The method soon spread across all of Europe. Of all countries it was England that was slow in adopting this use of the theory, and even a hundred years later the government authorities there were still selling pensions at prices that were not based on proper mathematical costings.

Alongside the development of the practical statistical use of the mathematics of randomness, progress was also made in theory regarding the connection between probability and legal evidence. Leibniz was the leader in this field, and in 1665 he published a paper on probability and law, with a more detailed version published in 1672. Leibniz's interpretation of probability was similar to the sense in which Aristotle viewed it, that is, the likelihood of an event in light of partial information. Leibniz, who came from a family of lawyers, tried to present quantitative measurements for the correctness of legal claims. After a visit to Paris and after becoming familiar with the Fermat-Pascal correspondence as well as Pascal's wager, Leibniz realized that a similar analysis could also be used in cases in which one had to assess the probability that a claim or certain evidence, even if it is one-off, that is, non-repeated, is correct or not. He analyzed the logic underlying the information brought before a judge and proposed measuring the likelihood of a conclusion, giving it a value of between zero (in the event that the conclusion is clearly wrong) and one (in the case when it is without doubt correct). Thus Leibniz laid the foundation of the analogy between the likelihood of an occurrence and the mathematics of situations that are repeated randomly. Herein apparently lies the secret of the great impact of the Fermat-Pascal correspondence and Pascal's wager. They used the same mathematical tools in discussing events that can be repeated, such as a game of chance stopped in the middle, as well as non-repeated occurrences, such as the question of the existence of God. Yet neither Leibniz nor others who dealt with the concepts of likelihood and
probability reached an understanding or consensus of what these probabilities were derived from or where they were formed.

39. THE MATHEMATICS OF PREDICTIONS AND ERRORS

A crucial step forward in the establishment of the link between the concept of expectation and the practical use of statistics was made by Jacob Bernoulli (1654–1705), one of the most prominent members of a family that had a great influence on mathematics. He first analyzed repeated tosses of a coin, assuming that the chances of its falling on either side were equal. By sophisticated use of Newton's binomial formulae, Bernoulli analyzed the following: he examined whether in repeated tosses of a coin the chance that it would fall with a particular side up, say heads, would be close to 50 percent of the total number of flips. He found that the chance grew closer and closer to certainty the greater the number of flips of the coin. Clearly in a given series of tosses of the coin, the proportion of the number of heads to the total number of throws could have any value between zero and one. Nevertheless, as Bernoulli showed, as the number of throws increases, it is almost certain that the number of times heads appears will be close to 50 percent of the total number of throws. These trials are still today called
Bernoulli trials
, and its mathematical law is called the
weak law of large numbers
(the formulation of the strong law of large numbers would later require twentieth-century developments).

Bernoulli himself and others who contributed to this innovative research path extended the results to more general cases than repeated flips of a coin, even to the case of repeated sampling from a large population, and to random errors in non-exact measurements. As we have noted previously, in order to assess a physical quantity, the measurement of which entails measurement error, physicists used an average of many measurements. The mathematical result confirmed that the greater the number of repeats or measurements, and on the condition that the repeats are carried out totally independent of each other, the average of all the measurements
will be close to the true value with a likelihood that increases and converges to certainty. At the same time, Bernoulli discussed the question of what creates the different probabilities and how confident can we be in their numerical values. He was apparently the first to distinguish between
a priori
probability, which we can calculate and derive from the conditions of the experiment, and
a posteriori
probability, which we see after performing a series of experiments. He turned the development of methods to calculate a posteriori probabilities into an objective, and it played an important role in future progress.

Bernoulli's weak law of large numbers is one of the limit rules that refer to statistical aspects of large samples or many repetitions of trials. Already then discrepancies were found between claims regarding large numbers and human intuition. Further on we will discuss in greater detail the discrepancy between intuition and the mathematics of randomness, but here we will give just two examples.

The first discrepancy is generally referred to as
the gambler's fallacy
. Many gamblers continue betting even if they are losing, believing that the laws of large numbers ensure that they will get their money back eventually. Their mistake is that the law says only that on average their wins will be close to expectation, but it does not relate to the amount of the win or loss. Even if the average of the wins and losses in a series of games is one dollar, the loss itself, in the case of many repeated bets, may be ten thousand or a million dollars. The difference between the average and the actual value gives an enormous advantage to the gambler who is wealthy and who can finance the loss until the probability turnaround arrives. That difference led to the bankruptcy of many gamblers who did not have deep enough pockets. Evolution did not give us the intuitive understanding of the difference between the average and the value itself when dealing with large numbers; the reason, apparently, is that in the course of evolution humans did not encounter examples of so many repetitions of events.

The second discrepancy between intuition and probability concepts is known as the St. Petersburg paradox, named after the city in which Daniel Bernoulli, nephew of Jacob Bernoulli, presented the problem to the Imperial
Academy of the Arts. Remember that Hyugens referred to the expectation of a lottery as a fair measure of the cost of participating in it. That approach proved itself as an accounting basis for calculating loan repayments or the price of participating in a lottery. Now consider a lottery in which a coin is tossed many times, say a million. In this, if the first time the coin falls tails up is on the
n
th throw, that is, until then it fell
n
– 1 times with heads up, the participant receives 2
n
dollars. A simple calculation shows that expected winnings are one million dollars! Would you agree to pay one hundred thousand dollars, or even ten thousand dollars, to participate in this lottery? I do not know anyone that would agree to do so, and this difference between the theory and the practice is the paradox. Daniel Bernoulli had a social explanation for this, and we will discuss it in the
next chapter
. A different explanation, which we will also expand on in due course, is the gulf between mathematics and intuition. The latter tells us that the coin will not fall on the same side a large number of times and ignores the chance, albeit a small one but with high winnings, that such an event will occur.

Other books

Residence on Earth (New Directions Paperbook) by Pablo Neruda, Donald D. Walsh
The Things She Says by Kat Cantrell
Dangerous Ladies by Christina Dodd
The Lodger: A Novel by Louisa Treger
Wicked Wonderland by Lisa Whitefern
Tramp in Armour by Colin Forbes
Eclipse by Book 3
The Meeting Place by T. Davis Bunn
Long Live the Dead by Hugh B. Cave