Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Theodore M. Porter
Abstract--The quantification of uncertainty has since the time of Pascal and Fermat
been tied to a program of social rationalization and enlightenment. Probability and
statistics, though their historical roots are distinct, have always been united in this:
that they provide a way of understanding and hence controlling the uncertainties of
change. Mathematical probability arose out of legal traditions involving the valuation
of evidence, and by the eighteenth century was generally conceived as the scientific
surrogate for native good sense in a world where this was too often lacking. Statistics,
a science of nineteenth-century origins, began as the application of numerical
reasoning to the problems of society. Its job was to uncover the order of large
numbers that prevailed beneath the conspicuous turbulence and unpredictability of
surface events, and to provide the legislator with the knowledge needed to combat
potential instabilities in social development. The ideas and techniques that were
developed in the context of these social and philosophical projects have since become
standard tools of the sciences, natural as well as social. Mathematical statistics might
thus appear to reflect the social interests of those who have invented and applied it
rather than to provide true knowledge about nature, or, in the current jargon, to be
"socially constructed.
This paper is a historical exploration of the mathematical assault on chance
and risk in the centuries before acting under uncertainty became a preoccupation of
specialized social and behavioral sciences. I seek to show how thoroughly the history
of probability and statistics contravenes the standard view of the hierarchy of the
sciences, to show that the origins of some of the most fundamental ideas and
assumptions of mathematical statistics, statistical physics, and population genetics lie
in the social sciences, and even in social ideologies. I aim also to clarify the relations
between the content of the sciences and their social contexts by defining this sharply
for the case of statistics. The paper will defend a thoroughly contextualist approach
to the history of science, but not one that reduces science to institutional forms and
social relations.
46
47
I.
CLASSICAL PROBABILITY
48
49
arises from a game in which Peter will toss a coin until it turns up heads.
If this occurs on the first try, Paul owes him $1; if on the second, $2; if
on the third, $4; and if not until the Ilth toss, Paul must pay him $2"1.
The question is: what is a fair price for Peter to pay Paul for the rights
to the proceeds of this game?
Mathematicians quite naturally looked to probability for an
answer. More than that, they required probability to give a good
answer. It was after all the logic of good sense. Failure to do so would
virtually amount to a falsification of the theory. And this is why the St.
Petersburg problem was called a paradox. For probability did give a
simple answer--one that, though un problematical mathematically,
manifestly contradicted good sense. The fair price in a gambling
problem was defined in probability theory as the mathematical
expectation of gain. Here the expectation was [(1/2) + 2(1/4) + 4(1/8)
+ ... + 2"'1(1/2") + ... ] = [1/2 + 1/2 + 1/2 + ... ], which, since the game
can continue indefinitely, is infinite. But, as Bernoulli remarked, no
sensible person would pay more than a few dollars to play such a game.
Hence it was necessary either to give up the calculus of
probabilities, or to tinker with it. Daniel Bernoulli ([1738] 1954), a
cousin of Nicholas, chose the latter option. In a widely influential paper
that is now best remembered as the first intimation of the theory of
diminishing marginal utility, he defined a "moral expectation" which
increased not linearly with wealth but only in proportion to its
logarithm. After all, one dollar means much less to a person who
already has millions than to one on the verge of starvation. Hence very
small probabilities of proportionately large gains are worth almost
nothing, so that the sequence declines sharply and the total moral
expectation of the game is a modest number. Jean d'Alembert (see
Daston, 1979) took a more radical course, proposing to solve the St.
Petersburg problem with a change that radically undermined the
conventional theory of probabilities. A very long run of tails, he argued,
is mathematically possible but physically impossible. The probability of
a new throw of heads must go down as the number of consecutive
identical tosses goes up. Good sense supports this, he held, for if tails
were tossed even 100 times in a row, nobody would believe the coin was
fair. D'Alembert's solution, in contrast to Bernoulli's, was condemned
unanimously by probability writers, and it caused something of a scandal
that a mathematician so brilliant could be so stubbornly wrongheaded
about probability. But we can at least see that he, like Bernoulli, was
50
II.
51
52
53
54
Ill.
55
56
the nineteenth century, when the statistical method had become well
established, a few scientists and philosophers began arguing that it was
unnecessary to endorse determinism even as a working assumption.
Now they had research methods that were no less capable of dealing
with a universe of chance. From the 1860s, social thinkers in much of
Europe and North America developed this view in a vigorous discussion.
Holistically-inclined German historical economists defended the idea of
a nondeterministic social science most strenuously. Precisely because
statisticians take averages over large numbers of individuals, they held,
the most impressive regularities are still quite compatible with the
existence of radical, unexplained human diversity--if not pure chance,
with which they were rather less enthralled (Porter, 1987).
In several essays from the early 1870s, Maxwell (e.g. 1873)
repeated this argument in a gentle criticism of Buckle, and proceeded
to draw out its implications for physics. Since the kinetic gas theory is
obliged to use the statistical method if it is to make any sense of this
chaos of colliding particles, it must fall short of a perfect, dynamical
explanation. Thus physics can no more exclude the operation of an
element of chance, or of some unknown cause, at the molecular level,
than social science can deduce from statistics who will commit suicide.
Against Buckle, Maxwell insisted that statistics is bound by its nature to
uncertainty, and can in no way provide evidence against free will.
Others maintained the opposite view. In physics for example,
Ludwig Boltzmann held stubbornly to his early faith in atomistic
reductionism and the complete determination of physical processes by
natural laws. Boltzmann, Maxwell's most brilliant successor in the
kinetic gas theory, was an admirer of Buckle. In one of his landmark
technical papers on the kinetic gas theory he offered the unexpected
argument that the adoption of a statistical method in physics no more
implies uncertainty in its results than does its employment by social
scientists, who have shown the most remarkable regularities to prevail
from year to year (Boltzmann, [1877] 1968). Statistics, then, was a
highly flexible source of philosophical arguments. If some statisticians
and physicists chose to interpret their results as reflecting variability and
uncertainty, we must look at least partly to external factors to
understand why. For German statisticians and historical economists like
G. F. Knapp and Wilhelm Lexis, these seem to reflect mainly their
political and social views, while Maxwell was moved more by religious
and philosophical considerations. Still, we can hardly imagine these
57
IV.
58
59
60
V.
STATISTICS
SCIENCE
AS
THE
METHODOLOGY
OF
61
62
VI.
63
64
65
66
67
68
VII.
69
70
71
REFERENCES
72
73
74
75