Sei sulla pagina 1di 31

CHAPTER 3

THE QUANTIFICATION OF UNCERTAINTY AFTER


1700: STATISTICS SOCIALLY CONSTRUCTED?

Theodore M. Porter

Abstract--The quantification of uncertainty has since the time of Pascal and Fermat
been tied to a program of social rationalization and enlightenment. Probability and
statistics, though their historical roots are distinct, have always been united in this:
that they provide a way of understanding and hence controlling the uncertainties of
change. Mathematical probability arose out of legal traditions involving the valuation
of evidence, and by the eighteenth century was generally conceived as the scientific
surrogate for native good sense in a world where this was too often lacking. Statistics,
a science of nineteenth-century origins, began as the application of numerical
reasoning to the problems of society. Its job was to uncover the order of large
numbers that prevailed beneath the conspicuous turbulence and unpredictability of
surface events, and to provide the legislator with the knowledge needed to combat
potential instabilities in social development. The ideas and techniques that were
developed in the context of these social and philosophical projects have since become
standard tools of the sciences, natural as well as social. Mathematical statistics might
thus appear to reflect the social interests of those who have invented and applied it
rather than to provide true knowledge about nature, or, in the current jargon, to be
"socially constructed.
This paper is a historical exploration of the mathematical assault on chance
and risk in the centuries before acting under uncertainty became a preoccupation of
specialized social and behavioral sciences. I seek to show how thoroughly the history
of probability and statistics contravenes the standard view of the hierarchy of the
sciences, to show that the origins of some of the most fundamental ideas and
assumptions of mathematical statistics, statistical physics, and population genetics lie
in the social sciences, and even in social ideologies. I aim also to clarify the relations
between the content of the sciences and their social contexts by defining this sharply
for the case of statistics. The paper will defend a thoroughly contextualist approach
to the history of science, but not one that reduces science to institutional forms and
social relations.

G. M. von Furstenberg, Acting under Uncertainty: Multidisciplinary Conceptions


Springer Science+Business Media Dordrecht 1990

46

Acting Under Uncertainty: Multidisciplinary Conceptions

Probability theory arose as the mathematics of uncertainty. The


term itself reveals a monumental act of scientific imperialism. For
originally "probability" had nothing to do with mathematics, or even
numbers. As Edith Sylla explains in this volume, a doctrine had
traditionally been called probable if, while lacking certainty, it came on
authority reliable enough to be worthy of belief, and of being acted
upon. In the aftermath of the striking new approach to gambling
problems developed by Blaise Pascal and Pierre de Fermat, a few
natural philosophers began to wonder if this non-demonstrative
knowledge, probability, might be reduced to mathematics. Pascal
([1670] 1962) himself, in perhaps the most famous of his Pen sees,
applied the reasoning of mathematical expectation to belief in God,
arguing that the prospect of an infinite reward made Christian faith a
sensible wager no matter how great the odds against its truth. Thus was
mathematical probability, from its very inception, the instrument of a
brave campaign to bring important decisions into the domain of
calculation. Statistics, a science of early nineteenth-century origins, had
only slightly less extravagant ambitions. It was to be the quantitative
science of the state, or the method of science--counting and comparing
numbers--as applied to society. Statisticians aimed to promote social
reforms, leading towards a rational polity and a prosperous economy.
By the end of the nineteenth century, statistics and probability had come
together, forming what is still the standard mathematical tool for
dealing with uncertainty.
These aims were not mere embellishments of a pure
mathematical, or even a pure scientific, program of research. On the
contrary, the history of mathematized rationality and of quantitative
social science comes close to being a history of statistical mathematics
in its formative" period. Such at least is the perspective offered in this
paper. I am concerned here with the effort to subject variation and
uncertainty to the guiding compass of mathematics in the centuries
before acting under uncertainty became a specialization within the
modern social sciences. The coverage is necessarily selective, and will
emphasize the ways in which the social mission of providing a guide to
action was interwoven with the invention of some key methods and
concepts of what has become the statistical method.
This, then, is a highly contextual history. Statistics will be
treated here as a social construct. Indeed, I am concerned here with the
relations of statistics to the broader society, and not merely with social
processes that are internal to the scientific community. This contextual

The Quantification of Uncertainty After 1700

47

approach provides only a partial view of the history of science, but it is


an essential one if science is to be understood as a part of history. It is
not to be construed as an assault on the value of statistical methods, but
as an exploration of growing role of quantification in modern history.

I.

CLASSICAL PROBABILITY

Jakob Bernoulli's Ars Conjectandi ([1713] 1968) set out a


program to make belief an exact science, to put numbers to all our
uncertainties, to reduce rationality to mathematics. That campaign
culminated in Pierre Simon Laplace's use of what is now called Bayes'
Theorem to define what rationality means--whether in judicial decisions,
business calculation, evaluation of historical testimony, or theory choice
in science. Lorraine Daston's (1988) remarkable study of probability in
the Enlightenment shows that mathematical probability was esteemed
a scientific surrogate for good sense in a world where this was too often
lacking. In place of the wise intuition of the enlightened few, it offered
calculation of degrees of belief.
It would substitute public
demonstration for ineffable wisdom.
The classical probabilists made much of their epistemological
modesty. Like Newton, they renounced fanciful hypotheses (see
They were willing, at least
Molland's paper in this volume).
provisionally, to do without knowledge of deep causes, and to proceed
instead by counting instances. They would learn about the causes of sex
at birth through careful registration, and then by comparing empirical
ratios from London and Paris, or city and country. They would use
mortality records to decide on the advisability of inoculation for
smallpox, while recognizing that the conclusion might not be applicable
to any particular individual. And even for large numbers, they conceded
the inevitability of error, just as there was always error in astronomical
measurements. As Laplace ([1814] 1951) put it, famously, a perfect
intelligence would have no need of probability, but for mortal man it is
indispensable. In a world of uncertainty, it seemed an immense
advantage to be able to quantify partial belief.
But if probability involved some compromise of the standards
of rational demonstration that had traditionally been associated with
philosophy, we must not associate this with diminished scientific

48

Acting Under Uncertainty: Multidisciplinary Conceptions

ambitions. Laplace did assert his humility too much. He accepted a


loosened standard of rigor of scientific belief, but only in order to
expand the domain of science beyond all limits. Probability could not
eliminate uncertainty from the world, but it promised a rational way of
managing it. The dream of mechanized decision criteria based on the
maximization of probability values or mathematical expectations had
already been dreamed in the eighteenth century. Ian Hacking (1975) has
argued that probability emerged from the low sciences, such as alchemy
and medicine. Here, knowledge was based on similitude rather than
demonstration. Probability, he suggests, became possible upon the
breakdown of this Renaissance world of signs and likenesses, and their
replacement by the idea of evidence. The notion of evidence figures
prominently also in Daston's account of early probability theory: she
argues that probability arose mainly from legal and commercial
reasoning. She points out that the earliest quantitative study of games
of chance was conceived in terms of the fair contract, and expressed in
terms of expectations. The shift to a style of reasoning whose elemental
concept was that continuous magnitude between zero and one, the
chance of an event, reflected the influence of a different branch of legal
reasoning. Jurists had long made use of a theory of partial proofs,
which were added together to make a full proof that would justify
conviction (in both senses of the term). G. W. Leibniz and Jakob
Bernoulli, both trained in the law, developed a mathematical grounding
f or legal probabilities and sought to use this as a model for apportioning
belief whenever certainty was not attainable.
Probability thus came to provide a model of enlightened good
sense. Indeed it was more than a model. Probability theory was nothing
else than the mathematics of good sense. Like mechanics, probability
was part of "mixed mathematics," that most rationalistic branch of
natural knowledge in the eighteenth-century classification of the
sciences. Again like mechanics, probability was not merely required to
produce agreement between premises and conclusions, but to make
accurate predictions about the world. The relation of the calculus of
probabilities to rational belief was much like that of optics to light. If
it proved to condone judgments or actions that manifestly lacked good
sense, then evidently there was something wrong with th~ theory, which
required correction.
Daston's best example of the subordination of mathematics to
enlightened good sense is the St. Petersburg paradox, first stated by
Jakob Bernoulli's nephew Nicholas (printed in Montmort, 1713). This

The Quantification of Uncertainty After 1700

49

arises from a game in which Peter will toss a coin until it turns up heads.
If this occurs on the first try, Paul owes him $1; if on the second, $2; if
on the third, $4; and if not until the Ilth toss, Paul must pay him $2"1.
The question is: what is a fair price for Peter to pay Paul for the rights
to the proceeds of this game?
Mathematicians quite naturally looked to probability for an
answer. More than that, they required probability to give a good
answer. It was after all the logic of good sense. Failure to do so would
virtually amount to a falsification of the theory. And this is why the St.
Petersburg problem was called a paradox. For probability did give a
simple answer--one that, though un problematical mathematically,
manifestly contradicted good sense. The fair price in a gambling
problem was defined in probability theory as the mathematical
expectation of gain. Here the expectation was [(1/2) + 2(1/4) + 4(1/8)
+ ... + 2"'1(1/2") + ... ] = [1/2 + 1/2 + 1/2 + ... ], which, since the game
can continue indefinitely, is infinite. But, as Bernoulli remarked, no
sensible person would pay more than a few dollars to play such a game.
Hence it was necessary either to give up the calculus of
probabilities, or to tinker with it. Daniel Bernoulli ([1738] 1954), a
cousin of Nicholas, chose the latter option. In a widely influential paper
that is now best remembered as the first intimation of the theory of
diminishing marginal utility, he defined a "moral expectation" which
increased not linearly with wealth but only in proportion to its
logarithm. After all, one dollar means much less to a person who
already has millions than to one on the verge of starvation. Hence very
small probabilities of proportionately large gains are worth almost
nothing, so that the sequence declines sharply and the total moral
expectation of the game is a modest number. Jean d'Alembert (see
Daston, 1979) took a more radical course, proposing to solve the St.
Petersburg problem with a change that radically undermined the
conventional theory of probabilities. A very long run of tails, he argued,
is mathematically possible but physically impossible. The probability of
a new throw of heads must go down as the number of consecutive
identical tosses goes up. Good sense supports this, he held, for if tails
were tossed even 100 times in a row, nobody would believe the coin was
fair. D'Alembert's solution, in contrast to Bernoulli's, was condemned
unanimously by probability writers, and it caused something of a scandal
that a mathematician so brilliant could be so stubbornly wrongheaded
about probability. But we can at least see that he, like Bernoulli, was

50

Acting Under Uncertainty: Multidisciplinary Conceptions

troubled by the chasm between the mathematics of probability and good


sense.

II.

THE SOCIAL ORIGINS OF STATISTICS

The development of statistical mathematics, too, cannot be


understood without taking into account the social and ethical aspirations
of the "statists" who studied it. Statistics, like probability, was originally
intended to provide a way of quantifying the most important aspects of
human existence. Both aimed to make reason the guide of private and
public action. There is, however, an important distinction to be made
here. The eighteenth-century science of probability was normative and
individualistic, concerned mainly with the rational apportionment of
belief. The nineteenth-century science of statistics aimed first of all to
describe the laws governing a social collective, albeit with the
expectation that they would have implications for social reform. These
aims, while not incompatible, reflected rather different sensibilities
about how order was to be imposed on political, social, and economic
life. Still, their shared goals and presuppositions would seem to
outweigh this divergence. Quantification in each case meant subduing
the irrational and inexplicable. It admitted a degree of uncertainty, or
at least a level of existence (the individual) for which prediction was not
possible. As compensation it offered a means of extraordinary range
and flexibility for discovering order where it had hitherto gone
unperceived and creating it where it had been lacking.
Although probabilists, and most notably M.-J .-A.-N. Condorcet
(Baker, 1975), tried to make their science the foundation of policy, it
had flourished principally in the rarified domain of mathematicians and
natural philosophers. Statistics, in contrast, was located mainly outside
the academies and universities. Emerging at first in Great Britain and
France early in the nineteenth century, it looked rather more like a
movement than a science. Official statisticians began collecting
numbers to promote mobilization for war in the Napoleonic period. In
the private sphere, "statists" were generally moved by poverty, disease,
crime, and urban unrest. They campaigned vigorously for public
education and for public health measures. They claimed the objectivity

The Quantification of Uncertainty After 1700

51

of science, which they identified with numbers, but did so mainly in


order to invoke its authority against their opponents.
The issue here is not whether ideological commitments stood
behind the statistical movement. That much is unquestionable. What
may appear more doubtful is whether that movement contributed
markedly to the development of statistical method and statistical
mathematics. After all, the statists were for the most part thoroughly
practical men--men of business, the law, and medicine--and were often
untrained in science. Very few had any knowledge of probability, or any
formal conception of how to deal with uncertainty and error. And yet
we can say with only slight exaggeration that these reformers and
publicists introduced to the world the most basic concepts of statistical
thinking. As worshipers of numbers, they took for granted that
something important could be learned about the causes of crime, births
out of wedlock, and mortality from laboriously-collected statistics.
Since it was the "mass" of humanity with which they were concerned,
they were untroubled by the impossibility of making predictions about
individuals. Rather than seeking to explain this or that suicide,
statisticians wondered why suicide in Prussia was up this year, or why
the mean height of conscripts was higher in northern than southern
France. They assumed, that is, the regularity of large numbers. They
even believed that a science of society could be based on them.
There was, however, a critical intellectual shift that had
provided a rationale for investigations of this sort. The statistical
research of the nineteenth century was predicated on the new
conception of "society" that was developed in the wake of the French and
Industrial Revolutions. The threat of social turmoil had called the
statistical movements into existence. The regularities they discovered
off ered the statisticians hope that society was on track, that disorder was
exceptional, or that undesirable activities such as crime somehow fit into
the larger scheme of things. But the decisive importance of the concept
of society, for our purposes, is that it provided statistics with a proper
object, a level of existence above the individual. Society was an entity
that statistical regularities could be properties of. Crime rates and
suicide frequencies could scarcely be understood as normal
characteristics of every man or woman. On the contrary, it was widely
agreed that these were dependent on the state of society, and could be
reduced, or increased, if the social order was reformed.
This view of statistical regularities, of crucial importance for the
subsequent development of statistical thinking, was worked out around

52

Acting Under Uncertainty: Multidisciplinary Conceptions

1830. It must be understood that certain regularities of large numbers


were well known in the eighteenth century. Jakob Bernoulli ([1713]
1968) had proven an important theorem about the convergence of
frequencies of outcomes, say of coin tosses, to the underlying
probabilities. The uniformity from year to year of the ratio of male to
female births, of marital fertility and age of death, was well known to
demographic quantifiers such as the Prussian pastor Johann Peter
Siissmilch (1765). But these reflected the natural order of things, even
the biological character of mankind. They did not lead to a general
expectation of regularity in all social numbers, as we can see from the
shock with which the first statistical tabulations of crime were greeted,
even by men whom we might have expected to know better. A stable
crime rate could be no natural consequence of free individual activity.
Instead, it seemed to imply an alarming fatalism, some mysterious force
that drove people to murder or suicide, even against their will. Such, for
instance, was the reaction of the Belgian astronomer turned social
statistician Adolphe Ouetelet (1829), who more than anyone else was
responsible for formulating and popularizing the idea that immoral acts
were subject to statistical laws.
Ouetelet's distress, however, was only momentary. For a
troubled liberal, deeply fearful of revolution (especially after 1830), the
paradoxical discovery of laws of crime was cause for celebration. And
celebrate Ouetelet did, in a stream of memoirs and books intended for
scientists, functionaries, rulers, and the general public (e.g. Ouetelet,
1835). He drew two general conclusions from his figures. First was the
seemingly conservative, but in fact merely reassuring, one that society
was on track. Human errors could not permanently derail society from
the course of progress. His second, and most controversial, conclusion
involved moral accountability. Society was more powerful than
individuals, and hence also more responsible. The repetition of the
same crimes with the same frequencies every year proved that society,
not the individual criminal, was the real malefactor. It was the task of
the legislator to put into law those ref orms whose advisability was shown
by statistical study, and thus to reduce this terrible, needless slaughter.
One or the other of these two morals, both enunciated by Ouetelet in
the early 1830s, was drawn virtually every time the "laws of statistics"
were discussed during the remainder of the century. Which one
depended on ideological temperament. Ouetelet's English admirer, the
militantly liberal historian Henry Thomas Buckle (1857), inferred from
statistics that the laws of history were supreme, and that neither a

The Quantification of Uncertainty After 1700

53

meddling state nor an obscurantist church could block the inevitable


progress of society. German "socialists of the chair," eschewing the
"atomistic" determinism of "social physics" and Manchestertum,
preferred the opposite conclusion, that a benevolent state was both
capable and morally obligated to institute the reforms needed to
improve the social condition (e.g. Knapp, 1871; 1872).
We thus find sharp ideological conflict concerning the most
basic of statistical assumptions, that conclusions can be drawn from
large numbers when the constituent acts (such as suicide) are too
unpredictable or too numerous to be studied individually. The same can
be said of the first application of the normal curve to statistical
variation, a step of scarcely less importance. That, too, was Quetelet's
work, though here again we have less an act of pure originality than a
creative conflation of ideas. The normal curve had been derived as the
limit of the binomial distribution by Abraham De Moivre early in the
eighteenth century. Laplace put it to use to estimate, for example, the
error in population estimates made by multiplying the total of births
registered in a year by the ratio of births to population in a certain
district chosen to be typical of all France. In 1809, Gauss showed how
the curve could be used to measure another kind of error, when an
astronomical quantity (such as a relative stellar position) was estimated
from repeated measurements. It is for this that the curve is commonly
called the Gaussian, though it was known almost universally as the error
curve until late in the nineteenth century. It was as a law of error that
the normal curve became a standard tool of astronomers, geodesists,
and surveyors (Stigler, 1986).
Quetelet (1846), whose great mission in statistics was to show
that "social physics" used the same methods and exhibited the same
principles as celestial mechanics, extended the range of the error law to
the physical and moral variation of humans in society. The crucial result
of this was to vindicate, and indeed exalt, the concept of /'homme
moyen, the average man. If, as he inferred, all real individuals are
accidental deviates from an archetype, then the laws of the
corresponding society can be learned from the study of mean values
alone, with due regard also to the average error. This was very
convenient for social physics, which aimed to plot a trajectory for
society through time by extrapolating from a series of measurements of
the average man. It also had gratifying implications for the future
prospects of society. If a society is adequately represented by mean
values, and if these mean values exhibit impressive stability from year

54

Acting Under Uncertainty: MUltidisciplinary Conceptions

to year, then the threat of discontinuous change can be defined away.


Deviation is mere error. What remains is the true mean, the point of
virtue that is always to be found between vicious extremes. Thus was
Aristotelian moral philosophy vindicated by statistical mathematics.
These may seem rather strained analogies. But they are not
unusual. A fuller discussion (Porter, 1986) would show how Ouetelet
built his social physics out of an elaborate metaphor connecting the
physical, the social, and the ethical. His own conception of social
progress appeared in this scheme as deriving both from physical and
moral laws. He also generalized freely, usually on the basis of analogy.
His evidence that human variation was normally distributed consisted
of what in retrospect seems an indifferent fit between the astronomer's
error law and some physical measurements, mainly from records of
military conscription. From this he concluded that all human traits,
including moral and psychological ones, are distributed in the same
fashion as errors of observation in astronomy. For example, every
individual has a quantitative "penchant for crime." The values of these
individual penchants vary randomly from the mean penchant for crime
(the total of crimes committed annually divided by the population).
This idea seemed to many statisticians to carry the denial of individual
responsibility too far. But such reasoning was entirely typical of
Ouetelet's intellectual style.
Among the pioneers of mathematical statistics, Quetelet's
analogical imagination was slightly unusual f or its lack of discipline, but
not at all exceptional in its fertility. Analogies were crucial for the
application of statistical concepts and methods to new subject areas.
After Ouetelet, social science itself became a crucial source of such
analogies, analogies that were applied back to physics and biology.
Ouetelet's ideas formed the bridge between the social science of
statistics and the modern branch of applied mathematics that has taken
its name. Here, then, the hierarchy of the sciences that we have come
to regard as customary fell apart, or rather was folded back on itself.
What is more, these social analogies sustained the bond between the
methods of statistics and the social perceptions through which they had
originated, even after the cutting edge of statistical innovation had
shifted back to the sciences of nature.

The Quantification of Uncertainty After 1700

Ill.

55

THE STATISTICS OF MOLECULES

Statistical physics is among the most successful fields to borrow


methods from the social science of statistics. In the great pioneering
work of statistical gas theory by James Clerk Maxwell ([1860] 1890), we
already find strong hints of his debt to the statistical tradition. The first
three propositions in this paper show that a system of colliding particles
(a gas) must quickly lose all trace of its initial configuration of positions
and velocities. The result is complete disorder. But, argued Maxwell in
a leap reminiscent of the social statistician, the disorder of innumerable
objects will have an order of its own. He proceeded to determine that
the gas particles must in fact come to equilibrium at the distribution of
velocities given by the astronomer's error law. Strikingly, his derivation
of that curve was almost precisely the one introduced by the astronomer
and polymath John Herschel (1850) in a review of one of Quetelet's
books. Herschel considered his derivation appropriate because of its
simplicity and generality, suitable for what Quetelet had shown to be the
extraordinarily widespread applicability of the error curve. Thus did
Quetelet's faith in universal social order contribute to the recognition
of a kind of molecular order. To the untrained eye, the motions and
collisions of many billions of molecules would seem to be utterly beyond
the grasp of science. Even three (gravitating) bodies introduced
rna thema tical com plexities so grea t tha t no formal solu tion was possible.
Only one versed in statistics could take for granted that millions of
colliding particles would yield order at a higher level. When Maxwell
was writing, even the existence of molecules was still completely
hypothetical.
Maxwell was, to be sure, no passive recipient of the technical
virtuosity of social science. Already in 1860 he was able to use elegant
combinatorial mathematics to deduce properties of gases that had no
analogues in society. But he took an active interest in many of the same
problems that engaged the statisticians. The publication of Buckle's
History of Civilization in Eng/and (1857) had set off a fierce
controversy about the compatibility of statistical regularity with free
will. Quetelet (1847), ambiguously, and Buckle, quite forthrightly, had
been inclined to assume that there was no room for the capriciousness
of human freedom where mathematical order had been shown to prevail.
The availability of mathematical rules governing chance and
uncertainty, however, could also lead in the opposite direction. Late in

56

Acting Under Uncertainty: Multidisciplinary Conceptions

the nineteenth century, when the statistical method had become well
established, a few scientists and philosophers began arguing that it was
unnecessary to endorse determinism even as a working assumption.
Now they had research methods that were no less capable of dealing
with a universe of chance. From the 1860s, social thinkers in much of
Europe and North America developed this view in a vigorous discussion.
Holistically-inclined German historical economists defended the idea of
a nondeterministic social science most strenuously. Precisely because
statisticians take averages over large numbers of individuals, they held,
the most impressive regularities are still quite compatible with the
existence of radical, unexplained human diversity--if not pure chance,
with which they were rather less enthralled (Porter, 1987).
In several essays from the early 1870s, Maxwell (e.g. 1873)
repeated this argument in a gentle criticism of Buckle, and proceeded
to draw out its implications for physics. Since the kinetic gas theory is
obliged to use the statistical method if it is to make any sense of this
chaos of colliding particles, it must fall short of a perfect, dynamical
explanation. Thus physics can no more exclude the operation of an
element of chance, or of some unknown cause, at the molecular level,
than social science can deduce from statistics who will commit suicide.
Against Buckle, Maxwell insisted that statistics is bound by its nature to
uncertainty, and can in no way provide evidence against free will.
Others maintained the opposite view. In physics for example,
Ludwig Boltzmann held stubbornly to his early faith in atomistic
reductionism and the complete determination of physical processes by
natural laws. Boltzmann, Maxwell's most brilliant successor in the
kinetic gas theory, was an admirer of Buckle. In one of his landmark
technical papers on the kinetic gas theory he offered the unexpected
argument that the adoption of a statistical method in physics no more
implies uncertainty in its results than does its employment by social
scientists, who have shown the most remarkable regularities to prevail
from year to year (Boltzmann, [1877] 1968). Statistics, then, was a
highly flexible source of philosophical arguments. If some statisticians
and physicists chose to interpret their results as reflecting variability and
uncertainty, we must look at least partly to external factors to
understand why. For German statisticians and historical economists like
G. F. Knapp and Wilhelm Lexis, these seem to reflect mainly their
political and social views, while Maxwell was moved more by religious
and philosophical considerations. Still, we can hardly imagine these

The Quantification of Uncertainty After 1700

57

debates assuming the shape they did in the absence of flourishing


statistical approaches to physics and social science.

IV.

SOCIAL BIOLOGY AND MATHEMATICAL


STATISTICS

The social analogy also contributed crucially to the shape of


statistical biology, or biometry. Just as Maxwell had learned of the
application of the normal law to real variation from an interpreter of
Quetelet, Francis Galton ([1909] 1911) reported in his Memories that
another of the Belgian statistician's advocates, the geographer William
Spottiswoode, first explained to him the properties of this remarkable
curve. Galton's initial work as a statistical biologist involved social, or
at least sociobiological, studies. His first use of higher mathematics was
to apply the normal distribution to the physical and mental variation of
humans, very much in the Quetelet tradition.
Galton's real originality in statistical methods began with an
explicit social analogy. This was the theory of Pangenesis, introduced
by his cousin Charles Darwin in 1868 and elaborated in an appendix to
Galton's Hereditary Genius in 1869. Pangenesis explained heredity
as the result of innumerable genetic elements, called gemmules,
combining into an embryo. This process was, Galton reported, precisely
comparable to the formation of a new town out of a great many
individuals migrating from the vicinity. All the most important
phenomena of heredity--atavism, sudden changes of racial type, "sports"
of nature--could be understood in terms of an analogy of migrations and
elections. He emphasized, in the liberal spirit that prevailed even in
eugenics during its earliest years, that these were free individuals, so
that no centralizing power was required. After all, it had been clearly
shown by statistics that a stable ordering of social behavior could be
maintained simply through the free choices of individuals.
Galton's analogy was strongly suggestive of a method of analysis.
The science that dealt with these demographic phenomena was statistics;
the same mathematics ought to be applicable to the biological problem.
Galton knew exactly how to set it up from the analogy with sampling a
popUlation. If a child was formed out of the gemmules passed down
from his parents, any given trait, such as his height, ought to equal the

58

Acting Under Uncertainty: Multidisciplinary Conceptions

mean of his parents (corrected for sexual differences), plus an error


term governed by the normal curve.
A few years later (Galton, 1877), when he actually performed
the experiment on an organism for which sufficient experimental control
was possible to give clean results, he found that this was not quite right.
Indeed, he decided, it could not be right, at least as a general rule, for
the addition of a new error term in every generation would cause the
distribution to get wider and wider. In fact, Galton's attempt to apply
the methods of social statistics to a biological problem led to a much
more interesting conclusion than replication of old results. He got
instead a linear regression. On average, the off spring were closer to the
mean than their parents. Why, then, did the population not simply
converge over time to the mean? Having asked this question, Galton
was led to divide the variation in the offspring into a portion explained
by the parental variation, plus a residual, unexplained portion that
would appear even if both parents were at the mean. Here, for the first
time, was a statistical method of correlating variables. For years,
Galton always interpreted these results strictly in terms of biological
categories: "reversion" to ancestors more remote than the parents, or
later "regression" towards the mean of the race. Finally in 1888
(published as 1888-89), when the same mathematics arose in a quite
different context, he suddenly realized that this was an abstract
mathematical tool, applicable to interdependent variables from almost
every field of knowledge. He gave it the name "correlation," and
distinguished it from "regression" only by its much greater generality
(Galton, 1890).
Thus Galton, like Maxwell, set out from a social analogy but was
soon led by imperfections of the analogy to novel results of
extraordinary importance. How he, a mediocre mathematician, could
have been practically the founder of modern mathematical statistics, is
an interesting problem. Victor Hilts (1973) suggested what has become
the usual interpretation of Galton's statistical originality. In contrast to
the social statisticians, Hilts argues, Galton was much less interested in
mean values than in variation and its causes. The souls of statisticians,
Galton (1889, p. 62) once wrote, "seem as dull to the charm of variety as
that of the native of one of our flat English counties, whose retrospect
of Switzerland was that, if its mountains could be thrown into its lakes,
two nuisances would be got rid of at once." This emphasis on natural
diversity, in turn, is usually connected with, and sometimes attributed to,
Galton's commitment to eugenics, the improvement of the human race

The Quantification of Uncertainty After 1700

59

through artificial selection. Eugenics provided, after all, the main


incentive for Galton to take up statistical biology. If progress was to
come through artificial selection rather than, as with Quetelet, the
universal diffusion of knowledge, exceptional individuals must be of
much greater interest than average ones.
An appreciation of the role of eugenics is crucial to any
satisfactory account of Galton's statistical creativity. Yet it goes too far
to argue, as for example Ruth Cowan (1972) has, that the influence of
eugenics is almost a sufficient explanation of the genesis of the
correlation concept. To believe that important scientific concepts were
ripe for the picking by anyone armed with the appropriate social
ideology is even less justifiable than the implication of some older
histories that the great discoveries required no more than a willingness
to value empirical facts over bookish preconceptions. Correlation
involved some subtle difficulties, and even where the will to find such
a tool was present, the way could prove elusive. Moreover, Galton was
not so blinded by his ideological commitments, nor so indifferent to
purely intellectual endeavors, as he appears in Cowan's portrayal. By
the mid 1880s he had become convinced that regression to the mean
reflected the inherent stability of certain types, and hence that real
eugenic progress would have to come through discontinuous variation,
of a sort that could not be grasped by his statistical formalism. This is
why Galton's work provided as much encouragement to the
anti-statistical Mendelians as to Karl Pearson and the biometricians
when these two parties had their definitive falling out at the turn of the
century (Provine, 1971). But Galton did not give up his statistical study
of continuous variation on this account. It presented problems that
greatly intrigued him, such as the asymmetry between parental
regressions on offspring and offspring regressions on the parents, which
he was so pleased to have figured out in about 1885.
There is another compelling reason not to assign exclusive
explanatory value to eugenic ideology in accounting for Galton's
statistical ideas. Eugenic concerns were not the only ones that could
lead away from mean values towards diversity as the principal focus of
In late nineteenth-century Germany, the
statistical interest.
paternalistic socialism that prevailed among statisticians and
historical-school economists was seen as sharply at odds with Quetelet's
social physics and its emphasis on the average man. German social
thinkers stressed the mutual responsibility implied by a highly
differentiated organicist state, and a society of average men would be

60

Acting Under Uncertainty: Multidisciplinary Conceptions

mlssmg precisely what was needed to make a moral community.


Mathematically-inclined statisticians developed tools to comprehend a
highly differentiated society. For example, Lexis (1877) worked out a
much-noted formula for assessing the stability of statistical series. He
concluded that such series were not so surprisingly stable, and that their
fluctuations reflected the interdependence and diversity of human
beings. By the late nineteenth century, statisticians in Germany and
elsewhere routinely pursued the search for causes by asking about the
relations between education and crime, or religion and suicide, or
wealth and duration of life. For this the method of correlation might
In England and America Galton's
have been highly welcome.
correlation concept was applied almost immediately to existing
problems in social statistics, and also to meteorology, anthropology,
economics, and education. In none of these fields, however, was it
discovered independently. I have suggested here, and argued elsewhere
(Porter, 1986, chapter 9), that Galton's path to correlation depended on
some distinctive features of the problem of hereditary transmission. He
began with a social analogy, but was quickly led beyond it. This reflects
certain peculiarities of his subject matter, and while Galton came to it
mainly because of eugenic concerns, we cannot reasonably speak of
correlation as merely the product of eugenic ideology.

V.

STATISTICS
SCIENCE

AS

THE

METHODOLOGY

OF

After Galton, mathematical statistics became more like a


discipline, rapidly generating a body of methods and a mathematical
structure that had to be mastered by all serious practitioners. In these
early years, we seem to have almost a holy line of statistical founders.
Galton begat Karl Pearson; Pearson begat G. Undy Yule, W. S. Gosset
("Student"), and R. A. Fisher. Fisher fell out of favor, but then
accomplished a heroic patricide. Pearson also begat Egon S. Pearson
(literally) and the Polish mathematician Jerzy Neyman, whose criticism
of Fisher initiated a new round of internecine strife. If in fact these are
the blessed forefathers, we must concede a certain weakness of family
loyalty. Indeed, few fields can match statistics for the duration and
fierceness of its controversies. The differences that separated Fisher

The Quantification of Uncertainty After 1700

61

from Pearson, then from Neyman and E. S. Pearson, and (more


recently) both parties from the so-called Bayesians, were deep ones.
Statistics was and remains a long way from consensus on fundamentals.
The one point on which Galton, Pearson, and Fisher seem to be
united is a preoccupation with biometry--inspired to a considerable
extent by eugenic concerns. But the biometric line is far from pure, and
its dominance was never complete. Stephen Stigler (1986) shows how
Pearson drew upon the economist Francis Edgeworth and then, with
Yule, mined the astronomical tradition of error theory for its
mathematics. Student developed his t-test for quality control in
brewing. The various editions of Arthur Bowley's (1901) statistical
textbook for social researchers may be contrasted with the textbook
tradition started by Yule (1911), or with Fisher's (1925; 1935) classic
books on statistics for research workers and on experimental design.
Bowley relied on Edgeworth (e.g. 1885), and on the German tradition
epitomized by Lexis and Ladislaus von Bortkiewicz (e.g. 1901). Social
statistics was by this time rapidly becoming more mathematical, but
decades passed before it incorporated the tools developed within the
(broadly) biometric tradition.
Of all the founders Fisher seems closest to biology. His
immense statistical prowess was instrumental in reconciling biometry
with Mendelian genetics in the evolutionary synthesis (Fisher, 1930).
His methodology of experimental design, however, was associated more
closely with practical agriculture than with evolutionary biology. What
he found in the agricultural literature could not have come from the
Pearsonian corpus, for Pearson's statistics, like his philosophy,
emphasized description and correlation rather than experiment and
causation (Gigerenzer et aI., 1989; Box, 1978). Fisher's concept of
degrees of freedom came from physics.
"Statistics," in fact, continues to encompass an extremely diverse
subject matter. But if the field remains a fractious one, users and
consumers of statistics now generally believe in a single, reasonably
unified methodology for designing surveys or experiments and analyzing
the data. In the extreme case, statistics has been treated as an answer
to the dream of mechanized induction. As Gerd Gigerenzer, especially,
has argued (1987; Gigerenzer et aI., 1989), textbook writers of the 1940s
deserve most of the credit for this apparent unification of statistical
methodology. They wrote mainly for the sciences of life, behavior, and
society, many of whose practitioners were eager to believe that statistics
provided an easy path to the grail of scientific respectability. The first

62

Acting Under Uncertainty: Multidisciplinary Conceptions

generation of statistics textbooks for the human SCIences were


consciously written for researchers in fields where very little
mathematical sophistication was either demanded or available.
Accordingly, they provided a product that was as simple and
unambiguous as possible.
In place of the controversies and
inconsistencies that still prevailed they created what has been called a
hybrid statistics. For sociologists, psychologists, and researchers in
education and medical therapeutics, this was simply Statistics. If its
logical foundations were insecure, its recipes were certified as tested in
the kitchens of mathematical specialists.
In some subfields,
experimental study with numerous repetitions yielding in the end a 0.05
significance level became practically the definition of sound research
(Morrison and Henkel, 1970). Fisher encouraged this attitude with
loose remarks, including one in The Design of Experiments (1935)
which seemed to reduce science to the rejection of null hypotheses.
The statistics textbooks were also welcomed in bureaucratic
circles. A unified statistical methodology would permit potentially
controversial decisions to be taken out of the political domain. They
could instead appear to be made objectively, that is, according to fixed
rules, preferably using somewhat arcane methods sanctioned by science.
For such purposes, mathematical statistics is second in importance only
to cost-benefit analysis, to which it also often contributes. This can
make public decision-making seem almost mechanical, as in the area of
drug testing, which in the United States is governed by a rigid
methodology of double-blind experiments (where possible) and analysis
for statistical significance. Statistics is also central to the burgeoning
new field of risk analysis, where it has achieved its greatest notoriety in
the study of nuclear power plants. Here, as in the most controversial
health matters, such as treatments for AIDS, the authority of
quantitative objectivity tends to break down. But these examples should
suffice to make it clear that the progress of statistics continues to be
shaped by, as well as to shape, social needs and wants.

VI.

THE QUESTION OF SOCIAL CONSTRUCTION

The prominent social role that has come to be played by


probability and statistics is readily apparent. In recent years, scholars

The Quantification of Uncertainty After 1700

63

have gone a long way towards writing a history of probability and


statistics in these terms, challenging the older understanding assumed
by most statistician-historians who treat the subject mainly in terms of
the advance of technical methods. A few works on the history of
statistics, and particularly on the British eugenist-statisticians, have
been written as case studies of social construction. A broader literature,
upon which I have drawn heavily here, suggests that other aspects of the
history of statistics are no less conducive to a broadly social approach.
Most of this work, to be sure, is not argued along the somewhat rigid
lines associated with an uncompromising sociological interpretation.
But this very diversity makes the history of statistics an excellent subject
matter for investigating this historiographic movement. We need to ask
not just what the idea of social construction contributes to the history
of science, but also, and more crucially, what if any interpretations of
social construction can be reconciled with this history.
There is a sense in which it is trivially true to call science a
social construct. Science is, after all, made by human beings.
Somewhat less trivially, modern science is impossible without an
advanced economy and sophisticated forms of social organization. We
are concerned here with a somewhat different, and not at all trivial,
relationship between society and the content of science. Not all cultures
place much value on scientific knowledge in the form it has customarily
taken in elite western institutions during the last three centuries. There
has been almost continuous resistance to it even within Europe and
America. Charles Gillispie's (1960) classic Edge of Objectivity
illustrates how prof ound were the differences separating rival programs
for physics, chemistry, and biology, as late as the eighteenth and
nineteenth centuries, even in France, Germany, and Britain. It took A.
L. Lavoisier's pneumatic chemistry, he suggests, to discredit the mystical
chemistry of p n e u rna, and in biology the strong claims of teleological
and broadly romanticist approaches to nature remained influential
decades after Darwin's 1859 Origin of Species.
Quantification and statistics are for many the defining
characteristic of scientific knowledge. They also, however, have often
been seen as eliminating from a subject precisely what is most important
for science to understand. We now hear such criticism most often in
relation to the social sciences, but the biological and physical sciences
have by no means been exempt. As late as the 1830s, G. F. Ohm's
mathematical analysis of the electric circuit was denounced by the
Hegelian natural philosopher G. F. Pohl as unjustifiably austere and

64

Acting Under Uncertainty: Multidisciplinary Conceptions

abstract, rather like a traveler's description of a journey which gave


nothing but the traveler's stops and the velocity in between (Jungnickel
and McCormmach, 1986, p. 56). Both Auguste Comte and Claude
Bernard rejected the use of statistics by medicine because they saw
detailed explanation as essential, especially for a science whose object
was to cure the individual patient (Porter, 1986). Statistics, in fact, has
been affected by such considerations more than most fields, partly
because it involves "mere" probability, and partly because of its
imperialistic tendencies. A field so influential and so controversial can
hardly be understood in purely internalistic terms, as the result of
mathematical advances and factual discoveries.
To include sociological factors means first of all to examine
carefully the workings of the scientific community responsible for
advancing knowledge in the relevant discipline. The activity of
disciplinary communities as makers of knowledge is the subject of a
rapidly growing literature. Such studies aim to show that even the
microlevel of science, ostensibly its most unproblematical, unideological
aspect, can be illuminated by sociological analysis. What had once been
seen as the un problematical domain of discovery is now portrayed as the
locus of continuous struggle and controversy (Latour, 1987).
Controversies, it is argued, expose real alternatives, and only in
retrospect does knowledge seem to be driven by experiment,
observation, and the logic of exact theory toward the "truths" that now
fill our textbooks.
Harry Collins (1985) argues that knowledge should be viewed as
emerging out of a process of negotiations within the scientific
community. Like Latour, he prefers not to talk about nature in itself,
arguing instead that scientific facts or laws only come into being when
the relevant community of specialists reaches agreement on them. They
are, however, far from denying the importance of observation and
experiment; on the contrary, theirs is the perspective most responsible
f or reviving interest in experimentation among historians, philosophers,
and sociologists of science. Indeed, they may be less subversive than
Latour's memorable epigrams would seem to imply. His interpretation
of science as clever rhetoric requires that scientific instruments, and the
"inscriptions" they produce, be the cleverest rhetoricians of all, which is
to say he imports a kind of realism through the back door. Ian Hacking
(1983) argues this way explicitly. Since modern experiments often
produce phenomena which never occur outside of experiments, the
"construction" of scientific facts can have a concrete meaning. When,

The Quantification of Uncertainty After 1700

65

however, experimental control permits certain results to be obtained so


reliably that they can be incorporated unproblematically into other
experiments, then one is justified in speaking of the reality of those
phenomena.
Collins (1985) disagrees. He holds that true replication is much
rarer and more difficult than most discussions of the methods of science
imply. One of his case studies, from physics, concerns the TEA laser,
which is not particularly remarkable as high technology, but which
nonetheless could never be duplicated, despite several attempts, except
by scientists who had actually worked in a laboratory where it was
already functioning. If this is typical, as a few other studies suggest and
as much sociological literature now assumes, it tends to exalt tacit
knowledge at the expense of the "public knowledge" of theory and even
of scientific description. It also makes generalization from experimental
results rather problematical, and casts some doubt on the extent to
which scientific results deserve the authority which in the modern world
is customarily attached to them (Barnes and Edge, 1982).
This line of interpretation offers definite promise as an
approach to the history of statistics. Statisticians of the present century
have been noteworthy for their passionate and enduring controversies,
perhaps even to the extent of providing a counterexample to the usual
assumption that an eventual establishment of consensus is the rule in
science. For that reason, the patched-together version of statistics that
prevails in the textbooks is more than usually vulnerable to
deconstructive criticism. If, however, there are disagreements about the
underlying logic, on the surface statistics appears as a method of
astonishing power and flexibility, a clear triumph of a general method
over recalcitrant particulars. Statisticians, after all, use similar
mathematical tools to analyze data from every branch of science, and
often from outside of science as well. In a certain sense, statistics is the
quintessential form of public knowledge, one that depends on
standardization of experimental or observational procedures and helps
to render results strictly comparable. In the same sense, the rise of
statistics has made knowledge more "objective"--that is, capable of being
interpreted or further analyzed without the benefit of an intimate
knowledge of the circumstances under which the results were obtained
(Swijtink, 1987). To say this, of course, is not to refute Collins.
Statistics only goes part way towards changing tacit into explicit
knowledge, and the reduction of all results to numbers is often used to
disguise the subjective aspects of an investigation. Still, the problem of

66

Acting Under Uncertainty: Multidisciplinary Conceptions

replication has more to do with the experimental or observational


methods than with the statistical analysis. Many statistical procedures
are so routine and mechanical that they can be done by computers.
Collins and Latour provide valuable insights, but their exclusive
concern with subdisciplinary communities is inadequate for a topic so
broad as the rise of probability and statistics over the last three
centuries. More useful f or our purposes is the perspective made famous
by sociologists at Edinburgh under the label "sociological strong
program." This label, despite a recent move by its advocates in the
microsociological direction (Shapin, 1982), continues to be associated
with an interest in macrosociological determinants of scientific
knowledge. A number of studies have gone so far as to deny the
autonomy of the discipline or subdiscipline in favor of explanations
involving a different unit of analysis, usually social class. Donald
MacKenzie's (1981) recent study is the most noteworthy of these. He
argues that eugenics gained credence in Britain because it best met the
ideological needs of the professional middle classes. He seeks then to
show that the outcomes of certain key episodes in the history of
statistics--Galton's studies of correlation, the Biometrician-Mendelian
controversy, the debate between Pearson and Yule over contingency
analysis--were determined by eugenic ideology.
Advocates of the strong program do not stand alone in drawing
such connections. They have only been most forceful, and perhaps
dogmatic. Be that as it may, their research has greatly enriched our
perspective on science. It enables us to see that not just scientific
theories and techniques, but also the standards of truth or
methodological soundness by which particular contributions are judged,
have varied over time and place, even among the elite scientific
communities of the modern west. Moreover, it shows that the content
of science is often closely related to circumstances falling within the
domain of political, social, economic, intellectual, and religious history.
Edinburgh sociologists are correct to deny that the course of scientific
change is to be explained simply in terms of factual discoveries and the
working out of the logic of theories. The historically contingent intrudes
at every level in the making of science.
This line of interpretation is wonderfully helpful in interpreting
the history of statistics, which exhibits the permeability of science to
external influences more strikingly and more frequently than do most
fields. A considerable struggle was required for statistical reasoning to
become acceptable to the scientific community at all. The model of

The Quantification of Uncertainty After 1700

67

statistical laws came from social science, where, ironically, it survived


partly on borrowed authority from the physical sciences. The
mathematical tools of probability and statistics reflected the social views
and related intellectual ambitions of their inventors: to reduce good
sense to calculus, or to mechanize scientific inference; to collapse
natural diversity into a system of mean values, to depict a highly
differentiated, organic society, or to probe the causes of the
perpetuation of variability from parents to offspring; to demonstrate
that even life, mind, and society are rigidly determined by natural laws,
or to show that there is room for chance, or free will, even in the laws
of physics; to deny uncertainty, to celebrate it, or to subject it to
mathematical order.
An approach to science that assumes the autonomy of disciplines
must be thoroughly inadequate in regard to the history of statistics.
Concepts arising as a part of natural or social science informed its
mathematical structure. An almost dizzying play of analogies promoted
its extension into new territories. The unification and integration of
statistical methods was promoted by the longstanding dream that
probability might one day be accepted as the logic of uncertainty, or of
social decisions, and thus cut the ground away from demagogues and
disputants. Nevertheless, rival alternatives have repeatedly been
formulated, giving rise to some of the most vitriolic debates in the
history of science.
Granting this, we might still wonder how far we should go with
the Edinburgh sociologists. Their claims seem to derive from an urge
to debunk science. Or rather, they subordinate science to sociology by
arguing that science is the appropriate form of knowledge for societies
like ours. The last phrase is hard to make precise. Does the existence
of other forms of knowledge as depicted by anthropology suffice to call
science a form of ideology? It is hard to see how history can provide an
adequate grounding for this, an epistemological claim (Gordon, 1989).
Less abstractly, the sociological interpretation of scientific change
seems almost to invite an objection much like Samuel Johnson's
refutation of Berkeley. Science works. Our high technology would be
inconceivable without it. How can it be a mere ideology?
This is by no means unanswerable. A simple response,
consistent with the social constructivist viewpoint, is that the criterion
of truth implicit here--technological fruitfulness, or perhaps prediction
and control--is itself socially conditioned. It rests on a special attitude
towards nature, one whose emergence in the modern west is a historical

68

Acting Under Uncertainty: Multidisciplinary Conceptions

problem of the greatest interest. But the objection cannot be dismissed


as completely misdirected. Radical social constructionists tend to be
satisfied with an explanation of scientific change only when they have
found an ideological factor, or perhaps a struggle for resources and
authority within the scientific community, to which decisive importance
can be assigned. This cannot be the only way to write about science.
We can note that science, as a social activity, necessarily has a rhetoric,
the more so as controversies arise frequently in the making of
knowledge. We can observe that respect for science is still far from
universal, and often more dependent on its relations to wealth and
power than on the explanations it gives. We might go so far as to call
call the norms and ideals that regulate science cultural preferences. But
these are, at the least, longstanding and reasonably stable cultural
preferences. Some of the most crucial have been with us since the
seventeenth century, or even classical antiquity, and are presupposed in
any activity upon which we would confer the name "science." The body
of knowledge, techniques, standards, and organizational forms that
constitute the modern scientific enterprise has considerable coherence
and durability.
Science can reasonably be called a social construct, but it does
not require a separate act of social creation every time it moves. To
suggest, as Latour does, that the reasons for a particular scientific
position become sound only in retrospect is to imply that every scientific
controversy, every unresolved issue, could equally well be decided one
way as another. This view would have science perpetually inhabiting a
state of nature. It would cancel out everything called knowledge as soon
as it is created. The absurdity of doing so is implicitly recognized even
in the most radically sociological accounts of scientific change, including
Latour's. If scientific ideas and methods had no standing--if they were
powerless to affect the future development and uses of science--their
history could not be of the slightest importance. Science may be socially
constructed, but it is not reinvented every moment.

VII.

CONCLUSION: CONSTRUCTING STATISTICS

The recent move by social constructivists to put less emphasis


on the relations of science to its social and intellectual context reflects

The Quantification of Uncertainty After 1700

69

a concern to avoid appearing crudely reductionistic. This, however,


misses the point. A social interpretation is crude when it refuses to
respect the content of science, seeing in it nothing but a pale reflection
of social relations. A more nuanced view of social construction is fully
compatible with attentiveness to the interactions between science and
the broader culture. Such an approach must certainly be balanced with
study of the micro-processes that characterize disciplinary (and
subdisciplinary) communities in the big science of our day. Even that
science is not wholly insulated, however, and it rests on the shoulders of
a scientific enterprise that often was closely bound up with
philosophical, religious, and political concerns.
Few subjects could illustrate this point better than the history of
statistics. It was long on the fringes of science, both because of its close
ties to eff orts to expand science into the human domain and because its
methods were hard to reconcile with traditional ideals of scientific
demonstration. Again and again, ideas and ideologies from outside the
sciences have exerted a formative influence on it. The process by which
its fundamental concepts originated and gained acceptance is of the
highest interest, for its influence on the social and natural sciences has
been extraordinary. Its acceptability was not a foregone conclusion; as
late as 1900 it was possible to reject statistical models in physics on the
ground that they added nothing to our knowledge of the macroscopic
laws of thermodynamics but uncertainty.
The history of statistical method, I have argued here, is
inseparable from the history of politics and society. Even so, it offers
scant support for those who see nothing but clever rhetoric and power
struggles in the making of science. If statistical mathematics had almost
no disciplinary autonomy before 1900, the fields with which it
interacted, including even social science, did. Quetelet, Maxwell, and
Galton had to meet certain minimal expectations of the scientific
audiences to which they addresed their work. These expectations need
not lead inexorably to some unique formulation of a problem, much less
to a single set of answers. (Consider, for example, how different is the
physicist's from the chemist's atom.) But they do set limits, strenuous
ones, on acceptable scientific work.
The standards of acceptable science are very hard to state
formally. They are not identical in every discipline, and there have been
sharply variant national traditions. They evolve over time. Modern
statistics, for example, has changed them considerably; it had to, for by
the standards of science prevailing up to about 1800, statistical

70

Acting Under Uncertainty: MUltidisciplinary Conceptions

explanation was no explanation at all. But since the seventeenth


century, at least, successful science has involved convincing other
scientists or natural philosophers that one has attained a certain mastery
over nature. This can mean agreement between a theory and existing
data, or the prediction of new phenomena, or demonstrated
experimental control. Such requirements are easy to belittle, for nature
does not present itself naked for observation and measurement. It is
now a truism that scientific description is necessarily theory-laden. Still,
to say this is not to imply that nature is utterly passive, or that every
theoretical framework (or experimental apparatus) stands up equally
well.
How else can we interpret the shock that we find expressed at
some of the real turning points in the early history of probability and
statistics? The classical probabilists were genuinely alarmed to find that
their reasonable calculus was subject to a paradox like the St. Petersburg
problem. Ouetelet reported his astonishment upon learning that even
crime was subject to a considerable degree of statistical regularity.
Maxwell anticipated that his statistical gas theory would be written off
as interesting but misdirected speCUlation until experiment confirmed
what seemed the implausible prediction that gaseous friction should be
independent of density. Galton expressed wonder on several notable
occasions, among them his recognition while graphing some
anthropometric data that hereditary "regression" and biological
"correlation" (of parts) have the same mathematical solution (Galton,
1890). He concluded that he had reached a solution to the problem of
"co-relation," which was much more abstract than any biological theory.
The existence of scientific surprise does not imply that the facts
speak for themselves and must in the end triumph over mere theory.
Surprise, too, is an interpretation, and (for example) subsequent
statisticians have had great difficulty seeing "astonishing" regularity in
Ouetelet'sfavorite results. But it does reflect a certain engagement with
one's materials. Our protagonists were perhaps more than normally
susceptible to the delight and consternation of surprise because they
occupied a field with strong centrifugal tendencies. Working, as they
did, outside and between disciplinary traditions, they were perpetually
confronting the materials of one discipline with methods and
conceptions borrowed from another. Far from the well-worn paths, the
fit between method or theory and object was imperfect and
unpredictable.

The Quantification of Uncertainty After 1700

71

Perhaps, then, it is precisely because statistical mathematics had


so little autonomy that the confrontation between expectations and
subject matter was so often surprising. This is almost the same reason
that statistics was so readily accessible to ideas and influences from
outside the scientific community. It suggests that an intense struggle to
master a recalcitrant subject matter and an openness to a wide variety
of influences, including perhaps social ideologies, are far from
contradictory. It makes good sense that we should see such influences
in a new, ill-defined field, and especially in one whose strongly
expansionist tendencies brought it into contact with a variety of objects
and intellectual traditions. Both surprise and openness were encouraged
also by the curious status of statistics, an insecure social science aiming
to imitate the methods of physics, while actually developing new
conceptions that would eventually prove valuable in the study of nature.
If statistics is unusual, however, it is not unique. A balanced
account of any development in science must integrate technical,
microsociological, and broader contextual factors. To say that statistics,
or any area of science, is "socially constructed" is not to say very much.
The phrase does suggest, however, that history, philosophy, and
sociology of science should pay attention to the relations between
knowledge and the societies within which it is nurtured and applied.
The history of statistics, at least, cannot be adequately understood in
any other way.

REFERENCES

Baker, Keith (1975), Condorcet: From Natural Philosophy to


Social Mathematics. Chicago: University of Chicago Press.
Barnes, Barry and Edge, David, eds. (1982), Science in Context,
Cambridge: MIT Press.
Bernoulli, Daniel ([1738] 1954), "Exposition of a New Theory on the
Measurement of Risk," Econometrica, January, 22, pp. 23-36.
Bernoulli, Jakob ([1713] 1968), Ars Conjectandi, Brussels: Culture et
Civilisation.

72

Acting Under Uncertainty: Multidisciplinary Conceptions

Boltzmann, Ludwig ([1877] 1968), "Ueber die Beziehung zwischen dem


zweiten Hauptsatz der mechanischen Warmetheorie und der
Wahrscheinlichkeitsrechnung," in Ludwig Boltzmann,
Wissenschaftliche Abhandlungen, Fritz Hasenohrl, ed., 3
vols., reprinted New York: Chelsea, vol. 2, pp. 164-223.
Bortkiewicz, Ladislaus von (1901), "Anwendungen der Wahrscheinlichkeitsrechnung auf Statistik," in Encyclopadie der
mathematischen Wissenschaften, Leipzig: B. G. Teubner,
vol. I, part 2, pp. 821-51.
Bowley,Arthur L. (1901), Elements of Statistics, London: P. S. King.
Box, Joan Fisher (1978), R. A. Fisher: The Lif e of a Scientist, New
York: Wiley.
Buckle, Henry Thomas (1857), History of Civilization in England,
vol. 1, London: J. W. Parker.
Collins, H. M. (1985), Changing Order, London: Sage.
Cowan, Ruth S. (1972), "Francis Galton's Statistical Ideas: The
Influence of Eugenics," Isis, December, 63, pp. 509-28.
Cullen, Michael (1975), The Statistical Movement in Early
Victorian Britain, Hassocks: Harvester.
Daston, Lorraine J. (1979), "D'Alembert's Critique of Probability
Theory," Historia Mathematica, August, 6, pp. 259-79.
Daston, Lorraine J. (1988), Classical Probability in the
Enlightenment, Princeton: Princeton University Press.
Edgeworth, Francis (1885), "Methods of Statistics," Journal of the
Royal Statistical Society, Jubilee Volume, pp. 181-217.
Fisher, Ronald A. (1925), Statistical Methods for Research
Workers, Edinburgh: Oliver and Boyd.
Fisher, Ronald A. (1930), The Genetical Theory of Natural
Selection, Oxford: Clarendon.
Fisher, Ronald A. (1935), The Design of Experiments, Edinburgh:
Oliver and Boyd.
Galton, Francis (1869), Hereditary Genius, London: Macmillan.
Galton, Francis (1877), "Typical Laws of Heredity," Proceedings of
the Royal Institution of Great Britain, 8, pp. 282-301.
Galton, Francis (1888-89), "Co-relations and their Measurement, Chiefly
from Anthropometric Data," Proceedings of the Royal
Society of London, meeting of December 20, 1888,45, pp.
135-45.
Galton, Francis (1889), Natural Inheritance, London: Macmillan.

The Quantification of Uncertainty After 1700

73

Galton, Francis (1890), "Kinship and Correlation," North American


Review, April, 50, pp. 419-31.
Galton, Francis ([1909] 1911), Memories of my Life, London:
Macmillan.
Gigerenzer, Gerd (1987), "Probabilistic Thinking and the Fight against
Subjectivity," in Krtiger, Gigerenzer and Morgan, pp. 11-33.
Gigerenzer, Gerd et al. (1989), The Empire of Chance: How
Probability Changed Science and Everyday Life,
Cambridge: Cambridge University Press.
Gillispie, Charles C. (1960), The Edge of Objectivity, Princeton:
Princeton University Press.
Gillispie, Charles C. (1963), "Intellectual Factors in the Background of
Analysis by Probabilities," in A. C. Crombie, ed., Scientific
Change, New York: Basic Books, pp. 431-53.
Gordon, Scott (1989), The Proper Study of Mankind: An
Introduction to the History and Philosophy of Social
Science, Brookfield, Vt.: Edward Elgar.
Hacking, Ian (1975), The Emergence of Probability, Cambridge:
Cambridge University Press.
Hacking, Ian (1983), Representing and Intervening: Introductory
Lectures on the Philosophy of Science, Cambridge:
Cambridge University Press.
[Herschel, John] (1850), "Quetelet on Probabilities," Edinburgh
Review, July, 92, pp. 1-57.
Hilts, Victor L. (1973), "Statistics and Social Science," in R. N. Giere
and R. S. Westfall, eds., Foundations of Scientific Method:
The Nineteenth Century, Bloomington: Indiana University
Press, pp. 206-33.
Jungnickel, Christa and McCormmach, Russell (1986), Intellectual
Mastery of Nature: Theoretical Physics from Ohm to
Einstein, vol. 1, The Torch of Mathematics 1800-1870,
Chicago: University of Chicago Press.
Knapp, Georg Friedrich (1871), "Die neueren Ansichten tiber
Moralstatistik," lahrbucher fur Nationalokonomie und
Statistik, 16, pp. 237-50.
Knapp, Georg Friedrich (1872), "A. Quetelet als Theoretiker,"
lahrbucher fur NationalOkonomie und Statistik, 17, pp.
89-124.

74

Acting Under Uncertainty: Multidisciplinary Conceptions

Kruger, Lorenz; Daston, L. J. and Heidelberger, M., eds. (1987), The


Probabilistic Revolution, vol. 1: Ideas in History,
Cambridge: MIT Press.
Kruger, Lorenz, Gigerenzer, G. and Morgan, M., eds. (1987), The
Probabilistic Revolution, vol. 2: Ideas in the Sciences,
Cambridge: MIT Press.
Laplace, Pierre Simon ([1814] 1951), Philosophical Essay on
Probabilities, F. W. Truscott and F. L. Emory, trans., New
York: Dover.
Latour, Bruno (1987), Science in Action, Cambridge: Harvard
University Press.
Lexis, Wilhelm (1877), Zur Theorie der Massenerscheinungen in
der menschlichen Gesellschaft, Freiburg: Wagner.
MacKenzie, Donald (1981), Statistics in Britain, 1865-1930: The
Social Construction of Scientific Knowledge, Edinburgh:
Edinburgh University Press.
Maxwell, James Clerk ([1860] 1890), "Illustrations of the Dynamical
Theory of Gases," in James Clerk Maxwell, Scientific Papers,
W. D. Niven, ed., 2 vols., Cambridge: Cambridge University
Press, vol. 1, pp. 377-409.
Maxwell, James Clerk ([1873] 1882), "Science and Free Will," in Lewis
Campbell and William Garnett, Life of James Clerk
Maxwell, London: Macmillan, pp. 434-44.
Montmort, Pierre Remond de (1713), Essai d'Analyse sur les J eux
de Hazard, Paris: Quillau, 2nd Ed.
Morrison, D. E. and Henkel, R. E., eds. (1970), The Significance
Test Controversy, Chicago: Aldine.
Pascal, Blaise ([1670] 1962), Pensees, Louis Lafuma, ed., Paris:
Delmas.
Porter, Theodore M. (1986), The Rise of Statistical Thinking,
1820-1900, Princeton: Princeton University Press.
Porter, Theodore M. (1987), "Lawless Society: Social Science and the
Reinterpretation of Statistics in Germany, 1850-1880," in
Kruger, Daston and Heidelberger, eds., pp. 351-75.
Provine, William B. (1971), The Origins of Theoretical Population
Genetics, Chicago: University of Chicago Press.
Quetelet, Lambert-Adolphe-Jacques (1829), "Recherches Statistiques
sur Ie Royaume des Pays-Bas," Nouveaux Memoires de
l'Academie Royale des Sciences et Belles-Lettres de
Bruxe II e s, 5, separate pagination.

The Quantification of Uncertainty After 1700

75

Quetelet, Lambert-Adolphe-Jacques (1835), Sur I'Homme et Ie


Developpement de ses Facultes, ou Essai de Physique
Sociale, Paris: Bachelier.
Quetelet, Lambert-Adolphe-Jacques (1846), Lettres sur la Theorie
des Probabilites, Brussels: Hayez.
Quetelet, Lambert-Adolphe-Jacques (1847), "De I'Influence de Libre
Arbitre de I'Homme sur les Faits Sociaux," Bulletin de la
Commission Centrale de Statistique (of Belgium), 3,
135-55.
Shapin, Steve (1982), "History of Science and its Sociological
Reconstructions," History of Science, September, 20, pp.
157-211.
Stigler, Stephen M. (1986), The History of Statistics: The
Measurement of Uncertainty before 1900, Cambridge:
Harvard University Press.
Sussmilch, Johann Peter (1765), Die gOttliche Ordnung in den
Veriinderungen des menschlichen Geschlechts, 2 vols.,
Berlin: Verlag der Buchhandlung der Realschule, 3d Ed.
Swijtink, Zeno G. (1987), "The Objectification of Observation:
Measurement and Statistical Methods in the Nineteenth
Century," in Kruger, Daston and Heidelberger, eds., pp. 261-85.
Yule, G. Udny (1911), An Introduction to the Theory of
Statistics, London: Griffin.

Potrebbero piacerti anche