Sei sulla pagina 1di 2

Random number tables have been used in statistics for tasks such as

selected random samples. This was much more effective than manually selecting the
random samples (with dice, cards, etc.). Nowadays, tables of random numbers have been
replaced by computational random number generators.

Tables of random numbers have the desired properties no matter how chosen from the
table: by row, column, diagonal or irregularly. The first such table was published by a
student of Karl Pearson's in 1927, and since then a number of other such tables were
developed. The first tables were generated through a variety of ways—one (by L.H.C.
Tippett) took its numbers "at random" from census registers, another (by R.A.
Fisher and Francis Yates) used numbers taken "at random" from logarithm tables, and in
1939 a set of 100,000 digits were published byM.G. Kendall and B. Babington
Smith produced by a specialized machine in conjunction with a human operator. In the mid-
1940s, the RAND Corporation set about to develop a large table of random numbers for use
with the Monte Carlo method, and using a hardware random number generator produced A
Million Random Digits with 100,000 Normal Deviates. The RAND table used electronic
simulation of a roulette wheel attached to a computer, the results of which were then
carefully filtered and tested before being used to generate the table. The RAND table was an
important breakthrough in delivering random numbers because such a large and carefully
prepared table had never before been available (the largest previously published table was
ten times smaller in size), and because it was also available on IBM punch cards, which
allowed for its use in computers. In the 1950s, a hardware random number generator
named ERNIEwas used to draw British lottery numbers.

The first "testing" of random numbers for statistical randomness was developed by M.G.
Kendall and B. Babington Smith in the late 1930s, and was based upon looking for certain
types of probabilistic expectations in a given sequence. The simplest test looked to make
sure that roughly equal numbers of 1s, 2s, 3s, etc. were present; more complicated tests
looked for the number of digits between successive 0s and compared the total counts with
their expected probabilities. Over the years more complicated tests were developed. Kendall
and Smith also created the notion of local randomness, whereby a given set of random
numbers would be broken down and tested in segments. In their set of 100,000 numbers, for
example, two of the thousands were somewhat less "locally random" than the rest, but the
set as a whole would pass its tests. Kendall and Smith advised their readers not to use those
particular thousands by themselves as a consequence.
If carefully prepared, the filtering and testing processes remove any noticeable bias or
asymmetry from the hardware-generated original numbers so that such tables provide the
most 'reliable' random numbers available to the casual user.

Note that any published (or otherwise accessible) random data table is unsuitable for
cryptographic purposes since the accessibility of the numbers makes them effectively
predictable, and hence their effect on a cryptosystem is also predictable. By way of contrast,
genuinely random numbers that are only accessible to the intended encoder and decoder
allow literally unbreakable encryption of a similar or lesser amount of meaningful data (using
a simple exclusive-OR operation)

Potrebbero piacerti anche