Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
distributions because 80
0 N = 713.00
18
50
66
34
2.
.0
.0
.0
.0
0
Ht (cm) 1996
Failure (F)
Examples
Toss of a coin (heads or tails)
Jacob Bernoulli (1654-1705)
Sex of a newborn (male or female)
Examples
Toss of a coin (S = head): p = 0.5 q = 0.5
Examples
A coin is tossed 5 times
Assume p remains constant from trial to trial and that the trials
are statistically independent of each other
The Binomial Distribution
Overview
Example
What is the probability of obtaining 2 heads from a coin that
was tossed 5 times?
FFSFFFFSFSFSSFFFFFSF…
P(x) = q q p q
= px qn – x
The Binomial Distribution
Overview
n!
P(x) = px qn – x
x!(n – x)!
n!
where is the number of ways to obtain x successes
x!(n – x)!
in n trials, and i! = i (i – 1) (i – 2) … 2 1
The Binomial Distribution
Overview
Bin(0.3, 5)
Bin(0.1, 5)
0.4
0.8
0.3
0.6
0.2
0.4
0.2
0.1
0 0
0 1 2 3 4 5 0 1 2 3 4 5
Bin(0.5, 5)
0.4
0.3
0.2
0.1
0
0 1 2 3 4 5
Bin(0.9, 5)
Bin(0.7, 5)
0.8
0.4
0.6
0.3
0.4
0.2
0.2
0.1
0
0
0 1 2 3 4 5
0 1 2 3 4 5
The Poisson Distribution
Overview
When there is a large number of
trials, but a small probability of
success, binomial calculation
becomes impractical
Example: Number of deaths from
horse kicks in the Army in
different years
Simeon D. Poisson (1781-1840)
The mean number of successes from
n trials is µ = np
Example: 64 deaths in 20 years
from thousands of soldiers
The Poisson Distribution
Overview
e -µµx
P(x) =
x!
The Poisson Distribution
Overview
What is p?
Random events
Regular events
Clumped events
The Poisson Distribution
0.1 0.5
1 1
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
0
6
0
10
12
0
10
12
0
8
1
0.8
0.6
0.4
0.2
0
1 2
10
12
0
8
1 1
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
0 0
10
12
0
10
12
0
8
The Expected Value of a Discrete
Random Variable
n
E ( X ) ai pi a1 p1 a2 p2 ... an pn
i 1
The Variance of a Discrete Random
Variable
( X ) E X E ( X )
2 2
2
n
n
pi ai ai pi
i 1 i 1
Uniform random variables
The closed unit interval, which contains all
numbers between 0 and 1, including the two end
points 0 and 1
0.2
1 / 10,0 x 10
Subinterval [3,4]
Subinterval [5,6]
f ( x)
0, otherwise
P(X)
0.1
The probability
0
density function
0 1 2 3 4 5 6 7 8 9 10
X
The Expected Value of a continuous
Random Variable
E ( X ) xf ( x)dx
For an uniform random variable
x, where f(x) is defined on the
interval [a,b], and where a<b,
(b a) 2
E ( X ) (b a) / 2 and (X )
2
12
The Normal Distribution
Overview
Discovered in 1733 by de Moivre as an approximation to the
binomial distribution when the number of trails is large
Abraham de Moivre
Derived in 1809 by Gauss
(1667-1754)
Importance lies in the Central Limit Theorem, which states that the
sum of a large number of independent random variables (binomial,
Poisson, etc.) will approximate a normal distribution
Karl F. Gauss
(1777-1855)
The Normal Distribution
Overview
1 (x )2/22
f (x) e
= 2
0.45
0.40
0.35
0.30
0.25
f (x )
0.20
0.15
0.10
0.05
0.00
-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3
x
The Normal Distribution
Overview
0.45
0.40
0.35
0.30
0.25
f (x )
0.20
0.15
0.10
0.05
0.00
-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3
x
The Normal Distribution
Overview
0.25
0.20
0.15
0.10
0.05
0.00
25 26 27 28 29 30 31 32 33 34 35
0.25
0.20
0.15
0.10
0.05
0.00
25 26 27 28 29 30 31 32 33 34 35
0.25
0.20
0.15
0.10
0.05
0.00
25 26 27 28 29 30 31 32 33 34 35
5000
4000
3000
2000
1000
0
-6 -4 -2 0 2 4
Useful properties of the normal
distribution
1. The normal distribution has useful
properties:
Can be added E(X+Y)= E(X)+E(Y)
and σ2(X+Y)= σ2(X)+ σ2(Y)
Can be transformed with shift and
change of scale operations
Consider two random variables X and Y
Let X~N(μ,σ) and let Y=aX+b where a and b area
constants
Change of scale is the operation of multiplying X by a
constant “a” because one unit of X becomes “a” units
of Y.
Shift is the operation of adding a constant “b” to X
because we simply move our random variable X “b”
units along the x-axis.
If X is a normal random variable, then the new random
variable Y created by this operations on X is also a
random normal variable
For X~N(μ,σ) and Y=aX+b
E(Y) =aμ+b
σ2(Y)=a2 σ2
A special case of a change of scale and shift
operation in which a = 1/σ and b =-1(μ/σ)
Y=(1/σ)X-μ/σ
Y=(X-μ)/σ gives
E(Y)=0 and σ2(Y) =1
The Central Limit Theorem
That Standardizing any random variable that
itself is a sum or average of a set of independent
random variables results in a new random
variable that is nearly the same as a standard
normal one.
The only caveats are that the sample size must
be large enough and that the observations
themselves must be independent and all drawn
from a distribution with common expectation
and variance.
Exercise
Location of the
measurement