Sei sulla pagina 1di 28

Discrete Random Variables and

Their Probability Distribution:


Part II
Cyr Emile MLAN, Ph.D.
mlan@stat.uconn.edu
Discrete Random Variable: Part II

p. 1/28

Introduction
Text Reference: Introduction to Probability and Its
Application, Chapter 4.
Reading Assignment: Sections 4.3-4.6, February
25-March 2

In this set of notes, we will discuss several common but


useful distribution models for discrete random variables.

Discrete Random Variable: Part II

p. 2/28

Bernoulli Distribution
Many experiments (public opinion poll, consumer
preference, drug testing, quality control in industrial
setting) have only two outcomes: yes/no, head/tail,
male/female, fail/pass, success/failure,
defective/non-defective.

Bernoulli Trial
(i) Two possible outcomes: success or failure.
(ii) p = P (success) and q = P (failure) = 1 p.
Let

(
1,
X=
0,

if success,
if failure.

Then, X has a Bernoulli distribution.


Discrete Random Variable: Part II

p. 3/28

Bernoulli Distribution
The probability mass function is given by
x
0
1
p(x) 1 p p

or simply,

p(x) = px q 1x ,

y = 0, 1 ,

The mean and variance are


E[X] = p, and
V (X) = E[X 2 ] (E[X])2 = p p2 = p(1 p) = pq,

Note that X 2 = X for this case.

A simple example of a Bernoulli trial is to toss a coin


once and to let X = 1 if the head faces up and X = 0 if
the tail comes up.
Discrete Random Variable: Part II
p. 4/28

Binomial Distribution
Definition 3.5:
A binomial experiment possesses the following properties:
1. Each experiment consists of a fixed number, n, of independent and identical Bernoulli trials.

X , the number of successes observed during the n trials.

2. The random variable of interest is

For an experiment to be a binomial experiment the


characteristics above must all be satisfied.

probability mass function


X has a binomial distribution if
 
n x nx
p(x) =
p q
, x = 0, 1, 2, . . . , n,
x
Discrete Random Variable: Part II

where 0 < p < 1.

p. 5/28

Binomial Distribution
We use the following notation to denote this distribution:
X Binomial(n, p).

Another View of Binomial Random Variable


Let
Xi =

1
0

if success" in the ith Bernoulli trial,


if failure" in the ith Bernoulli trial.

Then X1 , X2 , . . . , Xn are independent and identically


distributed (i.i.d.) as the Bernoulli distribution with p
being the probability of success.
Now, write
X=

n
X

Xi .

i=1

Discrete Random Variable: Part II

Then, X Binomial(n, p).

p. 6/28

The Binomial Distribution


0.25
0.20
0.15
0.00

0.05

0.10

probability distribution

0.20
0.15
0.10
0.00

0.05

probability distribution

0.25

0.30

Binomial ( n=12, p=.85)

0.30

Binomial ( n=12, p=.15)

10

12

10

12

10

12

Binomial ( n=12, p=.08)

0.1

0.2

probability distribution

0.15
0.10
0.05

0.0

0.00

probability distribution

0.3

0.20

Binomial ( n=12, p=.50)

10

12

Discrete Random Variable: Part II

p. 7/28

The Binomial Distribution


Example 4.14:
Corinne is basketball player who makes 75% of her free throws over
the course of a season. In a game, Corinne shoots 12 free throws
and missed 5 of them. The fans think that she failed because she
was nervous. Is it usual for Corinne to perform this poorly? Should
the coach or the player be terribly upset about this?

Solution:
To answer this question, assume that free throws are independent
(Studies of long sequences of free throws have found no evidence
that they are dependent, so this is a reasonable assumption). X =
count of missed free throws in 12 attempts.
X Binomial(n = 12, p = 0.25). We want P (X 5) = 0.1576. That
is, Corinne will miss 5 or more out of 12 free throws about 16% of the
time, for roughly one of every six games. While above her average of
3 missed free throws in 12 attempts, this performance is well within
the range of the usual chance variation in her shooting.
Discrete Random Variable: Part II

p. 8/28

The Binomial Distribution


Example 4.15:
Experience has shown that 30% of all persons afflicted by a certain
illness recover. A drug company has developed a new medication.
Ten people with the illness were selected at random and received the
medication; nine recovered shortly thereafter. Suppose that the
medication was absolutely worthless, what is the probability that at
least eight of ten receiving the medication will recover?

Solution:
X = count of people who recover among the 10 people. If the
medication is worthless, then X Binomial(n = 10, p = 0.3). We
want P (X 8) = 1 P (X 7) = 1 .998 = .002. If the medication
is ineffective, the probability of observing at least nine recoveries is
extremely small. One of two things may have occurred: (1) either the
medication is really worthless and we have observed a rare event or
(2) the medication is indeed useful in curing the illness. Statisticians
adhere to (2) and that is the basis of their final decision.
Discrete Random Variable: Part II

p. 9/28

Mean and Variance of a Binomial Random Variable


Theorem 4.7:
Let X be a binomial random variable based on n trials
and success probability p. Then,
= E(X) = np
2 = V (X) = npq

There are at least two approaches for computing the


mean and variance of a Binomial random variable.

Approach based on binomial series:


E(X)

n
X

n
n
X
n!
x
px q nx =
px q nx ,
x
(x 1)!(n x)!
x=0
x=1

np

n
X

x=1

n1
X n 1
(n 1)!
x1 nx
p
q
= np
pz q n1z ,
(x 1)!(n x)!
z
z=0

np(p + q)n1 = np ,

where z = x 1.

Discrete Random Variable: Part II

p. 10/28

Mean and Variance of a Binomial Random Variable




E X(X 1)

 
n
X
n!
n x nx
=
x(x 1)
p q
=
px
x
(x 2)!(n x)!
x=0
x=2
n
X

n
X

(n 2)!
= n(n 1) p
px2 q nx ,
(x 2)!(n x)!
x=2
n2
X n 2
= n(n 1) p2
pz q n2z ,
z
z=0
2

= n(n 1) p2 (p + q)n2 = n(n 1) p2 ,

where z = x 2. Hence,
2

V (X) = E(X ) (E(X)) = E X(X 1) + E(X) (E(X))2 ,


= n(n 1) p2 + np (np)2 = npq .

Discrete Random Variable: Part II

p. 11/28

Mean and Variance of a Binomial Random Variable


Approach based on Bernoulli random
variable:
Again let
Xi =

1,

0,

if success in the ith Bernoulli trial,


if failure in the ith Bernoulli trial.

Then E[Xi ] = p and Var(Yi ) = p(1 p).


Since X = X1 + X2 + + Xn , we have
n
n
X
X
E[X] =
E[Xi ] =
p = np
i=1

i=1

and
n
n
X
X
Var(X) =
Var(Xi ) =
p(1 p) = np(1 p).
i=1

i=1

Discrete Random Variable: Part II

p. 12/28

The Binomial Distribution


Example 4.16: Example 4.14 revisited On
average how many three throws is Corinne expected to
miss?

Solution: On average, Corinne is expected to miss 4


free throws in 12 attempts.

Example 4.17:
A hospital has backup generators for critical systems should the
electricity go out. Independent but identical backup generators are
installed so that the probability that at least one system will operate
correctly when called on is no less than 0.99. Let n denote the
number of backup generators in a hospital. How large must n be to
achieve the specified probability of at least one generator operating, if
1. p = 0.95
2. p = 0.8

Discrete Random Variable: Part II

p. 13/28

The Binomial Distribution


Solution:
Let X denote the number of correctly operating generators. If the
generators are identical and independent, X has a binomial
distribution. Thus,
P (X 1) = 1 P (X = 0) = 1 (1 p)n
is greater than 0.99 if
(1 p)n

0.01

n ln(1 p) ln(0.01)
ln(0.01)
n
ln(1 p)
1. When p = 0.95, installing two backup generators will satisfy the
specifications.
2. When p = 0.8, installing three backup generators will satisfy the
specifications.
Discrete Random Variable: Part II

p. 14/28

Joke

Where theres a will I want to be


in it.

The accident was caused by me


waving to the man I hit last week.
Discrete Random Variable: Part II

p. 15/28

The Geometric Distribution


Definition

Consider a sequence of independent and identical


Bernoulli trials with common probability of success,
p.
Let X = the number of Bernoulli trials required prior
to achieving the first success.
Then, X has a geometric distribution with
p(x) = (1 p)x p = q x p ,

for x = 0, 1, 2, . . .

where q = 1 p.
x

We write

z }| {

FF
FS
Discrete Random Variable: Part II

X Geo(p).

p. 16/28

The Geometric Distribution


Example 4.18:
Flip a fair coin (independently) until we get a head. Let
Y = the number of flips prior to obtaining that head. The
possible values of X are {0, 1, 2, . . . }.
probability

event

P (X = 0) =
P (X = 1) =
P (X = 2) =
.
.
.

1
2
1
4

H
TH

` 1 3
2

TTH
.
.
.
k

p(X = k) =
.
.
.

1
2k+1

z }| {
TT TH
.
.
.

So, the probability mass function is given by


x

p(x)

1
2

1

1 2
2

2

1 3
2

3

1 4
2

4

1 5

2 Random Variable: Part II


Discrete

p. 17/28

The Geometric Distribution


We can also compute the following probabilities:
P (X is odd) =P (X = 2) + P (X = 4) + P (X = 6) +
 2  4  6
1
1
1
=
+
+
+
2
2
2
 2x
 x
1
X
X
1
1
1
=
=
= 4 1 = .
2
4
3
1 4
x=1
x=1
P ({X = 1} {X < 3})
P (X < 3)
P (X = 1)
=
P (X = 0) + P (X = 1) + P (X = 2)

1 2
1
2
2
4
=
.
=
=


7
1
1 2
1 3
7
+
+
8

P (X = 1|X < 3) =

Discrete Random Variable: Part II

p. 18/28

Mean and Variance of a Geometric Random Variable


Theorem 4.8:
Let X be a geometric random variable with success
probability p. Then,
q
= E(X) =
and
p
q
2
= V (X) = 2 .
p

Approach based on the geometric series:


Note that
d
dq
d
dq

qx

qx

x=0

x=0

1X x
xq
and
q x=0


d
1
1
1
=
=
.
2
2
dq 1 q
(1 q)
p

Discrete Random Variable: Part II

p. 19/28

Mean and Variance of a Geometric Random Variable

E(X) = p

x=0

x q x = pq

1
p2

q
,
p
2

d
x(x 1) q = pq
2
dq
x=0
 
2q
2
= pq
= 2.
p3
p


E X(X 1) = p

1
1q

Thus,
2q 2
q
q2
V (X) = E X(X 1) + E(X) [E(X)] = 2 + 2 ,
p
p p
q
.
=
2
p


Discrete Random Variable: Part II

p. 20/28

The Geometric Distribution


Example 4.19:
Suppose a NBA player misses 20 percent of his
free-throws. Assume that successive free-throws are
independent. How many tries will result in a basket
before a miss?

Solution:
Let X = the number of tries required to get the first
miss. Then, X Geo(0.2) and E[X] = 0.8/0.2 = 4. On
average the number of baskets made before the first
miss is 4.

Discrete Random Variable: Part II

p. 21/28

The Geometric Distribution


Property
The geometric distribution is used to model waiting
times. An important feature of this distribution is the
memoryless property.
For integers s > t, P (X > s|X > t) = P (X > s t), the
probability of observing s t additional failures, given
that we have already observed t failures, is just the
probability of observing s t failures at the start of the
experiment.

Discrete Random Variable: Part II

p. 22/28

The Negative Binomial Distribution


Definition

Consider a sequence of independent and identical


Bernoulli trials with common probability of success,
p.
Let X = the number of Bernoulli trials that resulted in
a failure prior to achieving the rth success.
Then, X has a Negative Binomial distribution
X NB(r, p)

with probability mass function


p(x) =P ( r 1 successes in x + r 1 trials) P (success)




y 1 r1
y

1
=
p (1 p)x p =
pr (1 p)x
r1
r1

for x = 0, 1, 2, . . . .

Discrete Random Variable: Part II

p. 23/28

Mean and Variance of a Negative Binomial Random Variable


x+r1

z
}|
{
 . . .   
r1 Ss x Fs S

Mean and Variance


Theorem 4.9:
Let X be a negative binomial random variable with
success probability p describing the number of the
trial on which the rth success occurs. Then,
r
= E(X) =
and
p
r(1 p)
2
= V (X) =
.
2
p
Discrete Random Variable: Part II

p. 24/28

Mean and Variance of a Negative Binomial Random Variable

Approach based on the geometric random


variables:
Again let Wi = number of failures between the (i
1)th success and the ith success.
Wi are independent and identically distributed
geometric random variable with parameter p and
W = W1 + + Wr .
Hence,
r
r
X
X
q
rq
E[X] =
E[Wi ] =
=
p
p
i=1

and
Var(X) =

r
X
i=1

Var(Wi ) =

i=1

r
X
1p
i=1

p2

rq
= 2.
p

Discrete Random Variable: Part II

p. 25/28

The Negative Binomial Distribution


Example 4.20:
A pediatrician wishes to recruit 5 couples, each of whom is expecting
their third child, to participate in a new natural childbirth regimen. Let
p = P (a randomly selected couple agrees to participate). If p = .2,
what is the probability that exactly 15 couples must be asked before 5
are found who agree to participate? at most 15 couples must be
asked before 5 are found who agree to participate?

Solution:
Let X = the number of couples to be randomly selected before 5 are
found who agree to participate. Then, X NB(r = 5, p = .2). We
have

14
P (X = 15) = 4 (.2)5 (.8)10 = 0.034 .
P15 x1
P (X 15) = x=5 4 (.2)5 (.8)x5 = 0.164 .
Discrete Random Variable: Part II

p. 26/28

The Negative Binomial Distribution


Example 4.21:
A large stockpile of used pumps contains 20% that are in need of
repair. A maintenance worker is sent to the stockpile with three repair
kits. She selects pumps at random and tests them one at a time. If
the pump works, she sets it aside for future use. However, if the
pump does not work, she used one of her repair kits on it. Suppose
that it takes 10 minutes to test a pump that is in working condition
and 30 minutes to test and repair a pump that does not work. Find
the mean and standard deviation of the total time it takes the
maintenance worker to use her three repair kits.

Solution: Let X =

the number of working pumps tested before


the third nonfunctioning pump is found. Then, X NB(r = 3, p = .2).
and E(X) = 12 and V (X) = 60. However, the random variable of
interest here is T = 10 X + 3(30) = 10 X + 90. Hence
E(T ) = 10 E(X) + 90 = 210 .
V (T ) = 102 V (X) = 100(60)Discrete
= 6000Random
.
Variable: Part II
T = 77.46 .

p. 27/28

Joke

The older I got, the better I used


to be.

Discrete Random Variable: Part II

p. 28/28

Potrebbero piacerti anche