Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
JOHN S. LOUCKS
St. Edwards University
Chapter 4
Introduction to Probability
Probability
.
5
n1 = 4
Collins Mining:
n2 = 2
Total Number of
Experimental Outcomes:
n1n2 = (4)(2) = 8
Tree Diagram
Markley Oil Collins Mining
Experimental
(Stage 1)
(Stage
Gain 8 2)
Gain 10
Gain 8
Gain 5 Lose 2
Even
Lose 20
Gain 8
Lose 2
Lose 2
(10, 8)
$18,000
(10, -2)
$8,000
Outcomes
Gain
Gain
Gain 8
(5, 8) Gain
$13,000
Lose 2
(5, -2)
$3,000
Gain
(0, 8)
$8,000
Gain
(0, -2)
Lose
N
at a time
Cn
n
n!(N n)!
where
10
11
Assigning Probabilities
Classical Method
Assigning probabilities based on the
assumption of equally likely outcomes.
Relative Frequency Method
Assigning probabilities based on
experimentation or historical data.
Subjective Method
Assigning probabilities based on the
assignors judgment.
12
Classical Method
If an experiment has n possible outcomes, this
method
would assign a probability of 1/n to each
outcome.
Example
Experiment: Rolling a die
Sample Space: S = {1, 2, 3, 4, 5, 6}
Probabilities: Each sample point has a 1/6
chance
of occurring.
13
Number of
Number
Polishers Rented of Days
0
4
1
6
2
18
3
10
4
2
14
Number of
Number
Polishers Rented of Days
Probability
0
4
.10 = 4/40
1
6
.15 = 6/40
2
18
.45 etc.
3
10
.25
4
2
.05
15
Subjective Method
.20
.
.
.
.
.
17
18
19
20
Complement of an Event
Event A
Ac
21
Event A
Event B
22
M C = {(10, 8), (10, -2), (5, 8), (5, -2), (0, 8),
(-20, 8)}
P(M C) = P(10, 8) + P(10, -2) + P(5, 8) + P(5,
-2)
+ P(0, 8) + P(-20, 8)
= .20 + .08 + .16 + .26 + .10 + .02
= .82
23
Event A
Event B
24
25
Addition Law
26
Addition Law
Markley Oil or Collins Mining Profitable
We know: P(M) = .70, P(C) = .48, P(M C)
= .36
Thus: P(M C) = P(M) + P(C) - P(M C)
= .70 + .48 - .36
= .82
This result is the same as that obtained
earlier using
the definition of the probability of an event.
27
Event B
28
29
Conditional Probability
30
Conditional Probability
Collins Mining Profitable given
Markley Oil Profitable
P (C M ) . 36
P (C | M )
. 51
P( M )
. 70
31
Multiplication Law
32
Multiplication Law
Markley Oil and Collins Mining Profitable
We know: P(M) = .70, P(C|M) = .51
Thus: P(M C) = P(M)P(M|C)
= (.70)(.51)
= .36
This result is the same as that obtained
earlier using the definition of the probability of
an event.
33
Independent Events
34
Independent Events
35
DoesP(M C) = P(M)P(C) ?
36
Bayes Theorem
New
Information
Application
of Bayes
Theorem
Posterior
Probabilities
37
Example: L. S. Clothiers
A proposed shopping center will provide
strong competition for downtown businesses
like L. S. Clothiers. If the shopping center is
built, the owner of L. S. Clothiers feels it would
be best to relocate.
The shopping center cannot be built
unless a zoning change is approved by the
town council. The planning board must first
make a recommendation, for or against the
zoning change, to the council. Let:
A1 = town council approves the zoning
change
A2 = town council disapproves the change
Prior Probabilities
38
Example: L. S. Clothiers
New Information
The planning board has recommended
against the zoning change. Let B denote the
event of a negative recommendation by the
planning board.
Given that B has occurred, should L. S.
Clothiers revise the probabilities that the town
council will approve or disapprove the zoning
change?
Conditional Probabilities
Past history with the planning board and the
town council indicates the following:
P(B|A1) = .2
P(B|A2) = .9
39
Example: L. S. Clothiers
Tree Diagram
P(B|A1) = .2
P(A1 B) = .14
P(A1) = .7
P(Bc|A1) = .8 P(A1 Bc) = .56
P(B|A2) = .9
P(A2 B) = .27
P(Bc|A2) = .1
P(A2) = .3
40
Bayes Theorem
41
Example: L. S. Clothiers
Posterior Probabilities
Given the planning boards recommendation
not to approve the zoning change, we revise
the prior probabilities as follows.
P ( A1 ) P ( B| A1 )
(. 7 )(. 2 )
P ( A1| B )
P ( A1 ) P ( B| A1 ) P ( A2 ) P ( B| A2 ) (. 7 )(. 2 ) (. 3)(. 9)
= .34
Conclusion
The planning boards recommendation is good
news for L. S. Clothiers. The posterior
probability of the town council approving the
zoning change is .34 versus a prior probability
of .70.
42
Tabular Approach
43
Tabular Approach
(1)
(5)
Events
Ai
(2)
(3)
(4)
Prior Conditional
Probabilities Probabilities
P(Ai)
P(B|Ai)
A1
.7
.2
A2
.3
.9
1.0
44
Tabular Approach
45
Tabular Approach
(1)
(5)
Events
Ai
(2)
(3)
(4)
Prior Conditional
Joint
Probabilities Probabilities Probabilities
P(Ai)
P(B|Ai)
P(Ai I B)
A1
.7
.2
.14
A2
.3
.9
.27
1.0
46
Tabular Approach
Tabular Approach
(1)
(5)
Events
Ai
(2)
(3)
(4)
Prior Conditional
Joint
Probabilities Probabilities Probabilities
P(Ai)
P(B|Ai)
P(Ai I B)
A1
.7
.2
.14
A2
.3
.9
.27
1.0
P(B) = .41
48
Tabular Approach
P ( Ai B )
P( B)
49
Tabular Approach
(1)
(5)
(2)
(3)
(4)
Prior Conditional
Joint
Posterior
Events Probabilities Probabilities Probabilities
Probabilities
Ai
P(Ai)
P(B|Ai)
P(Ai I B)
P(Ai |B)
A1
.7
.2
.14
.3415
A2
.3
.9
.27
.6585
1.0
P(B) = .41
1.0000
50
End of Chapter 4
51