Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Scanned by CamScanner
Section 2. Marginal Distributions 292
2.1 A G e n e r a l Discussion. 2.2 The Discrete Case. 2.3 The Absolutely
r Random Variables 107
Continuous Case
Introduction 107
Section 1. The Notion of a Random Variable 107 9 Conditional Distributions and Independent Random Variables 311
Section 2. Tlie Distribution Function 115
Introduction 311
2.1 The Definition o f a Distribution Function, 2.2 Properties o f a
Distribution Function Section 1. Conditional Distributions 311 . , , .
1.1 C o n d it io n a l Distribution Given an Eveiv c s a r t Prohjbilm.
Section 3. Classification of Random Variables 130
1.2 Conditional Distribution Given a Specij... - alue
3. 1 Discrete Random Variables, 3.2 Absolutely Continuous Random
Variables, S.3 Mixed Distributions, 3.4 Singular Distributions Section 2. independent Random Variables 327
Section 3. More Than Two Random Variables 343
Some Special Distributions 157 3.1 The Joint Distribution Function, 3.2 Tlie Discrete Case,
3.3 The Absolutely Continuous Case
Introduction 157 "
Section 1. Discrete Distributions 157 10 Functions of Several Random Variables 351
1.1 Bernoulli Distribution, 1.2 The Binomial Distribution, 1.3 The
Introduction 351 ■
Hypergeometric Distribution, 1.4 The Geometric Distribution
1.5 The Negative Binomial Distribution, 1.6 The Poisson Distribution Section 1. The Distrete Case 352
Section 2. Absolutely Continuous Distributions 175 Section 2 The Continuous Case 357
2.1 Distribution o f the Sum, 2.2 Distribution o f the Product.
2.1 The Uniform Distribution, 2.2 The Normal Distribution,
2.3 Distribution o f the Quotient. 2.4 Distribution o f the Maximum,
2.3 The Gamma Distribution, 2.4 The Cauchy Distribution,
2.5 The Laplace Distribution 2.5 Distribution o f the Minimum
Seciion 3. Miscellaneous Examples 373
v ) Functions of a Random Variable 195
Introduction 195
Section 1. The Mathematical Formulation 195
11 Expectation-Several Random Variables 389
Introduction 389
Section 2. The Distribution o f a Function of a Random Variable 198
Section 1. Expectation of a Function of Several Random Variables 389
2.1 The Discrete Case, 2.2 The Continuous Case
1.1 The Definition, 1.2 Basic Properties o f Expectation, 1.3 Covariance
12,
o f a Random Variable, 1.3 Some Properties o f Expectation, 1.4 The Value o f a Random Variable by Conditioning, 2.3 Probabilities by
Variance o f a Random Variable, 1.5 Conditional Expectation Conditioning
Section 2. Expectations of Some Special Distributions 242 i Generating Functions 435
Introduction 435
Scanned by CamScanner
Scanned by CamScanner
Jt / Preface Prrfecr { %i
measure and their interpretations. I developed in subsequent chapters. Such chapters, with the balance of the book of this manuscript the author lost, in the
find this particularly desirable since the examples are indicated by marking them to be covered in the second quarter. death of Paul Van Wulven, k good friend
student can be made to realize that with a solid circle. The reader would be There is no denying the fact that I and a typist of uncanny genius The
some problems which seem inaccessible well advised to familiarize himself with have derived heavily from the existing final chapters were typed by Cheryl
on first appearance can in fact be at their essence, in Section 1 of Chapter 4, literature on the subject, and I acknowl Richards, who, in spite of no previous .
tempted in a routine way. edge my indebtedness to these sources. experience with mathematical typing,
some examples are marked with an
Part 2, consisting of Chapters 4 asterisk. These might be omitted at first Somv ^re mentioned at the end of the rose to great heights and did a superb
through 7, deals with single random reading, especially if the interest of the text; the interested reader might consult job.
variables, and part 3, consisting of reader is nonmathematical. these to broaden his perspective.
Chapters 8 through 11, treats several No mathematical book at this level On a personal note, during the typing Ramakant Khazartie
random variables. There is a common is complete without an adequate number
theme adopted in the development of of exercises. I have met this requirement
these two parts. I have found that con by providing a wealth of exercises which
siderable mileage can be gained if, touch on every aspect of the theory dis
before embarking on pan 3, the student cussed in the text. They are given at the
is made aware that the broad approach end of each section, and—as far as pos
adopted in part 2 is maintained in part 3. sible—are arranged in the order in which
This approach uses the following se the material is developed in the particu
quential developments: (1) mathe lar section. N.o important results which
matical description of a function are needed for further development of
defined on the sample space; (2) intro the subject are relegated to the exer
duction of the concept of a distribution cises. The exercises are initiated with
Junction 2 :ong with its properties; simple routine problems which increase
(3) classification of random variables in complexity, but none should be con
on the basis o f the nature of the distri sidered beyond the prowess of a diligent
bution function; (4) treatment of student. Hints are appended for prob
^ functions o f random variables; and lems which might call for undue insight.
(5) the treatment of expectation. It is The extent of coverage in a semester
also helpful to make the student aware or a quarter will depend largely on the
o f how, for instance, the definitions of level and background of the students.
random vector, distribution functions, Even so, it is inconceivable that the
and so on mimic those in part 2. entire book would be covered in a one-
Part 4 consists of Chapter 12, treat semester offering. Based on my own
ing generating functions, and Chapter experience, a one-semester course can
13, which involves the study o f limit be outlined as follows; most of the
theorems in probability. topics in Chapters 1 through 9, with
There are a wide variety o f illustra varying degrees of emphasis; Section 1
tive examples throughout the text, and o f Chapter 11; a brief touch on the con
I consider this to be one o f its strong tents of Chapter J 2, finally, Chebyshe\’s
points. Thorough explanations are inequality and the central limit theorem
given so that the student can read these in Chapter 13.
on his own, thereby allowing the in In a two-quarter course the pace
structor more time to discuss questions could be more leisurely, allowing more
of a more fundamental nature. Some o f time to discuss topics in Chapters 12
the examples contain important results and 13. In this type o f offering, the first
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
4 / Bine Probability Theory emi Application!
Building Blocki o / Ihe h v b sU lty S tn e m r ,! s
The classical theory was not equipped to handle problems o f loaded dice or
biased coins. Consideration of problems of this type led to the axiomatic theory. For example, HA = ITom, Dick, Maryl, then Ton. (A and Jack t A
A giant step in this direction was taken as a result of pioneering work of A. Kol If a set has a large number of elements, it might be tedirus or som«,m,<
mogorov, who provided« sound mathematical foundation for the subject of impossible to specify the set by a complete list of its elements. A
probability. been devised to describe such sets is the so-called set-builder notation. If we r e L f
How does one devise > set o i axioms in a mathematical discipline? Of course, it sent a typical member of the set by x, then the set of all elements* such that f l m
is always possible to propose a system of axioms and derive results from them. some property, say property P, is written as
The only requirement would be that the axioms be consistent. However, if the Ix I x has the property P I
axioms are such that they have no connection with reality, then the whole exercise
becomes purely academic and of very little practical use. To have any relevance at For example, we could write the set of real numbers greater than 4 as
all, the axioms should be motivated by our experience in real world and should |x |x a real number, jc > 41
reflect it as closely as possible. In other words, the axioms should serve to provide
an idealization of what we observe in nature. Such an axiomatic presentation As another example, the set consisting of pairs of real numbers where the first
governing the behavior of chance phenomena was given by A. Kolmogorov (1933) component is twice the second component can be written as
in The Foundations o f Probability Theory. Our introduction to the subject will be Ku. >’)!«,>> real numbers, and u = 2v\
mainly axiomatic. The classical theory will turn out to be a special case.
The braces should be read as “ the set of a ll. . . ” and the vertical bar as “such
1. ELEMENTS OF SET THEORY that”
A very important set is the set of all the real numbers. We shall denote it by R.
1.1 The Notion of a Set Using the set-builder notation
Since the concepts of set theory are at the very heart of the treatment of prob R = lx |x a real number, - “ < x < “ l
ability, we shall begin by presenting a detailed outline of the basic ideas.
The word set is meant to indicate a gathering of objects which we choose to In the sequel we shall also need the following sets: Suppose a and b are real
numbers with a < b . Then
isolate because they have some common characteristic. However, any attempt to
define a set is fraught with logical difficulties. For our purpose, we shall adopt the [a, 6) = |x | x e R, a < * < bl ■ (closed interval)
intuitively familiar notion and regard a set as a collection of objects, requiring only (p, b) = |* |* e R, a < x <61 (open interval)
that it be possible to determine unambiguously whether or not any given object is M M x Ix e R , ff< x < fc |
a member o f the collection. («./>] = lx |x e R , «<*<41
When a complete list of the members of a set is given, it is customary to write k “ ) = lx lx e R , «<*<»1
them within braces, separated by commas. For example, a set that contains the four ( a ," ) = |x |x e R , a<x<~l
letters a, b, c, and d may be written as Ia, b, c, d\. Since we are talking only about (-“ ,« ] = |* Ix e R , -» < * < « 1
the objects in the set, there is no reason why the members should be written in any (-“ ,« ) = l x |x e R . - « < x < a |
particular order. For example, the s e ts k b, c, d\, Id, b, a, c|, Id, c, a, ¿1 represent
A set which has no elements in it is called the empty set, or the void set We
the same collection and consequently the same set. Hence order is irrelevant in
shall denote i. by the symbol 0 . The following are examples of empty sets:
listing members o f a set. Also, no purpose is served by repeating the same element,
so only distinct elements are listed in a set (0 The set o f all equilateral triangles with one angle equal t0 4S°
We shall denote sets by upper case letters and the elements of these sets by lower (H) The set o f all odd integers divisible b y 4 '
case letters. If x is in the set A we shall write (i/0 l(x, y) | x .y e R, |x| + b>| < 01
(fv) l x |x e R, x* = —1 |
xeA
A set which is not empty is called nonempty
to mean “jr is an element of the set A."
If* is n o tin g , we write
x (A
to mean “x is not an element of the set A."
Scanned by CamScanner
0 / « rat n o M m t, /« « , ,, ,,«J
/
Building Blocks o f the Probability Structure 7
Ä » I."«t t £ £ * h 1 . The notation AB, juxtaposing the iwo letters^ and B. is also used in place of
A riB .V Ie shall find this latter notation more convenient for our purpose.
■ h a tf* T r T i KI1 T ’" ' ' ' f m*n,b" i ,j| i Thtn B ii till;,] i
Consider the following examples:
also
(0 If .4 =|a, b. c, d\ and B -]a. tl. c .f. p\ then AB = |a, c. dl.
(it) i f A is the set of people who wear a tie, and B the set of people who wear a
SCX means that if x c B then x e A jacket, then AB represents the set o f people who wear a tie and a jacket
(iit) Let '
A = \( x .y ) \x ,y e R ,x > M
A ' * " * K no1 in B We give the'followingexampies*"'1 ' h" e “ S° m' membe' ° f
B = l(*. y ) U y e R. y > - l l
to 'llT r ,Harrly,C ,T 0I" - Dick- Hirryl then
triangles; in fact, a ^ r o ^ r sub«t ^ t' Un8''iS “ * S“bse‘ ° f thc of ^ the lsos“ l« AB = f t x ,y ) l x . y e R . x > 3 and y > - I I
o f * ™ ; thC SpadCS in “ bridg' deck of 52 cards is a subset Two sets A and B arc said lo be disjoint if they have no elements in common;
that is, if AB = 0 . More generally, if A ,, A . . . is a collection o f sets, then we say
("■) 1C*, ^ U . ^ f R , and i= > - |C |(*, y J U . y e R I
that the sets are pairwise disjoint if, whenever i # / , A / and A / are disjoint.
In *iew o f this comTem, w e 'h L IC * ' U' en th" ' “ n 0 t h i n 6 ° Which is " ol in A The concept of intersection carries over to any arbitrary collection of sets and,
in particular, to a countable collection of sets. Thus, if A ,, A 3___ represents a
For example, if A represents the set o f (dangles will, all sides equal, and B the Notice that S is in the intersection since 5 e A , for every i. For the same
set of triangles with all angles equal, then A = B. also in the intersection. reason, 2 is
If set A is not equal to set B, we w rite r * B. As another example, we have
For instance, if A = la, b, cl and B = la, b. e. d |. Ihen A * B.
A r iß = \ x \ x e A and x e B \
19 i
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
r<- .
; H i Bcsk bobabt/iry Theory and Applications
Buildwt; Blocks nj the Probability Structure / I S
if. EXERCISES-SECTIONI
10. Supposed = R X R where R denotes the set of real numbers.
1. SupposeS = II. 2 ,3 .........121,,4 = II, 3.4,6, 10, l l |, f l = 12, 3, 5, 6, 9,10 121
Let A = \( x .y ) e S lx < u \,B = \lx ,y ) e S \y < v \.
Find:
(a) Find A B .A U B, and A - B .
(i) A U B (b) AB (c) A - B
(b) Using the Cartesian plane and the appropriate regions of the plane,
(d) B - A (e) A'
indicate (as Venn diagrams) the sets AB. A U B. and A - B.
2. Suppose /* is the set of positive integers and A .B .C ire three subsets of it
defined as follows: 11. Let A = la, 4, cl. Answer true or false:
(a) 0 i A , b u t0 e i? (/l).
A = |2* | * e/*l (b) a t A. b u la ^ 3 \A ).
« = ¡2Jfc- 1 | t £ r i (c) |¡7, A! f A. but !j2. 61(
C = l 3 * |* e n (d) A is an clement aii?(A) but is not a subset of it.
Describe the sets in words. (e) lla,4Uc!ICi?(-4).
12. Let A =1a. b .c\.B = 16,71, C - 17,81. Find:
3. Let A B C be as described in exercise 2. Describe in words the following sets
(a )X X (fiU C ) (b) 04X fi)U (y4X C )
U A U B (b) (A U C)B (c) A U C
(c) A X (BC) (d) (A X B)(A X C)
(d) A U (BC) (e) AB (q g(;
13. Let A = 11,2,31 andfi = II, 2 ,5 ,6l. List the elements in the following sets:
4. Use Venn diagrams to establish the following:
(a) G4- B ) X ( A - B ) (b) A X ( A - B ) (c) (A X A ) - ( B X B )
(a) If ,4 C fi and fi C C, then A C C .
14. Consider the following sets which are subsets of R X R. Indicate which ones are
(b) A ( B - A ) * 0 .
Cartesian product sets, and write them in product form.
(c) \iA C B ,\h e n A = A B .
(d) If A B = 0 and C C B , then /4C = 0 .
(a) K-V.r) 1Jc< 21
(b) l(x. 1 0 < jc< j - <21
(e) I f A B = 0 , then A'B = B.
(c) I(jc, ji) 10 < jr < I. -2< *y< 3\
(i) (A U B ) - B = A - A B .
(d) l(x,y) iy > 2 l
(g) ( A - A B ) U B = A U B . ' (e) \ ( x ,y ) \ x - 2 y \
(h) (a U B ) - A B = (A B) U (A’B).
15. Let 5 = 1 1 ,2 ,3 ,.. .|and<4n = lx e S | i > n l . Explain why lim A„ exists.
5. State whetlS! the following are true or false for all sets A and B. Find it. . n~*~
(a) If ,i4 U fi = 0 , then A =B = 0 .
16. Consider the following sequences\A„\ of subsets of R. In each cas< state
(b) I f A B = 0 , then A - B = 0 .
whether the sequence is contracting or expanding and find lim A„.
(c) I f A = 0 o rfi = 0 , then A B = 0 .
(d ) lfA C B ,lh e n A < J B = A . 00 -4n =!* I| 1 + - < X < 4 - - !
n n1
6. Prove that A C fi if and only if S ’ C A 1.
7. Prove the following identities: (b) A„ =!x 1 ~ — < * < 4 + —!
(a) (A ')‘ = A for any set A.
(b) S ' - 0 and 0 ' = S. (c) A n =[x | jr > 2 - ^ - j
(c) A U A '= S for any set A.
(d) A A ' = 0 for any set/). (d) / 4 „ = j x |x > 2 + i |
8. If S is the universal set, and if A is a subset of S, find: II
(e) A„ = |x | 4 - - < x < 4 +
(a) SA (b) ( 0 L M / (c) S 'A '
(d) S A ' (e) (0 /1)' (/) (A 'U A )'
9. Let S = I* e R 1 0 < x < 41, in d Jet A and B be subsets o f S defined as 2. SAMPLE SPACE AND EVENTS
^ = lx 11 < x < 31, fi = Ix IJt > 21. Find: From the Vitnfpoiixt of | lei»/, wc jjiall be interested in a » x ip £ » H
(a) A ' (b) B 1 (c) A U B
iie W m p
(d) A ' u t f (e) (A U fi)' ( 0 a 'b 1 the ti(uvei51 s& M w ill-be<silled'theiim ple «pace. As i mathematical entity the
(?) (AB )' sample space itonly * set. Throughout we shall denote this set by S. Every conceiv
able outcome of the experiment judged pertinent to i discussion is in this set, and
there is no member in this set which is not a possible outcome of the experiment.
Members of S are called simple points. Consider the following examples:
Scanned by CamScanner
¡4 / Banc Probability Theory and Application
CO W*" « toss i coin, usmg the letter« tor heads and the letter T for tails
could represent S as S = \H. 71. it is not always possible to decide ahead of time the kinds f
(b) If we roll a die once, we could write S as S = 1 1,2,3,4.5,61. might want answered. To be on the safe side i, „ 7 questions that one
(of) If we roll a die and toss a coin, writing the outcome’s as pairs with the first tion of the sample space as fully as possible jo thaUUs d *° *)r0V'l*t 1 <*“ CT1P‘
component representing the outcome on the die and the second the outcome on the possible questions one might pose ’ •'¡equate i0 answer all the
coin, we could write
(iv) If we pick one card from a standard deck of 52 cards, then we have
nrnhaKM t * ^ P0Sc a <lu«tion of the following type: What is the
S = U «p. K,p. ■■- , 2 A„. X„, f t .........2h, A„, K i, . 2d. p bability that an even number will show up? We will say that we are interested in
*Cl» *cl» ficl» • • • . 2C|| ie event an even number will show up.” This event will occur if and only if 2
ows up, or shows up, or 6 shows up. In other words, we are interested in the
with an obvious notation. For example, fisp denotes the queen of spades mem ere o the set 12,4,61. Thus any verbal description of an event has a set- ^
(v) Suppose a basketball player keeps throwing a ball at the basket until he makes eoretic representation to it. On account of this, an event is defined as a subset of ^
the basket for the first time. Once he scores he quits. Let us agree to write m if he e sample space. An event consisting of a single outcome is called an elementary
, misses and j if he scores the basket. What are (he possibilities? He could score on the event, or a simple event. 4
fust a(temp(, or miss on the first attempt and score on the second, or miss on the An axiomatic treatment of probability requires a more formal definition of an ^
first two and score on the third attempt, and so on. Writing these possibilities as event, this aspect will be discussed in the next section. From now on, we shall
s, ms, mms, and so on, we have identify a verbal description of an event with the underlying set. ^
An event E will be said to have occurrcd i f one o f the outcomes belonging to the j
S = |j, ms, mms, mmms, . . .1 set f takes place
(W) Suppose a person speculates on the length of time he will have to wait at Understanding this is crucial lor a good grasp of the algebra of events, where we *
the bus stop for the bus to come. In this case, we have S =)* I x > 0, x e Rl. combinc two or more events to get a new event. The algebra of events, as we shall
soon see, is precisely the algebra of sets that we have already discussed.
It should be noted that there is no unique way of describing a sample spacc and The event consisting of all members of 5 is called the sure event. The name is
considerable skill and insight is called upon in deciding what outcomes will be quite suggestive because one of the outcomes belonging to S is bound to occur when
relevant. Faulty descriptions of the underlying sample spaces have led to several the experiment is performed. Hence the sure event is the set S. '
paradoxes in the theory of probability. Let us consider the following examples: The event which contains no members of S, namely 0 , is called the impossible
event. This makes sense because the event 0 will never occur since it has no out
(w'O Suppose a coin is flipped. In this caw the sample space can be given as comes in it.
S = Ihead, taill, assuming that the coin will land either heads or tails. However, if
we entertain the possibility that the coin might land on its edge, then£ - Ihead, Example 2.1. Suppose we toll two dice. Write as sets the following verbal descrip
tail, edget would be a more appropriate sample space. tions of events.
(viiff Suppose a box contains three letters a, b, c and vve pick two of these O') E, = The sum on the two dice is 7.
letters, one by one, w ithout returning the first letter to the box before the second (u) £ , = The two dice show the same number.
one is picked. Here one might give the sample space as S| -\ab, ba, ac, ca, be, cb\, (iii) £ j = The sum on the two dice is a prime number.
listing the order in which the letters are picked. For example, ba would represent
«
that letter b was picked first and the letter a was picked next. With this description Solution. The sample space can be written as a set of ordered pairs in the following
of the sample space we are in a position to answer a question such as Which out'
comes resulted in picking the letter b first?” Now suppose we choose to give the s = (1 .1 ).0 ,2 ),(1 ,3 ),( 1 ,4 ) ,(1 ,5 ). (i
sample space as S , = lei, ac, cb\. recording all the outcomes on the basis of what two
n ! w i 2)’ (2’ 3)'<2 . 4) . ( 2 . 5), (2 .6 ),
dijtinct letters are picked, but with no regard to the order. This second description,
( 3 .1),(3.2), ( 3 ,3 ),(3 ,4 ), (3, 5) (3 6)
though satisfactory to answer some questions, is totally inadequate to answer the
« '> . ( 4 ,2 ) . ( 4 , 3 ) ,( 4 ,4 4 5 4 61
question “ Which outcomes resulted in picking the letter 4 first? ’ ^"ce ^ fad o
record the outcomes according to the order in which the letter, are picked. Of the
(6.1). (6,2), (6 ,3 ), (6 ,4 ), (6, 5), (6 ,6 )
two sample spaces S „ it would seem that S, is more appropriate for our Hfrp in n r h
A,„
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
26 / B «sk ProbaM iry th eo ry ana Applications fMitlilWf 0101. ut me 1 iitvuui'ii ‘ t •-
k c, rft |61, )a, ft, c, dll is a sigma field. Because of the way we have constructed it sistencies in defining probabilities, this is precisely what wc do. But there are situa
'V
i t u the smallest sigma field containing the set 10, lil, la, 611. It is ciUed the sigma tions where inconsistencies crop up. This happens when 5 has an uncountable
2 . field generated by the set 10,1*1, )*, j ||.
number of sample points. We shall take this problem up in Chapter 2
As . further «lustration, ifS = l„ , b, c, dl, then it cm be easily verified that
T / " ' i,d ,g' n' rattd by Ia. « «>0.1«. <>1. I*. d{ U. ft, c, d ll («,) that , EXERCISES-SECTION 3
gewrated byllal, Ifclt ls 10, | 8|, |tf, k b[ |£ dy |a_ £ d , ^ c ^ j d |, d
1. lf S » la , b, c, d, ei, give three distinct sigma fields of subsets o f S.
W I that generated by |!a |, | W, |e || is the power o f k h f rf|
V 2. If 9 is a sigma field, show that for any A e 9 . B e 9 . it follows that
it ihTrlii P“ iIi0n diSCUSS'he Borel "dJ of th' real li»- To construct
3 procedure:
it, we use the following 10 (A -B )U (B -A )e 9 .
1/3. Suppose A and B are subsets o f 5, ^ t S e : of which is 5 01 the empty set Find
the sigma field generated by \A ,B \.
i" Cludi,"8 a " ,he ,n" rvals ( - “ .«I whete , is any real number. 4. Discuss the Borel field o f the subsets o f the interval [0,1 ]. Hint: Start with
m ,n h,K r ° u • (~“ ^ fo™ a I » 1 “ f ,he “ Section. the set of intervals of the form |x 10 < x < 61, where 0 < b < 1.
I ) For $ to be a sigma field we now require, on account of axiom { 9 \ \ that
it contain complements o f the intervals that we included under (0. Since the 5. Discuss the Borel field o f the subsets o f .9 = R X R - f(*. y ) I x , y € Rl.
complement o f ( - « a] is fe « ) , the collection 4 will contain all the intervals of Hint: Start with the collection of sets o f the form Ia,b = l(*. J’) I
the type {a, ~ ) where a is any real number. For example, intervals of the type -00 < y < 6 |f where a, b are any real numbers.
( i °°); ( 2 , « ) , (> /3 , « ) will be members of 5 .
(nr) Suppose a and b are any two real numbers with a < b. Since by (0
( b] c S , and by («) (a,°° ) e <B, their intersection, ( - « ¿] n (a, « ) = ¿] >
is also, in S . In other words, all the intervals o f the type {a, b] , where a and b are
real numbers with a < b , are in S . For example, ( 2 .3 ] . (-2 , V I ] , and so on
are in S . ’
we note from (iti) that sets o f the type ¡5 " , sj are in iJfo r every natural number n.
Scanned by CamScanner
Scanned by CamScanner
o» 3 0 /Basic Probability Theory and Applications
(C l) The probability of the impossible event is zero. That is, For example:
Definition o f Probability I S I
A - A U 0 U 0 U . ..
(C3) If A and D are any two events, then
Since A n 0 = 0 and 0 0 0 = 0 , we get, using axiom (/>3),
P {A -B )= P (A )-P {A B )
^ ) = W * / ,( 0 ) + f ( 0 ) + . . .
3» Now. using the fact that f(>4) is finite, this gives P (0 ) = 0. (Recall that A - B represents the event that A occurs but not B\ that is, o f the two
3* events A and B, only A will occur.)
Comment. This result incidentally proves that P is finitely additive. That is, if
To prove this, we note that we can write the event A as the union of two
3 Ax A 2. . . . . A„ are n mutually exclusive events, then ^ = X H&4/)- This mutually exclusive events A - B and AB. (See the Venn diagram in Figure 1.1.)
Hence
d follows simply from the fact that
3 P ( A )= I\A -B )+ J \A B ) .
= ^ , ) + . . . + />(-4„) + n 0 ) + / X 0 ) + . . . P {A -B )= P {A )-P {A B )
3
by axiom (F3). Hence Comment. We must be careful to take note of the fact that, in general,P{A - B ) is
3>
not equal to P(A )- P[B). However, in the special case when B C A , the result is
? true. Tl»!it is,
since P (0 ) - 0.
if B C A then P{A ~ B )= P {A )-P {B )
« Note: A collection o f events is said to be mutually exclusive if no two of them can
ê occur simultaneously.
(C4) (Monotone Property) If events .4 and B are such that B C A , then
$ (C2) For any event A. P(A ') = 1 -P [A )
The proof o f this relies on the comment at the end o f the proof o f (C3). Wc
I
. know that if B C A, then
» 4*V In words, the probability that an event will nor occur is equal to one minus (he
probability that it will occur. P (A -B )= P {A )-P {B )
>f We can prove this as follows: From the definition of > 4 we know that
B utP(A - B ) > 0, by axiom (PI). Therefore,i*(i4) - P{B) > 0 and consequently
i» Sf
~ , S ~ A U A \ N o w by axiom (P2) , P(S) = 1. Hence
P (A )> H B ) -
K ** ' /X A U A ') = I
1 V
> >. Therefore, since A and A ' are mutually exclusive,
I & /C O * 4 0 - 1
1* Consequently,
> 1 4 o = i - w
) ■
\
Fi*ure I 1
!
Scanned by CamScanner
/
Definition o f Probability ¡3
Therefore,
P{A, U A i U A ,) =P(A,) + [f(A7)+ P (A ,)-JK A ,A })]
-I J X A .A j + f C A t A j - P f A ^ A , ) ]
=JXA, )+P(A2)+ /K A ,)-/X A ,A ,)
-P ( A l A 3) - i ,{A1A , ) + f( A , A 2A 3)
P(A I U A ! U A 1) = ;X P(A,) - . r , ■
H A iA j) + K A ' A i ■
A 3)
i<i
An astute observation might lead one to conjecture that, ¡M ,, -42, • • • >».. a[C
n events, then
p {u a !\= i l \ A i ) - i f t A . A j ) * 2 f\A ,A jA k )
\/= 1 I 1=1 1.1s 1 U.*=l
I </ !< /< *
This conjecture is indeed correct and can be proved by induction. The proof of
this result will be left to the exercises. .
Comment. The preceding formula is useful for finding the probability that at least
one of the events A ,. A .......... A„ will occur, if we can find the probability of the
simultaneous occurrence of any subcollection of the events. Also note that if the
events are mutually exclusive, then the probability of the simultaneous occurrence
/ ** \ W *
of two or more events is zero, and consequently j4,j will equal .X P(Ai), as it
should.
Example 1.1. Suppose/i and B are two events for which PM ) ■ 0.6,/*(B) = 0,7,
and H AB) = 0.4. Find the following probabilities:
(a) P (A U B ) (b )P (A B ') (c) l\B A ')
(d) P((AB)') (e) P((A U B)') (J) P(A'B')
Scanned by CamScanner
Scanned by CamScanner
M ÍY ÍV V Í
4« « W
«
Scanned by CamScanner
Scanned by CamScanner
$ 4t/B * sic Probability Theory and Application
ifM ^f is an expanding sequence with lim A„ = S then Definition of Probability / 41
n-*- ’
A ( 0 ] ) = 0 - fl
lim PiA,,) = /X lim 4 „ ) = P(S) = 1
n-»- n -f- 7
(a) For any real number r we can write Irl = O (r - - r] Since \lr - -
n= l\ n I \\ n ’
rl! is a
contracting sequence o f intervals, we get This leads to an inconsistency because P( (0, 1J ) s 1, whereas £ J \£ ,) is either
4
members of the Borel field. However, they are, of course, members of the power
set of [0,1 ].
m = H Û . l r i = Z P0r„|)
Vi=l / n=1
Scanned by CamScanner
' Definition o f Probability 143
42 ¡Bene Probability Theory and Appticanons
yy elementary events. This « suflfcent. l-.o b .b iim « » « th«n ..signed m 3 nan.nl
#. A student is taking two courses. History and English. If the probability that he
will pass either of the courses is 0.7, that he will pass both the courses is 0.2, and wav to ail the events as follows: . , . _ _
that he «ill fail in History is 0.6, find the probability that— Suppose A = l«/t .1 ,,____ I t i *«•> * oulcomcs-Then A ™ b‘ ' Xp" W d **
r tj (a) he will pass History union of k mutually exclusive elementary events as
jje'Ai (*>) he "HI pass English
A = lJ/,1 U li/jl U . . . U lj,*|
fe ' fc) he will pass exactly one course.
V Suppose A. B, C, I) are four events. Derive an expression for the probability that Using axiom (/>3), we therefore w
i exactly k o f the events occur (k - I 2. > 41 in terms of the probabilities of their
intersections. -4 ^ k '0
10. Ann, Betty, Cathy, and Dorothy are invited to attend a party. Let A. B, C. Tims for a m e simple space, the probability ofan c e n t A is «ju„/ to the mrr.
and D represent respectively the events that Ann, Betty, Cathy, and Dorothy attend o f the probabilities assigned to each o f t h e o u t c o m e that make up t h e event A.
the party. IfP(A) =P(B) =P{C)= P{D) = 0.6, f\AB)=P{AC} =P[AD) = PfBC)
= PfBD) = P(CD) = 0 36, P(ABC) = P{ABD) = P(A CD) = P(BCD) = 0.216, and The classical definition of the probability of an event is based on two fund*
menial assumptions. One of these is to assume that the performance ofanexpen-
ffABCD) = 0.1296, find the probability that exactly k girls attend the party, k - 0.
I, 2 ,3 ,4 . ment results in a finite number o f outcomes. The other is to assume that all the
elementary events have the same probability; that is, the outcomes are equa y
11. Supposed - II, 2 , . . and ^Q/Q = Ar/3* for all ie S, where it is a constant.
(a) Determine it. likely or equiprobable. •
(b) Find the probability of (/) the set of even numbers, (if) the set of odd In what follows let us assume that the outcomes are equally likely ; that is,
numbers.
12. Prove by induction that
N L _ 1
Then, since E /^IS/I)= 1, we 6et Np = 1, so that P ~
A *1 w - J e /h ^ > +
, i=l
i<j i<j<k Hence, if the outcomes of a sample space S with N outcomes arc equally likely,
then the probability of cach elemenury event ¡j 1/M the reciprocal o f the number
r lK * iA 7 . . . A n) of outcomes in S.
13. Establish the following inequalities: Next, suppose A is an event with k outcomes, s /,, s,7, . . . , s ^ . Then, as we have
(a) P { A B ) > \- I \A ') - P { B ') already seen,
(b) f f U A ^ Z P i A , ) m = n i i , 1o + « i i i l i ) + . • .+ * 0 * 1 * 0 = j j
Comment. We shall have occasion to use phrases like “ an unbiased coin is tossed,”
2. FINITE SAMPLE SPACES “a fair die is rolled,” “ an object is picked at r a n d o m and so on. They are all
We shall devote the rest of this chapter to the discussion of finite sample spaces; meant to suggest that the outcomes in the sample space are equally likely.
that is, sample spaces which have only a finite number o f outcomes. Consider a
Example 2.1. Let the sample space S = taj, s2, j 3, $4, ss , s6l be given. Probabilities
sample space S with N outcomes so that we can write S as S = Is,, s2, . . . , fy l.
are assigned to some (.vents as follows:
Recall that the power set oTS has 2N members. In other words, there are
7 ^ possible events. I f we define a function/* which assigns numerical values to
these events in a way which is consistent with the three probability axioms ((PI), and
(P2), and (P3)), then the fu n ctio n ? is a probability measure. Thus, in order to
define a probability function on a sample space withiV outcomes, we need specify
at most 2n values. In practice, this is accomplished by assigning probabilities to the
Scanned by CamScanner
Scanned by CamScanner
/ J W r Probe W iry Theory ami Applteatiout Definition o f Probability / V7
& h each cu e, whether the sampling is carried out wit]} or without replacement. Comment. When n objects are picked and the ordei is important, it is convenient
' p e m ay or may not be interested in the order in which the objects are picked. As to write the sample points as ordered n-tuptes(X|, .........x«) where the rth
isn a a lt. we have the following four situations: component Xj represents the ith object picked- Thusxi represents the result of the
first draw, of the second draw, and so on.
without replacement with replacement We shall now provide a general formula in each of the above four cases Towards
this, we state the following basic rule of counting techniques.
\ The Basic Counting Principle If a certain experiment can be performed in r
order no order order writer
ways and, corresponding to each of these w ay, -I’ -rperiment can be per
formed in k ways, then the combined experiment can be performed in rk ways.
Let us consider the following illustration which brings out the essential ingredi
ents o f our discussion: Suppose there are four distinct objects represented by the To understand this principle, suppose the outcomes of the first experiment are
letters a, 6, c. d , and two of these letters are picked. The following four cases are written as A - laj, flj,. . . , arl and those of the second experiment as
posible: B= |A,, ¿ 2 . ___ 6*1. Then the outcomes cf the combined experiment can be repre
sented in a rectangular array as ordered pairs (a/, 6;):
b c d
‘
a ab ac ad ab ad 6*
6, »,
b ba be bd be (a i, t¡) . ■ («!.**)
bd fli < « I,M fri. *>) • .
c ca cb cd cd (a2, b 7) . . (dì. */> («2, 6*)
*2 (flj. b x )
d da db dc
a aa ab ac ad a aa ab ac ad
In other words, the outcomes of the combined experiment can be represented as
b ba bb be bd b bb be bd
m 9 : the Cartesian product A X B. Clearly, there are rk pairs. Indeed, this shows that
c ca cb cc cd c cc cd n (A X B ) = n(A) X n(B). •
da db dc dd d dd Another way of illustrating the above principle is by a tree diagram, as shown
d
in Figure 3.1. First we list all the outcomes of one experiment, and then, corre
Cate 3. With replacement, Case 4. With replacement, sponding to each of these, those of the other experiment. The total number of
with order without order
branches, namely rk, gives all the combined possibilities.
The basic counting principle can be extended to any number o f experiments
In cases I and 2, the sampling is carried out without replacement, and conse in an obvious way. We shall now give some examples.
quently there are no possibilities like aa, bb, cc, dd. This explains why there are no
entries along the diagonal in these cases. In case 2, moreover, we are not interested (i) If a die is tossed twice, then there are 6 X 6 = 36.possible outcomes.
iiuirder so that, for example, cb is listed, but not ba. (if) If a person has 3 different shirts, 6 different ties, and S different jackets,
v I d case 3, we list all sixteen possibilities. In case 4, since the sampling is with then he can get dressed for an occasion in 8 X 6 X 5 = 240 ways.
»placem ent, we certainly have outcomes like qq. bb, cc, dd However, since order (Hi) If the purchaser of an automobile has a choice of 3 makes, 5 body styles,
¿ n o t relevant, ab is the same as ba, and so on. Hence there are no entries below and 6 colors, then ne can choose from 3 X 5 A t> = 90 different models.
(if) Suppose license plates are formed with three distinct letters followed by
the diagonal.
three distinct digits. Then there are 26 choices for the first letter, 25 for the second,
Comment. When order matters each possibility is called an arrangement, or a and 24 for the third. Also, there are 10 choices for the first digit, 9 for the second,
permutation. I f order does n o t m atter, it is called a combination. and 8 for the third. Therefore, there are 26 X 25 X 24 X 10 X 9 X 8 = 11,232,000
different license plates.
Scanned by CamScanner
Scanned by CamScanner
s ? 50 f Bask Probability Theory §nd Applications Ltrjmtuon oi rrvfiarxhiy /
Que 2: Without replacement, without order (combinations) For convenience, the following conventions are adopted.
^ pw We shall discuss this case in conjunction with case 1. We have seen that if we
p - 4*ck three letters out of«, b, c, and d, and if order is important, then we get 24
tM\
v ; permutations. In the present case, however, we are not interested in order, and as and - 0 if n < 0 or n > M
M «ich there are juit 4 possibilities, namely, abc, abd, acd, and bed. Each of these 0 - .
possibilities is called a combination. Among the 24 permutations of case I the first
column consists of the permuutions of the letters a, b .c and, as we know, there are Comments. (1) Picking n objects out of A/ to form a group is tantamount to pick
3. of these. This is why there are 3! = 6 arrangements in column 1. The same is ing M - n objects out of M not to belong to the group. Thus, for example, the
true o f columns 2 .3, and 4. Consequently, we get from our example, that the number of ways of choosing 3 books to read from a set of 8 books is *h- /■-
number of combinations, multiplied by 3!, is the number of permutations. the number of ways of picking 5 books not to read from the 8. Theretore we
Let us now take up the general case where we pick n objects without replace always have
ment from M distinct objects, where order is not important. Symbolically, we
■hall denote the number o f ways of doing this by and call it the number of
C K - J
combinations of n objects from a set of M. Our objective is to derive an expression
for0 This can also be seen by observing tliat and j are both equal i
Towards this, we see that if a combination has n elements, then there are n\
possible arrangements o f its elements. Each combination gives rise to n\ arrange Af!
ments, thereby giving rise to all the permutations, namely, M(M - 1) . . . (M - n + 1). n \(M -n )\’
Hence we have (2) For any two real numbersx andy the expansion of (x + j ’V1* can be
written as
o ■n! = M (M - 1 ) . . . (M -'n + I) =
M\
(M -n )l (a* + v>v = I (M\ x n YKi- n
' w=0 \n t '
Therefore,
1 his is called the binomial expansion. Since occurs as the coefficient of
(M\ _ M'
x " y '- " in the binomial expansion. (' ). i; - 0. I ......... M, arc called the binomial
coefficients.
is the number of unordered samples of size n that can be drawn without replace (3) If a set h3S M objects, then the number o f different subsets o f size n is
ment from M distinct objects. ( „ ) Tl,is is because, as we know, order is not important in listing the members of
For example:
a set. --------
((') The number o f ways o f choosing a set of 3 books to read from a set of 8 (4) We have mentioned above the following identity, which holds for any real
numbers*, y . '
books is = 56. (Note that we are not interested in the order in which the
2« = i M +...+
( s H l r 2'598'960
(«/) From a group o f 8 seniors, 6 juniors, and 4 sophomores, there are This shows that the total number o f subsets that can be formed from a set with M
(i.*) ways o f picking a five-member committee. elements is 2 . (Recall that we mentioned in Chapter I that the power set of a set
with n elements has 2" members.)
Scanned by CamScanner
¡ fr a c Probability Theory and Applications
D ie num ber o f ways o f picking n objects from M distinct objects is M" when Solution
¡he objects are picked with replacement and when order is important. This is easy (a) In this case we are interested in die order. Since we are picking 5 cards with
to see because at every draw there are M different choices. ' out replacement and the order is relevant, there are (52)s = 52 X 51 X 50 X 49
I For example .
(i) With the eight digits 1 ,2 ,3 .4 . S. 7 ,8 ,9 , one can form 83 distinct three
X 48 possible outcomes in the sample space. For example, three of the outcomes
in this sample space can be written as(/f,p, Jh< 3h. 7 d » 8 ci),(/h > ?d>
Mh, fii./d - 8C|. 6d) How many of these (52)s outcomes are favorable to the
r' ’ SP 1
digit numbers. event that there are 3 black cards and 2 red cards? Let us call this event I le even
(ii) If there are Af cells, then n objects can be placed in them in M" ways. A. First of all we observe that there are 5 locations, of which 3 are to he assigned
(We are assuming that a cell can have more than one object.) Placing an object in
to the black cards and 2 to the red cards. This can be done in Q = 1« ways.
a cell amounts to picking one o f the M cells, and allowing a cell to have more than
one object amounts to sampling with replacement. Consider just one of these, and say we have black cards in the first, third, and fourth
(iii) If 10 people are in a train which stops at 6 stations, then there are 610 locations, and red cards in the second and fifth. There are 26 X w
filling the first, thtrd, and fourth locations with the black cards and corresponding
possible ways that the 10 can get o ff the train. Notice that a person can get off at
to any of these there are 26 X 25 ways to fill locations two an >ve wi r
any one of the 6 stations so that he has 6 choices. This is true of each of the 10
people. Also, if one person gets off at a station, it does not preclude other persons Hence, by the basic rule of counting, there are (~\ X 26 X 25 X 24 X 26 X -5
\3/
from getting o ff a t that same station.
outcomes favorable to A. Hence
Case4 With replacement, without order * /* )
The derivation of a general formula in this case is rather tricky and we shall not ( 3) ( 26) 3( 26)3 (236) f
pursue the matter here. For our purpose it will suffice to know that the number o f m <52)s (52)
O r d e r e d sample: o f size n when objects are picked with replacement from M
i M + n - 1\ (6) In this case order is not of interest. Hence there are (5S') possible samples
distinct objects is \ j
For example, the number of ways of placing n nondistinguishable balls into Af of size 5. Now there are ways of picking 3 black cards o ut o f the 26 black
cells is {M * n ~ ‘) ■(Try to see the analogy between the indistinguishable balls and
\ n ' cards, and corresponding to cach of these ways there are | 0j ways of picking the
the irrelevance o f order.)
2 red cards. Therefore, by the basic counting rule, there are ( ^ ) ( ^ ) possible
samples of size 5 where each sample has exactly 3 black cards and 2 red cards.
Consequently,
’ ( ,) _ L if sam pling is ordered, w ithout replacement
(M)n ( X )
if sampling is unordered, without replacement
(2)
ô (?)
(3) i f sampling is ordered, with replacement Comment. In Example 3.1 we see that P(A) is the same in both cases so that, for
n findingP{A), it does not make any difference whether we observe the cards all at
if sampling is unordered, with replacement. once or one by one. It should be borne in mind that the event A specifies that
there are so many cards of one kind (black) and so many of the other kind (red).
Example 3.2. If a person is dealt 13 cards from a standard deck of cards, what
is the probability that he is dealt:
(a) the complete suit of spades;
red cards’ We shall consider the following two cases. (b) a complete suit?
t . \ #h. r*rM are seen one by one;
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
¡ g P '# « * Theory <„d Appllcan0m
8V87US'
/ ‘Î W ' I
*
- Ci0nSider 4 ' 0Uery th3t Sells 100 »<*«’ offers five cars and
Consequently,
&
: m =
n m n
In the following, we present a slight variation of the hypergeometric probabilities, ^
(4) The event B specifies that the person picks 0 tickets from among those that
‘ * effer i car. 0 from those that offer a motorcycle, and 4 from 'hose thai offer no in that the sampling is carried out with replacement To start with, consider the
t j r i z e . Hence following example : ^
Example 3.9, Suppose 5 cards aie picked from a standard deck of 52 cards with ^
replacement. What is the probability that there are 3 black cards and 2 red cards? V
,
-
n w
(c) The even! C is simply the complement o f the event B above!. Therefore,
Solution. Since we are picking 5 cards with replacement, there are 52s ways of
doing this. For example, (K jp .ih .C c l.^h .^sp ) is a possible outcome. We shall
next find the number of outcomes favorable to the event A representing 3 black
^
^
cards and 2 red cards. To find this number, we note that there are 5 locations, of '•
» /8 7 \ which 3 are to be assigned to the black cards and 2 to the red cards. There are ^
•V M = i - ' ; 10 ways of doing this. Consider one of these ways, and suppose we have black £
n Ml
cards in the first, third, and fourth locations, and red cards in the second and fifth.
There are 26 X 26 X 26 ways of filling the first, third, and fourth locations with
:0
the black cards, and corresponding to any o f these there are 26 X 26 ways o f “
filling locations two and five witfi red cards. Applying the basic counting rale, (4
C<
Scanned by CamScanner
Probability Theory and Application • ¿>, fininon o} rrotwwit? 1 <*•
! ;- • j ? ( J ) i> * ( w - o ) " - * - 0 0 »
|- a. AT Example 3.12. In a bridge game, find the probability that NoTth gets exactly *
^ !a S
S £ ^
n
3
! h t Pr0blT l b0ilS d0Wn ,0 the followin8: There «
« l7 n .< T
3 courses and we arc
r\ P r ement' N 0,i“ ,hat there “ 0nly 1 course in probability
o eH ‘ 0
V f " ' , StUdeJ l tS t 0 P ‘c k fr0m and there « e 2 courses in statistics for the 7 students We observe that this probability is the same as the piobability that an arbitrary
. ^ y o - p d t from. The probability of the desired event is therefore equal to hand of 13 cards contains exactly k aces.
Example 3.13. Find the probability that eight players on a team will all have their
birthdays o n -
(a) Monday or Tuesday (but not all on one day)
(b) exactly two days of the week.
| g | - J n (he rest o f this section we shall consider miscellaneous examples which unify
allieren t ideas developed thus far. Solution. There are 7 days of the week on which each of the players could be born.
Hence there are 7* possibilities.
?le 3.11. Find the probabUity that in a bridge game North, East, South, and (a) If each person is bom on Monday or Tuesday, then each person has two
I get, respectively, i, /, k, and I spades ( i + / + * + / = 13).
choices of days, and as a result there are 2* possible ways this can happen. However,
n . The num ber o f ways o f dealing 13 cards to one player is . There are the men cannot all have birthdays on M onday, nor all on Tuesday. Therefore,
there are 2* - 2 outcomes favorable to the event, and consequently the desired
cards left from w h ~ h th r second player can re w v e 13 cards in ( ' ^ ) ways. probability is equal to (28 - 2)/78
(i>) There are Q ways of picking 2 days out of 7. Hence the probability of
lUnuing the argum ent, the third player can be dealt 13 cards in ^ ways, and,
having all of the birthdays on exactly 2 days of the week is
ly, the fourth player can be dealt the remaining cards in ways. By the basic
^«M inting rule, there are Q ( ^ ) ( ^ ) ( | j ) = (^ 3X 13) ^ ) ways ,0 deal four brid8 '
Scanned by CamScanner
p i Probability Theory and Application*
(a)
I .’ r. Finally, since there are terms in the sum terms in the sum
« o - w
—
'.ÿ< fM i'A/)< . / |« '‘
and, in general, •J terms in the sum 2. Show that
fw e get
Inb
Scanned by CamScanner
' Banc Prohahüiry Theory mtd Applications
D tfm tion of Probability / 45
I* the following, n is a n o n n e jjiivt integer. Use the fact that
15. From an ordinary deck o f 52 cards.4 cards are picked at random. Find the
(!♦*)"* X probability that—
for any x
m '5 0 * ' (a) exactly one is an ace
(b) exactly one is i face card
the following identities
(c) all are black cards
s&s (.) 2 ( - , r (d) each is from a different suit
Uc\ (e) at least two are aces.
16. Find the probability that a hand o f five cards selected from j L‘ai-iard deck
(b) Í
M 0
/ y>-l
Hint: Differentiate and set x = 1 h as-
(a) an ace, king, queen, jack, and ten of spades
(c) =0 (b) an ace, king, queen, jack, and ten of the same suit
(c) an ace,king, queen, jack, and ten.
(d) I 17. Two cards are drawn, with replacement, from an ordinary deck of 52 cards.
///«f: C o n sid er(l + x ) w = ( 1 + x )« (i + x yi
Find the probability that both cards belong to the same suit.
18. Work exercise 17, this time assuming that the cards are drawn without
replacement.
19. In a five-card poker hand, find the probability th a t- .
v 5 ‘ Aud,e 15 l0SSCd a x times and a co‘n is tossed
tossed four times. How many outcomes
n there in the sample spaced (a) there are three kings and two aces
(b) there are exactly three kings
^ 0 *(i), ^ o w , m a iv < *"* * Ch°!Ce o f *n,werini «ight out o f ten questions. (C) there are exactly three kings and one ace
(•J Mow m any ways are ihere to answer the test’
(d) there is one ace and at least three kings
a « in I m any WJyS ' here if ‘>UeStions 1 and 2 « • obligatory7 (e) there are at least three kings.
t ™ 1 how m an>' P ° « ‘*ve. integral divisors 3500 has.
20. From a group of 10 lawyers, 8 doctors, 6 businessmen, and 9 proleisors, a
t WayS ° f arra" Eing the leUets o f lh t W°'<1 SUCCESSION committee of six is selected at random . Find the probability th at—
| i 2 5 y 2 5 W° rd V° LUMES a " an* 'd in a11 <“ « ay s, find the (a) the committee consists o f 2 lawyers, 2 doctors, onc businessman, und
rX. (*) the w ord ends w ith a vowel one professor
(b) the committee contains no lawyers
the w ord starts w ith a consonant and ends with a vowel.
(c) the committee contains at least one lawyer.
fe e !? * i ,git* *re pickcd at ran d o m . Without replacement, from the digits 1 21. From a group of S lawyers, 7 accountants, and 9 doctors, a committee of
^ t t r o u g h 9. Find the probability th at the digits are consecutive digits.
three is selected at random. What is the probability that the committee has more
• o nUmbCrS * rC PiCkCd W ith ° Ut r c P ,a c e m e n t from the b lo w in g numbers: lawyers than doctors?
gi 8 . 9 , 1 1 J 2 , 1 7 ,1 8 . Find th e probability that they are relatively prime
22. Suppose ei$ht books are arrayed on a shelf in a random order. What is the
A box contains eight balls m arked 1 , 2 , 3 , . . . . 8. If four balls are picked at
probability that three particular books will be next to each other?
& M o m , find the probabiU ty th a t th e balls m arked 1 and 5 are among the four
* ¿ I f c t e d balls. 23. The numbers 1 , 2 , . . . , n are arranged in all possible ways. Assuming that all
the arrangements are equally likely, find the probability th a t-
¿ r j i l * S u n d r y b ,g c0013*™ fo u r black and eight white gloves. If two gloves are
(a) 1*2,3, and 4 appear next to each other in the order indicated
one onc a l random , w hat is the probability th a t—
(b) 1 ,2 ,3 , and 4 appear next to each other.
p«r ( i ) they are b o th black
24. Suppose 4 letters are placed at random in 4 addressed envelopes. What is the
(b) they are o f the same color.
probability that exactly k o f the letters are in their correct envelope, k = 0 , 1,2 ,
M. A box contains ten item s, o f w hich foui are defective. If three items are
3 ,4 .
^ced w ithout replacem ent, find th e probability th a t -
25. Eight cards are dealt from a standard deck o f 52 cards. Find the probability
(a) ill are defective
of obtaining either 3 aces or 3 kings or 3 queens. (This does nof preclude, for
(b) exactly tw o are defective
instance, getting 3 aces and 3 kings.)
(c) a t m ost tw o are defective
26. Suppose 13 cards are dealt fiom a standard deck o f 52 cards. Find the prob
(d ) a t least tw o are defective.
ability th a t the hand will contain all the face cards in at least one suit.
Scanned by CamScanner
iJB K k Probability Theory and Application,
3
Conditional Probability
and Independent Events
I
An elevator stops at ten floors If th*r» ’ * • ’ ' ' *’
~r*^p 3babftity that— ’ ,re *“ peop,e in thc elevator. find the
(/) What is the probability that the person is in the United States? Assuming «
that there are 200 million people in the United States and 3 billion people in the «
world, the answer would be .
00 Given that the person is in Australia, what is the probability that he is in the
i
United States? The answer would be, obviously, 0. i
(/«) Given that the person is to Iowa, what is the probability that he is in thc
United States? Here the probat^Ky is, o f course, 1.
We see from the above three situations that prior knowledge of the person’s
location influences the probability of his being found in the United States. Thus it
often happens that partial information is available about the outcome of the under
lying experiment and this, in turn, leads to appropriate adjustment of the probabil
ities of the associated events. In summary, the notion o f conditional probability
involves the probability of an event, say A, given the information that an event fl
has occurred.
67
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
\jBeac Probability Theory end Applications
Conditional Probability and Independent Events / 73
ipie 1 .7. Three balls arc picked al random one by one and without replace-
irom a box containing four white and eight black balls. Let (ib) Here we want to find P(C\A). We have
A—
A T h l first
= The (iMt ball
V. ■1is
• white
a.
B =The second ball is white “ c ,i4 )' W A u ■
C = The third ball is white
Suppose (S, 9 , P) is a probability space and B is a fixed event with P(B)> 0. We
I: («) P[A) (6) P(B\A) (C) P(C|/IB) have just defined* the concept of conditional probability. In essence we hive intro
Solution duced a function P{ • \B), which, for any event A (that is, A e SF), is given by
P(A 1 ^ * %tu)!P(B). To make the following result more appealing, we write t B
F°(kA)
atiuuons this will be the nature o r our argument in f i m L g ^ c o n S t o L ' ^ r
The properties (i) and (u) are obvious and we shall prove only (iii). We have
4 X3 X ^ lm « ^ WI,° bS" V
u ,ha' m A B ) = M U W A B ) . Now there are
- ( 4 X 3 X 2)/(12X u T f o i Ht ' . ?V' n,C4B “ W pb(9iAi)=p(9\Ai]B)=K(9/ ‘H I " bydefiniiion
P[C\AB) = ^ X 3 X 2 \ ^ / 4 X 3 X 10 \ 2 = l\ B ) , by the distributive rule
\12 X 11 X 10/ Vl2 X 11 X 10/ “ 10
Now since the events Mil are mutually exclusive, so are the events\A jB \. Hence
° z r n r : n ih,at ,he nrst 2 baiis are wi,i,e' ,hwe ■ » « * * ™ wh«n «he
k i o f thtse are whi,e-whercver convenieni’ ° * read“ pb (?-i A) i ^ H A M I W ) * 1 , ^ . 1 « ) = S f f lW i)
p Aould adopt this type o f argument. It is assumed that he is aware of the back-
^«ground leading to such an argument.
It follows from properties (r), (it), (iii) th a t the function PB , th a t is,P (* |fl),
L S - JThrec baUs are Picktd >1 random one by one from a.box containing satisfies all the axioms of a probability function, and hence is a probability
^ « V .w hite and eight black baUs. Find the following probabilities:
■ (fl) The first and third ball are white. Comment. For completeness, we should add the following—keeping in mind that
( b) The third ball is white given that the first ball is white. our treatment of the subject will n ot be impaired if such fine details are relegated
- Solution to obscurity: Conditioning on the event B am ounts to choosing B as the new
sample space. As such, the appropriate sigma field is a sigma field o f sublets of B.
t- w t e t A * The first ball is white, B = The second baU is white, and C = The
I third ball is white. Then Denoting this sigma field by SFB, it is given by
Scanned by CamScanner
/ Basic Probability T htory and Applications
Conditional Probability and Independent Event /75
(b) In this case we want to find P [ A ,- A 2\B): this is equal to /( ¿ J A )
” / \ A kA 21B). Hence V
f \ A \ \ S ) ~ \ - l^ A ^ B ) HA -A \P \- *1 41 - 21
1 11 * 250 250 ' 125 .
I S lS ^ S rA<**Z'reven,sw,thiW>=°'4>"«w-
^ T W . <«) * A \ B ) (b )H A \B ) (c) P(A'B ') (d ) P (A U B ) 0 .2 .
attempt is 0.5, that he passes on the second attempt is 0.7 (of course, given that
he failed on the first attempt), and the probability that he passes on the third Y
attempt is 0.8 (given that he failed on the first two attempts). If the person is
Solution. We have: allowed three attempts, what is the probability that he will pass the test?
Solution. Let A , represent the event that the person passes the test on the rth
attempt, / = 1.2 ,3 . Then A , U (A \A 2) U {A [A \A 3) is the event that the person
passes the test. The probability o f this event is equal to /X ^O + P (A \A 2)
(4) A ^ 'l f i ’) = 1 -P (A \B ')= 1 ~ =1
^ 1 \A \A \A $ ) (why?). Next,
f? - (c) F{A 'B ')= P {B ') - P[A'\B’) = 0.8 X = 04 / H ) = 0.5, = ^ < i ) ^ M 4 ) = 0.5 X 0.7 = 0.35
and
••:•' (<i) U B) = 1 -P {A U fi)' = I -P iA 'B ) = 1 - 0 4 = 0.6
P (A [A \A > )= P (A [)IU 2\A i)P (A 3\A [A 2) = 0 5 X 0 3 X 0.8 = 0 .1 2
j. could have found P(A U B) in an alternate way by noting that P{A U B)
Hip.ce the probability of passing the test is equal to
¿ ¿ t e * ftA B ') + P{B). (Draw a Venn diagram to see this.)
0.5 + 0.35 + 0.12 = 0.97
R ^ C ^ xwnpie I ’M- A number » picked at random from the integers 1 ,2 ,3 , . . . , 1000.
'7 !f the number is known to be divisible by four, what is the probability that—
(d) it is divisible by six or eight?
EXERC1SES-SECTION I
1. Suppose probabilities are assigned to the simple events o f S = Is ,. s2, s4,
&
(ft) it is divisible by six, but not by eight?
i 5l s6| as follows:
(c) it is divisible by exactly one of the integers six, eight?
PQfi\) = 2 f ( M = 3*fli3l) = 4Pfls4|) = SFQssD = 6P0s*[)
'-■*
lution. Let £ represent that the number is a multiple of 4,A X that it is a multiple
i , i n d A t that it is a multiple o f 8. , Find the conditional probability of: &
i X f f t want to find P(Ai U A 2\B) which, as we know, is equal to P{A[\B)
i p l M a W - / U M B ) . U ow
(*) III, *3.s<l given ls2, i 3|
(b) |J2, Jjl given Is, t i 3 , i 4|
*
(c) 1 1 1 , i j , s3, s4l given ls2, s4, ss , s«l
> H J A m .S d iH . A - « . (d) U i, I 4I g iv e n li,,jl t j s, j 4l.
T£ 250
_ ¡ U i B ) ii& 125
2. Suppose A and B are tw o events with i\>4|/?) = 03,P(v< 'IB ') = 0 4 and
P(B) = 0.7. Find: 1 ’ m
h 4 , ib )
m " » 250 (a) f U lA ') (b) P(A ) (c) P{B\A) *■
W ,A ,IB )
f ( A ,A ,B ) rcfe 41 3. S u p p o s e ^ ,|B ) = 0 .7 .f U ,U f ) = 0.4 , and H A tA %\B) = 0.3. Given that B has
occurred, find the probability that:
#■
m " m " 250
(a) at least one of thc events A j , A 2 occurs *
(b) exactly one of the events j4 ,. A 2 occurs
(c) only A j occurs.
%
HA U A
250
+
250
iL -]6 1
250 ' 250 = = O - W ’ lB) = 0 S ,tU ,A ,\B ) - 0.3,
#
i v ” and
that, o f the e v e n t s , A. A ,\
= 0 Given event B t find the probability 1
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
f B t t k Probability Theory and Applications
Conditional Probability and ludcpundcm h rcuts I
n * i)= A fi,) = } ,^ l f l ,) = l ,.n d
EXERC1SES-SECTI0N 2
1. Three boxes contain white, red, and black balls in the numbers given below.
A i,U ) = ______________ S _ A box is picked at random, and a ball is drawn from it at random.
Scanned by CamScanner
Scanned by CamScanner
rrovmunu v i neory ana Applications
Conditional Probability end lndcf>enJeiit Eiwrs ¡89
I (A and B arc two events with f \ A ) > 0 and P(B) > 0, then, as can be seen
immediately,
W - J , /(B )" j , and P(AB) = 0
( 0 if A and B are independent, they cannot be mutually exclusive; and
\u J \A B ) * I \ A ) - f\ B ) , so that 4 and B are not independent events. (11) if A and B are mutually exclusive, they cannot be independent.
Example 3.4. Suppose we draw a card from a standard bridge deck of cards. Give
two events B and D which a re -
(a) mutually exclusive and independent
W i l • « ..» . HOT».»«.
(fc) mutually exclusive, but not independent
kn0W" Uom past « P 'rie n c e that the probability that a D = The card is an ace
B - The card is a black card o f hearts
-' ~ PCTOnhaS iS 0 2 - and * » Fobability that he has hear,
,W° eVen,Sa" 'ndePen<*e n l.w hat is the probability (b) Let
a person has (a) at least one ailment? (b) precsely one ailment?
D = The card is an ace of spades
^ ! f ~ r ’j *2rS° n has cancer»and B = The person has heart disease. B - T’.ic card is a 4 o f a red suit
We w ant to find P(A U B ). Now
(c) Let
1 / t * U £ ) = P (y !)+ P (£ )-i> (/M )
= * X A )+ P ( B )- P (A )-P (B ) , since the events are independent D - The card is an ace
= 0.2 + 0 .1 - 0 .2 X 0 . 1 = 0 .2 8 B = The card is a spade
J W The probability o f precisely one ailment is equal to flyO + P(B) - {¿) Let
^ ) * / W = 0.26. Ki m
D = The card is a king
m*mfr should be aware o f the distinction between independent B - The card is a face card
. • ■.v r „ .... . _
and m utually exclusive events. The two concepts are often confused.
are independent i f the occurrence o f one does not influence the
1 events are m u tu a lly exclusive when th ey are no t compatible, that is, they
o & m f t c e ^ f (he other. I f (J4s js Ih e ja se , tjjeu it seems intuitively obvious that
occur together. "M utually exclusive" is a property o f sets In this case
4 j o - l h a t / ( M ) B 0. * r . o ith e o th e r .N o r
•o f o ra influence the nonoccunence o f the other. The
events are independent when the occurrence o f one event does not
otot’
“ th e occurrence o f th e other. Thus no inference can be drawn regarding
rence o f one event on the basis o f the knowledge o f the o c c u r r e d 0 f *he
independence is a property o f the probability measure. In this case If A and J9 are independent events, thei.
t* * V f)» /t0 ). (i) A and B l are independent
a m atter o f fact, the following result shows how divergent the two concepts
(it) A ' and B are independent
are:
(iu) A ' and B' are independent
We shall prove (/) and leave the other cases to the reader. We want to show that
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Conditional Probability and litdepmJtnt Ewarts 1 91
With (his assignment we see that
Í,=C,X SXSX ...XS
* M ) » P J + e + p< l - p ) - t = p fl, = S X C , X S X . . . X S
<1 , similarly,
W 'P fi,,=SXSX...XSXCn
then
¡ \B ,) = f,{ C ,), H B i) = P,(Ct), P{B„) = P,(C„)
» p . « we would in u L e i v d e Z d H ' ^ * !,' ,dS ° " any ,oss ">ual
and
quently the trials are not independent In that L * th r ill n ° 7 C' P {B ,riB i r \ . . . n B „ ) = P,(C l ) - P , ( C i ) - . . . ■Pi(CK)
Comment. What is the benefit of all this discussion of independent trials? 77i<
ass-'s-* s s w important fact is that i f the trials are independent, then we can compute the proba
bilities o f the events in the composite experiment on the basis o f the probabilities
o f the events in the basic experiment. For instance, if we want to find the proba
- L i ‘T „ the pr° bib,ii,y ° f hMds ° n any i° “ * ^ »>-
^ that the events A and B are independent. As a matter o f fact, it can be seen that bility that, in rolling a fair die three times, we get an even number on the first toss,
any event determ ined only by the first toss and any event determined only by the a 5 on the second toss, and a multiple of 3 on the third toss, we do not have to
second toss are independent. In order fo r the trials to be independent, this is the consider the set of triplets !(*, 5, z) I * an even number, z a multiple of 31 from
only way to assign probabilities to the outcomes o f the composite experiment. among the 63 outcomes in the composite experiment. Instead, we can argue as
To generalize from the above discussion, consider an experiment consisting follows: the probability of getting an even number on a roll of a die is \ , of getting
o f n identical trials, each trial defined by the sample space S with a finite number a 5 is J, and of getting a multiple of 3 is and, consequently, the probability of
o f outcom es. L et P , be the probability measure of the events of S. The ;;mplc the desired event is J • J • J =
7? *P*oe appro p riate for the com posite experiment consisting of n trials is the
-> £ Cartesian pro d u ct S " w here At least one. and exactly k of n independent events
We open this discussion with the following example:
S " * K*i. *a..........In ) I i | is the outcome ol the ith trial, i = 1 , 2 ...........nl
Example 3.11. Suppose A, B, C are mutually independent events with f(A ) -
An event B (that is, a subset o f S") is said to be determined by the ith trial if l\B ) = l \ 0 = p. Find the probability that (a) exactly k (* = 0 ,1 ,2 ,3 ) of the
fi= 5 X 5 X ...X 5 'X C X 5 X ...X S events occur, ( 6) at least one of the events occurs.
t Solution
ith trial (a) We shall calculate only the case k = 2. We see that
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
nun i tu ititi t ttttmtf ttmtm,
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
%
%
%
*
%
*
«
«
«
*
*
Scanned by CamScanner
Scanned by CamScanner
A
A
à
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
r n n n n m u in T n ir m r m m n ?
Scanned by CamScanner
Scanned by CamScanner
%t ^ v v w w W W W W X ^
Scanned by CamScanner
Scanned by CamScanner
^ v n m m m u m iiu n //7 7 7 7 7 7 ¡
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Iffftfttfîîtîi ff ttf ««««« « « « « « • • • • •
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
f* / hnhohih’rv ---* • — *•'• *
£ ! £ £ i , ' V ’' X T " " ................. b(k,n , p ) increases with k i ( k < ( n + 1)p and decreases with k if k > (n ♦ IJp. If
(n + I)p is an integer, say equal to m, then b(m — J ,n , p ) - b (m ; n r p). The integral
part o f the number (n + 1 )p represents the m o st probable num ber o f successes.
« r s t t i r s s - - - ... If (n + IV? is an integer m, the largest value o f the probability b (* . .. p ) i> atiained
for two integers m - I and m .
1-(1
For instance:
g(Why?) Hence we are given that (a) Suppose n = 20 and p = 0.30. Then {n + I)p = 6.3, so that b ( k ; 2 0 ,0 .3 ) in
creases monotonically as k goes from 0 to 6 and then decreases as k goes from
1 - ( I - p)* * 0 .9 9 9 9 3 6 7 to 20.
sat is, (/>) Suppose n = 24 and p = 0.4. Since (n + I )p = 10, an integer, b (k , 2 4 .0 .4 )
increases as k goes from 0 to 9 and decreases as k goes fro ir 10 tcf24 with
(1 ~ p )* ~ 0.000064 b(9-, 24,0.4) = 6(10; 2 4 ,0 ,4 ). ~
[Hence I - p = 0. 2 and, consequently,p = 0.8. (c) Consider the graphs o f binomial probabilities in Figure 1.3. Figure 1 .3 (a)
<ff) The probability function o f X is given as corresponds to n = 1 0 , p = 0.25; in this case, (n + l)p = 2.75 and the maximum
value is attained for k = 2, the integral part o f 2.75. Figure 1 .3(b) corresponds to
n - 5. p - 0 JO. Here (/i + l)p = 3.0, an integer, and the maximum value is
:* W - A r ) = Q ( 0 ^ (0 .2 )-*
attained for k = 2 and it = 3 .
(b) H ere we w ant P {X > 3). We have Example 1.6. Thirteen machines are in operation. The probability th a t, at the end
of one day, a m -chin- is still in operation i f ' »o. ;f the machines function inde
pendently, find the most probable number o f machines in operation a t the end o f
that day and the probability that these many machines are operating.
= 0.9830
« n g the table.
Scanned by CamScanner
Scanned by CamScanner
(fc) We want lo firK) P(X = 0); this is equal to until he misses a shot. Thus, as an idealization describing these situations, the
experiment consists of a sequence o f independent Bernoulli trials with probability
of success p on any trial, where 0 K p 1, and the random variable X represents
the number o f trials required for the first success to occur. The random variable is
(3 '(3 commonly called a geometric random variable; it is also referred to as the waiting
time for the first success. It should be realized that-unlike the binomial distribu
(c) The probability o f a I least one defective tube is tion, where the number of trials isfixed-in the present case, the number )f trials
is the random variable of interest.
The possible values of X are obviously 1 , 2 , 3 , . . . . and
*" (3 X -r
/the first r — 1 trials are failures\
\and the rth trial is a success /
However, an easier way to compute this is to note that Therefore, since the trials are independent.
( ' 2\
n x =r) = { \ - p t ' p . r - 1,2,3,...
Solution
IA The G eom etric D istribution
(a) The distribution o f X is clearly geometric. Since p = 0 .0 0 2 , we have
The geometric distribution finds applications in situations o f the following
M ure: A person tosses a coin until heads show up for the first lime; or a basket- P(X = r) = ( \ - 0 .0 0 2 /" 1(0.002)
*11 player attem pts a basket until he scores one; or a billiards player keeps shooting = (0.998)r"‘(0 .0 0 2 ), r = 1,2,...
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Uííftnmi imiiiinmmmm
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
» v u u i I I 111 i 11 i I
Scanned by CamScanner
Scanned by CamScanner
Hence
The pdf and the D.F. Tor the Cauchy distribution with b * 0 and a * I are drawn
in Figure 2.13. The reader will see a close resemblance between the above graphs
and those forihe normal distribution. However, it should be realized that the t* j
distributions are quite different.
(*)
Figure 2.13
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
J0UUBOSLUBQ Aq pauueos
Futvrtnnt „/ a / jf-y
" " ’" , W " n * ««« * <1« p r o : « , « S . J ? , ,n d “ J ™»
m p* « o » iiy p o m u < S l rea! „ U m b n ¿ T . " ! * f“ nC" ° n fr0 m * < » R . «hill nol verify th e iw tlx in hul will te n ? , n » such Hrn n by the
F o r any p o m , i e S . i , „ M in e à hy M X Ir» W* « 1 denole lhl5 functjon Hon of an r.v., it follows that lt(X) is a random variable
Lei us denote the random variable MX) by Y. Now. as we are aware an rv
induces a probability measure on the Borel sets of the real line In our dacimion,
ft tt io find MX) ai * * c r.r*« r j » Uiere are two random variables involved, namely X and Y, and ihese wifi indure two
probability measures which, using our previous notation, we ¿hall denote respective 1
md /«. « indicated m Figure I . | ^ nUmb<f X{s) and for ,I,IS ^
ly b)7*r aiid?r . Thus, for any Borel sei B o f the real line, we have
5
/* (* ) = />()* I * (!)€ 0 1 )
and Pr{B) = /»()j I Y{s) e B\) = P[)s Ih(X)(s) ( B\)
The question is, how do the two probability measures Px and PY relate 10each
other? To answer this, suppose Cis a Borel set of the real line. Let
Z M x f R | h{x)eCl A =\seS\X(s)eB\
A is a subset of S that consists precisely of prcimages (under AT) of members ot
B which, in turn, is a subset of R and consists precisely of preimages (under h) of
the members of C. (Sec Figure 1.3.)
Figure 1.1
Figure 1.3
The l e t C e ^ K / i ) * e x { f i ) ' r Y{ Q
Figure 1.2
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
ttm w v W W W
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
. « 1 1 1 11 m « \\W \\W \\\\\\\\\\Û
Scanned by CamScanner
Scanned by CamScanner
hnc rronchnny i henry ana Applications
Expectation-A Single Variable / 243
expected number o f defective items, given that there are at most three
live items ' ' The proof depends on the easily proved fact th3t if the distribution o ( X is
t f X h is a continuous distribution with the pdf given by symmetric about a, then X - a and - X + a have the same distribution. (See Chapter
6, exercise 2 on page 217.) Therefore.
**> = 0< JC < 2
elsewhere E ( X - a ) ^ E ( - X + a)
W O = b - f F (x ) dx. Thus
Scanned by CamScanner
rffX ilillll
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
1 11 i 1 1 1 MU w w w vw w w w x
Scanned by CamScanner
ifff f f f f t t l t U l W U i n u u
u u u u u u m i i «
Scanned by CamScanner
Scanned by CamScanner
n * 1 , 2 __
B inom ial np np (I “ p)
0< p < 1 ( x ) pX ° ~ P )” ~X' x ~ 0 i ____ "
1
G e o m e tric 0< p< 1 p d - p / ' 1, x = 1 .2 . . . .
P
I
/> 0
Poisson
X >0
x. 0 , X/ \t
x!
¿V= 1, 2, . . .
« = 1 . 2 .......... N œ j G ,
H y pcrgt c m e tric * np rtpi 1
\ /N \ * ~ ° ' 1--------" P )\ N - i )
.......... ' 0
* T h c c o m p u la lio n o f the e x p e c ta tio n a n d v aria n c e arc left to th e ex ercises. (Also, a n a lte rn a te m e th o d o f c o m p u la tio n will be g iv e n for
th ese eases in C h a p te r I I . )
__ '__ e - ( x - a ) 7 / 2 b 7
-°°<a < 00 f(x) =
/>V2 tt a Aa
N o rm a l b > 0
- o° < X < 00
\ e ~ kx . x> 0 1 1
E x p o n e n tia l X>0 /<*> = 0, else w h e re X X5
1
!
— XP ~ ' e - K x , x > 0
00
V V
s v
G am m a* /< * )-; r ( p )
■
0. e lse w h e re
11
1
11
j
i
11
j
Scanned by CamScanner
Scanned by CamScanner
M
f U tttlU l\ì\ììììn \\
Scanned by CamScanner
u m u u u a u u iilli/ttö z flö n
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
?î m m f11111fmum\\\\\vvs\v
Scanned by CamScanner
\\\\\\\^
m
m
im
n
n
u
u
u
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Joint anJ Margino! D iuntnitinni / 291
r ]0. If X and Y have • joint distribution F, show that 16. Show why the function / defined by
Fx ( x ) * F y(y) - I < F\x, y ) < yJFx {x ) ■FY(y) „ . |or(3x-A 0<*<2, -i<^<4x
f for ill x . y. Hint: Consider the regions o f the xy-plane /(*< y ) ~ j q elsewhere
11. If the distribution function of X and Y is such that F[x, y ) - u(x) • v(y) for cannot represent a pdf for any choice of c.
[ e ra r x - y ( u l t * function of x only and v is a function of y only), show that 17. The functions p defined below represent joint probability functions for appro
priate choice of the constant c. Determine c.
•a" u(*) = :
*-►- lim p(x) x * 0 , 2 , y 55 - 2 ,- 1
elsewhere
12. The jo in t D.F. o f X and Y is given by X = " 2 ,0 ,2 , > -= -2 ,3
\c \x -y \,
<b> p ( x ,y )-
x<0 elsewhere
or ><0 |0 ,
0 < x < l and 0<y< 1 18. Determine for what constant c the functions given below W ill represent pdfs.
fb ,y ) = Kx’ +X), 0 < jr < 1 and y> 1
è 0 '2 + > ). X >1 and + 3y), 0 < x < 1, 0 < y < 1
0<><1 (a)
v i. X >1 K * .y ) = elsewhere
and y> 1
Find: (a) P Q < X < 1 , (b)
l(x, y ) = elsewhere
(b )* (i< J r< i, y > { )
(c) The jo in t pdf o f X and Y I cxy. 0<><2
(c)
(S )P (X > 7 Y ) l( x ,y ) = elsewhere
( e ) W '+ y < l )
(d)
13. For each o f the discrete probability functions given below, obtain the joint H * .y ) = j 0, elsewhere
distribution function. . '
(<■; | ce*. y< C , x< 2 y
(a) K x .y ) = (0 , elsewhere
^ vi = l i ( 2 x - y + l ) , * = 0 , 1 ,2 , > = 0 ,1
10 , elsewhere
(0
(b) _ /r v i . I » x ( x t y ) , x = 2 ,3 , y = - 1 ,0 ,1 f(x .y )=" | o , elsewhere
* X , y ) ~ |0 , elsewhere
19. Two random variables X and Y have a joint discrete distribution with probabil
(c ) i t y function p{x, y ) as described in the following table:
J . i~ , x = 1 ,2 ,3 , > = 1 ,2
p(x.y)= j 12 ’ ' ’' 1
( 0, elsewhere -3 0 2 4
J.
14. For each o f the jo in t p d f’s given below, obtain the joint distribution function. -4 11 0 £ A
X
3 0 11 0 Ä
(a) 0 < jc < I, 0 < > < 1 5
A 0 A
12
elsewhere
15. Show w hy the function p given below cannot represent a jo in t probability u K ' ' x>0- ' >0
l\x ,y > - | elsewhere
unction for any choice o f c.
Find:
\c x ( .2 x - y ), * =0,1,2, > = 0,3
f a y ) - j0 elsewhere
(a) P { X * Y < 2 ) (b) P { X > 2 Y )
Scanned by CamScanner
n I Basic Probability Theory and AppJicgijo m
K x ,y )= Sq4^ 1 -jc ), 0 < x < 1, 0 < j.< Let us prove this. Since (X, K) « a b.vamte random vector, by definition.
elsewhere Similarly,
Scanned by CamScanner
wmm
« m m u t nim m !
Scanned by CamScanner
Scanned by CamScanner
*** Joint and Marginal Distributions ! 299
,J w,/ * • r>
1 ...
Solution. The probability function o f * U obtain«! by summing the joint proba
bilities with respect to ill the possible values of y. Therefore, for X - I . ,
V '
* \ y\ >2 •• . yi ... W =x)
X\ pi*\.yi) #**». > , ) ..• p(x%
.yj) ... 2jiK*i ,y/)
= i ( ( * + l) + (* + ° ) + <x + 0 + (* + 9)]
Xi p ( X i ,y t) p(x 1, y 7) .. • pixz.yj) ... Z p l x , ,yj)
thus
4x ♦ 11 - 1 A k
H X ‘ x )=
xi p (x i, y i) P(xj, y 2) . • Pbi. yj) . .. I Ptri. >,)
Similarly, the probability function of Y ii given by
As can be seen, the totals in the vertical and horizontal margins in fact represent, Hence
respectively, the probability functions o f X and Y. It is because of this feature that
> = - 1 .0 ,1 ,3
the individual distributions o f X and Y are often called the marginal distributions.
In Section 2.1, we saw that distinct joint distributions a n give rise to the same
marginal distributions. As another example o f this, consider the family of joint Example 2.4 (The trinomial distribution) Suppose X and Y have the trinomial
probability functions given below (where 0 < c/2 < ^ (why?)): distribution with n trials and parameters p. q. (See Example 1.9.) Find the
marginal distributions of X and Y.
1 I
r € 1 - 9
where i = 0,1, . . . ,n ; ; = 0 , 1 ..........n - L Therefore, for / = 0 , 1 , we have
i , f 1 € 3 S_
18 2 18 2 18 18 n-i
3 € 2 £ f ( X m0 m U V ' l Y=l)
_S_ /■o
0
18 2 18 2 18
= j h ilflifi- / - / ) ! M v - p - i f " 1 .
i_ _5_
W -7 ) i
18 18
This table describes a family o f joint probability functions for different values
o f €. But no m atter what e is (as long as 0 < c/2 < A) , we always get the same
marginal probability function o f X, n a m e l y , = * ,) = | P {X = x2) - and
P (X = x 3) - and the same marginal probability function of Y, namely,
= ( ^ ) p ' [ < i + ( i - p - « ) r - i = ( " ) p ,( i - p ) " - '
W V i ) = i P {Y = v,) = and P {Y =>*,) =
Example 2.3. I f X and Y have the joint probability function given by Thus, i f X and Y have the trinomial distribution w ith n trials and parameters p, q,
then X has the binomial distribution w ith n trials and the probability o f success p.
P {X = x , r = ^ ) = i ( j r + / ) , x = 1 ,4 , >» = - 1 , 0 ,1 , 3
Similarly, Y has the binomial distribution w ith n trials and the probability o f
find the marginal distributions o f X and Y. success q.
Scanned by CamScanner
r " ...............— p p t^ o n
2.3 The Absolutely Continuous C ue Joint end Marginal Distribvlkms /
f r M = f mf ( x , y ) d x . - « < ,< „
/Ut.y) = 0
F A U ) ~- » ) = ^ ~ ) = j“r K x .y )dydl
Therefore,
• fx b )= r O d y + f ' ^ H x t y d y + r O d y
A similar argum ent shows th at Y is absolutely continuous w ith p d f f~ f ( x , y ) dx. Example 2.6. Let
I f 0 < j r < I, as can be seen from Figure 2.1, If 0 < x < I , it can be seen from Figure 2.2 that
if y < 0 or y > 1
f(x w) = I®' if > < 0 ix(x)= i ° 0 d y ♦ / * lQ r V d y + /~0d>> = Sx*
Jy y } |i( l+ * H 0<y < 1 ° x ,
Combining these results, we get
Is* 4 , o<*«;i
/* (* )=
elsewhere
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
t Probability Theory and Applications Conditional Ditlri
«
to ,io n ,a n d M 'p r n d e n , Random Variable I S 13
■a " )
,h' C° n d i,im a ,d is' Tibu,ion f unc' k” ' o f the random variable X given
frm ple I I. Let X in d Y hive i joint distribution given by the following pdf:
lU, elsewhere
■ind:
(c) The conditional distribution function o f * Riven 0 < Y < 1
1 8 « !,
A (20«3- I ) ,
1,
o< u< i
J< u< I
u> I
The discrete case
Suppose X ind Y have a joint discrete distribution w here the possible values o f
WK=v. V=„A
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
r
?
Ip / Basic Probability Theory attd Applications ■^ 4
liution Solution. Right away, we get the marginal distribution functions of and ^ 35
(a) It a n be shown thil the marginal pdf of ATis (0 , x<0 3
*<0 Fx (x )‘ \im F k ,y ) = 0< x< l 4 ?
ix to '
*>0 I. x>l
inct Jyurtvl x„) is defined only when fx (x„) > 0, we can consider the condi- and
onal pdf only if xa > 0. Now, $
(0 . >’ < 0
Fy(y) = lim F[x, .v) = ! y( 1 + y - / ) . 0* y < 1
'= 1le°’“J
J '> x „ « --- 1 1, y > 1
herefore, In order to find the conditional pdf’s, we require the joint pdf and the margin.l
pdf’s. These are obtained as follows:
><10
/m O -lx „ )= & £ ^ = 3J „ 4 |2 ( x + y - 3 x / ) . 0 < x < l. 0 < j-< 1
/jKv*o) y> xo / (X’ > ) = M 10, elsewhere
A 3 < Y < 4 1X = 2 )= F r lx ( 4 12 ) - Fm (312) (why?) (c) We shall first find / V > } | Y = y 0) for any 0 < y 0 < 1.
« p
= (1 - e ‘(,-1)) - ( | - e-<3-l)) n * > J l > ' = . > '„ ) = l - P ( * < i l l ' = >’o)
« p
- 1 - £ n ix iy f.x \y 0 )d x
*ample 1.7. Suppose the joint distribution function of X and Y is given by fp
_r 2j£ l p z l S & d x
0, x<0 or y <0 , 1 + 2 > o -3 > 5 t p
x y ( x + y - x y J), 0<x<l and 0<^<1 « 3
= 1_
f\* .y ) = { y ( i + y - y ), x>i and o<^<i 4(1 +2>-0 - 3 > J )
x, 0<x<I and «P
1, x> I and y> 1 Hence, substituting
IP
ind: ' V > i i r = i) = i
(“) the conditional pdf o f Y given X =x a
(4) the conditional pdf o f X given Y = y 0
{c) P ( X > \ \ r = i)
« -
Scanned by CamScanner
jp/naill <ivtfbM,!,^ t HWNa>KIM<l|lf«UUll|UIU
Conditional Distributions and Independent Random Variables 323 /
IT t h e j o i n t p d f O f X i n d K is g iv e n a s f i x , y ) * K u(x. y), w h e r e K i s , c o n s ta n t
Therefore,
S rs
r ! . This
M follows
i * n° because,
r ,0 COmpU,e " * COnS,an' f0r finding ,he condition.)
for instance, '
- 1/4
fx \ r t o \ y )* — — = - K ujx .y) _ _«(*.
_ y)
_ _
/ i ( x .y )d x K j\(x ,y )d x f ~ u (xt y ) d x • Example 1.9. (The standard bivariate normal distribution) Let X and Y have the
Ihere K cancels out. joint pdf
( 0, elsewhere
= j * ’ §Vi ->,2(l + 2 y 2), -l< y <1
• °> elsewhere
elsewhi Therefore, from the functional form o f the conditional pdf, we recognize that
the conditional distribution o f Y, given X = x, is normal with mean px and
ext, we note that
variance l - p \ Similarly, the conditional distribution o f X, given Y - y , is normal
K {x7 + y l ) rr^p 0 < x < V ^ 7 l with mean py and variance I - p2. a
f ( * . y 0)
0, elsewhere
It follows immediately from the definition o f conditional probability density
berefore, for - I < y 0 < 1, function that
( _____* ¡ ¿ ± £ 1
f x \ r ( x [ y 0) = j K - i> /\ - ^ ¿ ( 1 + 2 ^ 5 ) ’
-y /i^ K x K y /T ^ V l -°°< X < 00
-« < « < «
I o. elsewhere
3 fe L tz a
> / F ? I < x < %/ p ^ ! Evomple 1.10. A nonnegative number X is picked with the probability law given
= { 2 y /l-y U :* 2 y iY
by the following pdf:
0, elsewhere
_ | xe~x, x> 0
(b) Substituting y 0 = the conditional p d f o f X given Y = ^ is /* < * )• <0,
elsewhere
w w v j) . I f »• - ¿ 2 1 ' If X x, a number Y is picked with the uniform distribution over the interval
|0 ,x j. Find:
(a) the joint pdf of X and Y
Scanned by CamScanner
m
Scanned by CamScanner
Scanned by CamScanner
f I B ttK ProboM tfy T krorv a nd A p p lr v lto m C o ^ O ^ ^ < ’^ a ' K niam Y^ tlm ^
aicntly, X«« variables X and Y have the following joint
Example 2.2. Suppose two random variables a a
'nefim tkm 2 Two random vari.Nes X and Y are a id lo be independent if for distribution function:
x <0 or y < 0
V P*,r ra l mtmbm * «nd> the two event! ls|Jf(j) < x t and l i | K(i) < y\
! independent. In other words a x ,\. i ’- V * * ' - » . x> y ,n i
, - e~* + 2r “<Jt*jr,_ 2e~y, 1,1(1 y > x
l\X < x . Y < y ) = P { X < x ) 'P { Y < y )
Show that X and Y are not independent
v every x and v. This criterion, expressed in terms of the distribution funcMor.»,
Hates that two random variables are independent if and only if F[x, >>) Solu,io, In Example 2.1 of Chapter 8 the marpnal distribution functions were
* fy ix ) * F yiy) f°r every pair of real numbers x. y. found as
i
That definition 1 implies definition 2 is easy to verify: one has only to lake |0, x<0
Mi B(“*°»x l *nd f ij= (-°®>y ].T o prove that definition 2 implies definition I is «*>■ II- x> 0
much harder, and is omitted.
In Chapter 8 we saw that joint distributions uniquely determine the marginal and
distributions. Also, we showed through examples that the converse is false, in that |0, y< 0
it is not possible to determine the joint distribution from knowledge of the F y iy Y 11 - 2e~r ♦ e~7y. y> 0
marginal distributions. We find now that an important exception to this is pro
vided when the random variables are independent, and that, as a matter of fact, Now, if x, y > 0,
in this case the jotni distribution is given by F\x, v) = F%(x) *Fy(y) for any real Fx (x) ■Fr (y) = (I - • (I - e 'v ) * H *. y )
numbers x. y.
We give yet another definition of independence. Hence X and Y are not independent. *
Definition 3 Two random variables X and Y are said to be independent if for To determine whether two random variables are independent or not, there is
every choice of real numbers a, b (a'< b) and c, d (c < d) the paiTS of events actually no need to find the marginal D.F.’s explicitly and then check w hether ^
Ja < * < b\, \c < Y *Zd\ are independent; that is, F{x, y ) = Fx(x) • Fy{y). It suffices to check whether the joint D.F. can be
factored as a product of two functions (not necessarily D.F.’s). one depending on
P (a < X < b . c < Y< .d) - P(a < X < b )P (c < Y < d )
x only, and the other on y only (see exercise 1). On the basis of this, we could
' Jt is left to the reader to show that definition 2 and definition 3 are equivalent. decide immediately that the random variables in Example 2 2 are not independent,
^'H andom variables which are not independent are said to be dependent random without finding the marginal D.F.'s.
llriables. Sometimes one can tell if two random variables are not independent pictorially
*
simply by looking at the domain of the definition o f the probability distribution.
Example 2.1. The joint D.F. of X and Y is given by
To see this, suppose JITand Y have a joint pdf which is positive over the shaded
x < 0 or y < 0 region indicated in Figure 2.1, and zero outside it. (For convenience we are
x > 0 and y > 0 assuming that X and Y have a joint absolutely continuous distribution.)
Are X and Y independent?
■j
i
Solution. The marginal D.F.'s of * and Y are given by
10, jc < 0 V
F x(x) ~ lim F\x, y ) =
y-*~ 11 - e~x, *>0 < «. v )
md
u ►
10, y< 0
FYi y ) - lim f\x , y ) -
X-*• 11 - e~y , y> 0
Scanned by CamScanner
Scanned by CamScanner
\ i Bath- ,mn>b*Miry Theory end Apphccnom C o n d ilio m lD a trU v m K a n d tn d tfeK im llU iid o m Y tm b k s H U ^^0
Comment I fX and Y are independent random mTitbits, then, for m y x„
[* “ ' I * 0* lM ,h" defin,,i0n il 'O“1« 1« " *0 one Of the three
"“ f r I tWI* 10 * ° W ,h"‘ 11" «luwalent to definition 2. Before ¡ \ X = x , \ Y * y , ) ’ H X ~ x t)
- I the equivalence, we dull establish the following result:
W y . , , y . v>-
« X - x , \ Y - y , ) - n Y ._Vj)
=P x M P r M
p y (y /)
, p x { j!l)
^
That IS. the conditional distribution o f X. firen Y = y ,.is the u m e as the ilisrn-
This is true because button o f X and hence does not depend on Y. It is this consequence that might
explain the use of the term “ independence
F[a, = lim F^a, b - - ) Similarly, if and Y are independent random variables,
P [ Y - y ii X = X i) - l\Y = y ,) = P r(y i) ^
" ' Fy (6 " n ) ’ since X and Y are independent
for any v, ^
= Fx (a) • \m F y (i? - jj) = Fx (a) ■Fy(b~) Example 2.3 l f l \ X = i, >' = /) = , i. / * 1. 2 .........show that X and Y
are independent.
We shall now show that definitions 2 and 4(a) are equivalent.
Solution In Example 13 we showed that *
(0 Assume X and Y are independent as per definition 2, that is, F(x. y)
- Fx(x ) *F y (y ) f°r every x, y. Tlicn
P x (i)- i?. i= l.2 ..
¡ \X = Xi. Y - y j ) - h \x j,y j) - F(x], y j ) - F [x,,yj) + F(x1, y j)
and
~ Fx (x i)F rb'i) ~ Fx(x'j)Fr(yi) - Fx (Xi)Fr (yj)
+Fx (xJ)FY(y])
=M - Fx (xH] ■(Fj-CKy) - Fr (y])] Pr(i)= f r /= 1 . 2 .. ..
That is, Since p{i, /’) = px (i) •Py(j) for ever)' /, / = 1,2, , the random variables X and )'
H X = x,. Y - y j ) = H X - x , ) - H Y = yi) are independent.
This implies independence according to definition 4(a). Example 2.4. Show that the random variables X and Y which have the trinomial
(if) Conversely, assume independence according to definition 4(a), lhat is. distribution with n trials and parameters p. q arc not independent.
¡\X =x h Y =y/) = P(X - X/) ■P(Y - y j ) Then, by the definition of a joint distribu Solution. We know from Example 2.4 of Chapter 8 that the marginal distribution
tion function, . o f * *s binomial with n trials and the probability of success p. Also we saw in
F \x.y)= I Z P[X = x,. Y - y /) Example 1 4 that the conditional distribution of X given Y - j is binomial with
y / t y x /< j n - j trials and the probability of success pl{ \ - q ). Since the conditional distribu
= Z Z F\X =Xj) • P[ Y - y /) , by assumption tion of ATgiven Y = j i%not the same as the marginal distribution o f X, we can
//<>• x,<x conclude lhat the random variables are not independent.
= z nx= x< ) z n r - y i )
* ( « ji r/<y Example2.5. Consider a sample space 5 = |j , , , j , , , , , , | where all the outcomes
arc equally likely. Let X and r be two random variables defined on S as shown
=Fx (x)-Fy(y) below x
This gives independence according to definition 2.
Hence, from (i) and (if), we have the equivalence of the two definitions. i no
*1
*3
•i
‘A
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
i/ Beac Probability Theory end Applications Venables /
Con*ditiona/ Distnbutions and Independent Random
Show that X and Y v t independent, and find
M / f t < jt < j)
{ b ) /> (i< K < l>
Scanned by CamScanner
m
t m
w
w
w
m
» m
u m
m
Scanned by CamScanner
Scanned by CamScanner
/
) Bask Probability Theory end Applications ( M y /»««««*« Independent Random Vriabki / U ■
r r t i w I I / 1/ 1 0 < x < I Having discussed the general nature o f the distributions o f several random
fx ix ) = / / fix , y , z ) d v d z = jo o
variables, we now give a result which deserves special mention:
' 0. elsewhere
If X t , X } ......... X„ are (mutually) independent random variables, then any sub
l3 x 2, 0<x < 1
set o f these random variables is independent.
10, elsewhere
Similarly. The reader will get the idea o f the proof in the general case if we show in
particular tha! X tr X 2, . . , X k (where k < n) are independent whenever
/ 7 ‘ 12xV* 0<^< l *2......... X k , X k , X n are. We observe first that, for any u |. w2 . ltk ‘
fy iy ) = J o o
(o elsewhere \X i < U |......... X t < Ufr, X k . x < ° ° , . . . , X„ < ° ° l
= lArl < u l ln lA r 2 < M 2l n . . . n | A r ft< M * i n i A r Jk, l < » t n . . . n } j r n < e* l
_ \2 y , 0< y< \
= lATj ^ U | ......... X k ^ I
10, elsewhere
since \X f <«>1 = 5 for any *.
Scanned by CamScanner
k î î n s î î î î t î
ta
Scanned by CamScanner
Scanned by CamScanner
Functions o f Several Random Variables I ?S3 ^
I Example 1.1. X and Y have a joint distribution with the probability function
mum o f m y(i). For example, if JT(s) - 6 and
We have, for example.
I )
*
I given below.
HV = 4l = IJf = 4, y = 5 l u i x = 4, y = 7l
X \ 3 5 7
Therefore,
2 ± _L -L 5
24 12 12 i &
-L 1 , 1 1 1
4 0 12 4 A A W - 4> V i 2 B 3 &
6 1 f.
6 M A Continuing, the probability function of W is given b y ________ &
P {Y = y) 1 i
A 3 4 r~ ~ 2 3 ____ 6_ &
(c) W = m in(X Y). and Y binomial B{m. p). Find the distribution of Z = X + r. V
Solution Solution. Since the possioie values of X are 0 , 1 , . • ■, n and those of Y arc V
(a) Tlic probability function ofZ*is given as 0 , 1 , . . . , wi, the possible values of Z are 0 , 1, 2 , . • • >n
*
I 3 5
X A +A +
7 9
A + 4+ 6= M
11
t2 + M = ^
13
_L
12
Now
u + y = * i= u = o ,y = if c iu u = i.r = k - i> u .- u tx = k . r= o l «:
flZ = i) M 12
= U \X = i, Y - k ~ i \ %
i=0
For example, %
\Z - 7| = \X = A, Y = 3 i u \ X = 2, y = 5 ( U ! X = 6, Y= 11
Thenfore, %
p ( z = * ) = f ( J f + y = * ) = i{ u ix = i, y = f c - /i) «
Therefore,
/tz = 7 ) = ^ = 4 ,y = 3 ) iw = 2 ,r = 5 ) + w = 6 ,r = D 2 p<x = i, Y - k - 1), since the events are mutually exclusivei
i=0 ’ ’ |
= I) P{X = t)P{Y = k - 1), since X and Y are independent
»=0
s | 0(” ) p ' o - p y , ( ^ (V - ,d - Pr
soon.
Notice that, for example, since X is B{n, p) and Y is B(m, p).
iK - 7 i- u - 2 . r - 7 iu i x - 4 . y - 7 iu u r - 6 . r - 7 i
Therefore,
I I 1 5
i \ V - 7) = 24+! 2 + l2~24
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Functions o f Serení Rendon: VartcMn / S IJ
P 21 B«*h Probability Thenty end Application
Comment There are some situations where one can easily identify the random and
vihablcs V = max(A, K), W= min(Af. Y).T o be specific, consider the distribution
, , . .1 2 (1 -7 ), 0<7< I
I
given by the pdf
I r W - |o . elsewhere
The random variables max(*, , X l t . . . , X n) and mini*, , X i ........ XH) are 100
^>100
referred to as the extremes. Their distributions are very important in reliability and M r)-
renewal theory . To realize the importance of the distribution of these random _ elsewhere
variables, consider the following situation: Suppose a machine runs on n electronic 5. The joint pdf of X and Y is given by
components. Let X f denote the life of the/th component. If the machine breaks
down as soon as one component goes bad, then we would be interested in the distri k
x >0, y > 0
bution of min(AT|, X i . . , X„). On the other hand, if the machine breaks down /( x .J 'H O + J + y ) 3’
when the last component goes bad, then our interest would lie in finding the 0. elsewhere
distribution of maxCJf,. X 7, . . . JK„). where k is a constant Evaluate k and obtain the distribution function o ( Z - X + Y.
6. The random variables X and / have the following joint pdf:
EXERCISES-SECTION 2
Note: Since computation of the integrals in many of the following problems can be „ U ( i♦ x r t * 2- / ) ] , -i<x<i,
J(x, y ) jq elsewhere
quite involved, it will be sufficient to iust set the integrals up.
I. If ATand Y are independent random variables with the pdf’s
Find the distribution of Z = X + Y.
n I< x< 2 7. Find the pdf of Z = X Y if the joint pdf of X and Y is given by
/* (* ) = 10, elsewhere i „-*<•♦ y> x>0, ^ >0
.-<>• fix. y) = elsewhere
y> \
elsewhere
8. Find the distribution function of V = max(Af, Y) if the joint pdf of X and Y is
find the distribution functions o f- given by
(a) X + Y (b) X Y (c) X /Y (d) max(*. r )
2. Suppose X and Y are independent, identically distributed random variables, _ Ixe -x(i ♦ y) x>0, y> 0
/(x, y)
elsewhere
each having the following pdf:
2x, 0<x< I
fx ( x ) = 0, elsewhere
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
m ' V“ * * * * * * i W Ar i, l „
l0 - f< 0
~ i 0y h r « T 7 ) ( o f - 2p, 7 / 3 ' W
'N - O r I - o . I> 0
And, consequently. ¿ - j t - m .i M dy
»0,0, v/l ~ I y (cxpi 2(1 ” P )\<M 0,0,
( rn
Jm ■ ,W. r)(0 ~
- Ii n2e 7‘■ 1> 0
fr ,y(/) = — ±!°2____ I .
'
° l/°
°i/°l ;
ir(a? + a |f 3) » [(o./o,)2 + rJ] ’
&
10 , elsewhere
which, as will be recalled from Section 2.4 of Chapter 5, is die Cauchy distribution
&
Example 3.3. Suppose X and Y have the joint bivariate normal distribution In other words, i f X is N(0, o\), Y is N(0. a]), and ifX a n d Y are independent,
given by then X /Y lm a Cauchy distribution with parameters a = a,/o, and b = 0.
f x / r 0 ) = / '/ f r y . y )y dy - / “ /(O', y )y dy k ' “" 1 2 wc cln T as T = X/Z. As can be easily verified, the
0 distribution of s/YJii is given as
Z>0
/z ( 0 = 2<"'J) - 'r ( 'i j
%
elsewhere
%
c
%
c
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
P f /Utk- tyobeM itv 77,v o n ' and Applicanons
«I- O O Cl
Given that the length is*, the distribution of the breadth Y is given by the follow
r> t ing pdf
ta s 2(x+ y)
1< y< x
£00= (3*J I)’
M r)- -r). 0 < r< 1
(0. elsewhere
elsewhere
Find the distribution o f-
EXERCISES-SECTION 3 (a) the perimeter
1. Suppose X and Y are independent and identically distributed random variables (b) the area.
each having an exponential distribution with parameter X. Find the distribution o f- 8. M X and Y are independent, normally distributed random variables, each
(a) X + Y (b) X /Y Nlß, a1), find the distribution of U - \J X 2 + Y 2.
(c) nax(X , Y) (d) min(X, Y) 9. The joint pdf of X and Y is given by
2. Let 0 < a < i . Two numbers are picked independently, one at random in the
interval [a, ¿ J,a n d the other at random in the interval [-*, -a]. If JITrepresents the (1- - y \ x 2+ y 2 K 1
K x .y )-
number in [a, A], and Y the number in \-b , a \, find the distribution of—
elsewhere
- (a) the sum o f the numbers, X + Y
(b) the product o f the numbers, XY. Find the distribution o f Z - \J X 2 + Y 1.
3. In exercise 2, having obtained the distributions o f X + Y and XY. find £(X+ Y)
and E (XY).
4. If X and Y are independent random variables each having an exponential distri
bution with parameter X, find the distribution o fZ = X - Y .
Scanned by CamScanner
lliK llllitilk
Scanned by CamScanner
/
JVO Htsh Pr\,hcbiht. Thtvry uh i Applicano fu
one as we have scon in Chapter 10. The following equivalent definition, using the Solution. We shall find t \ Z ) using Iwo me.hods
jcint distribution o f X and Y, avoids this.
Method I: In Example 3.1 (d) o f Chapter 10 we found the distribution o f Z as
The expectation o f Z = M X . Y) is defined by
, , . 1 21, 0<i<l
,7 10. elsewhere
-L J . ^ x , ) ’)f(x -y)dy< lx, in the continuous ease
E(Z) = E{h(X. >0) =
Therefore.
— X h(x „ y t )P{X - X,. Y=.Vi), in the discrete case
E(Z) -- f ' t f z C ) d t = f ‘r - 2 t d t = j
^ by " the d,SCrele “ rollow anil0^ b* * -■ since f(x. y ) - I if 0 < * < I and 0 £ y < I , and f(x, y ) = 0 elsewhere. Hence
f T z T x e! y ^ PZ X and Y havt a" absolutely continuous join, distribution. E(Z) = / 1I f 1 max(x,y ) d y + / max(jr,y ) d y \ d x
0 |o X J
Now. if 0 < V < X, then max(jr, y ) = x . and if x < y < I , then maxO, y ) = y.
/ \/" /" (x + y)f(x,y )Jydx
Consequently.
=S S ( x +y ) f ( x , y ) d x d y
X y x 42
letting y = z - x .
=¿ 1 (n + f )
4 [(11+t H ,1 + t) ] = I
Scanned by CamScanner
Scanned by CamScanner
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------------
394 / Basic hnhotuh’v 77'i‘orvand A nnlfrtrior* * i Xpt , w non-Several Randnm I orvM ci /
then states that the expected value of a linear combination o f random variables is
and
equal to the linear combination o f their expected values. This property o f tile
expectation operation is referred to as the linear property.
V a rfn = £ ■ ( ) ' • ) 2 1
= / ’ / * m in (x .y )(x + y )d y * f'm i n ( x .y ) ( x + y ) d y j dx C (X + Y ) = E (X ) + b \Y )
o
Scanned by CamScanner
Scanned by CamScanner
c x/m in io n -a e m iu u n a o m ► « » «
Cov{X, Y ) = E ( X Y ) - £ ( X ) E ( Y )
C o v « Y) = £ [ ( X - u x X Y - M y ) l
= E ( X Y - » x Y - n YX + u x liy )
=E(XY) - tix E (Y ) - u YE (X ) + tix p Y (why?)
= E (X Y )-E (X )E (Y )
Consequently, E(XY) exists if E(X>) < » and E (Y 2) < » . Therefore, the definition
of covariance is meaningful i f £ ( * 2) and E (Y 2) are finite.
Scanned by CamScanner
¥
” " r“ ™ to . £^ = 0 (h ) + , ( S + 2 (i Í ) = tI
O
^ > = 0l £ ) + l l 7 ? M i i ) = 7 f
I*x. y) = Co^ .n = Etxr¡ - e (X)e(Y) 6_
________ V v >K-y)Vat<y) VvaitADVa r(r ) and W n - 0 - l ( ¿ ) + O . 2 ( ¿ ) * l - 0 ( ¿ ) * l - l ( ¿ ) * 2 . 15
Solution. We are already familiar with the fact that X has the binomia! distribution
0 with n trials and the probability of success p, and Y has the binomial distribution
with n trials and the probability of success q.
where i = 0 , 1 , 2 ; / = 0 ,1 ,2 with the understanding thal (*) = 0 if r > * o r Therefore,
r< 0 . '*■/
E(X) - tip, \ai(X ) = np( 1 - p)
Displaying Ihe joint probability function in tabular form yields
and E(Y) = nq, Var<Y) = nq( 1 - q)
0 1 2 HX=x) Also, in Example 15 in this section, we saw that
> <
0 0 ñ is
J.
IS E(XY) = n(n - \)pq
1 ís ñ 0 JL
IS Hence:
2 is 0 0 -1
15 («)
P(Y = y ) -fi.-fi.-L
IS IS IS C o v tt Y) = E (X Y )-E (X )E (Y )
= n(n —1)pq - np • nq
Therefore, ~~npq
£w = 0 + i( ^ ) +2(n )=i intu^Uvely reasonable?* ,hS, ' he C0Varian“ ° f '* thiS S“ ">
E(X7) = 0'~
151
Scanned by CamScanner
/ r _ . 7* » »«• m. . 4 ».
*’ ' "'V • «"« ■I|^»vwfViìt
(b)
p(x, y) = Co* * n
v V ar(A r)» V a r (y )
= — "tino
= - / H a l l
v O -riO -? )
^ ° v ( X K) = -n p ^
>nd y) = - [Z . pq
------------------ v U - p X i - o)
E ( X n Y k ) s‘JT f mx ny kf(x , y ) d y d x
= f lf XXny k • \0x7y d y d x
0 o
■ 10
(* + 2X* + k + 5)
Hence:
(«)
V” « H - ( ! ) '• è
Scanned by CamScanner
Scanned by CamScanner
m
w
f r f f m
t m
m
t t w
Scanned by CamScanner
4 0 R t B a w P m b t N b t v T h e o r y a ltd A p p l n a i m n
M p eeurto*' Several Random I « « N o 1 409
1.5 The Method of Indicator Random Variables linear property of expectation ,« follows that El,AoB) * % > * «W -
For any event A, the indicator r.v. of A was delined in Chapter 4 as one which
that is, P(A UB) = P(A) + PIB) closely linked lo that of the cone
takesthe value I at each sample po.nt in A and the value 0 at each sample point in
A Thus an indicator r.v. assumes only two values, namely. 0 and I . It is so called
because if the value of the random variable is 1, it indicates that the event A has be true. The proof will be left io the cxercise set.
occurred, and if the value is 0. it indicates that A has not occurred. lf A l t A 7........ An are n events, then th e y a re independent if and only i f the
The following identities are immediate and can be proved routinely.
indicator random variables lA(. U t ........ U „ ar€ i,,(leP*nd* ^
(0 Iab = A* '¡ b ar,d*,n general. We shall next find Vari lA). We immediately have EUa )I * 1** /V J) ♦ 0 ]
^AtAt ...A n Al, ((, tA,t
Hence,
(") U = \ - 1 A
VatdA) * n A y n A ' ) for any events
0*0 I a ub = I a + (s “ Ia b an(l . ,n general,
Ia ,»A,U..M A. -- s ' A - I , I A¡A/ * ■ ■♦ (-! r " l A,A ,...A a The covanance between two indicator random variables can be expressed in
terms of the probabilities of the underlying events as follows: Suppose A and B are
. If, in particular, A A .......... A„ ate mutually exclusive, then two events. Then
n
E(I a -Jb ) = E(1 a b ) = I\AB)
and we get
('*') I a b ' ~ 1a ~ U b
Actually, (re) and (iv) follow from (i) and (if). For example, CoyytA. IB) *P (A B )- P(A)P[B). for any two events A. B
lA B t-l A ‘ l B \ by (0
The method of indicator random variables turns out to be a very powerful tool
= (rf* 0 - / * ) , b y (ii)
in many instances, as the following examples will illustrate.
= Ia - 1 a ' I b <
Example 1.18 (The binomial distribution) Suppose X represents the number of
= Ia ~ I a b > by (o
successes in n independent Bernoulli tnals. with the probability of success p on each
Let us now find E{!A ). Since /¿(s) = 1 if and only if s e A. and IA(s) = 0 if and trial. In other words, ATis B(n. p). Find £'(X) and Var(AT).
only if s € A \ it follows that Solution, We previously found E(X) and Var(A) in Chapter 7 using the direct
approach which, as will be recalled, involved some tedious algebraic steps We now
f \ I A = l) = w and P(iA =0) =P(A')
give a much simpler approach
Tlierefore, Let A, represent the event that there is a success on the ith trial, ( = 1 . 2 .........n.
Then clearly
£•(/„)= l-P (A ) + 0 -P (A ') =P(A)
Hence.
whert I Al, lAi........ Ia „ are independent r.v.’s, since the events >4,, A ^ .......... A n
E(/a ) = P(A) for any event A are independent.
It follows that
This result shows that we can regard the probability o f an event as the expected E(X) = E(lA) + . . . + E(lAt)
value o f the corresponding indicator random variable. In other words, the concept
of expectation is an extension o f the concept of a probability measure. Ej K A d = np
This single fact now leads to the various results of the probability measure that
are already familiar to us. For example, since IA kj b ~ U * I b ~ (afl. usin8
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
-/a • Banc PmKihitfty Tlustn aiuf ipphcttions
E xpenal ioii -S r v tn l Ranamn iarubies / 4 1 7
I or - I when. jnd only when, the functional rel.lion between X and Y K |mca, -
Therefore,
thatis. when there ex,st constants * 0 and c for which l \ Y = m X * c \ = I
What significance can he attached to - I <p(JT, K )< I? Tim would imply that E (Y) = E(X2 + 2 X )*E {X 7)* 2 E {X ) = 1
mere . » positive probability that the relation between X and Y is no. linear E (Y 2) = E[{X2 + VC)7) = E(X*) + 4£(AT3) + A£(X2) = 7
and E(XY) = E(X(X7 ♦ 2*)) = E {X ') + 2FJX7) = 2
m a l r 'i l ? ” * T " “ “C^ * c) > 0 ulh" «-«I«. »»
WC in ' l,C JrV P'ane- 'here 15a P°si" ve Probability Hence V*r(X) = I, VarOO = 6 ,and Cov(X Y ) s 2. Consequently,
that (X. Y) will no. be on such a line. The correlation coefficient tl-irefore
provides a measure of .he linear relationship of X and Y. I. is due to this reason
that it is sometimes called the coefficient o f linear correlation. ** j h z = J l
C iim pie 1.22. We give below three cases of joint distributions ofX and Y which is strictly between -1 and 1, as we had anticipated.
(fl) X has a standard normal distribution and Y = I X + 1. (c) The joint pdf is positive over the shaded region in Figure 1.1 and is zero out
(b) X has a standard normal distribution and Y = X 7 + 2X side it. We anticipate here that -1 < p(X, Y) < 1. Not only that; we anticipate it to
(c) The joint pdf of X and y is be negative (why?).
( I--O r‘*y,)/i
f ( x ,y ) = irc ’ *<0 and ;- > 0 , or
l 0, elsewhere x> 0 and y < 0
In each case, comment on the correlation coefficient in the light of the func
tional relation between X and Y,
Solution
(a) The joint distribution o f X and Y is singular since all the probability mass is
distributed along the line y = 2x +«l. Consequently, there exists no joint pdf.
Since the relation between X and Y is linear with m = 2 > 0, the correlation
coefficient has to be equal to 1. We shall compute it directly anyway.
Since X is a standard normal variable. E(X) = 0 and Var(Af) = E{X2) = 1. Next,
E{Y) = E{ 2 * + l ) = 2 Z f 0 m i = l
E (Y 2) = E[( 2X + I )2] = £ ( 4X2 + AX + I ) = 4E(X2) + 4£(X ) + 1 = 5 Figure 1.1
and E (X Y )= E (X (2 X + \)) = 2E(X7) +E(X) = 2
It can be easily verified that the marginal distributions o f X and Y are standard
Hence •
normal. HenceE(X) = E(Y) « 0 and Var(X) = V arfr) = I, so that p(X. Y) = E(X Y).
Next,
0 -
BSB S Bfl i S i
e-ix**y*)n „ „-(**♦ y*)/j
E(XY) = f f x y - d y d x + /**/ x y - dydx
as was anticipated.
(£) The joint distribution of X and Y is singular, since all the probability mass
is distributed smoothly on the curve y - * 2 + 2x, that is, a region whose area is =i i y - '^ r y ^ O y + J~xe-*’n dx i ’y e - ^ d y l
zero. Hence there exists no joint probability density function. Letting u2I2 - 1, it follows that
The relation between X and Y is not linear, so that we already know that
p(X. Y) should be strictly between -1 and I. Let us compute the actual correlation J mue~u>n du =!"e~ , dt= 1
coefficient. Since X is N (0 ,1), we have o o
Hence
E(X) = 0, E (X ') = 1. E (X 2) - 0. H X ') = 3
Scanned by CamScanner
418 'fattc ProSehihn 71t<<nr\ end AppiictttMH
Thu*. finally,
5. Two points arc picked at nmdom and independently inside the interval |0 , .;)
f*X, Y) = - ?
n . Find the expected distance and the variance ot (he distance between the points
6. Suppose the distribution of X and Y is given by the following pdf
*hich is between - I and 0
I (x 0 < x < I and 0 < y < I
0. elsewhere
e x e r c i s e s - s e c t io n I
Find E(mix(X. Y))
a i ° 'nt distribution. For the
7. Let 0 < a < b. Suppose X and Y are uniformly distributed over the intervals
[a, b\ and (-6. -* ], respectively. Find
W ) = _ / xfx(x)dx (a )£ (* + K )
(b) E(XY) if X and Y are independent
O " the other hand, according to ,he definition g,ven in this chapter, (Comment: Recall that in exercise 3 o f Section 3, Chapter I0,£(JT * )') and E (X Y )
were found by actually finding the distributions o f X + Y and X Y . There is no
£(X)=_f~ f~x f(x>y ) d y d x
need to go this route!)
8. If X and Y are independent random variables, why is it true that E {X fY )
How do you reconcile the two definitions?
- E ( X ) ' E ( \fY ) l What restriction would be required on J ? Use the above result to
U X and Y have ,he follow,ng joint probability function. find E (X /Y ) for the random variables X and Y described in exercise 7
9. Suppose Z is uniformly distributed over the interval f0,2irJ. Define X and Y
0 as follows:
-I 0.1 0.2 0.1 X = cos Z and Y = sin Z
0 0.1 0.1 0.2
2 >0.1 0.1 0 Show that—
(a) X and Y are not independent
find—
(b) X and Y are not correlated.
(a) E(X) (b) E (X l Y) 10. If ATand Y are two random vanables such that each assumes only two values,
\ y + 1/ then show that Cov(^f, Y) - 0 implies that X and Y are independent. Hint; There is
The amount X (in dollars) that a babysitter cams on a weekend and the amount
no loss of generality in assuming that X assumes values 0 .x and Y assumes values
hat she spends in the following week have the following mint Hktrihnti
0,j»(why?).
11. Suppose X has a distribution which is symmetric about 0. Let Y - X 2. Show
1 3 6
that X and Y are uncorrelated. Are they independent?
2 1 i o 12. The dimensions .Y, Y, Z o f a rectangular parallelepiped are known to be
4 i i * independent random variables with the following p d f’s:
7 0 i J
11, I<x<4
Find: /* (* ) =
10, elsewhere
(a) the expected amount earned
(b) the expected amount spent Mr)« I !(3 - y ) ,
10,
I < y< 3
elsewhere
(c) the correlation coefficient between the amount earned and the amount
spent and
4. I f X and Xhave an absolutely continuous joint distribution with pdf given by
1< z < 5
/z W *
elsewhere
i f c r t - L 24* 1 - * *
IU, elsewhere Find:
find: (a) the expected surface area
(a) E (X ) (b) E (X 2Y) (b) the expected volume o f the parallelepiped
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
( Recall that if a ** b, llicn ac ^ be il c > 0. and ac > be if t < 0.)
and, consequently,
Therefore,
, .. , f t / 7 I cos 0 d 6
the needle intersects a line) = J —~— —
-*/2 " ”
I I'*2
= — sin e \
na |-»/2
= 2/
ita
EXERCISES-SECTION 2
1. Two discrete random variables X and Y have the following joint probability
function:
2 3 5
1 JL 1 ±
18 9 IS
2 ± 1 1
18 9 9
3 0 I 1*
[I [I _ 2
P (X = x t Y = y )= ‘ n(n + 1 ) ’
y- 1 , 2 ______ x. x = 1,2,.. • »h
/* (* ) =
(o, elsewhere (o, elsewhere 0, otherwise
Scanned by CamScanner
*32 •Bask Probability Throrr tmf .ippluvrions f x p o loii'in Srxc'oi Random I anjhltr / 4$.t
F in d :
II. A point A', is picked at random in the interval | 0 . I ] A second point X 7 is then
(a) the regression o f Y o n X
picked at random in the interval |0 . V,] Show that ihe distribution o f AT,» identi
(b) the regression o f AT on Y
cal with that of YxY, where Y ,, Y: are independent random variables each having
the uniform distribution over the interval |0 . 1 1 Hint: II 0 < w < I .
fix.y) ik x y 2,
!o.
0< y< x< \
elsewhere
fin d -
(a) t \ Y n I AT= J)
(b ) V a K H A T H ) •
8. Tw o random variables A' and Y have the following joint pdf:
i(x,y)=j y 0,
0 < x< y< °°
elsewhere
If ^ o > 0 , fin d -
(a) E {X n l^ o ), where n is a positive integer
(b ) VaKATI^o)
9. I f AT and ) ' have the joint uniform distribution over the circle with radius 1 and
centered at the origin, find E (X n | Y = y 0), where - I < y 0 < 1 and n is any non
negative integer.
10. Suppose X and Y are independent random variables, each exponentially dis
tributed w ith param eter A. Find the p d f o f Z = X /{ X + Y).
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
444 ' /w «*/
1*ri r*0mg ¡•IMi'ilom i 44.s
« í r r " 1 :,M,b' «■•».«.»■.*■ +t í hm u«rJ *hn rr«): lu .^ aw «h? «*fí oí ihf Knomul dMnbuimn
H * p U rm ti* mtf ft* It* b cw vl'i 4»irrt*txw» lU .d l fluì if X n fiin r ) iheti
*M » ) ■ <*V ',V ,fí * r ***wVfIl
Í O . W « m tM « X • », ♦ ♦ K * h y* w w drprndnit
Inxm H if y V fatli will» p W H « p S**x M yfs)mpt* * H ‘ PH a i * * -•
Wmd“ KW U * '* Ó . «W* », «I. «Tf |T?
H , i t \ - * r ifiMr U) WrJ H -
1 * 0 - * <| <Ä
W :l*cf-f r^Sìnn rM t( 1 kw tfc* «*fJ<nre hwomul ^*nN»«—- « A r>*iaro
rtifM . tWn X M X * fi * 4 ♦ï# ï";
f c r w Z 'ÜÜ,"“ ' Z *""* ’* " ' * '• - ' * » * < ’ — <* * « » * .o K* tr-wprpiW» r f \ f.«> «»rh fhf u n # fr<K»««(fx +* >»HjI4 Kj»*
,c , * " " •'■— > • » « ___________* i.i»nd tb* tn%1 h* rl^> NhAm I 4*i*nM»iio« frapi fbf *•« f^*rte\n>.
, (rm ial io ihn purpow ” ^
are w t « * * •« tv •hütlkmtirm
TV r m i l . « h( v rn'V J » u*i.*n l »«f»M»rw Thinn>m If ram rjm t m r*rafct*n rh< urw m%t t iHn»
h av e rK< u n v » tl^iri^ H itw io , üv! ;.'v »iv«rw ly
* L y .* r l
h trtfc*! 'mntU if X ü1«l ï sr* T*n r * 'i t ^h«i V jl 1 } * MyU )U * f .
• W| ijs j * k lj**•?«
Fh^r wporUiKic oí rh< jr^u«nc« ^«i-.nrm i m « rro«j foü<7*«n| V é ^ v c
»<• *íi*i »•> M * « ¿»tnfctt-Mn o f Z B ^i.T Y t muí « « j ^ w u * »<j ^ U m «
« ff lí »» rumi o«i ilut tfc» ü» j ^ f.jrm u rt« n^f ><* » , c
I h c a K * r trvW í ,* * S f
<T*. I , P n i i H / « i |.»
to j r . fa x * „ « , ! «
* í,t,
c' c * '»
* «Jmnfctítmn íam .'n» f then tí o
P W h> the Jaií»b«»)OB • F
I. vinírvV * e ol / «
c
tf/ÍM = l / | W ,í <V, U ;'
f I 1 v ? c .~ i « . H w „ ' l W h ,S f ( 0 * .» » ,.
In p»fíKuUf if 1 » X, * Xj * • X# tW& 0> I-«A.ÎMB3
ï
/M
I ,- i
Thaï r /V « f[f i’/' ffcr o (i of micptmkmt r i i ti rq+a ;o r v
r*xhh*.i c f rk<tr *t¿ ri
* » )•
10
V^ a * « | T i f » d tì-r U
u» X* ï ï r
Scanned by CamScanner
Scanned by CamScanner
44$ f Besa Pntfh/hilii i /WvtitJ \¡>o!h*¡io>is /
(/CHCryi'H# Function* 449
" " , 7 x< ,s t ' - n ......... "• f" f " * ,/,f The reproductive property o) th e n *<nnuldistribution.
. .. - — ............ SupposeX
, • ■. \. X 7. . T,
. . Xr
, variables and that X¡ is M u ,, o f), t = 1. -• . . . r. I hen
L ~1 ^ ,~i 0 |^ 1 ) ' ,n P ^ W iU r. if the random variables are identically dis are independent random
MXi(s) = eti<s*(a' i r ) . / = I - -■ ,T
t r i b u í will, a common mean „ and a contmon v a ria ,« then Jf is,V(H.
Since X i . X 1.........X, are independent, writing Z = X , + X 2 + . . + Xr>we gei
Var( I A,) = I Var(A)l. The important fact that is brought out in the above dis
....... —
lishl m T er,y °rf ,hCbin°"'ial and PülSS0" diS,ribuliu"s* “ «'*■ cussion (hitherto not proved) is that the distribution o f X X, is normal
i=l
we h n,gf S.
The reproductive property o f the chi-square distribution. Suppose X , , X i t . . . . X r
7ft* reproductive property o f,h e binomial distribution with parameter p are independent random variables where the distribution o f X, is chi-square w ith
u p p o se* ,, * , Xr are independent random variables where X, is Bln- c l //,• degrees o f freedom, i = 1 .2 ......... r. Then from th e com m ent which follows
‘ - ' ■ 2......... '• T l'™ * * ¡ « = !/* * + <I 1= 1 i r. part (c) of Example 1.2, A M s) = ( I - 2s)~nfn, i= 1 , 2 ..........r.
Therefore, if 2 = X, + Jf, + . . . + Xr, we get H ence,lettingZ - AT, + . !. + A'r
Since this is them gf o f * » , ♦ * ,+ . . . ♦ „ ,.p h „ follows that the distribution But this is easily recognized as the mgf o f a random variable which has the chi-
square distribution with <1 , + n , t . . . ♦ n , degrees o f freedom
or A, * u i. . ln° mia! Whh * ' + "> + • • • + n, trials and probability of
success p, thereby exhibiting the reproductive property. Hence, .„ conclusion, if ......... * r are independent random variables
where X, is chi-square with H, degrees of freedom, i = 1 2 r then
The reproductive properly o f tlie Foisson distribution. Suppose A' í= 1 -> r •V, + . . . + Xr is chi-square with «, + „ J+ . . . + deg[ees o f freedom
are independent Poisson random variables where the parameter o fX is X Then ’ ’
% fW = e*'(' ( = 1 . 2 ......... r.
LettingZ = Ar, + AT2+ . ..+ATr ,
: : h n d ,he -W rtbuU«" o f the total kinetic energy o f a l l l l
Mz {s )= e K^ ~
= C( W .. .♦ArXi'i -i) r cm/sec is giv'en'« ' Par" C'e 0 fm a “ " « « « « a, a velocity
But this is the mgr of a random variable with a Poisson distribution with parameter lion. Let K,. Ks .........[/,„ lepresem the velocities o f the panicles. Then the
*1 + a2+ . . . + A,.
total kinetic energy Z is given bv Z = 2 - . o y 2 - y 1/2
• ,-=1 2 ~ ¡iy V‘
Scanned by CamScanner
indepttukni. by «he reproductive property of the chi-square distribution it follows 8. Fo, ilic m il » p m » l i t follow,»* c * n . .demify (he undfrlyrn.t i M m
w o f the random variable
that 1 |;A ) hss a ch,-iquare distribution with 10 degrees of freedom Hence
10 I
the pdf of U * 1 y f y is pven hy la)
j m w -{j Éi ,
li .?; *T
,b )
3 r=0
W r t - j p f e » " ''" “ " >0
* elsewhere it) =|4 ^4 /J
From this it can easily he shown that the distribution of Z * 91/ is Id)
My(j)
r
- rì-0 i-Ì0 4r -'^ ll(< :
EXERCISES-SECTION 1
(a) FindE(Xr) a n d £ ( D . r = 0, I . 2 . . .
X and a r I* ; : r , rilt ,and° m Var“ b,e X WUh n x =c^ 1 0blaln lhe »•' (b) How are the random variables X and Y related?
_ Suppose * has the probability function defined by P[X = I) = \ and 10. I f * is uniformly distributed over the interval [a. />), use the mgf ol * lo show
H X * - 2 ) ■ V For any positive integer n, find /•(*” ) in (he following two ways: that
(a) By using the basic definition of E(X")
b
(b) By expanding the mgf oM f as a power series t \ X r) = r = 1. 2 . . .
(r+ IX/>-<7)’
3 A random variable X assumes the three values -2 , 3 ,4 with respective nrobab.l-
11. Suppose * has a continuous distribution with the follov/ing po.'
(a) Find the m gf o f JIT.
/u )* k - « < X < 00
(b) Compute E(X). £ ( * ’ ), and E(JT3) by differentia ling the mgf
4. A fair die is rolled repeatedly until a I or a 6 shows up. Find the mgf o f the (a) Obtain the mgf of *.
number o f throws required. (b) Using the mgf. find E{X). E{Xexl2), and V ar(*).
5. I f * has a negative binomial distribution with parameters r, p, use the moment 12. The mgf of a random variable * is given by Af(r) = (I - s) \ s < I . Use the
generating function of X to find k \X ) and E (X 7) power series expansion of M{s) to obtain E(Xr) for any nonnegative integer r
6. I f * has the pdf
Hint: Use 1/(1 —i) = E / and differentiale twice.
r -0
_ I |x |, -I < x < I
/<*)* I Suppose the mgf of * is given by
10. elsewhere
M(s) = ( 0 .2 + 0 . 8 ^ ) '°
find the mgf o f X.
7. If the mgf o f * is Identify the distribution o f * and compute ^ 4 .3 < * < 7.8), using an appropriate
table.
j* 0 14. If the distribution of * is symmetric about c. show that the mgf of * (if it
M (r)= 65
exists) is given by
(l. 5 -0
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
■/VS ¡k*tc Pmhibif(tv Ithon auJ -1/v*«** ¿trm* Gfumunit hwcliont /457
H o w « ', in llie absence o f sucll knowledge, the probabilities can be obtained by 4 lfA and Y are independent random variables. show that
differentiating and evaluating the derivatives at r.ero.
S»a - i.r(s> = S.vi»" r(j/>•
n v = o )= * ^ K O )= (i-p i" for any constants a. b.
5 For any random variable X. show that
n x = n= = n|ps + (i - p i r ' p j ^ o
V a r O ) = * " ( l ) + * ' ( ! ) - l * '( n ] 2
= (")p ’d - p r ! EiX(X - 1 )(X - 2)) if X is Poisson with paiam eter X. Generalize and tin
£ ( * < * - 1 ) . . . ( * - i r + I ) ) for any positive integer n.
In general, for r = 0 , 1 . 2 ..........n .it can be seen that 8 Suppose 0 < a < 1 .1f the probability generating function o f a random variable
X is given by .
v .„ . a ° J - - p)r v \ ^
¿<s) = (!+ « )+ [(! +o), - 4 « r
UK I
* (")prd - p r
find E{X).
9. If the probability generating function o f a ran d o m variable X is given by
EXERCISES-SECTION 2
J In the following cases, obtain the factorial moment generating functions from Sts) = 3s! - l&s+ 16
the m g f's. _
(a ) X is Poisson with param eter X. use the partial fraction decom position to find th e p ro b ab ility fu n ction o f X
( b ) X is geom etric with param eter p 10. Suppose a random variable X has the follow ing p ro b ab ility generating fu n ctio n :
(c ) X is negative binomial with param eters r, p.
(d ) X is uniform over the interval \a ,b \. ¿ s) = £ ( 1 - 3 . ‘ + 3. » - » “ i i / j 2)«*
(e ) X has the m gf given by
Find:
« (i) = ie - 5I + f„ i-3, + foe-:, + si
< > )« * = 3) .
; . C onsider the following factorial m om ent generating functions. In eacli case, (b) P(X - 5) ■
d eterm in e the corresponding mgf. (c) P[X = 6)
< d ) f( X = 9)
(a) (£ W
5*1. S> 0 11. Suppose X, K, and Z are independent random variables w ith the follow ing
£W = log<5 probability functions:
11, s- 1
C) ^ )= 5 ? +^
I 1 1 3i 5
+ T2s + -12 5 '
4
s> 0 IdT
l \ i f = r) = r - 1 , 2 ,. ..
n2=r) = 3(3)-
3. F o r any random variable X , show that r =0 . 1 , 2 . . .
ftur.4W = * V
Find g x ( ! ) .i v ( s ) . and f z ( s ) . and use these to find th e d istrib u tio n of .V + V + Z .
w here a and b are any constants. Him: D ecom pose th e generating function in to partial fractions.
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
■*'*- rr, / HfotVClhl A pplkVtlOHi limit Theorem ¡uProhahihiy I46.*
Oiebyshev’s incqualiiy is of no value if 0 < h < I . because it does not tell us any Before we proceed with more examples, we introduce the following notation
thing that we don’t already know. For instance, Chehyshev\ inequality would which will be used in the rest of this chapter. We also state a result Irom calculus
which we will find extremely useful.
We should realize that Chehyshev’s inequality makes a very general, all-encom- Suppose X t . X 2, . . . is a sequence of random variables, that is, a countable
passmg statement regardless ol the p,ecise form of the distribution of the . j.idoin collcction of random variables. We shall often write X n , n 2 I. to denote such a
vana e, and. consequently, can only provide a vcrv crude hound. We may be ahle sequence.
, r „ p; r ° " j lm if t ",form;" io n «*a™ |ab|c »r me diSi,ibU- A sequenceX„t n > I. is said to constitute an indepe-.i -nt sequence of random
variables if any finite subcollection of these random vm tables is independen!.
•« « » . w0Uid bc , He t.
a p o s itio n i I ^ 1 With (he aid o fC h e b y sh ev ’s in eq u ality , w e are m Let S„ = I X Then S„ is a random variable which represents the sum of a
l h S : : m 3 m " ' e SUl,, f m ial S' alem ' n ' lh al p robability is in fact less f=l
sample of size n and S J n represents the sample mean. Whenever convenient, we
o l l Z Z Z ° VCT y r ^ ^ iS8iW " V -W . A H e n we can
d o even ^ t e r - ¿ . f r o m th e stan d ard norm al table, prov.de the e x m value o f shall write the sample mean as X„, using the subscript n to emphasize the tact that
b u t in rlw * ’r N<>lh,n6 b e a li kll,>w'ng the exact d istrib u tio n ; the mean is based on n observations.
We know that i(E(X¡) = w, and Var(Af,) = o f, then
t o we 1 n ' >n^ o f" " ^ a lo t o f in fo rm atio n
that we may not have otherwise.
S ince W - p O e X o V . we see th a t A(U r - u \ > e ) will be small if the
v a ria n c e a ,s sm all T hus, f h e b y s h e v s in eq u ality lends precision to the statem en t
} I , Viman':c m c a n s tl,a ' l»fge deviations from the m ean are im p ro b ab le
a n d th a t th e p ro b a b ility d istrib u tio n (ends to be co n cen trated aro u n d th e m ean and. if the random variables are independent,
It th u s in d ic a te s th e sense in w hich th e variance m ay he used as a m easure o l the
n
scatter o f the distribution about the mean.
ZoJ
i=l
Example /. /. Suppose X is uniformly distributed over the interval |0. 2 1 Var(*„) =
(a) Applying Chebyshcv’s inequality, find an upper bound on the probability
n x < 0 .2 or X > 1.8) and compare it with the exact value. In particular, if ^ = n and of = a2 for every i, then
(¿>) Find an upper bound on P(X < 0 .3 or X > 1.8). E ( X „ \-- u an d V aa„) = -
n
Solution.' Si nee X is uniformly distributed over |0 ,2 J . wc know that L'iX) = I
and V a rM = ( 2 - 0 ) 2/12 = i ' The result from calculus that we shall find particularly helpful is the following:
(a) We have
I f j is a given real number and c„ is a sequence of real numbers with
f [ X ^ 0 . 2 or JT > 1 .8 )» P ( lX - l\> 0 .8 ) lim cn - 0, then
W - l |> 0 .* ) < ^ P = 0.S2 We shall accept this result without proof. As a trivial special case of this result,
(U.o)
we have
On the other hand, the exact probability is equal to (0.4)} = 0.2.
(b ) In this case, ^ lim (l + - ) = e°
n-*- \ nl
P { X < 0 .3 or X > \.S ) < P { X < 0 .3 or X > 1.7)
= P { \ X - I |> 0 .7 ) for any real nu~ber j.
< | q 7^ , by Chcbyshev's inequality Example 1.2. Suppose a fair die is rolled thirty times and the number showing on
the die noted each time. Use Chebyshev’s inequality to find a lower bound on the
= 0.68 probability that the total score will be between 90 and 120. both inclusive.
♦
The exact answer is (0.3)} + (0.2)} = 0.25. ■
------Jsma
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
4 72. Banc Proh«biht\- Picorv and Appliranoni ' I ,»*ti: Theorems in Prnhchflnv ! 4 V
4
12 Consider th e sequence o f independent random variables A, where converges in distrib u tio n to X th en one can use Ihe d ,s .n b u n o n X t o obtain the O
HXf= 2') - PU i = - 2 f) = 2-(I ,,|> approxim ate probabilities. (O f course, one assum es th at
a
such lhal th e probabilities can be obtained fro m n easily I This can he p ro ed o
and
any sequence o f r.v.’s as follows: - ____
nA , = 0) = 1 - 2-1* Suppose X , . X , . . . . converges in d istrib u tio n to X . th at is. Fx „ c 8
FX a u h e c o n tin u ity points o f Fx Suppose a and b ( a < b ) are c o n tin u ity p oints
Show that the sequence obeys ihe WLLN
o f Fx . T hen, by the definition,
” ' “ q“'" ce0 f mu,ually i"J'P enJ' " 1 ra"dom l,m Fx (b ) = Fx (b ) and lim /* „ ( « ) = F x ia l £
variables, with £W () = „ and Vartf,) = o’ for every r. Show that
n-*~ n n~r“
hm C
n~* Hence
2. C O N V ERG EN C E IN DISTRrBUTION
n
Therefore,
T h e m ain result o f this section is the central limit theorem. We shall initiate our
discussion w ith th e general notion o f convergence in distribution. 4>(-oo) if M< 0
lim Fx (u ) = 4>(0) if u = 0
2.1 T he G eneral N o tio n o f Convergence in D istribution <t>(°°) if u > 0
XI
( o r in law ) to a random variable X if if u = 0
■ 4
U if u > 0
ti
Jim FXn(u ) = Fx (u) Hence, if we consider a ran d o m variable X w hose D .F. is given by
n
/!*>■
if u < 0
/I
FX (u )= j °
at each real n u m b er u w here Fx is continuous if u > 0
/I
Tilt* is w ritte n com pactly as X„ * X . (The points where Fx is continuous are then X„ -* X . (N otice th at f x n(0 ) - ; for every n and Fx { 0 ) = 1, so th a t
jT
called th e c o n tin u ity p o in ts o f FX ). J i m / ^ ( O ) * Fx (0 ). But 0 is n o t a c o n tin u ity p o in t o f Fx , and fo r con v e rg e n c e
T he im p o rta n c e o f th e n o tio n o f convergence in distribution is to be seen in the
in distrib u tio n we d o n o t n eed convergence at the d isc o n tin u ity p o in ts .)
follow ing fact: O ften one is interested in finding Ihe probabilities associated with
/I
Ihe d istrib u tio n ofAT„ w hen n is large. However. Ihe problem o f finding the distribu E xam ple 2 2 Discuss the convergence in d istrib u tio n o f the se q u en c e s o f r a n d o n ,
lion o f X„ m ay be q u ite com plicated and, som etim es, even if the distribution o f variables X n , n > 1, in th e follow ing cases:
X„ is available, th e actual co m putations m ay be quite involved. For example, (a) For every integer n > 1, th e p d f o f X„ is given by
/■»
suppose X n is th e sum o f n in dependent. Bernoulli random variables each with
0<x < I
the prob ab ility o f success p. The distribution o f X n is known for every n as being
r*
|0 , elsew here
binom ial, B (n , p ). But even for m o derately small values o f n, the com putations o f
l~r
/ ' '*‘(1 are m essy. Now , ¡ fit is know n th at the sequence JT,, X , , . . .
■lit il
Scanned by CamScanner
■r?
1,1 I It*•
>1
4?4 / flanc frobcbth'y Theory and A pplnvnom
(0 . «<0 Let us fint find the limit foi * ' v'- ***
F\ n{u) = ju " , 0<«<l
. 1 1 - o f * Mi** v ¡i)
1 1, «>1
As n goes to infinity, ...... ~
Next, taking the ratio o
lim Fx (u) = j 0 ' u<l
■ " (l, u> l
Fx(u)
_ |0 , i/< l
~ 1, u> \
(/>) In this case, V rnF j^iu) = 0 for every real number u. But there is no distribu
tion function which, at its continuity points, will agree with a function which is (Kb 01
identically zero. Hence Xn does noj converge in distribution. ■ ■ i i - w 1
Therefore
The reader is cautioned that, if Xn ^ A, tr.cre is nothing implied about the
convergence of the sequence Xn, n > 1, to the random variable X per se. b(k-.n,p)
To see this, suppose * is a standard normal variable. Define a sequence Xn, J™. b ( k - l -,n,P) k
n > I , by
T lia l is.
* « .-< -ir*
lim b { k \ n . p ) - \ ^ k - V-n' P)
Thus, for any sample point s e S, Xn{s) = (-l)"* (s). Now, since * is JV(0,1), it n-*m
follows that for ei>ery integer n > \ , X n is.V(0,1). (Recall that i f * isN[p, o2). then Proceeding recursively.
fl* + ft is N(afi + ft, a2o7). Here a = ( - \ f and b = 0.) Therefore, it follows that
X _X__ iirn Mk - 2;n. P)
*n *• lim b (k ;n ,p )- k * * - | n^ m
However, for every s g S the sequence of real numbers ( -l)" * (i), n > 1, diverges. n-+~ . .
Hence, J \s \ Xn(s) diverges) = P{Xn diverges) = 1. X _ X_ lim ft(0; ii. p)
=k 'k -\ 2 1 n-*"
In conclusion, although *„ * * , the sequence o f random variables itself diverges.
X _vx
Poisson approximation to the binomial k\
In Chapter 5, we enunciated the postulates under which the random variable
Thus we see......
representing the number o f changes during a given interval of length i has the poisson random variable will, p a n j m e I< _ ,* /„ ) ) - . X a * n -
Poisson distribution. We shall now establish that under certain conditions the
fi(„, p), then E(X) = "P X ' f ,he Polsson random variable.)
binomial probabilities b{k\n, p ) can be approximated by the Poisson probabilities and X IS indeed the mean and ihe v a ria n t 01
if the number o f trials n is large.
Scanned by CamScanner
Scanned by CamScanner
a a a a « ft ä « ä ft ft « t § €
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
a «i n c (A \\ V\ V \ V« Ü. 'V ï*. V* V* ft h
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
* * * /« •» hnàttUlry Tktnrrtmi 4p rh ti^u - ' : <trrr. h A- ' »r;
CHAPTERS
M 0 < * V — . Oetorwteif
Section 1
»• ( Ï W l j ) 4 13 /f<ri • * rr ‘^, r >0. 0?te*tote
4 liî» * ^ A h " a l 13 f i n w 15 ffiti- •—* r . jno* < r < J ^ r . O itoirtierf
* l ) * 0 4 5 2 . p i i ) . 0 1 « , f i l ) . (ioH *yr»
* p (* )» í Y ¿1 h
'fc J U M 1 ä )
'\ N 1
• **°i. ,-V
lq I I » 1 » J r >0
I , i r r \ 'r Q r . n f I * ' r
^ ) * ( î ) ( Û Â J i f ( 0 .i 2 f \ *.0|2 lo flirvhcrr
l i «. 0.2S08 ’ ’ *
,s A * W 0 . |) ( f t . * j \ * . 0 |2 21 ^ >/l * f * 1. » <*0. Û tfcr*?rrf
, 10.
|0 fF < 112
2
A fir)
« "> <?**)*(«>• (M <*)•<*) fl f fr4* f ^ 12
CHATTU 7
30 (,) ( » ) ( / « * » * ) ' 4 *0 .1
V rfm I
(»>) rn o A ) *• I i i <*> i
21 onus > í » i a n - i . y « ? / )”45 íb > /.íj)*i. vjm tm s
H T v m<*, protaNr numhri oí m t'pr»u « 4 1*. p r r è ^ n * 0 W P Vj^T) • V Vj4 T )-2
Svitavi 2
• u> f i T i • j. v * f r ) - A (<)Æ UD*i. v»ur>*j
5 <») ^ r ( r ) * ^ j ^ 1, / t H I i , - * . - 4 . - 1 . 2 .S. Ili 12 (i) O’îOÂi (b) TWre i#t feo )*4 4
„ m ». o i (V) «fr
lb) / » / < ! ) * rr<~>*£. Fzt SJ" ! M W * i- M Ä l ' i
10 *r ^ ’V I - r ’*). 1 * 1 .2 . (jr jw rtn t JtUnbvtMj* «nUk p*j*mcie:
17 0 5 iV) — Í ------- ,
l-r'M * *I O • 11« ♦II1
21 fïtaT)*-»; VálidaJTi* I
23 («) o o :2 â i*» ¿> >44«. (( ) o j o r
25 (4)1.)(057 T2)tt ft) I-10JT72)h (»H^oorréñot^:^
f » W 02^)-ru
îi #iri*on. vmjo-Oj«
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Binom ial Probabilities
for
■î i l ï î f l l l l ï l î l
i
5
; 0001
.0 0 0 °
.0000
.0054
.0004
.0000
!0331
.0046
.0004
0839
0185
002Ä
?2 lî
Ò 2 IS
n n î?
? Â 15
Ì216
2865
-2965
25*'
1361
2731
273
1707
?5 ÏZ
0896
*090
S 787
-0548
15*9
o S iS
U S?
0039
0 3 >2
-1094
. 1 1 1 1 1 1 i I É 1 1 Ê I
O >¿3 00>J!7'w AU)KJ
Il 1 1 1 1
nrtnS
1 1 1 i I i I
0 000 .0008 .0050 0165 Cn»Q ’A t *« -2048 .2194 2508 if iò n -, 7 39 1641
.0000 .0000 .000 1 00Û6 nn5fi °7 3 5 .1024 .1181 iä § ? ‘5 ?9 2 -2506 .2461
.0000 .0000 .0000 0000 non? '2 ? ? ^ '2210 0341 .0424 0743 ï!< n 24° 8 -2461
.0000 .0000 .0000 0000 OOnn S r i? 0039 0073 0098 0 2 ?2 n in ? À542 ‘ 641
.0000 .0000 .0000 !oooo 0000 cro i S2SÍ 2009 -0013 o lii o fjí 0703
10
!I l i ü i l i i l i i i
o oSSS
0 000
-0000
o oo n
.0000
ÎX S S
----------------- - ° ° ° ° ------?0 00 ___ 0 0 0 0
.0000
2 0 00
.0000
.0001
0000
.0000
0004
0000
.0000
0014
0001
.0000
0030
0003
.0000
'n n l i
°0 0 5
.0000
'2 f H 5
0016
.000?
0746
° ° 42
.1080
S 2 Ï?
oooa
.1172
0439
o o îo
Scanned by CamScanner
r u v u u i
Scanned by CamScanner
Scanned by CamScanner