Sei sulla pagina 1di 30

Kashmir Institute of technology,

Zakura campus, Kashmir University,


Jammu and Kashmir.
TABLE OF CONTENTS
INTRODUCTION TO PROBABILITY: ................................................... 3
EXPECTATION ......................................................................................... 4
INDEPENDENT EVENT ........................................................................... 5
CONDITIONAL PROBABILITY .............................................................. 5
LAWS OF PROBABILITY:........................................................................ 5
THE ADDITION LAW OF PROBABILITY.............................................. 5
THE MULTIPLICATION LAW OF PROBABILITY ............................... 6
TYPES OF PROBABILITY:....................................................................... 8
MARGINAL PROBABILTY: .................................................................... 8
JOINT PROBABILTY:............................................................................... 9
CONDITIONAL PROBABILITY: ............................................................. 9
BAYES THEOREM:................................................................................. 10
RANDOM VARIABLE:............................................................................. 12
DISCRETE PROBABILITY DISTRIBUTIONS:................................... 13
CONTINUOUS PROBABILITY DISTRIBUTION: .............................. 15
GRAPHICAL INTERPRETATION ......................................................... 16
EXPECTATION:........................................................................................ 17
BINOMIAL DISTRIBUTION: ................................................................. 18
INDUSTRIAL INSPECTION................................................................... 21
POISSONS DISTRIBUTION: ................................................................. 23
NORMAL DISTRIBUTION: .................................................................... 25
INTRODUCTION TO PROBABILITY:
Probability is a concept which numerically measures the degree of uncertainty and
therefore, of uncertainty of the occurrence of events. The probability of something
happening is the likelihood or chance of its happening.
If an event A can happen in m ways, and fail in n ways, all these ways being equally
likely to occur, then the probability of the happening of A is

number of favourablecases m

total number of mutually exclusive and equally likely cases m n

And that of its failing is defined as

m
mn

Values of probability lie between 0 and 1, where 0 represents an absolute impossibility


and 1 represents an absolute certainty. The probability of an event happening usually
lies somewhere between these two extreme values and is expressed either as a proper or
decimal fraction. Examples of probability are:

EXAMPLES PROBABILITY

a length of copper wire has zero resistance at 100C 0

a fair, six-sided dice will stop with a 3 upwards 16 or 0.1667

a fair coin will land with a head upwards 12 or 0.5

a length of copper wire has some resistance at 100C 1


If p is the probability of an event happening and q is the probability of the same event
not happening, then the total probability is p+q and is equal to unity, since it is an
absolute certainty that the event either does or does not occur, i.e.

p+q=1

EXPECTATION

The expectation, E, of an event happening is defined in general terms as the product of


the probability p of an event happening and the number of attempts made, n, i.e.

E=pn.

Thus, since the probability of obtaining a 3 upwards when rolling a fair dice is 16, the
expectation of getting a 3 upwards on four throws of the dice is 164, i.e. 23. Thus
expectation is the average occurrence of an event.

DEPENDANT EVENT

A dependent event is one in which the probability of an event happening affects the
probability of another event happening. Let 5 transistors be taken at random from a
batch of 100 transistors for test purposes, and the probability of there being a defective
transistor p1, be determined. At some later time, let another 5 transistors be taken at
random from the 95 remaining transistors in the batch and the probability of there being
a defective transistor p2, be determined. The value of p2 is different from p1 since batch
size has effectively altered from100 to 95, i.e. probability p2 is dependent on
probability p1. Since 5 transistors are drawn, and then another 5 transistors drawn
without replacing the first 5, the second random selection is said to be without
replacement.
INDEPENDENT EVENT

An independent event is one in which the probability of an event happening does not
affect the probability of another event happening. If 5 transistors are taken at random
from a batch of transistors and the probability of a defective transistor p1 is determined
and the process is repeated after the original 5 have been replaced in the batch to give
p2, then p1 is equal to p2. Since the 5 transistors are replaced between draws, the
second selection is said to be with replacement.

CONDITIONAL PROBABILITY

Conditional probability is concerned with the probability of say event B occurring,


given that event A has already taken place. If A and B are independent events, then the
fact that event A has already occurred will not affect the probability of event B. If A and
B are dependent events, then event A having occurred will effect the probability of
event B.

LAWS OF PROBABILITY:

THE ADDITION LAW OF PROBABILITY

The addition law of probability is recognized by the word or joining the probabilities.
If pA is the probability of event A happening and pB is the probability of event B
happening, the probability of event A or event B happening is given by pA+ pB
(provided events A and B are mutually exclusive, i.e. A and B are events which cannot
occur together). Similarly, the probability of events A or B or C or . . .N happening is
given by

pA + pB + pC + + pN

THE MULTIPLICATION LAW OF PROBABILITY

The multiplication law of probability is recognized by the word and joining the
probabilities. If pA is the probability of event A happening and pB is the probability of
event B happening, the probability of event A and event B happening is given by pA
pB. Similarly, the probability of events A and B and C and . . .N happening is given by

pA pB pC pN

PROBLEMS
Problem 1:
Determine the probabilities of selecting at random
(a) a man, and
(b) a woman from a crowd containing 20 men and 33 women.

SOLUTION:

(a) The probability of selecting at random a man p, is given by the ratio:

number of men
number in crowd
i.e. p = __20__
20+33

= 20 53 or 0.3774

(b) The probability of selecting at random a women q, is given by the ratio:

number of women
number in crowd

i.e. q = __33__
20+33

= 33 53 or 0.6226

(Check: the total probability should be equal to 1;

p = 20 and q = 33
53 53

thus the total probability,

p + q = 20 + 33 = 1
53 53

hence no obvious error has been made).

Problem 2.
Find the expectation of obtaining a 4 upwards with 3 throws of a fair dice.

SOLUTION:

Expectation is the average occurrence of an event and is defined as the probability times
the number of attempts. The probability, p, of obtaining a 4 upwards for one throw of
the dice is 1
6
Also, 3 attempts are made, hence n=3 and the expectation, E, is pn, i.e.

E = 1 3 = 12 or 0.50
6

TYPES OF PROBABILITY:
Probabilities may be marginal, joint or conditional. Understanding their differences and
how to manipulate among them is the key to success in understanding the foundations
of statistics.

MARGINAL PROBABILTY:

It may be thought of as an unconditional probability. It is not conditioned on another


event. If A is an event, then the probability of an event occurring (p(A)) is marginal
probability. Example:

the probability that a card drawn is red

p(red) = 0.5

Another example: the probability that a card drawn is four

p(four)=1/13.
JOINT PROBABILTY:

It is the probability of the intersection of two or more events. If A and B are the events,
then the probability of event A and event B occurring is joint probability. The
probability of the intersection of A and B may be written p(A B). Example:

the probability that a card is a four and red

=p(four and red) = 2/52=1/26.

(There are two red fours in a deck of 52, the 4 of hearts and the 4 of diamonds).

CONDITIONAL PROBABILITY:

If A and B are two events, then the conditional probability p(A|B) is the probability of
event A occurring, given that event B occurs. Example:

given that you drew a red card, whats the probability that its a four

(p(four|red))=2/26=1/13.

So out of the 26 red cards (given a red card),

there are two fours so 2/26=1/13.

The equation below is a means to manipulate among joint, conditional and marginal
probabilities. As you can see in the equation, the conditional probability of A given B
is equal to the joint probability of A and B divided by the marginal of B.
Lets use our card example to illustrate. We know that the conditional probability of a
four, given a red card equals 2/26 or 1/13. This should be equivalent to the joint
probability of a red and four (2/52 or 1/26) divided by the marginal P(red) = 1/2. And
low and behold, it works! As 1/13 = 1/26 divided by 1/2. For the diagnostic exam, you
should be able to manipulate among joint, marginal and conditional probabilities.

P( Aand B)
P( A | B)
P( B)

BAYES THEOREM:

An event A corresponds to a number of exhaustive events B , B , B ,, Bn . If P( Bi )


1 2 3
and P( A | Bi ) are given, then

P( Bi ).P( A | Bi )
P( Bi | A)
P( Bi ) P( A | Bi )

PROOF:

By the multiplication law of probability,

P( ABi ) P( A) P( Bi | A) P( Bi ) P( A | Bi ) (1)

P( Bi ) P( A | Bi )
P( Bi | A) (2)
P( A)
Since the events A corresponds to B , B , B ,, Bn , we have by the addition law of
1 2 3
probability,

P( A) P( AB ) P( AB ) ... P( ABn ) P( ABi ) P( Bi ) P( A | Bi ) [By (1)]


1 2

Hence from (2), we have

P( Bi ).P( A | Bi )
P( Bi | A)
P( Bi ) P( A | Bi )

which is known as the theorem of inverse probability.

Problem 3:
There are three bags: first containing 1 white, 2 red, 3 green balls; second 2 white,
3 red, 1 green balls and third 3 white, 1 red, 2 green balls. Two balls are drawn fro
a bag chosen at random. These are found to be one white and one red. Find the
probability that the balls so drawn came from the second bag.

SOLUTION:

Let B , B , B pertain to the first, second, third bags chosen and A: the two balls are
1 2 3
white and red.

1
Now P( B ) P( B ) P( B )
1 2 3 3

P( A | B ) P (a white and a red ball are drawn from first bag)


1
(1C 2C ) 2
1 1
6C 15
2

( 2C 3C ) 2 ( 3C 1C ) 1
Similarly P( A | B ) 1 1 , P( A | B ) 1 1
2 6C 5 3 6C 5
2 2

P( B ) P( A | B )
By Bayes theorem P( B | A) 2 2
2 P( B ) P( A | B ) P( B ) P( A | B ) P( B ) P( A | B )
1 1 2 2 3 3
1 2
6
3 5 .
1 2 1 2 1 1 11

3 15 3 5 3 5

RANDOM VARIABLE:

Suppose that to each point of a sample space we assign a number. We then have a
function defined on the sample space. This function is called a random variable (or
stochastic variable) or more precisely a random function (stochastic function). It is
usually denoted by a capital letter such as X or Y. In general, a random variable has
some specified physical, geometrical, or other significance.

EXAMPLE 2.1
Suppose that a coin is tossed twice so that the sample space is
S = {HH, HT, TH, TT}. Let X represents the number of heads that can come up. With
each sample point we can associate a number for X as shown in Table 2-1. Thus for
example, in the case of HH (i.e., 2 heads), X = 2 while for TH (1 head), X = 1. It
follows that X is a random variable.
SAMPLE POINT HH HT TH TT

X 2 1 1 0

Table 2-1

It should be noted that many other random variables could also be defined on this
sample space, for example, the square of the number of heads or the number of heads
minus the number of tails. A random variable that takes on a finite or countable infinite
number of values is called a discrete random variable while one which takes on an
unaccountably infinite number of values is called a nondiscrete random variable.

DISCRETE PROBABILITY DISTRIBUTIONS:


Let X be a discrete random variable, and suppose that the possible values that it can
assume are given by x , x , x ,... arranged in some order. Suppose also that these values
1 2 3
are assumed with probabilities given by

P (X = x ) = f ( x ) k = 1, 2, . . . (1)
k k

It is convenient to introduce the probability function, also referred to as probability


distribution, given by

P (X = x) = f (x) (2)

For x = x , this reduces to (1) while for other values of x, f(x) = 0.


k
In general, f(x) is a probability function if

1. f(x) 0

2. f ( x) 1
x
where the sum in 2 is taken over all possible values of x.

EXAMPLE 2.2
Find the probability function corresponding to the random variable X of Example
2.1.

SOLUTION

Assuming that the coin is fair, we have

1 1 1 1
P( HH ) P( HT ) P(TH ) P(TT )
4 4 4 4

Then

1
P( X 0) P(TT )
4
1 1 1
P( X 1) P( HT TH ) P( HT ) P(TH )
4 4 2
1
P( X 2) P( HH )
4

The probability function is thus given by Table 2-2.

x 0 1 2

f ( x) 1 1 1
4 2 4
CONTINUOUS PROBABILITY
DISTRIBUTION:

A nondiscrete random variable X is said to be absolutely continuous, or simply


continuous, if its distribution function may be represented as

x
F ( x) P( X x) f (u )du x (7)

where the function f(x) has the properties

1. f(x) 0


2. f ( x)dx =1

It follows from the above that if X is a continuous random variable, then the
probability that X takes on any one particular value is zero, whereas the interval
probability that X lies between two different values, say, a and b, is given by

b
P(a X b) f ( x)dx (8)
a

EXAMPLE 2.4
If an individual is selected at random from a large group of adult males, the
probability that his height X is precisely 68 inches (i.e., 68.000 . . . inches) would
be zero. However, there is a probability greater than zero than X is between
67.000 . . . inches and 68.500 . . . inches, for example.
A function f(x) that satisfies the above requirements is called a probability
function or probability distribution for a continuous random variable, but it is
more often called a probability density function or simply density function. Any
function f(x) satisfying Properties 1 and 2 above will automatically be a density
function, and required probabilities can then be obtained from (8).

GRAPHICAL INTERPRETATION

If f(x) is the density function for a random variable X, then we can represent
y=f(x) graphically by a curve as in Fig. 2-2. Since f(x) 0, the curve cannot fall
below the x axis. The entire area bounded by the curve and the x axis must be 1
because of Property 2. Geometrically the probability that X is between a and b,
i.e., P(a X b) , is then represented by the area shown shaded, in Fig. 2-2.

The distribution function F ( x) P( X x) is a monotonically increasing function


which increases from 0 to 1 and is represented by a curve as in Fig. 2-3.
EXPECTATION:
The mean value( ) of the probability distribution of a variate X is commomnly
known as its expectation and is denoted by E(X). If f(x) is the probability density
function of the variate X, then

xi f ( xi )
i

Or E ( X ) xf ( x)dx

In general, expectation of any function ( x) is given by

E[ ( x)] ( xi ) f ( xi )
i

Or E[ ( x)] ( x) f ( x)dx

2. VARIANCE of a distribution is given by

2 ( xi )2 f ( xi )
i


or 2 ( x )2 f ( x)dx

where is the standard deviation of the distribution.


3. The rth moment about the mean (denoted by r ) is defined by

r ( xi )r f ( xi )


r ( x )r f ( x)dx

4. Mean deviation from the mean is given by

| xi | f ( xi )


Or | x | f ( x)dx

BINOMIAL DISTRIBUTION:

The binomial distribution deals with two numbers only, these being the
probability that an event will happen, p, and the probability that an event will not
happen, q. Thus, when a coin is tossed, if p is the probability of the coin landing
with a head upwards, q is the probability of the coin landing with a tail upwards.
p +q must always be equal to unity. A binomial distribution can be used for
finding, say, the probability of getting three heads in seven tosses of the coin, or
in industry for determining defect rates as a result of sampling. One way of
defining a binomial distribution is as follows:
if p is the probability that an event will happen and q is the probability that the
event will not happen, then the probabilities that the event will happen 0, 1, 2, 3, .
. . ,n times in n trials are given by the successive terms of the expansion of
(q p)n , taken from left to right.

The binomial expansion of (q p)n is:

n(n 1) n2 2 n(n 1)(n 1) n3 3


q n nq n1 p q p q p
2! 3!

This concept of a binomial distribution is used in Problems 1.

Problem 1:
Determine the probabilities of having (a) at least 1 girl and (b) at least 1 girl
and 1 boy in a family of 4 children, assuming equal probability of male and
female birth.

SOLUTION

The probability of a girl being born, p, is 0.5 and the probability of a girl not
being born (male birth), q, is also 0.5. The number in the family, n, is 4. From
above, the probabilities of 0, 1, 2, 3, 4 girls in a family of 4 are given by the
successive terms of the expansion of (q p)4 taken from left to right. From the
binomial expansion:

(q p)4 q 4 4q3 p 6q 2 p 2 4qp3 p 4

Hence the probability of no girls is q 4 ,


i.e. 0.54 = 0.0625
the probability of 1 girl is 4q3 p ,
i.e. 4 0.53 0.5 = 0.2500

the probability of 2 girls is 6q 2 p 2 ,


i.e. 6 0.52 0.52 = 0.3750

the probability of 3 girls is 4 qp3 ,


i.e. 40.5 0.53 = 0.2500

the probability of 4 girls is p 4 ,


i.e. 0.54 = 0.0625

Total probability, (q p)4 = 1.0000

(a) The probability of having at least one girl is the sum of the probabilities of
having 1, 2, 3 and 4 girls, i.e.

0.2500 + 0.3750 + 0.2500 + 0.0625 = 0.9375

(Alternatively, the probability of having at least 1girl is:


1(the probability of having no girls), i.e. 10.0625, giving 0.9375, as obtained
previously.)
(b) The probability of having at least 1 girl and 1 boy is given by the sum of the
probabilities of having: 1 girl and 3 boys, 2 girls and 2 boys and 3 girls and 2 boys, i.e.

0.2500+0.3750+0.2500=0.8750

(Alternatively, this is also the probability of having


1 (probability of having no girls + probability of having no boys), i.e.
12 0.0625=0.8750, as obtained previously.)
INDUSTRIAL INSPECTION

In industrial inspection, p is often taken as the probability that a component is


defective and q is the probability that the component is satisfactory. In this case, a
binomial distribution may be defined as:

the probabilities that 0, 1, 2, 3,, n components are defective in a sample


of n components, drawn at random from a large batch of components, are
given by the successive terms of the expansion of (q p)n , taken from left to
right.

Problem 3:
A machine is producing a large number of bolts automatically. In a box of
these bolts, 95% are within the allowable tolerance values with respect to
diameter, the remainder being outside of the diameter tolerance values.
Seven bolts are drawn at random from the box. Determine the probabilities
that (a) two and (b) more than two of the seven bolts are outside of the
diameter tolerance values.

SOLUTION

Let p be the probability that a bolt is outside of the allowable tolerance values,
i.e. is defective, and let q be the probability that a bolt is within the tolerance
values, i.e. is satisfactory. Then p = 5%, i.e. 0.05per unit and q=95%, i.e. 0.95per
unit. The sample number is 7. The probabilities of drawing 0, 1, 2, . . . , n
defective bolts are given by the successive terms of the expansion of (q p)n ,
taken from left to right. In this problem
(q p)n = (0.95 0.05)7

= (0.95)7 7 (0.95)6 0.05 21 (0.95)5 (0.05)2 +

Thus the probability of no defective bolts is


(0.95)7 = 0.6983

The probability of 1 defective bolt is


7 (0.95)6 0.05 = 0.2573

The probability of 2 defective bolts is


21 (0.95)5 (0.05)2 = 0.0406
and so on.

(a) The probability that two bolts are outside of the diameter tolerance values is
0.0406

(b) To determine the probability that more than two bolts are defective, the sum
of the probabilities of 3 bolts, 4 bolts, 5 bolts, 6 bolts and 7 bolts being
defective can be determined. An easier way to find this sumis to find 1(sum
of 0 bolts, 1 bolt and 2 bolts being defective), since the sum of all the terms is
unity. Thus, the probability of there being more than two bolts outside of the
tolerance values is:

1(0.6983+0.2573+0.0406), i.e. 0.0038


POISSONS DISTRIBUTION:
When the number of trials, n, in a binomial distribution becomes large (usually
taken as larger than 10), the calculations associated with determining the values
of the terms becomes laborious. If n is large and p is small, and the product np is
less than 5, a very good approximation to a binomial distribution is given by the
corresponding Poisson distribution, in which calculations are usually simpler.
The Poisson approximation to a binomial distribution may be defined as follows:
the probabilities that an event will happen 0, 1, 2,3,, n times in n trialsare
given by the successive terms of the expression

2 3
e 1 ....
2! 3!

taken from left to right.

The symbol is the expectation of an event happening and is equal to np.

Problem 6.
If 3% of the gearwheels produced by a company are defective, determine the
probabilities that in a sample of 80 gearwheels (a) two and (b) more than two
will be defective.

SOLUTION

The sample number, n, is large, the probability of a defective gearwheel, p, is


small and the product np is 800.03, i.e. 2.4, which is less than 5.
Hence a Poisson approximation to a binomial distribution may be used. The
expectation of a defective gearwheel, = np = 2.4
The probabilities of 0, 1, 2, . . . defective gearwheels are given by the successive
terms of the expression

2 3
e 1 ....
2! 3!

taken from left to right, i.e. by

2e
e , e , ,...
2!

Thus probability of no defective gearwheels is


e = e2.4 = 0.0907

probability of 1 defective gearwheel is


e = 2.4 e2.4 = 0.2177
probability of 2 defective gearwheels is

2e 2.4e2.4
=.2613
2! 2!

(a) The probability of having 2 defective gearwheels is 0.2613


(b) The probability of having more than 2 defective gearwheels is 1(the sum of
the probabilities of having 0, 1, and 2 defective gearwheels), i.e.

1(0.0907+0.2177+0.2613), that is, 0.4303


The principal use of a Poisson distribution is to determine the theoretical
probabilities when p, the probability of an event happening, is known, but q, the
probability of the event not happening is unknown. For example, the average
number of goals scored per match by a football team can be calculated, but it is
not possible to quantify the number of goals which were not scored. In this type
of problem, a Poisson distribution may be defined as follows:
the probabilities of an event occurring 0, 1, 2, 3, times are given by the
successive terms of the expression
2 3
e 1 ....
2! 3!

taken from left to right.

The symbol is the value of the average occurrence of the event.

NORMAL DISTRIBUTION:
When data is obtained, it can frequently be considered to be a sample (i.e. a few
members) drawn at random from a large population (i.e. a set having many
members). If the sample number is large, it is theoretically possible to choose
class intervals which are very small, but which still have a number of members
falling within each class. A frequency polygon of this data then has a large
number of small line segments and approximates to a continuous curve.Such a
curve is called a frequency or a distribution curve.
An extremely important symmetrical distribution curve is called the normal
curve and is as shown in Fig. 58.1.
This curve can be described by a mathematical equation and is the basis of much
of the work done in more advanced statistics.
Many natural occurrences such as the heights or weights of a group of people, the
sizes of components produced by a particular machine and the life length of
certain components approximate to a normal distribution.

Figure 58.1

Normal distribution curves can differ from one another in the following four
ways:

(a) by having different mean values

(b) by having different values of standard deviations

(c) the variables having different values and different units and

(d) by having different areas between the curve and the horizontal axis.

A normal distribution curve is standardized as follows:

(a) The mean value of the unstandardized curve is made the origin, thus
making the mean value x, zero.

(b) The horizontal axis is scaled in standard deviations.


This is done by letting

xx
z

where z is called the normal standard variate,


x is the value of the variable, x is the mean value of the distribution and is
the standard deviation of the distribution.

(c) The area between the normal curve and the horizontal axis is made equal to
unity.

When a normal distribution curve has been standardized, the normal curve is
called a standardized normal curve or a normal probability curve, and any
normally distributed data may be represented by the same normal probability
curve. The area under part of a normal probability curve is directly proportional
to probability and the value of the shaded area shown in Fig. 58.2 can be
determined by evaluating:

z2
1 xx
e 2 dz, where z
2
Figure 58.2

Problem 1.
The mean height of 500 people is 170 cm and the standard deviation is 9 cm.
Assuming the heights are normally distributed, determine the number of
people likely to have heights between 150cm and 195cm.

SOLUTION

The mean value, x, is 170cm and corresponds to a normal standard variate value,
z, of zero on the standardized normal curve. A height of 150 cm has a z-value
given by
xx 150 170
z standard deviations, i.e. or 2.22 standard deviations.
9
Using a table of partial areas beneath the standardized normal curve , a z-value of
2.22 corresponds to an area of 0.4868 between the mean value and the ordinate
z=2.22. The negative z-value shows that it lies to the left of the z=0 ordinate.
This area is shown shaded in Fig. 58.3(a). Similarly, 195cm has a z-value of
195 170
that is 2.78 standard deviations. From Table 58.1, this value of z
9
corresponds to an area of 0.4973, the positive value of z showing that it lies to
the right of the z=0 ordinate. This area is shown shaded in Fig. 58.3(b). The total
area shaded in Figs. 58.3(a) and (b) is shown in Fig. 58.3(c) and is
0.4868+0.4973, i.e. 0.9841 of the total area beneath the curve.

However, the area is directly proportional to probability. Thus, the probability


that a person will have a height of between 150 and 195cm is 0.9841. For a group
of 500 people, 5000.9841, i.e. 492 people are likely to have heights in this
range. The value of 5000.9841 is 492.05, but since answers based on a normal
probability distribution can only be approximate, results are usually given correct
to the nearest whole number.
Figure 58.3

_____________________________ END . __________________________