Sei sulla pagina 1di 13

PROBABILITY

1. SAMPLE SPACES AND EVENTS


1.1 Random Experiments
An experiment that can result in different outcomes, even though it is repeated in the same
manner every time, is called a random experiment.
1.2 Sample Spaces
The set of all possible outcomes of a random experiment is called the sample space of the
experiment. The sample space is denoted as S.
A sample space is discrete if it consists of a finite or countable infinite set of outcomes.
A sample space is continuous if it contains an interval (either finite or infinite) of real
numbers.
1.3 Events
An event is a subset of the sample space of a random experiment.
Some of the basic set operations are summarized below in terms of events:
The union of two events is the event that consists of all outcomes that are contained in either
of the two events. We denote the union as
The intersection of two events is the event that consists of all outcomes that are contained in
both of the two events. We denote the intersection as .
The complement of an event in a sample space is the set of outcomes in the sample space
that are not in the event. We denote the component of the event E as E .
1.4 Counting Technique

2. INTERPRETATIONS OF PROBABILITY
2.1 Introduction
Probability is used to quantify the likelihood, or chance, that an outcome of a random
xperiment will occur.
Whenever a sample space consists of N possible outcomes that are equally likely, the
probability of each outcome is 1N.
For a discrete sample space, the probability of an event E, denoted as P(E), equals the sum of
the probabilities of the outcomes in E.
3. ADDITION RULES

4. CONDITIONAL PROBABILITY

5. MULTIPLICATION AND TOTAL PROBABILITY RULES


5.1 Multiplication Rule

5.2 Total Probability Rule

6. INDEPENDENCE

7. BAYES THEOREM

8. RANDOM VARIABLES

DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

1. DISCRETE RANDOM VARIABLES


The distribution of the random variables involved in each of these common systems can be
analyzed, and the results of that analysis can be used in different applicationsand examples.
2. PROBABILITY DISTRIBUTIONS AND PROBABILITY MASS FUNCTIONS
The probability distribution of a random variable X is a description of the probabilities
associated with the possible values of X. For a discrete random variable, the distribution is
often specified by just a list of the possible values along with the probability of each. In some
cases, it is convenient to express the probability in terms of a formula.

3. CUMULATIVE DISTRIBUTION FUNCTIONS

4. MEAN AND VARIANCE OF A DISCRETE RANDOM VARIABLE

5. DISCRETE UNIFORM DISTRIBUTION

6. BINOMIAL DISTRIBUTION

7. GEOMETRIC AND NEGATIVE BINOMIAL DISTRIBUTIONS


7.1 Geometric Distribution

7.2 Negative Binomial Distribution

8. HYPERGEOMETRIC DISTRIBUTION

9. POISSON DISTRIBUTION

CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS


1. CONTINUOUS RANDOM VARIABLES
A number of continuous distributions frequently arise in applications. These distributions are
described, and example computations of probabilities, means, and variances are provided in
the remaining sections of this chapter.
2. PROBABILITY DISTRIBUTIONS AND PROBABILITY DENSITY FUNCTIONS

3. CUMULATIVE DISTRIBUTION FUNCTIONS

4. MEAN AND VARIANCE OF A CONTINUOUS RANDOM VARIABLE

5. CONTINUOUS UNIFORM DISTRIBUTION

6. NORMAL DISTRIBUTION

7. NORMAL APPROXIMATION TO THE BINOMIAL AND POISSON


DISTRIBUTIONS

8. EXPONENTIAL DISTRIBUTION

9. ERLANG DISTRIBUTION
The random variable that equals the interval length until r counts occur in a Poisson process
has an Erlang random variable.

10. GAMMA DISTRIBUTION


The Erlang distribution is a special case of the gamma distribution.

11. WEIBULL DISTRIBUTION


Weibull distribution is often used to model the time until failure of many different physical
systems.

12. LOGNORMAL DISTRIBUTION

Potrebbero piacerti anche