Sei sulla pagina 1di 3

CHAPTER 5

PROBABILITY
A numerical measure of the likelihood that an event will occur.

RANDOM EXPERIMENT
A process that generates well-defined experimental outcomes. On any single repetition or trial, the
outcome that occurs is determined by chance.

SAMPLE SPACE
The set of all outcomes.

EVENT
A collection of outcomes.

PROBABILITY OF AN EVENT
Equal to the sum of the probabilities of outcomes for the event.

COMPLEMENT OF A
The event consisting of all outcomes that are not in A.

VENN DIAGRAM
A graphical representation of the sample space and operations involving events, in which the sample
space is represented by a rectangle and events are represented as circles within the sample space.

UNION OF A AND B
The event containing the outcomes belonging to A or B or both. The union of A and B is denoted by
A∪B.

ADDITION LAW
A probability law used to compute the probability of the union of events. For two events A and B, the
addition law is P(A∪B)=P(A)+P(B)-P(A∩B). For two mutually exclusive events, P(A∩B)=0, so
P(A∪B)=P(A)+P(B).

INTERSECTION OF A AND B
The event containing the outcomes belonging to both A and B. The intersection of A and B is denoted
A∩B.

MUTUALLY EXCLUSIVE EVENTS


Events that have no outcomes in common; A∩B is empty and P(A∩B)=0.

CONDITIONAL PROBABILITY
The probability of an event given that another event has already occurred. The conditional
probability of A given B is P(A|B)=P(A∩B)P(B).

JOINT PROBABILITIES
The probability of two events both occurring; in other words, the probability of the intersection of
two events.

INDEPENDENT EVENTS
Two events A and B are independent if P(A|B)=P(A) or P(B|A)=P(B) ; the events do not influence
each other.

MARGINAL PROBABILITIES
The values in the margins of a joint probability table that provide the probabilities of each event
separately.
MULTIPLICATION LAW
A law used to compute the probability of the intersection of events. For two events A and B, the
multiplication law is P(A∩B)=P(B)P(A|B) or P(A∩B)=P(A)P(B|A). For two independent events, it
reduces to P(A∩B)=P(A)P(B).

PRIOR PROBABILITY
Initial estimate of the probabilities of events.

POSTERIOR PROBABILITIES
Revised probabilities of events based on additional information.

BAYES’ THEOREM
A method used to compute posterior probabilities.

RANDOM VARIABLES
A numerical description of the outcome of an experiment.

DISCRETE RANDOM VARIABLE


A random variable that can take on only specified discrete values.

CONTINUOUS RANDOM VARIABLE


A random variable that may assume any numerical value in an interval or collection of intervals. An
interval can include negative and positive infinity.

PROBABILITY DISTRIBUTION
A description of how probabilities are distributed over the values of a random variable.

PROBABILITY MASS FUNCTION


A function, denoted by f(x), that provides the probability that x assumes a particular value for a
discrete random variable.

EMPIRICAL PROBABILITY DISTRIBUTION


A probability distribution for which the relative frequency method is used to assign probabilities.

CUSTOM DISCRETE PROBABILITY DISTRIBUTION


A probability distribution for a discrete random variable for which each value xi that the random
variable assumes is associated with a defined probability f(xi) .

EXPECTED VALUE
A measure of the central location, or mean, of a random variable.

VARIANCE
A measure of the variability, or dispersion, of a random variable.

STANDARD DEVIATION
Positive square root of the variance.

DISCRETE UNIFORM PROBABILITY DISTRIBUTION


A probability distribution in which each possible value of the discrete random variable has the same
probability.

BINOMIAL PROBABILITY DISTRIBUTION


A probability distribution for a discrete random variable showing the probability of x successes in n
trials.
POISSON PROBABILITY DISTRIBUTION
A probability distribution for a discrete random variable showing the probability of x occurrences of
an event over a specified interval of time or space.

PROBABILITY DENSITY FUNCTION


A function used to compute probabilities for a continuous random variable. The area under the graph
of a probability density function over an interval represents probability.

UNIFORM PROBABILITY DISTRIBUTION


A continuous probability distribution for which the probability that the random variable will assume
a value in any interval is the same for each interval of equal length.

TRIANGULAR PROBABILITY DISTRIBUTION


A continuous probability distribution in which the probability density function is shaped like a
triangle defined by the minimum possible value a, the maximum possible value b, and the most
likely value m. A triangular probability distribution is often used when only subjective estimates are
available for the minimum, maximum, and most likely values.

NORMAL PROBABILITY DISTRIBUTION


A continuous probability distribution in which the probability density function is bell shaped and
determined by its mean μ and standard deviation σ.

EXPONENTIAL PROBABILITY DISTRIBUTION


A continuous probability distribution that is useful in computing probabilities for the time it takes to
complete a task or the time between arrivals. The mean and standard deviation for an exponential
probability distribution are equal to each other.

Potrebbero piacerti anche