Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Random Processes
0-0
Outline for first part
• Axioms
• Conditional Probability
• Independence
• Sequences of Independent Experiments
• Random Variables
P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
= ?
P (A ∪ B) =?
• What is P (A ∪ B ∪ C)?
P (A ∪ B ∪ C) =?
P (A|B) =?
Two fair dice are rolled. Let X1 denote the number that shows up
on dice 1. Let X2 be the number that shows up on dice 2. Define
A : X1 ≥ 4 and B : X1 + X2 is even.
Find the following
• P (A) =?
• P (B) =?
• P (A ∩ B)
• P (A|B)
• P (A|B c )
Can you express P (A) in terms of the conditional probabilties? can
you express P (A) in terms of P (A ∩ B) and P (A ∩ B c )?
• X is transmitted symbol (0 or 1)
• Y is received symbol (0 or 1)
• Channel noise may cause X and Y to be different
• Sample space Ω = {(X, Y )} = {(0, 0), (0, 1), (1, 0), (1, 1)}
• Suppose by design, P (X = 0) = P (X = 1) = 0.5, and from
measurements,
P (X = 0, Y = 0) =
• What is P (Y = 0)?
• This is an example of theorem of total probability useful for find-
ing unconditional probabilities from conditional probabilities
P (X = 1|Y = 1) = ?
P (A ∩ B) = P (A)P (B)
P (A|B) = P (A)
P (B|A) = P (B)
pn (k) =?
p8 (k) =
P (a − ² < X ≤ a) = FX (a) − FX (a − ²)
P (X = a) = FX (a) − FX (a−)
1
PX (x) = b−a+1 for all values a ≤ X ≤ b.
Pascal RV
or equivalently,
Z ∞
f (y)δ(x − y)dy = f (x)
−∞
X = σY + µ
FXY (x, y) = P (X ≤ x, Y ≤ y)
FXY Z (x, y, z) = P (X ≤ x, Y ≤ y, Z ≤ z)
= P (height ≤ x and weight ≤ y and age ≤ z)
P (a < X ≤ b, c < Y ≤ d) =?
Express your answer in terms of joint CDFs.
• Joint CDF
• Joint pdf
fXY (x, y) =
• Marginal CDFs
FX (x) =
FY (y) =
PXY (x, y) = P (X = xi , Y = yj )
for i = 1, 2, . . . ; j = 1, 2, . . .
• Probability of event A is sum of PMF over the outcomes in A:
X X
P (A) = PXY (x, y)
(xi ,yj ) ∈A
PX (xi ) = P (X = xi )
= P (X = xi , Y ≤ ∞)
X∞
= PXY (xi , yj )
j=1
PXY (x, y) = P (X = x, Y = y)
= P (N = xM + y)
= (1 − p)pxM +y
P (X = x) = P (N = xM ) + P (N = xM + 1) + · · ·
+P (N = xM + (M − 1))
M
X −1
= (1 − p)pxM +j
j=0
xM 1 − pM
= (1 − p)p
1−p
= (1 − pM )(pM )x
P (Y = y) = P (N = y) + P (N = M + y) + P (N = 2M + y) + · · ·
X∞
= (1 − p)piM +y
i=0
• For continuous joint pdf fXY (x, y), conditional pdf of Y given X
is
fXY (x, y)
fY (y|x) =
fX (x)
• For discrete joint PMF PXY (x, y), conditional PMF of Y given
X is
PXY (x, y)
PY (y|x) =
PX (x)
• In discrete case,
PY (y|x) = PY (y)
or
PXY (x, y) = PX (x)PY (y)
fY (y) = − ln y(how?)
• Suppose
0 if X < 0
+
Y = g(X) = (X) =
X if X ≥ 0
• Suppose
Y = g(X) = aX + b
for constants a, b
• This is a linear function, note
µ ¶
y−b
FY (y) = P (Y ≤ y) = P (aX + b ≤ y) = P X≤
a
• Therefore µ ¶
y−b
FY (y) = FX
a
• Suppose
Y = I{X > a}
for some a
• Note Y is discrete with only 2 possible values, 0 or 1, so Y has
PMF:
P (Y = 0) = P (X ≤ a) = FX (a)
P (Y = 1) = P (X > a) = 1 − FX (a)
• Suppose
Y = X2
for continuous random variable X
• Note
√ √
P (Y ≤ y) = P (X 2 ≤ y) = P (− y ≤ X ≤ y)
• Therefore
0 if y < 0
FY (y) =
FX (√y) − FX (−√y) if y ≥ 0
• So we need to integrate this area of the joint pdf fXY (x, y):
Z Z
FZ (z) = fXY (x, y) dx dy
x+y≤z
Z ∞ µZ z−y ¶
= fXY (x, y) dx dy
−∞ −∞
or equivalently,
Z ∞
fZ (z) = fXY (z − y, y) dy
−∞
• In the common case that X and Y are independent, then fXY (x, y) =
or equivalently,
Z ∞
fZ (z) = fX (z − y)fY (y) dy
−∞
• Now convolution is
Z z
fZ (z) = λe−λx γe−γ(z−x) dx
0
Z z
= λγe−γz e−(λ−γ)x dx
0
λγ −γz
= e (1 − e−(λ−γ)z )
λ−γ
• Bernoulli
E(X) = 1 · p + 0 · (1 − p) = p
• Binomial
n µ ¶
X n
E(X) = x px (1 − p)n−x = np
x=0
x
• Geometric
∞
X 1−p
x
E(X) = x (1 − p) p =
x=0
p
• Poisson
∞ x
X a
E(X) = x e−a =a
x=0
x!
• Moments
• Variance
• Markov and Chebyshev Inequalities
• Sample Mean
• Variance
b ¶2
(b − a)2
µ
a+b 1
Z
var(X) = x− dx =
a 2 b−a 12
which is proportional to width of the interval (a, b)
• Variance is
• Bernoulli
E(X) = p, var(X) = p(1 − p)
• Binomial
E(X) = np, var(X) = np(1 − p)
• Poisson
E(X) = a, var(X) = a
• Exponential
E(X) = 1/λ, var(X) = a
• Rayleigh
p
E(X) = σ π/2, var(X) = (2 − π/2)σ 2
E(X) = µ, var(X) = σ 2
• Gamma
α α
E(X) = , var(X) = 2
λ λ
• Conditional variance is
≥ xfX (x) dx
Za ∞
≥ afX (x) dx
a
= aP (X ≥ a)
E(X)
⇒ P (X ≥ a) ≤ Markov inequality
a
• This uses only the mean to give a bound on the tail probability
• Usually the bound may be quite loose
lim P (|M − µ| ≥ δ) = 0
N →∞