Sei sulla pagina 1di 3

Stat 455 Cheat Sheet Chapter 5 Waiting Time

Pn
Exponential Distribution Sn = i=1 Ti gamma(n,)

f (x) = ex , x > 0
Chapter 3 Rx
F (x) = f (x)dx = 1 ex
Chapter 6
Conditionals
E[X] = 1/ Continuous Time Markkov Chain
E[X 2 ] = 2/2
{X(t), t 0}s, t 0, non-neg ints i, j, x(u), 0 u < s has
discrete pmf P (X = x) P {X(t + s) = j|X(s) = i, X(u) = x(u), 0 u s}
continuous P X x + dx)
R pdf fX (x) = P (x
Memoryless RV = P {X(t + s) = j|X(s) = i}
E[X] = xfX (x)dx = xS xP (X = x) P {X > s + t|X > t} = P {X > s} = es
P (A|B) =
P (AB) Stationary Transition Probabilities
P (B) Gamma Distribution A CTMC has these if P {X(t + s) = j|X(s) = i}
Bayes: P (A) = P (A|B)P (B) + P (A|B C )P (B C ) Xi drawn idd exp() X1 + . . . + Xn is
P (X=x,Y =y)
conditional pmf: pX|Y (x, y) = P (X = x|Y = y) = (t)n1
P (Y =y) gamma(n, )f (t) = et CTMC Alternate Definition
P P (n1)!
E[X|Y = y] = x xP (X = x|Y = y) = x xpX|Y (x|y) A stochastic process having these properties each time it enters state i:
E[X] =
P
y E[X|Y = y]P (Y = y) Comparing Exponentials
R
= E[X|Y = y]fY (y)dy P {X1 < X2 } = 0 P {X1 < X2 |X1 = x}1 e1 x dx =
R The amount of time it spends in that state before make a
transition into a different state exp() with mean 1/vi
R 1 x dx = 1
0 P {x < X2 }1 e
Chapter 4 1 +2
when the process leaves state i, it enters state j with some
Counting Process probability Pij and
Discrete Time Markov Chains {N (t), t 0} is a counting process if: P
Pii = 0 all i
j Pij = 1, all i
Markov Property N (t) > 0
P (Xn+1 = j|Xn = in , Xn1 = in1 , . . . , X0 = i0 ) =
= P (Xn+1 = j|Xn = in )
N (t) is integer valued Birth and Death Processes
Markov Property V2 If s < t then N (s) N (t) A system with n people with
P (Xn+m = j|Xn = i, . . . , X0 = i0 ) = P (Xn+m = j|Xn = i)
For s < t, N (t) N (s) equals the number of events that occur in {n }
n=0 the arrival/birth rate
Transition Probabilities
Pij = P (Xn+1 = j|Xn = i) = P (X1 = j|X0 = i) the interval (s, t]
{n }
n=0 the departure/death rate
k = P (X
Pij n+k = j|Xn = i) = P (Xk = j|X0 = i)
n = (P (X = j|X = i) > 0,some n > 0
State J is accessible from I if Pij n
Independent Increments v0 = 0
0
If the numbers of events that occur in the disjoint time intervals are
2 states communicate (i j)if they access each other independent vi = i + i
If 2 states communicate they are in the same class.
Any 2 classes are identical or disjoint.
A MC is irreducible if it has only one class.
Poisson Process P01 = 1

If N (i) is the number of times we visit state i, we say a state is recurrent A counting process {N (t), t 0} is a Poisson process with rate if: i
Pi,i+1 =
if it is visited infinitely often, ie: N (0) = 0 i +i
P (N (i) k|X0 = i) = (fi )k = 1 limk P (N (i) k|X0 = 1) i
P (N (i) = |X0 = i) = 1 N (t) has stationary and independent increments Pi,i1 =
i +i
fi = P (enter state i | in state i) Number of evens in a time interval of length t is Poisson with :
If a state is not recurrent then it is transient, fi < 1, ie : (t)n ) qi,i+1 = i
limk P (N (i) k|XP = i) = 0 P (N (i) = |X0 = i) = 0 P {N (t + s) N (s) = n} = et , n = 0, 1, . . .
0 n n!
state i recurrent n=1 Pii = qi,ii = i
state i transient
P n et (t)k
n=1 Pii < P (N (t) = k) =
k! qii = 0
recurrence is a class property, a class property is one such that if state i
has the property and i j then j has the property too.
A state is positive recurrent if for o(h) Functions Knowing qs can give us P, but knowing P cant give us qs
C = min{n 1|Xn = j}, E[Cj |X0 = j] < A function f () is o(h) if
A state is null recurrent if E[Cj |X0 = j] = limh0
f (h)
h
= 0 Transition Probability Function
period of a state,d,= Move from state i to state j in a time t later.
n > 0) note that h is not o(h)
d = gcd(n 1 : Pii Pij = P (X(t) = j|X(0) = i) a continuous function.
n > 0 d|n
Pii
n = 0 d - n
Poisson Process 2
Pii N (0) = 0 Instantaneous Transition Rates
A state is aperiodic if its period is 1.
A MC is ergodic if it is positive recurrent and aperiodic. The process has stationary and independent increments Pij (t) = P {X(t + s) = j|X(s) = i}
For an irreducible and ergodic MC limn Pij n = is independent of qij = vi Pij is the rate, when in state i, at which the process makes a
j
P {N (h) = 1} = h + o(h) transition into state
i. P P j.
P vi = j vi Pij = j qij
j is the solution of j = i=0 i Pij P {N (h) 2} = o(h)
P qij qij
note Pij = = P
j=0 j = 1 P (N (h) = 0) = 1 = h + o(h) vi j qij
i are the limiting probabilities. Put them in a vector (1 , . . . , j ) and Ti is the holding time in state i exp(vi )
that is your stationary distribution.
A MC is reversible wrt { = i}iS i Pij = j Pji this is the local Poisson Process 3 P (Ti > h) = evi h
balance equation N (0) = 0
For Xn a Markov Chain, Yn = Xn is the reverse process. N (t) counts # of events that have occurred up to time t.
Chapman-Kolmogorov Equations
P
Define the mean return time to be mjj = inf {n 1|Xn = j} Pij (t + s) = k=0 Pik (t)Pkj (s), s, t 0
Times between evens are iid exp()
j mjj = 1 and j = 1
Cj
Kolmogorovs Backward Equations
Kolmogorov Equations
P(n + m) = P(n)P(m)
Interarrival Time 0 (t) = P
Tn is the time between the (n 1)st and nth events. {Tn } is the
Pij k6=i qik Pkj (t) vi Pij (t)
n+m P n m
P = k=0 Pik Pkj sequence of internarrival times with Tn exp()
ij
P
n+1
= P (Xn+1 = j|X0 = i) = P (T1 > t) = P (N (t) = 0) = et Kolmogorovs Forward Equations
ij P
P (Tn > t) = et 0 (t) = P
s P (Xn+1 = j|X1 = s, X0 = i)P (X1 = s|X0 = i) Pij k6=j qkj Pik (t) vj Pij (t)
Limiting Probabilities L =
P
n=0 nPn = i long run proportion of time that process in in state j.
Pj limt Pij (t) The limiting probs will exist if vj rate of leaving state j
W = L = 1
j vj long run rate of leaving j
all states communicate WQ = W E[S] = W 1/ = i qij long run rate of going from state i to j
()
the chain is positive recurrent 2
LQ = WQ =
Use thatPto get: ()
vj P i = k6=j qkj Pk , all states j M/M/1 - Finite Capacity
(leaving = entering)
P Now we have the limitation that n N .
P
j j = 1 In the discrete case may exists but limiting probabilities Balance Equations:
may not (if discrete chain is not aperiodic) P0 = P1
In the continuous case there is no similar problem, if exists it is unique ( + )Pn = Pn1 + Pn+1 , 1 n N 1
a and the above limit holds. Pn = Pn1 , for state N
P1 = P0
Embeded Chains
{X(t)}t0 a CTMC Pn+1 = Pn + (Pn Pn1 ), 1 n N 1

N
- stationary distribution of X PN = PN 1 . . . = P0
- stationary distribution of the embedded discrete time MC
j v j j /vj 1(/)N +1)
P P n
j = P , j = P 1 = P = P = P0 or
n=0 n 0 n=0 1/
iS i vi iS i /vi n
j - long run proportoin of transitions that CTMC makes into state j 1/ (/) (1/)
P0 = , Pn = , n = 0, . . . , N
1/vJ - average time it stays in state j 1(/)N +1 1(/)N +1
n
j /vj - long run proportion of time the CTMC spends in state j PN (1/) PN
L = n=0 nPn = 1(/)N +1 n=0 n =
Note: may exist and may not!
[1+N (/)N +1 (N +1)(/)N ]
=
Local Balance Equation for CTMC ()(1(/)N +1 )
i qij = j qji means rate of flow from i to j = rate of flow from j to i To find W we consider 2 cases.
a = if customers in system includes those who never get in
a = (1 PN ) if it does not. Either way we get:
Time Reversability
For a long running MC, the amount of time the process spends in state i W = L
a
is also exponentially distributed with rate vi . We have the discrete time
reversed chain: PASTA
j Pji Poisson Arrivals See Time Averages
Qij = and then
i

Let Xi t0 a continuous-time Markov chain with stationary
i Pij = j Pji i, j
Pi qij = Pj qji i, j distribution . Let Ti be the arrival time of the ith element. These
elements arrive according to a Poisson process.
Then {X(Tn )}t0 has as a stationary distribution.
Chapter 8 a j = j

Definitions Other useful stuff


L average # of customers in system P n x
LQ average # of customers waiting in queue n=0 nx = (1x)2
W average amount of time a customer spends in system X Bernoulli(p) means that you have an even with probability of
WQ average amount of time a customer spends in queue success p.
E[S] average amount of time customer spends in service X geometric(p) means X is the number of Bernoulli trials until success.
N (t) number of customer arrivals by time t p(n) = p{X = n} = (1 o)n1 p
Pn number of customers in system at time t X binomial(n, p) X is the number of successes in n trials.
= limt P {X(t) = n} n i

aka limiting/longrun/steady state probability that p(i) = p (1 p)ni
i
n

n customers are in the system = n!
also long run proportion of time that the system i (ni)!i!
N
PN
contains n customers N = 2N
i=0 i = (1 + 1)
a average arrival rate of customers
N (t) Pk i m(1mk )
= limt
t i=1 m = 1m
an proportion of custs that find n in the system when arriving
dn proportion of custs that find n in the system when leaving Infitesimal Generator: G
gij = qij , i 6= j
Littiles Formula gii = vi (note: on last 2 lines, entries not probabilities
L = a W LQ = a WQ
[P 0(t)]ij = [GP (t)]ij
P 0(t) = GP (t)(Ks backwards eqn)
Poisson Model P 0(t) = P (t)G (Ks forward eqn)
Poisson arrivals see time averages. ie Pn = an . P (tG)n
P (t) = etg = n=0 n!
a
M/M/1
Customers arrive according to a Poisson process with rate . The time
between successive arrivals are independent rv with mean 1/. If server
Stationary Distribution for a CTMC
free, cust goes in, else into queue. Service times are exp().
P = P (t)
Balance Equations: = 1
P0 = P1 If the initial distribution of X(0) is then the distribution of X(t) will
( + )Pn = Pn1 + Pn+1 also be , t > 0. (P (X(t) = j) = j j S, t > 0
P
P1 = P0 iS pij (t)i = j

n+1
Pn+1 = Pn + (Pn Pn1 ) =

P0 Global balance equation for CTMC
P P
n Po 0 = G P
1 = n=0 Pn = n=0 P0 = or
1/ vj j =
n i6=j qij i this is the jth row of the matrix
P 0 = 1 , Pn = (1 ) long run rate out of j = long run rate into state i

Potrebbero piacerti anche