Sei sulla pagina 1di 3

LECTURE 10 Asymptotics (xed)

10.1. Convergence in Probability


Reminder from basic calculus - sequences Xn and limits. limn+ Xn = X if for any > 0 there is N ( ) with |Xn X | < for all n > N ( ). Extending the notion of limits and convergence to random variables is not straightforward. There are multiple concepts of convergence (almost sure convergence, convergence in probability, mean square convergence, convergence in distribution). We will focus on two of them. Denition 10.1. Let {Xn } be a sequence of random variables and let X be a random variable. We say that Xn converges in probability to X if for all > 0 lim P [|Xn X | ] = 0
n+

We write: Xn X or plim Xn = X Properties p p p (a) If Xn X and Yn Y . Then Xn + Yn X + Y p p (b) If Xn X and a is constant. Then aXn aX p (c) Suppose Xn a and the real function g is continuous at a. p Then g (Xn ) g (a) p p p (d) If Xn X and Yn Y . Then Xn Yn XY Theorem 10.2. [Reminder]Chebyshev inequality P (|X | ) 2
2

Theorem 10.3. Weak law of large numbers Let {Xn } be a sequence of iid random variables having common mean and variance 2 < . Let Xn = n1 n i=1 . Then Xn Proof. To be done on the board. NB There are various versions of the law of large numbers. Example 1. Let X1 , ..., Xn denote a random sample from a distribution with mean and variance 2 . Show that the sample variance 1 2 Sn = n (Xi X n )2 is a consistent estimator of variance 2 . 1
1
p

10.2. Convergence in Distribution


Denition 10.4. Let {Xn } be a sequence of random variables and let X be a random variable. Let FXn and FX be, respectively the cdfs of Xn and X. We say that Xn converges in distribution to X if
n+

lim FXn (x) = FX (x)


d

and we write

Xn X We sometimes say that X is the limiting distribution of Xn . Properties (a) If Xn converges to X in probability, then Xn converges to X in distribution (b) (Continuous mapping theorem) Suppose Xn converges to X in distribution and g is a continuous function on the support of X. Then g (Xn ) converges to g (X ) in distribution (c) (Slutskys Theorem) Let Xn , X , An , Bn be random variables p p d and let a and b be constants. If Xn X , An a, Bn b, then d An + Bn Xn a + bX Theorem 10.5. (Lindberg-Levy Central Limit Theorem): Let X1 , X2 , ..., Xn denote a sequence of independently and identically distributed (iid) random variables with E (Xi ) = and var(Xi ) = 2 . Let X n = n1
n

Xi . Then
i=1

n1/2 (X n ) N (0, 2 ) NB1 There are various versions of the Central Limit Theorem. NB2 Note that the theorem does NOT make any distributional assumption on the Xi . Example 2. Suppose that Xi are iid Bernoulli random variables where the probability of success p = 0.25. Use a normal approximation to nd P (X 0.2) if n = 100. Theorem 10.6. (Delta method.) Let {Xn } be a sequence of random variables such that n1/2 (Xn ) N (0, 2 ). Suppose the function g (x) is dierentiable at and g () = 0. Then n1/2 (g (Xn ) g ()) N (0, 2 (g ())2 )
d d

LECTURE 10. ASYMPTOTICS (FIXED)

10.3. Multivariate version


Theorem 10.7. Let Xn be a sequence of p dimensional vectors and let p p X be a random vector. Then Xn X if and only if Xnj Xj , for all j = 1, ..., p Theorem 10.8. Let Xn be a sequence of iid random vectors with common mean vector and variance-covariance matrix which is positive denite. Assume the common moment generating function exists in an open neighborhood of 0. Then 1 d Yn = (Xi ) Np (0, ) n

10.4. Examples
Example 3. Suppose Xi is distributed NIID(2,1) for i = 1, ..., n. Let 2 Y = X . Suppose n = 100. Find P (Y 3.7) Example 4. Let Y denote the sum of the observations of a ran1 dom sample of size 12 from a distribution having pmf p(x) = 6 , x = 1, 2, 3, 4, 5, 6 , zero elsewhere. Using a normal approximation, compute an approximate value of P (36 Y 48) Example 5. Suppose Xt = t + t1 where t and t1 are iid with p mean 0 and variance 1. (a) Show that X 0 and (b) nd V in d n1/2 X N (0, V )

Potrebbero piacerti anche