Sei sulla pagina 1di 3

Useful Formulae

Special Functions
Factorial For positive integer x, x! = x(x 1) 1; 0! = 1.

Gamma Function For > 0 we dene () = 0 x1 ex dx.
For all > 0, ( + 1) = (). For positive integer , () = ( 1)!.

Beta Function Dene, for > 0 and > 0, B(, ) = ()()/( + ).



Binomial Coecients For any real r, 0r = 1. For positive integer x,

r r(r 1) (r x + 1)
= .
x x!

For r > x 1 this can be written as (r + 1)/ [(r x + 1)x!] which in turn can be written as
r!/ [(r x)!x!] if r is also an integer.

Basic properties of random variables



If X is has a discrete distribution then E[g(X)] = x g(x)P (X = x).

If X is has a continuous distribution with density fX (x) then E[g(X)] = g(x)fX (x) dx.

If E(X) = and Var(X) = 2 then E(a + bX) = a + b and Var(X) = b2 2 .

E(X + Y ) = E(X) + E(Y ) for any X and Y .

Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ).

If X and Y are independent then Cov(X, Y ) = 0.



Var(X) = E X 2 [E(X)]2 .

Cov(X, Y ) = E(XY ) E(X)E(Y ).



Corr(X, Y ) = Cov(X, Y )/ Var(X)Var(Y ).

Finite Populations

Let X = {x1 , . . . , xN } denote a population of real numbers with mean = N1 Nj=1 xj and (population)
1 N
variance = N j=1 (xj ) . IfY1 , . . . , Yn denotes a random sample taken from X either with or
2 2

without replacement then with T = ni=1 Yi , we have E(T ) = n.

Sampling with replacement


If the sampling is taken with replacement then

Y1 , . . . , Yn are iid;

Var(T ) = n 2 ;

if all xj s are 0 or 1 then with p = , T has a binomial distribution and for appropriate t,
P (T = t) = nt pt (1 p)nt .

8
Sampling without replacement
If the sampling is taken without replacement then

n
Var(T ) = n 2 N N 1 ;

if all xj s are 0 or 1 then with M = N , T has a hypergeometric distribution and for appropriate t,
(M )(N M )
P (T = t) = t Nnt .
(n)
if xj = j for j = 1, . . . , N then the i-th smallest observation Y(i) satises
N y
(y1
i1 )( ni )
P (Y(i) = y) = N for appropriate y;
(n)
i(N +1) i(N +1)(ni+1)(N n)
E(Y(i) ) = n+1 and Var(Y(i) ) = (n+1)2 (n+2)
.

Probability Distributions
Discrete Distributions
Beta Binomial X has a beta-binomial distribution if for positive , and integer n,
(+x1 )(+nx1 ) (n)B(+x,+nx)
P (X = x) = x nx
++n1 = x B(,) for x = 0, 1, . . . , n; then E(X) = np and
( n )
Var(X) = np(1 p) N N +1 where N = + and p = /N .
+n

Poisson X has a Poisson distribution if for x = 0, 1, 2, . . ., P (X = x) = e x /x!, in which case


E(X) = Var(X) = .

Negative Binomial
X has ax negative binomial distribution if for k > 0, 0 < p < 1 and x = 0, 1, 2 . . .,
P (X = x) = x+k1 x (1 p) p k in which case E(X) = k(1 p)/p and Var(X) = k(1 p)/p2 . For

integer values of k, Y = X +k is also called negative binomial. The geometric distribution is negative
binomial with k = 1.

Continuous Distributions
y
If X and has density fX (x) and Y = + X then Y has density fY (y) = 1 fX .

Uniform X U (0, 1) means X has density fX (x) = 1 for 0 < x < 1, 0 otherwise. E(X) = 1
2 and
Var(X) = 12
1
. Y U (a, b) means that (Y a)/(b a) U (0, 1).
1 2
Normal X N (0, 1) means X has density fX (x) = (2)1/2 e 2 x . E(X) = 0 and Var(X) = 1.
Y N (, 2 ) means (Y )/ N (0, 1).

Beta

U Beta(, ) means X has pdf u1 (1u)1 /B(, ) over the interval (0, 1) and 0 otherwise,
where the normalising constant is the beta function.
If U Beta(, ) then for appropriate k, E(U k ) = B( + k, )/B(, ). In particular, E(U ) =

+ and V ar(U ) = (+)2 (++1) .

If U1 , . . . , Un are independent U (0, 1) the i-th order statistic U(i) Beta(i, n i + 1).

9
Gamma

If, for > 0, X has density fX (x) = x1 ex /() for x > 0, 0 otherwise, then we say X has
a gamma distribution with shape and unit rate/scale and we write X gamma(, 1).
If X gamma(, 1), E(X k ) = ( + k)/() (for k > ) so E(X) = V ar(X) = .
If X1 gamma(1 , 1) and X2 gamma(2 , 1) are independent then X1 + X2 gamma(1 +
2 , 1).
If X gamma(, 1) then
Y = X/ is said to be gamma with shape parameter and rate parameter ;
Z = X is said to be gamma with shape parameter and scale parameter .

Exponential is the same as gamma with shape parameter 1.

Convergence in Probability
A sequence of random variables X1 , X2 , . . . is said to converge to in probability if for any > 0,
P (|Xn | > ) 0 as n .
P P
If Xn c and a function g() is such that limxc g(x) = then g(Xn ) .
P
Delta Method If Xn c and g() is dierentiable at c then for large n we may use the approxi-
mation
g(Xn ) g(c)
g (c) .
Xn c

Inequalities
Markovs Inequality If X is a random variable only taking non-negative values then for any c > 0,
P (X c) E(X)/c.

Chebyshevs Inequality If X has E(X) = and V ar(X) = 2 < then for any k > 0,
P (|X | k) 1/k 2 .

Cauchy-Schwarz Inequality If X and Y are random variables with E(X 2 ) < and E(Y 2 ) <
then [E(XY )]2 E(X 2 )E(Y 2 ) with equality if and only if Y = cX for some constant c.

Cramr-Rao Inequality If (y) denotes the derivative with respect to of the log-likelihood for
a one-parameter family indexed by and t(Y ) is an unbiased estimator of then
1
V ar [t(Y )]
V ar [ (Y )]

with equality if and only if (y) = C [t(y) ] for some C not depending on y.

10

Potrebbero piacerti anche