Sei sulla pagina 1di 8

# Proof of the Gauss-Markov Theorem

c
Dan Nettleton (Iowa State University)

Statistics 511

1/8

## The Gauss-Markov Theorem

of
Under the Gauss-Markov Linear Model, the OLS estimator c0
an estimable linear function c0 is the unique Best Linear
is strictly
Unbiased Estimator (BLUE) in the sense that Var(c0 )
less than the variance of any other linear unbiased estimator of
c0 .

c
Dan Nettleton (Iowa State University)

Statistics 511

2/8

## Unbiased Linear Estimators of c0

If a is a fixed vector, then a0 y is a linear function of y.
An estimator that is a linear function of y is said to be a
linear estimator.
A linear estimator a0 y is an unbiased estimator of c0 if and
only if
E(a0 y) = c0 IRp

c
Dan Nettleton (Iowa State University)

a0 E(y) = c0 IRp
a0 X = c0 IRp
a0 X = c0 .

Statistics 511

3/8

## The OLS Estimator of c0 is a Linear Estimator

We have previously defined the Ordinary Least Squares
where
is any
(OLS) estimator of an estimable c0 by c0 ,
0
0
solution to the normal equations X Xb = X y.
is the same for any

## We have previously shown that c0

that is a solution to the normal equations.
We have previously shown that (X0 X) X0 y is a solution to the
normal equations for any generalized inverse of X0 X denoted
by (X0 X) .
= c0 (X0 X) X0 y = `0 y (where `0 = c0 (X0 X) X0 ) so
Thus, c0
is a linear estimator.
that c0
c
Dan Nettleton (Iowa State University)

Statistics 511

4/8

## is an Unbiased Estimator of an Estimable c0

c0
By definition, c0 is estimable if and only if there exists a
linear unbiased estimator of c0 .
It follows from slide 3 that c0 is estimable if and only if
c0 = a0 X for some vector a.
If c0 is estimable, then
`0 X = c0 (X0 X) X0 X = a0 X(X0 X) X0 X = a0 PX X = a0 X = c0 .
= `0 y is an unbiased estimator of c0
Thus, by slide 3, c0
whenever c0 is estimable.

c
Dan Nettleton (Iowa State University)

Statistics 511

5/8

## Suppose d0 y is any linear unbiased estimator other than the

= `0 y.
OLS estimator c0
Then we know the following:
1

## d 6= ` ||d `||2 = (d `)0 (d `) > 0, and

d0 X = `0 X = c0 = d0 X `0 X = 00 = (d `)0 X = 00 .

## We need to show Var(d0 y) > Var(c0 ).

c
Dan Nettleton (Iowa State University)

Statistics 511

6/8

## Proof of the Gauss-Markov Theorem

Var(d0 y) = Var(d0 y c0 + c0 )
0
+ Var(c0 )
+ 2Cov(d0 y c0 ,
c0 ).

= Var(d y c0 )

## = Var(d0 y `0 y) = Var((d0 `0 )y) = Var((d `)0 y)

Var(d0 y c0 )
= (d `)0 Var(y)(d `) = (d `)0 ( 2 I)(d `)
= 2 (d `)0 I(d `) = 2 (d `)0 (d `) > 0 by (1).

c0 )
= Cov(d0 y `0 y, `0 y) = Cov((d `)0 y, `0 y)
Cov(d0 y c0 ,
= (d `)0 Var(y)` = 2 (d `)0 `
= 2 (d `)0 X[(X0 X) ]0 c = 0 by (2).

c
Dan Nettleton (Iowa State University)

Statistics 511

7/8

## Proof of the Gauss-Markov Theorem

It follows that
+ Var(c0 )

Var(d0 y) = Var(d0 y c0 )

> Var(c0 ).
2

c