Sei sulla pagina 1di 3

Autocorrelation

Autocorrelation (sometimes called serial correlation) occurs when one of the Gauss-
Markov assumptions fails and the error terms are correlated.
i.e. 1.
This can be due to a variety of problems, but the main cause is when an important
variable has been omitted from the regression. In the presence of autocorrelation the
estimator is no longer BLUE, as the estimator is not the best. In this case the t-
statistics and other tests are no longer valid.

Testing for Autocorrelation


To test for first order autocorrelation, we use the Durbin-Watson (DW) d statistic.
Given the following 1st order process:
y t xt u t (1a )
Where : u t u t 1 t (1b )
The d statistic is roughly: d = 2 - 2 , where lies between +1 and -1. This statistic
lies between 0 and 4. Having calculated the DW statistic, you need to go to the Tables
to find the critical values, which take the form of a lower and upper statistic. To
determine if there is autocorrelation or not, you need to put the values into the
following framework:

Indec Indec
-ision -ision

0 2 4
dl du 4-du 4-dl
If the DW d-statistic is between du and 4-du, there is no first order autocorrelation. If
it is below dL or above (4-dL) then autocorrelation is present.

Detection of higher order autocorrelation: The Lagrange Multiplier Test

The Lagrange Multiplier test is used for detecting autocorrelation of the more general
form such as 2nd or 4th order autocorrelation, and the test is executed as follows:
i) First decide on the order of autocorrelation that you want to test, say 2;
ii) Run the usual OLS regression of y against the explanatory variable x.

y t xt u t (2)

and save the residuals; ut


iii. Run a regression using the residuals from step ii as the dependent variable
against the explanatory variable xt, (as in ii) and also lagged variables of u (depending
on the order of the autocorrelation, in this case 2 lags)

u t 0 1 xt 2 u t 1 3u t 2 t (3)
.iv. Calculate TR2 for this regression (total number of observations multiplied by the
R2 value). Under the null hypothesis of no autocorrelation, this statistic has a (chi-
squared) distribution with s (number of lags on error term) degrees of freedom. (in
this case 2, which has a critical value of 5.99).There are two important points
regarding the Lagrange Multiplier test: firstly, it ,is a large sample test, so caution 'is
needed in interpreting results from a small sample; and secondly, it detects not only
autoregressive autocorrelation but also moving average autocorrelation. Again caution
is needed in interpreting the results.

Cochrane-Orcutt and Unrestricted models for remedying autocorrelation


Given the following model, suffering from first order autocorrelation

y t xt u t (4a )
u t u t 1 t ( 4b)

Lag the first equation (4a) and multiply by :

y t 1 xt 1 u t 1 ( 4c )
( y t y t 1 ) (1 ) ( xt xt 1 ) (u t u t 1 ) ( 4d )

This last equation (d) is the generalised difference equation, formed by taking (4c)
from (4a), used in the Cochrane-Orcutt approach to eliminating the autocorrelation.
Cochrane and Orcutt then recommend the following steps to estimate :
1. Estimate the two-variable model. with the original untransformed data by the
standard OLS routine and obtain the residuals, ut.
2. Using the estimated residuals. regress ut against ut-1 ,the coefficient of ut-1 being an
estimate of .
3. Using obtained in step 2, transform the data and run the generalised difference
equation. namely equation (4d) above:

yt* * t* u t*
(4e)

4. Collect the residual (ut*) from the new generalised difference equation and repeat
step 1 to 3 until there is no further change in the parameters.

Autocorrelation may be due to the omission of an important explanatory variable,


such as a particular lag structure. The Cochrane-Orcutt Procedure is a way of
introducing a specialised lag structure. The generalised difference equation used in the
Cochrane-Orcutt (C-O) Procedure (step 3), can be rewritten as :

yt (1 ) yt 1 xt xt 1 t (5a)

This equation effectively imposes the restriction that the coefficient of xt-1 is equal to
minus the product of the coefficients of the other two explanatory variables. This
restriction may not be true, the following more general equation may be appropriate:
yt 0 1 y t 1 2 xt 3 xt 1 t (5b)

This is an unrestricted version of the C-O difference equation above. To test if the
restriction is valid we need to test the restriction:

3 1 2

.To test if the restriction holds:

I) Run equation (5a), using the C-O procedure and collect the Residual Sum of
Squares (RSSRes) (restricted equation).
2) Run equation (5b), using OLS and collect RSSUnres (unrestricted equation)
We want to test the null hypothesis H0: 3 1 2
The test statistic for the common factor tes): is:

RSS Re s
T log e
RSSUnres

Where T are the number of observations and the Residual Sum of Squares are from
the restricted and unrestricted regressions. The test statistic follows a chi-squared
distribution, with degrees of freedom equal to the number of restrictions (1 in this
case). If the restriction is not rejected, we use the C-O procedure, if rejected fit 5b.

N.B. The RSS or Residual Sum of Squares is simply the sum of the residuals squared.

Potrebbero piacerti anche