Sei sulla pagina 1di 23

Econometrics

Chapter 6: Multiple Regression Model

Population regression function

Sample regression function

Assumptions of the classical regression model

Ideal conditions that guarantee that estimated parameters are unbiased, consistent, and attain the lowest variance among linear unbiased estimators.

Assumption 1: linearity

Linear relationship between the dependent variable and k independent variables:

Assumption 2: E(u)=0

E(ui)=0

Assumption 3: Homoskedasticity

The error terms have a constant variance (for all possible combinations of X1i, X2i, , Xki):

Heteroskedastic vs. homoskedastic error processes

Assumption 4: No autocorrelation

The error terms are independent across observations

Assumption 5: Nonstochastic X

The Xi are nonstochastic (not random) Common violations:


measurement error endogenous variables

This assumption guarantees that the covariance between the independent variable and the error term will be zero.

Assumption 6 no perfect multicollinearity

None of the independent variables can be written as an exact linear combination of the other independent variables

Near-perfect multicollinearity

Near-perfect multicollinearity one or more of the independent variables is approximately equal to a linear combination of the other variables Consequence:

imprecise parameter estimates for the affected variables. similar to the effects of small sample size.

Common phenomenon in time-series data due to common trend and cyclical effects

OLS estimation

Ordinary least squares estimation:

OLS estimators

Derivation requires the use of matrix algebra Estimators and standard errors are calculated by all statistical and econometric software packages

Standard errors of OLS estimators

The standard errors of the estimators will, in general, be smaller when:

the variance of the error term (s2) is relatively small, the number of observations is relatively large, and/or there is a relatively large amount of independent variation in each of the variables included on the right-hand side of the regression equation.

Properties of OLS estimators

Under the conditions of the classical regression model, OLS estimators are:

consistent linear unbiased best linear unbiased estimators (BLUE) (Gauss-Markov theorem)

Coefficient of Determination R2
With a bit of algebraic manipulation:

R2

TSS = RSS + ESS R2 = RSS/TSS, or R2 = 1 (ESS/TSS) R2 = the proportion of the variation explained by the regression model R2 is greater than or equal to zero R2 is less than or equal to one

R2 and the intercept

R2 may be appropriately computed as RSS/TSS only if an intercept term is included in the regression model

R2 and degrees of freedom

As additional variables are added to a regression, R2 tends to increase even if there is no causal relation between the added variables and the dependent variable. R2 = 1 when the # of estimated intercept and slope parameters = number of observations

Adjusted R2

Adjusted R2:

Cautions in interpreting R2

R2 is not a statistic that can be directly used for hypothesis testing If there is a large random component in the data generating process, R2 will be low, even if the model is correctly specified R2 is a measure of correlation, not causation

Forecasting

Forecasts have lower variance when:

the variance of the error term is smaller, or the sample size is larger.

Forecast accuracy is often assessed using the root mean square error:

Potrebbero piacerti anche