Sei sulla pagina 1di 2

Tutorial 2

1. Given a simple linear regression model

yi = 0 + 1 xi + i , i = 1, 2, , n, (1)

prove that
1
n
s =2
(yi (0 + 1 xi ))2
n 2 i=1

is an unbiased estimator of 2 . Here 0 and 1 are LSE estimators of 0 and 1


and 2 is the variance of i.i.d i .

2. Suppose the regression model

yi = 1 xi + i , i = 1, 2, , n,

Find the least squares estimator of 1 .


Assume that the error terms i are independent N (0, 2 ) and that 2 is
known. State the likelihood function for the n sample observations on Y
and obtain the maximum likelihood estimator of 1 . Is it the same as the
least squares estimator ?
Show that the maximum likelihood estimator of 1 is unbiased.

3. In a study of the relationship between a predictor variable X and a response


variable Y , eight independent pairs of observations are obtained as follows.

xi 1 2 3 4 5 6 7 8
yi 1.98 0.75 2.67 3.65 3.34 4.82 6.64 5.97

Consider using a simple linear regression model, y = 0 + 1 x + , N (0, 2 ),


to study the relationship.

1
(i) Estimate the regression coecients, 0 and 1 , with the least squares
method.
(ii) Find the standard errors for the estimated regression coecients in part
(i).
(iii) Construct a t-test for the hypothesis that 1 = 0. Use a 5% level of signif-
icance.

Potrebbero piacerti anche