Sei sulla pagina 1di 6

Use of dami variables in econometrics

Dami variables

Definition:

“In regression analysis it frequently happen that dependent variable is not only influenced
by variables which can be quantified (Cal in NO) on some well known scale like income,
output, price and height but also those variables which are qualitative in nature like
gender, race, color ,religion , war and peace”

Explanation:

Since such qualitative variables usually indicate the presence and absence of a quality or
attribute such as

Male or Female
Black and white

One method of identifying by such attribute by constructing artificial variables which


take of value

0 indicating the absence of an attribute


1 indicating the presence of an attribute

Exp:
0 May indicate the person is female
1 May indicate the person is Male

Multi co linearity
Questions

1. What do you understand by multicolenearty? Or explain the nature of


multicolenearty.
2. Discuss the consequences of multicolenearty.
3. How it can be detected?
4. What are the remedial measures?

Answers

1. What do you understand by multicolenearty? Or explain the nature of


multicolenearty.
Ans:
One of the assumption of classical multiple regression model is that
“There exists no exact relationship b/w independent variable in the model if such
relationship exists we say the independent variables are perfectly collinear or perfect
co linearity is said to exist.
Exp:

Suppose the H- school grades of student depends upon the following variables
Y = the grade of the student
X2 = Family income
X3 = No of hour of study per day
X4 = the no of hours of study per week

So model that can be estimate is

Y=β 1+ β 2X2+ β 3X3+ β 4X4+Є

In this case variable X3 and X4 are perfectly collinear because

7X3=X4
This gives rise to perfect colinearity in model therefore OLS estimates (β 1, β 2, β 3) of
this model are not possible. Interpretation of β 2, β 3 are not possible.

2. Discuss the consequences of multicolenearty.

Consequences of multicolenearty

• Whenever there is a problem of multicolenearty the variance of the


estimated coefficient will be very high.
Exp: variance of β 2, β 3 will be very big in No.

• Interpretation of individual coefficient is of doubtful value


Exp:

• The value of t coefficients is very small but the value R2 is very high.

Note:
For Significance check T-Test
For Importance R2
For Overall goodness of model K-Test

3. How it can be detected?

Detection of multicolenearty

If simple co relation coefficient b/w two independent variable is higher than the practical
co relation or multiple co- relation then it is a problem of multicolenearty.

r>R
r = simple co relationship
R = Partial
4. What are the remedial measures?

It depends upon the objective of the model


2 main objectives

• For casting
• Structural analysis

If we take structural measure then increase the sample size.

Heteroskedasticity

Questions

1. What is the nature of Heteroskedasticity& what do you understand by


heteroskedasticity?
2. Major causes of heteroskedasticity
3. How it can be detected
4. Remedial measure of hetro

Introduction

If all the assumptions of OLS are satisfied the OLS estimates are blue. However if
anyone of assumption is violated the estimates are no more blue.
Violation of the assumptions regarding variance of error term has very serious problems.

Regards, heteroskedasticity the above questions will be answered

1. What is the nature of Heteroskedasticity& what do you understand by


heteroskedasticity?
One of the important assumptions of OLS is that
• The variance of each of the ε (error term) is constant

Variance of (ε) =

• But if the variance is not constant then the problem of hetro is said to be
present

Variance of (ε) =
2. Major causes of heteroskedasticity

• It is expected in cross sectional data


• As data collection technique changes, variance also changes that cause hetro

Major consequences of Hetro (Very Important Regarding Paper Point of View)


1. In the presence of hetro the blue estimates are no more best
Or
The estimates are still unbiased but they are no more best or efficient
2. The variance of the estimated errors are bigger than decase such Case)
Or
Hetro variance will be bigger when there is no hetro
3. In case of hetro the usual test like T-test, F-test etc will provide the misleading
results
For exp:

t = β ^2 / S.E (β ^2)

if variance of β ^2 is bigger because of hetro then S.E of β ^2 will also bigger when S.E
is bigger then the value of t will be smaller and that situation you fail to reject H0

Method of detection of Hetro


• Graphical
• Mathematical

Graphical
In this method we plot the square of error against time. And by inspection we can
determine whether hetro is present or not

Mathematical method

1. Gould feld quant test


2. Spilled man rank correlation test
3. Park glejser Test for heteroskedasticity

Splid man rank correlation test for heteroskedasticity


This test is based upon the following steps

1. Fit the regression of Y on X and obtain residual


2. ignore the sign of residual and then rank both independent variable X and residual
according to ascending order or descending order and compute the Splid man
correlation denoted by rs

rs = 1-6 [Σ di2 / n (n2-1) ]

Where d is the difference of two ranks and n is the sample size

3. Test Ho by t-test

t = rs (n-2)

(1- rs2)

with n-2 degree of freedom

4. if t is significant mean tcal > ttab reject Ho other wise fail to reject

Example
2
Y X Ŷ Є =Y-Ŷ ΙЄΙ rank of ΙЄΙ rank of ΙЄΙ d=rank of ΙЄΙrank of ΙЄΙ d
12.4 12.1 2 5 -3 9
14.4 21.4 1 1 0 0
14.6 18.7 7 3 4 16
16 15.4 6 4 2 4
11.3 12.4 5 6 -1 1
10 10.4 4 7 -3 9
16.2 20.6 3 2 1 1

Park glejser Test for heteroskedasticity


This test tells us that how to correct heteroskedasticity

There are two main objectives

• Tells id there is hetro or nor


• It makes correction for hetro

Є = α +β X+Є

If α and β are insignificant stop and conclude about the absence of hetro

But if β is significant tcal > ttab then we have to take following steps
1. transform the original variables into new

Y* = Y / X
X* = 1 / X
*
Є = Є/X

Standardize equation
*
Y* = β + α X* + Є

α = Σ x*y*
Σx*2

β =Y- α X

Potrebbero piacerti anche