Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Andrea Beccarini
Center for Quantitative Economics
Winter 2013/2014
Winter 2013/2014
1 / 143
Introduction
Objectives
Winter 2013/2014
2 / 143
Introduction
Prerequisites
Descriptive Statistics
Probability Theory
Statistical Inference
Winter 2013/2014
3 / 143
Introduction
Class and material
Class
Class teacher: Sarah Meyer
Time: Tu., 12:00-14:00
Location: CAWM 3
Start: 22 October 2013
Material
Course page on Blackboard
Slides and class material are (or will be) downloadable
Winter 2013/2014
4 / 143
Introduction
Literature
Winter 2013/2014
5 / 143
Basics
Definition
Winter 2013/2014
6 / 143
Basics
Definition
Typical notations
x1 , x2 , . . . , xT
or x(1), x(2), . . . , x(T )
or xt , t = 1, . . . , T
or (xt )t0
This course is about . . .
univariate time series
in discrete time
with continuous states
Winter 2013/2014
7 / 143
Basics
Examples
600
550
500
450
400
350
650
1995
2000
2005
2010
Time
Winter 2013/2014
8 / 143
Basics
Examples
6000
2000
DAX
1970
1980
1990
2000
2010
2000
2010
9.0
8.0
7.0
6.0
logarithm of DAX
Time
1970
1980
1990
Time
Winter 2013/2014
9 / 143
Basics
Definition
Winter 2013/2014
10 / 143
Basics
Definition
Distinguish four cases: both time and chance can be fixed or variable
fixed
variable
t fixed
Xt () is a
real number
Xt () is a
random variable
t variable
Xt () is a sequence of
real numbers (path,
realization, trajectory)
Xt () is a stochastic
process
process.R
Winter 2013/2014
11 / 143
Basics
Examples
= Xt1 + t
and X0 = 0
NID(0, )
= Z
N(0, 2 )
Winter 2013/2014
12 / 143
Basics
Moment functions
[1]
Winter 2013/2014
13 / 143
Basics
Estimation of moment functions
(1)
X1
(2)
X2
..
.
(2)
(1)
XT
(2)
XT
...
...
...
...
(n)
X1
(n)
X2
..
.
(n)
XT
Winter 2013/2014
14 / 143
Basics
Estimation of moment functions
X1
(1)
X2
..
.
(1)
X1
(2)
X2
..
.
(2)
(1)
XT
(2)
XT
...
...
...
...
(n)
X1
(n)
X2
..
.
(n)
XT
1 X (i)
(t) =
Xt
n
i=1
Winter 2013/2014
15 / 143
Basics
Estimation of moment functions
X1
(1)
X2
..
.
(1)
X1
(2)
X2
..
.
(2)
(1)
XT
(2)
XT
...
...
...
...
(n)
X1
(n)
X2
..
.
(n)
XT
=
Xt
T
t=1
Winter 2013/2014
15 / 143
Basics
Estimation of moment functions
X1
(1)
X2
..
.
(1)
X1
(2)
X2
..
.
(2)
(1)
XT
(2)
XT
...
...
...
...
(n)
X1
(n)
X2
..
.
(n)
XT
1 X (i)
(i)
(t, t + h) =
(Xt
(t))(Xt+h
(t + h))
n
i=1
Winter 2013/2014
16 / 143
Basics
Estimation of moment functions
X1
(1)
X2
..
.
(1)
X1
(2)
X2
..
.
(2)
(1)
XT
(2)
XT
...
...
...
...
(n)
X1
(n)
X2
..
.
(n)
XT
T h
1 X
)(Xt+h (1)
)
(Xt (1)
T
t=1
Winter 2013/2014
16 / 143
Basics
Definition
Winter 2013/2014
17 / 143
Basics
Restriction of time heterogeneity: Stationarity
Winter 2013/2014
18 / 143
Basics
Restriction of time heterogeneity: Stationarity
the variance exists and is constant: Var (Xt ) = 2 < for all t T
Winter 2013/2014
19 / 143
Basics
Restriction of time heterogeneity: Stationarity
stationary.R
Examples
[2]
Winter 2013/2014
20 / 143
Basics
Restriction of memory: Ergodicity
T
1 X
Xt
T
t=1
Winter 2013/2014
21 / 143
Basics
Restriction of memory: Ergodicity
T h
1 X
(Xt )(Xt+h )
T
t=1
Winter 2013/2014
22 / 143
Basics
Restriction of memory: Ergodicity
|(h)| <
h=
Winter 2013/2014
23 / 143
Basics
Restriction of memory: Ergodicity
[3]
Winter 2013/2014
24 / 143
Basics
Estimation of moment functions
Summary of estimators
electricity.R
T
1 X
= XT =
Xt
T
t=1
T
h
X
(h) =
1
T
(h) =
(h)
(0)
(Xt
)(Xt+h
)
t=1
Winter 2013/2014
25 / 143
Basics
Estimation of moment functions
The estimator
is unbiased, i.e. E (
) =
[4]
The variance of
is
[5]
T 1
(0)
2 X
h
Var (
) =
+
1
(h)
T
T
T
h=1
X
h=1
(h) =
(h)
h=
Winter 2013/2014
26 / 143
Basics
Estimation of moment functions
N (, Var (
))
and asymptotically
T (
) Z N
0, (0) + 2
!
(h)
h=1
T (
) Z N 0, (0) + 2
(h)
h=1
Winter 2013/2014
27 / 143
Basics
Estimation of moment functions
T (
(0) (0)) , . . . , T (
(K ) (K ))
is multivariate normal with expectation vector (0, . . . , 0)0 and
T Cov (
(h1 ) , (h2 ))
X
=
( (r ) (r + h1 + h2 ) + (r h2 ) (r + h1 ))
r =
Winter 2013/2014
28 / 143
Basics
Estimation of moment functions
T (
(0) (0)) , . . . , T (
(K ) (K ))
is multivariate normal with expectation vector (0, . . . , 0)0 and a
complicated covariance matrix
Be careful: For small to medium sample sizes the autocovariance and
autocorrelation estimators are biased!
autocorr.R
Winter 2013/2014
29 / 143
Basics
Estimation of moment functions
for h1 = h2
else
Winter 2013/2014
30 / 143
i = 1
where i is the number which, when squared, equals 1
The number i is called imaginary (i.e. not real)
Winter 2013/2014
31 / 143
16 =
16 1 = 4i
5 =
5 1 = 5i
Further, it is possible to define numbers that contain
both a real part and an imaginary part, e.g. 5 8i or a + bi
Such numbers are called complex and the set of complex numbers
is denoted as C
The pair a + bi and a bi is called conjugate complex
Winter 2013/2014
32 / 143
imaginary axis
Geometric interpretation:
a+bi
er
alu
ev
lut
so
ab
imaginary part b
real part a
real axis
Winter 2013/2014
33 / 143
= a + bi
= r (cos + i sin )
= re i
a = r cos
b = r sin
p
a2 + b 2
r =
b
= arctan
a
Winter 2013/2014
34 / 143
Rules of calculus:
Addition
(a + bi) + (c + di) = (a + c) + (b + d)i
Multiplication (cartesian coordinates)
(a + bi) (c + di) = (ac bd) + (ad + bc)i
Multiplication (polar coordinates)
r1 e i1 r2 e i2 = r1 r2 e i(1 +2 )
Winter 2013/2014
35 / 143
imaginary axis
Addition:
a+bi
c+di
real axis
Winter 2013/2014
36 / 143
Addition:
imaginary axis
a+bi
c+di
real axis
Winter 2013/2014
36 / 143
Addition:
(a+c)+(b+d)i
imaginary axis
a+bi
c+di
real axis
Winter 2013/2014
36 / 143
imaginary axis
Multiplication:
r2
r1
real axis
Winter 2013/2014
37 / 143
Multiplication:
imaginary axis
r=
r1
= 1 + 2
2
r2
r1
real axis
Winter 2013/2014
37 / 143
p2
4
p2
q
4
Winter 2013/2014
38 / 143
(2)2
5 = 1 + 2i
4
(2)
x =
(2)2
5 = 1 2i
4
and
Winter 2013/2014
39 / 143
Winter 2013/2014
40 / 143
Winter 2013/2014
41 / 143
= 1 Az (t1) + . . . + p Az (tp)
= 1 z (t1) + . . . + p z (tp)
and thus
1 1 z 1 . . . p z p = 0
Characteristic polynomial, characteristic equation
Winter 2013/2014
42 / 143
Winter 2013/2014
43 / 143
Winter 2013/2014
44 / 143
ARMA models
Definition
Winter 2013/2014
45 / 143
ARMA models
Lag operator and lag polynomial
= Xt1
Rules
L2 Xt
n
L Xt
= Xtn
= Xt+1
= Xt
L Xt
= L (LXt ) = Xt2
Winter 2013/2014
46 / 143
ARMA models
Lag operator and lag polynomial
Lag polynomial
A(L) = a0 + a1 L + a2 L2 + . . . + ap Lp
Example: Let A(L) = 1 0.5L and B(L) = 1 + 4L2 , then
C (L) = A(L)B(L)
= (1 0.5L) 1 + 4L2
Winter 2013/2014
47 / 143
ARMA models
Lag operator and lag polynomial
Xt = t + 1 t1 + . . . + q tq
AR(1) process :
Xt = 1 Xt1 + t
AR(p) process :
Xt = 1 Xt1 + + p Xtp + t
Winter 2013/2014
48 / 143
ARMA models
MA(q) process
= (L)t
Xt
= t + 1 t1 + . . . + q tq
with t NID(0, 2 )
Expectation function
E (Xt ) = E (t + 1 t1 + . . . + q tq )
= E (t ) + 1 E (t1 ) + . . . + q E (tq )
= 0
Winter 2013/2014
49 / 143
ARMA models
MA(q) process
Autocovariance function
(s, t)
= E (s + 1 s1 + . . . + q sq ) (t + 1 t1 + . . . + q tq )
= E s t + 1 s t1 + 2 s t2 + . . . + q s tq
+1 s1 t + 12 s1 t1 + 1 2 s1 t2 + . . . + 1 q s1 tq
+...
+q sq t + 1 q sq t1 + 2 q sq t2 + . . . + q2 sq tq
for s = t
Andrea Beccarini (CQE)
Winter 2013/2014
50 / 143
ARMA models
MA(q) process
Define 0 = 1, then
(t, t) = 2
(t 1, t) =
Xq
2
i=0 i
Xq1
2
i i+1
i=0
(t 2, t) = 2
Xq2
i=0
i i+2
(t q, t) = 2 0 q = 2 q
(s, t) = 0 for s < t q
Hence, MA(q) processes are always stationary
Simulation of MA(q) processes (maqsim.R)
Winter 2013/2014
51 / 143
ARMA models
AR(1) process
= t
(1 1 L)Xt
= t
Xt
= 1 Xt1 + t
with t NID(0, 2 )
Expectation and variance function
[6]
Winter 2013/2014
52 / 143
ARMA models
AR(1) process
[7]
E (X0 ) = 0
Var (X0 ) =
2
1 21
[8]
[9]
Winter 2013/2014
53 / 143
ARMA models
AR(p) process
= t
= 1 Xt1 + . . . + p Xtp + t
with t NID(0, 2 )
Assumption: t is independent from Xt1 , Xt2 , . . . (innovations)
Expectation function
[10]
Winter 2013/2014
54 / 143
ARMA models
AR(p) process
Winter 2013/2014
55 / 143
ARMA models
Invertability
= 1 Xt1 + t
= 1 (1 Xt2 + t1 ) + t
= 21 Xt2 + 1 t1 + t
..
.
= n1 Xtn + 1n1 t(n1) + . . . + 21 t2 + 1 t1 + t
Winter 2013/2014
56 / 143
ARMA models
Invertability
Since |1 | < 1
Xt
i1 ti
i=0
= t + 1 t1 + 2 t2 + . . .
with i = i1
A stable AR(1) process can be written as an MA() process
(the same is true for stable AR(p) processes)
Winter 2013/2014
57 / 143
ARMA models
Invertability
= t
= (1 1 L)1 t
X
=
(1 L)i t
i=0
= t
= ((L))1 t
= (L)t
Winter 2013/2014
58 / 143
ARMA models
Invertability
= t + 1 t1
1 Xt1 = 1 t1 + 12 t2
we find Xt = 1 Xt1 + t 12 t2
Repeated substitution of the ti terms yields
Xt =
i Xti + t
with i = (1)i+1 1i
i=1
Winter 2013/2014
59 / 143
ARMA models
Invertability
Summary
ARMA(p, q) processes are stable if all roots of
(z) = 0
are larger than 1 in absolute value
ARMA(p, q) processes are invertible if all roots of
(z) = 0
are larger than 1 in absolute value
Winter 2013/2014
60 / 143
ARMA models
Invertability
= (L)t
= ((L))1 (L)t
= t
Winter 2013/2014
61 / 143
ARMA models
Deterministic components
Winter 2013/2014
62 / 143
ARMA models
Deterministic components
Winter 2013/2014
63 / 143
ARMA models
Deterministic components
Special case: Dt = t =
ARMA(p, q) process with constant (non-zero) expectation
Xt = 1 (Xt1 ) + . . . + p (Xtp )
+t + 1 t1 + . . . + q tq
The process can also be written as
Xt = c + 1 Xt1 + . . . + p Xtp + t + 1 t1 + . . . + q tq
where c = (1 1 . . . p )
Winter 2013/2014
64 / 143
ARMA models
Deterministic components
h th + Dt
h=0
with 0 = 1,
2
h=0 j
Winter 2013/2014
65 / 143
ARMA models
Linear processes and filter
h th
h=
= (L)t
where the coefficients are absolutely summable, i.e.
h= |h |
< .
Winter 2013/2014
66 / 143
ARMA models
Linear processes and filter
Winter 2013/2014
67 / 143
ARMA models
Linear processes and filter
(Xt Gt )2 +
t=1
T
1
X
t=2
Winter 2013/2014
68 / 143
ARMA models
Linear processes and filter
G1
X1
.
..
. = A ..
GT
XT
where A = (I + K 0 K )1 with
1 2 1
0 0
0 1 2 1 0
1 2 1
K = 0 0
..
..
..
..
..
.
.
.
.
.
0 0
0
0 0
... 0
... 0
... 0
.
. . . ..
0
0
0
..
.
0
0
0
..
.
. . . 1 2 1
Winter 2013/2014
69 / 143
ARMA models
Linear processes and filter
annual data
quarterly data
monthly data
Winter 2013/2014
70 / 143
Winter 2013/2014
71 / 143
Xp+1
1
Xp
Xp1
Xp+2 1 Xp+1
Xp
.. = ..
..
..
. .
.
.
XT
...
...
..
.
X1
X2
..
.
1 XT 1 XT 2 . . . XT p
c
1
..
.
p+1
p+2
..
.
Compact notation: y = X + u
Winter 2013/2014
72 / 143
Winter 2013/2014
73 / 143
Winter 2013/2014
74 / 143
T
X
(
t (d, f1 , . . . , fp , g1 , . . . , gq ))2
t=1
Winter 2013/2014
75 / 143
Winter 2013/2014
76 / 143
X1
X = ... N (, )
XT
Winter 2013/2014
77 / 143
Expectation vector
X1
c/ (1 1 . . . p )
..
= E ... =
.
XT
c/ (1 1 . . . p )
Covariance matrix
X1
X2
= Cov . =
..
XT
. . . (T 1)
. . . (T 2)
..
..
.
.
(T 1) (T 2) . . .
(0)
(0)
(1)
..
.
(1)
(0)
..
.
Winter 2013/2014
78 / 143
L (; X) = (2)
1/2
(det )
1
0 1
exp (X ) (X )
2
T
1
1
ln (2) ln (det ) (X )0 1 (X )
2
2
2
Winter 2013/2014
79 / 143
consistency
asymptotic efficiency
asymptotically jointly normally distributed
the covariance matrix of the estimators can be consistently estimated
Winter 2013/2014
80 / 143
Winter 2013/2014
81 / 143
LR U 2m
and H0 is rejected at significance level if LR > 2m;1
Disadvantage: Two models must be estimated
Winter 2013/2014
82 / 143
Winter 2013/2014
83 / 143
Test example 1:
H0 : 1 = 0
H1 : 1 6= 0
Test example 2
H0 : = 0
H1 : not H0
Illustration (arma33.R)
Winter 2013/2014
84 / 143
ln
2
|{z}
goodness-of-fit
+ 2 (p + q + 1) /T
|
{z
}
penalty
Winter 2013/2014
85 / 143
Winter 2013/2014
86 / 143
0
0
0
0
0
9
11
# orders selected by
q
1
2
3
0
0
0
18
64
23
171
21
16
7
35
58
2
12 139
6
12
56
AIC
4
0
14
5
80
37
46
5
0
6
7
45
44
56
0
0
0
0
1
6
1
# orders selected by
q
1
2
3
0
0
0
310 167
4
503
3
1
0
2
1
1
0
0
0
0
0
BIC
4
0
0
0
0
0
0
Winter 2013/2014
5
0
0
0
0
0
0
87 / 143
Integrated processes
Difference operator
Winter 2013/2014
88 / 143
Integrated processes
Definition
Winter 2013/2014
89 / 143
Integrated processes
Definition
I (2)
Xt
I (0)
Winter 2013/2014
90 / 143
Integrated processes
Definition
= Xt Xt1
= b + t
= b + (L)t
Winter 2013/2014
91 / 143
Integrated processes
Definition
= b + t t1
= (L)t
Winter 2013/2014
92 / 143
Integrated processes
Definition
Xt
(1 L) (1 L) Xt
= b + (1 + ) Xt1 Xt2 + t
= b + t
Winter 2013/2014
93 / 143
Integrated processes
Definition
1
Winter 2013/2014
94 / 143
Integrated processes
Definition
Winter 2013/2014
95 / 143
Integrated processes
Deterministic versus stochastic trends
Winter 2013/2014
96 / 143
Integrated processes
Deterministic versus stochastic trends
Winter 2013/2014
97 / 143
Integrated processes
Integrated processes and parameter estimation
Winter 2013/2014
98 / 143
Integrated processes
Integrated processes and parameter estimation
Winter 2013/2014
99 / 143
Integrated processes
Integrated processes and parameter estimation
Winter 2013/2014
100 / 143
Integrated processes
Unit root tests
= deterministics + Xt1 + t
= deterministics + ( 1) Xt1 + t
| {z }
=:
Winter 2013/2014
101 / 143
Integrated processes
Unit root tests
(unit root)
H1 : || < 1
H0 : = 0
(unit root)
H1 : < 0
or, equivalently,
Unit root tests are one-sided; explosive process are ruled out
Rejecting the null hypothesis is evidence in favour of stationarity
If the null hypothesis is not rejected, there could be a unit root
Andrea Beccarini (CQE)
Winter 2013/2014
102 / 143
Integrated processes
DF test and ADF test
or Xt = Xt1 + t
or Xt = a + Xt1 + t
or Xt = a + bt + Xt1 + t
Winter 2013/2014
103 / 143
Integrated processes
DF test and ADF test
Winter 2013/2014
104 / 143
Integrated processes
DF test and ADF test
or H0 : = 0, a = 0
H1 : < 0
or
H0 : < 0, a 6= 0
Winter 2013/2014
105 / 143
Integrated processes
DF test and ADF test
or = 0, b = 0
H1 : < 0
or
< 0, b 6= 0
Winter 2013/2014
106 / 143
Integrated processes
DF test and ADF test
T
/
Winter 2013/2014
107 / 143
Integrated processes
DF test and ADF test
Winter 2013/2014
108 / 143
Integrated processes
DF test and ADF test
Winter 2013/2014
109 / 143
Integrated processes
DF test and ADF test
Winter 2013/2014
110 / 143
Integrated processes
Regression with integrated processes
Winter 2013/2014
111 / 143
Integrated processes
Regression with integrated processes
Definition: Cointegration
Two stochastic processes (Xt )tT and (Yt )tT are cointegrated if both
processes are I (1) and there is a constant such that the process
(Yt Xt ) is I (0)
If is known, cointegration can be tested using a standard unit root
test on the process (Yt Xt )
If is unknown, it can be estimated from the linear regression
Yt = + Xt + ut
and cointegration is tested using a modified unit root test on the
residual process (ut )t=1,...,T
Andrea Beccarini (CQE)
Winter 2013/2014
112 / 143
GARCH models
Conditional expectation
Winter 2013/2014
113 / 143
GARCH models
Conditional variance
Winter 2013/2014
114 / 143
GARCH models
Rules for conditional expectations
i=1
Winter 2013/2014
115 / 143
GARCH models
Basics
Some economic time series show volatility clusters, e.g. stock returns,
commodity price changes, inflation rates, . . .
Simple autoregressive models cannot capture volatility clusters since
their conditional variance is constant
Example: Stationary AR(1)-process, Xt = Xt1 + t with || < 1;
then
2
Var (Xt ) = X2 =
,
1 2
and the conditional variance is
Var (Xt |Xt1 ) = 2
Winter 2013/2014
116 / 143
GARCH models
Basics
Winter 2013/2014
117 / 143
GARCH models
ARCH(1)-process
Definition: ARCH(1)-process
The stochastic process (Xt )tZ is called ARCH(1)-process if
E (Xt |Xt1 ) = 0
Var (Xt |Xt1 ) = t2
2
= 0 + 1 Xt1
Winter 2013/2014
118 / 143
GARCH models
ARCH(1)-process
Winter 2013/2014
119 / 143
GARCH models
ARCH(1)-process
[11]
E (Xt |Xt1 ) = 0
E (Xt ) = 0
2
Var (Xt |Xt1 ) = 0 + 1 Xt1
Var (Xt ) = 0 / (1 1 )
Cov (Xt , Xti ) = 0
for i > 0
Winter 2013/2014
[12]
120 / 143
GARCH models
ARCH(1)-process
[13]
2
Xt2 = 0 + 1 Xt1
+ vt
with vt = t2 (2t 1)
Thus, squared returns of ARCH(1) are AR(1)
The process (vt )tZ is white noise
E (vt ) = 0
Var (vt ) = E (vt2 ) = const.
Cov (vt , vti ) = 0
(i = 1, 2, . . .)
Winter 2013/2014
121 / 143
GARCH models
ARCH(1)-process
Winter 2013/2014
122 / 143
GARCH models
Estimation of an ARCH(1)-process
Winter 2013/2014
123 / 143
GARCH models
Estimation of an ARCH(1)-process
OLS estimator of 1
P
1 =
2 X2
Xt2 Xt2 Xt1
t1
2
(Xt2 , Xt1
)
2
PT 2
2
X
X
t1
t1
t=2
T
t=2
Careful: These
p estimators are only consistent if the kurtosis exists
(i.e. if 1 < 1/3)
Test of ARCH-effects
H0 : 1 = 0
H1 : 1 > 0
Winter 2013/2014
124 / 143
GARCH models
Estimation of an ARCH(1)-process
Reject H0 if
T
1 N(0, 1)
T
1 > 1 (1 )
then under H0
appr
T
12 TR 2 21
Reject H0 if TR 2 > F1
2 (1 )
1
Winter 2013/2014
125 / 143
GARCH models
ARCH(p)-process
Definition: ARCH(p)-process
The stochastic process (Xt )tZ is called ARCH(p)-process if
E (Xt |Xt1 , . . . Xtp ) = 0
Var (Xt |Xt1 , . . . , Xtp ) = t2
2
2
= 0 + 1 Xt1
+ . . . + p Xtp
Winter 2013/2014
126 / 143
GARCH models
ARCH(p)-process
Example of an ARCH(p)-process
Xt = t t
where(t )tZ is white noise with 2 = 1 and
q
2 + ... + X2
t = 0 + 1 Xt1
p tp
An ARCH(p) process is weakly stationary if all roots of
1 1 z 2 z 2 . . . p z p = 0 are outside the unit circle
Then, for all t Z, E (Xt ) = 0 and
Var (Xt ) =
P0p
i=1 i
Winter 2013/2014
127 / 143
GARCH models
ARCH(p)-process
for i = 1, 2, . . .
Winter 2013/2014
128 / 143
GARCH models
Estimation of ARCH(p) models
OLS estimation of
2
2
Xt2 = 0 + 1 Xt1
+ . . . + p Xtp
+ vt
Test of ARCH-effects
H0 : 1 = 2 = . . . = p = 0
vs H1 : not H0
Winter 2013/2014
129 / 143
GARCH models
Maximum likelihood estimation
Winter 2013/2014
130 / 143
GARCH models
Maximum likelihood estimation
ln L () =
T
Y
t=1
T
X
fX (xt ; )
ln fX (xt ; )
t=1
ML estimate
= argmax [ln L ()]
Winter 2013/2014
131 / 143
GARCH models
Maximum likelihood estimation
T
Y
fXt (xt )
t=1
or
ln fX1 ,...,XT (x1 , . . . , xT ) =
T
X
ln fXt (xt )
t=1
T
X
ln fX (xt )
t=1
Winter 2013/2014
132 / 143
GARCH models
Maximum likelihood estimation
T
Y
t=1
or
ln fX1 ,...,XT (x1 , . . . , xT ) =
T
X
t=1
1
1
fX1 ,...,XT (x1 , . . . , xT ) = fX1 (x1 )
p 2 exp
2
2 t
t=2
Andrea Beccarini (CQE)
xt
t
2 !
Winter 2013/2014
133 / 143
GARCH models
Maximum likelihood estimation
t=2
t=2
T
1X
1X
= ln 2
ln t2
2
2
2
xt
t
2
2
where t2 = 0 + 1 xt1
Winter 2013/2014
134 / 143
GARCH models
GARCH(p,q)-process
Definition: GARCH(p,q)-process
The stochastic process (Xt )tZ is called GARCH(p, q)-process if
E (Xt |Xt1 , Xt2 , . . .) = 0
Var (Xt |Xt1 , Xt2 , . . .) = t2
2
2
= 0 + 1 Xt1
+ . . . + p Xtp
2
2
+1 t1
+ . . . + q tq
for t Z with i , i 0
Often, an additional assumption is that
(Xt |Xt1 = xt1 , Xt2 = xt2 , . . .) N(0, t2 )
Andrea Beccarini (CQE)
Winter 2013/2014
135 / 143
GARCH models
GARCH(p,q)-process
X
0
2
=
+ 1
1i1 Xti
1 1
i=1
Unconditional variance
Var (Xt ) =
0
Pq
i=1 i
j=1 j
Pp
Winter 2013/2014
136 / 143
GARCH models
GARCH(p,q)-process
i +
i=1
q
X
j < 1
j=1
Winter 2013/2014
137 / 143
GARCH models
Estimation of GARCH(p,q)-processes
1
1
= fX1 (x1 )
p 2 exp
2
2 t
t=2
xt
t
2 !
Winter 2013/2014
138 / 143
GARCH models
Estimation of GARCH(p,q)-processes
t=2
t=2
T
1X
1X
= ln 2
ln t2
2
2
2
xt
t
2
2
2
with t2 = 0 + 1 xt1
+ 1 t1
and 12 = 0
Winter 2013/2014
139 / 143
GARCH models
Estimation of GARCH(p,q)-processes
2
Conditional h-step forecast of the volatility t+h
in a GARCH(1, 1)
model
0
2
h
2
E t+h |Xt , Xt1 , . . . = (1 + 1 ) t
1 1 1
0
+
1 1 1
0
1 1 1
Winter 2013/2014
140 / 143
GARCH models
Residuals of an estimated GARCH(1,1) model
Careful: Residuals are slightly different from what you know from OLS
regressions
Estimates:
0,
1 , 1 ,
2 + 2
From t2 = 0 + 1 Xt1
1 t1 and Xt = + t t we calculate
the standardized residuals
t =
Xt
Xt
=q
t
2 +
1 2
0 +
1 Xt1
t1
Winter 2013/2014
141 / 143
GARCH models
AR(p)-ARCH(q)-models
= + 1 Xt1 + t
t2
= 0 + 1 2t1
where t N(0, t2 )
mean equation / variance equation
Maximum likelihood estimation
Winter 2013/2014
142 / 143
GARCH models
Extensions of the GARCH model
Winter 2013/2014
143 / 143