Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Jorge Bravo
Universidad de Chile
July 2015
Time Series
July 2015
1 / 54
Outline
Time Series
July 2015
2 / 54
t = E (yt )
h
0,t = E (yt
i
t )2 = Var (yt )
j ,t = E [(yt
t ) (yt
t )] = Cov (yt , yt
Time Series
j)
July 2015
3 / 54
j)
In other words, fyt g is weakly stationary if its rst two moments are
time invariant. Some people refer to weakly stationary processes as
covariance stationary processes.
Time Series
July 2015
4 / 54
Time Series
July 2015
5 / 54
Notice:
F (y1 , y3 ) = F (y4 , y6 ) ! two periods
F (y1 , y5 ) = F (y10 , y15 ) ! ve periods
i.e. The joint distribution does not depend on t (in weak stationarity
we do not impose further stable conditions on the joint distribution)
In what follows when we mention stationary we will be refering to the
covariance stationary denition.
Time Series
July 2015
6 / 54
Denition: Ergodicity
T
A covariance stationary process fyt gtt =
=1 is ergodic if sample
moments converge in probability to population moments; i.e. if
p
p
p
y ! , j ! j and j ! j .
T
Intuitively, a stochastic process fyt gtt =
=1 is ergodic if any two
collections of random variables partitioned far apart in the sequence
are essentially independent.
T
More intuition. fyt gtt =
=1 is ergodic for the mean if:
1
T
yt
t =1
(1 )
! E (Yt ) = = plim
T !
Time Series
I !
1 I (i )
yt
I i
=1
July 2015
7 / 54
The rst and second properties are the absence of any serial
correlation or predictability. The third property is conditional
homoskedasticity or a constant conditional variance.
Time Series
July 2015
8 / 54
Remarks:
Notation: t s WN (0, 2 )
This process is a Gaussian white noise process if t s N (0, 2 )
By construction it is a stationary process (a collection of uncorrelated
random variables with mean 0)
Example: the error term of a classical linear regression model is a white
noise.
Time Series
July 2015
9 / 54
-2
e t GWN(0,1)
100
200
300
400
500
time
Time Series
July 2015
10 / 54
Time Series
July 2015
11 / 54
1 t = yet = 0 + t
with
E (yt ) = 0
Var (yt ) = 2
Time Series
July 2015
12 / 54
yt
10
15
20
y t = 0.2 + 0.1t + e
50
100
time
Time Series
150
200
July 2015
13 / 54
+ t
yt =
+ y0
s =0
Thus
E (yt ) = y0
t 1
Var (yt ) =
E (2t
s)
= t2
s =0
Time Series
July 2015
14 / 54
+ t
yt = 0 t +
+ y0
s =0
Thus
E (yt ) = 0 t + y0
t 1
Var (yt ) =
E (2t
s)
= t2
s =0
Time Series
July 2015
15 / 54
Random W alk
t -1
+e
t -1
+e
20
10
40
yt
yt
20
60
30
80
100
40
y t =y
200
400
600
800
1000
time
200
400
600
800
1000
time
Time Series
July 2015
16 / 54
Remarks
The eect of the initial value, y0 , stays in the process.
The innovations, t s , are accumulated to a random walk, ts =10 t s .
This is denoted a stochastic trend.
Note that shocks have permanent eects.
In the case of deterministic trend shocks have transitory eects. Why?
Time Series
July 2015
17 / 54
t -1
+e
yt
0
20
40
yt
10
60
15
80
20
100
y t = 0.2 + 0.1t + e
50
100
time
150
200
200
400
600
800
1000
time
Time Series
July 2015
18 / 54
+ ... + p yt
+ t + 1 t
+ ... + p t
Time Series
July 2015
19 / 54
In what follows we will develop the tools used to identify and estimate
ARMA models following the Box-Jenkins methodology.
These models are useful for:
Statistical hypothesis testing (to prove some theory)
Forecasting
Some examples?
Time Series
July 2015
20 / 54
Time Series
July 2015
21 / 54
+ ct
ct
i
1 ) + t
(1)
,0 < < 1
(2)
, > 0
(3)
Time Series
July 2015
22 / 54
yt
Time Series
+ (1
)ct + it
ct
July 2015
(4)
23 / 54
Properties:
Lc = c
(Lyt ) = L( yt ) = yt 1 (It is commutative)
Li Lj yt = yt i j
(Li + Lj )yt = yt i + yt j (/it is distributive over the addition operator)
L i yt = yt +1 (forward operator)
We can dene the dierence operator as:
yt = yt yt 1 = (1 L)yt
yt
= yt
yt
= L(1
L)yt
and so on.
Jorge Bravo (Universidad de Chile)
Time Series
July 2015
24 / 54
= 0 + t + t
s WN (0, 2 )
= 0 + (1 + L)t
Time Series
July 2015
25 / 54
MA(1) Process
Unconditional mean
E [yt ] = 0
Unconditional variance
0 = Var [yt ] = (1 + 2 )2
Given that t s WN (0, 2 ), then E [t jt
1]
=0
Conditional mean
E [yt jt
1]
= 0 + t
Conditional variance
Var [yt jt
Time Series
1]
= 2
July 2015
26 / 54
MA(1) Process
Autocovariance for k
1
1 = 2
k
= 0 for k > 1
j
0
Thus we have:
1 =
k
1
2
=
=
2
2
0
(1 + )
(1 + 2 )
k
0
=
= 0 for k > 1
0
(1 + 2 )2
Time Series
July 2015
27 / 54
MA(1): y
= + e t + e t-1
MA(1): y
= + e t + e t-1
( =0.2;
= -0.5)
yt
-2
-2
-1
yt
= 0.5)
( =0.2;
20
40
60
80
100
time
20
40
60
80
100
time
Time Series
July 2015
28 / 54
MA(2) Process
= 0 + t + 1 t
s WN (0, 2 )
+ 2 t
= 0 + (1 + 1 L + 2 L2 )t
Time Series
July 2015
29 / 54
MA(2) Process
Properties
Unconditional mean
E [yt ] = 0
Unconditional variance
0 = Var [yt ] = (1 + 21 + 22 )2
Given that t s WN (0, 2 ), then E [t jt
Conditional mean
E [yt jt
1]
= 0 + 1 t
1]
=0
+ 2 t
Conditional variance
Var [yt jt
Jorge Bravo (Universidad de Chile)
Time Series
1]
= 2
July 2015
30 / 54
MA(2) Process
Autocovariance for k
1
1 = ( 1 + 1 2 )2
2 = 2 2
3 = 0
for k > 2
1 =
2
k
Time Series
July 2015
31 / 54
MA(2): y
= +e
1 =0.5;
+ 1 e t-1 + 2 e t-2
2 =0.4)
-2
-1
yt
( =0.2;
20
40
60
80
100
time
Time Series
July 2015
32 / 54
MA(q) Process
= 0 + t + 1 t 1 + 2 t 2 + ... + q t
= 0 + (1 + 1 L + 2 L2 + ... + q Lq )t
s WN (0, 2 )
Time Series
July 2015
33 / 54
MA(q) Process
Unconditional mean
E [yt ] = 0
Unconditional variance
0 = (1 + 21 + 22 + ... + 2q )2
Given that t s WN (0, 2 ), then E [t jt
1]
=0
Conditional mean
E [yt jt
1]
= 0 + 1 t
+ 2 t
+ ... + q t
Conditional variance
Var [yt jt
Time Series
1]
= 2
July 2015
34 / 54
MA(q) Process
Autocovariance for k
= ( k + k 1 + ... + q q
= 0 for k > q
k
k
2
k )
for k = 1, 2, ..., q
( k + k 1 + ... + q q k )
k
=
0
(1 + 21 + 22 + ... + 2q )
k
=0
0
for k = 1, 2, ..., q
for k > q
Time Series
July 2015
35 / 54
Autoregressive Process
A rst-order autoregressive, or AR(1) process is:
yt
t
= 0 + yt 1 + t
s WN (0, 2 )
yt
(1
L)yt
= 0 + t
= 0 + t
Time Series
July 2015
36 / 54
AR(1) Process
A rst-order autoregressive, or AR(1) process is:
Substituting lagged y recursively, we can transfer the AR(1) into
MA().
yt
=
=
=
=
0 + (0 + yt
+ t 1 ) + t
0 + 0 + yt 1 + t 1 + t
0 + 0 + 2 (0 + yt 2 + t 2 ) + t 1 + t
0 + 0 + 2 0 + 3 yt 2 + 2 t 2 + t 1 + t
yt
= 0
j =0
j +
j t
j =0
Time Series
July 2015
37 / 54
AR(1) Process
yt = 0
j =0
j =0
j + j t
+ t
+ ...
Time Series
July 2015
38 / 54
AR(1) Process
Notice that j j < 1 is required for stationarity. For j j < 1 we have:
Unconditional mean
0
E [yt ] =
1
Unconditional variance
0 = Var [yt ] = 2
2j = 1
j =0
1]
1]
= 0 + yt
2
2
=0
Conditional variance
Var [yt jt
Jorge Bravo (Universidad de Chile)
Time Series
1]
= 2
July 2015
39 / 54
AR(1) Process
Autocovariance
1 =
2
1 2
j =
2
1 2
In general
Time Series
July 2015
40 / 54
AR(1) Process
Thus the autocorrelation function is:
1 =
1
=
0
2 =
1
=
0
2
1 2
2
1 2
2
1 2
2
1 2
= 2
..
.
j
j
=
0
2
1 2
Time Series
2
1 2
= j
July 2015
41 / 54
AR(1) Process
Thus
j
j =
=
0
2
1 2
2
1 2
= j
Time Series
July 2015
42 / 54
AR(1): y
= y t-1 + e
AR(1): y
= y t-1 + e
( = 0.2)
y2
0
-4
-2
-2
-1
y1
0
( = 0.95)
20
40
60
80
100
20
40
60
80
100
Time Series
July 2015
43 / 54
AR(1): y
= y t-1 + e
AR(1): y
= y t-1 + e
( = -0.2)
-4
-4
-2
-2
y4
0
y3
0
( = -0.95)
20
40
60
80
100
20
40
60
80
100
Time Series
July 2015
44 / 54
AR(2) Process
A second-order autoregressive, or AR(2) process is:
= 0 + 1 yt 1 + 2 yt
s WN (0, 2 )
yt
t
+ t
1 yt
(1
1 L
2 yt
2 L )yt
(L)yt
= 0 + t
= 0 + t
= 0 + t
Characteristic equation:
1
1 x
2 x 2 = 0
Time Series
July 2015
45 / 54
AR(2) Process
In the case of AR(1) we have:
1
1
> 1 , j 1 j < 1
1
1 L = 0 ) L =
1 L
2 L = 0 ) L =
21
22
42
>1
Example:
yt = 0.7yt
(1
0.7L
+ 0.35yt
0.35L )yt = 0
Characteristic equation:
1
Jorge Bravo (Universidad de Chile)
0.7x
0.35x 2 = 0
Time Series
July 2015
46 / 54
AR(2) Process
Solutions
x1,2
b 2 4ac
p2a
0.72 4 1 ( 0.35)
0.7
=
2 ( 0.35)
) x1 = 2.964; x2 = 0.964
Time Series
July 2015
47 / 54
AR(2) Process
j 1 + 2 j < 1
1 + 2 < 1
j 2 j < 1
Time Series
July 2015
48 / 54
AR(2): y
= 1 y t-1 + 2 y t-2 + e
AR(2): y
2 = 0.35)
= 1 y t-1 + 2 y t-2 + e
( 1 = 0.2;
2 = 0.35)
y2
-3
-2
10
-1
y1
20
30
40
( 1 = 0.7;
20
40
60
80
100
20
40
60
80
100
Time Series
July 2015
49 / 54
AR(2) Process
Unconditional mean
E [yt ] =
0
1
Unconditional variance
h
0 = E (yt
i
E [yt ])2 =
(1 2 )2
(1 + 2 )(1 1 + 2 )(1
Time Series
July 2015
2 )
50 / 54
AR(2) Process
Autocovariance.
j
j
Autocorrelation function
j
j
j
0
= 1 j
+ j
Time Series
for j > 1
July 2015
51 / 54
AR(p) Process
A p-order autoregressive, or AR(p) process is:
yt = 0 + 1 yt
+ 2 yt
+ ... + p yt
+ t
(1
2 L2
1 L
...
p Lp )yt
(L)yt
= 0 + t
= 0 + t
Characteristic equation:
1
1 x
2 x 2
...
p xtp = 0
j < 1
j =0
Jorge Bravo (Universidad de Chile)
Time Series
July 2015
52 / 54
AR(p) Process
Substituting lagged y recursively
yt
yt
= 0 + 1 yt 1 + 2 yt 2 + ... + p yt p + t
= 0 + 1 (0 + 1 yt 2 + 2 yt 3 + ... + p yt p 1 + t 1 ) +
+ 2 (0 + 1 yt 3 + 2 yt 3 + ... + p yt p 2 + t 2 ) + ... +
+ p (0 + 1 yt p 1 + 2 yt p 2 + ... + p yt 2p + t p ) + t
if
j < 1
j =0
then
yt =
0
2
...
Time Series
+ j t
j =0
July 2015
53 / 54
AR(p) Process
Unconditional mean
E [yt ] =
0
2
...
Unconditional variance
0 = 1 1 + 2 2 + ... + p p + 2
Autocovariance
j = 1 j
+ 2 j
+ ... + p j
Time Series
for j
July 2015
54 / 54