Sei sulla pagina 1di 54

Univariate Time Series

Jorge Bravo
Universidad de Chile

July 2015

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

1 / 54

Univariate Time Series

Outline

1) Stationary stochastic processes

2) Non-stationary stochastic processes

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

2 / 54

Univariate Time Series

Stationary Stochastic Processes


Notation

t = E (yt )
h
0,t = E (yt

i
t )2 = Var (yt )

j ,t = E [(yt

t ) (yt

Jorge Bravo (Universidad de Chile)

t )] = Cov (yt , yt

Time Series

j)

July 2015

3 / 54

Univariate Time Series

Denition: Weakly Stationary


Denition: Weakly Stationary
A stochastic process fyt g is weakly stationary if
E (yt ) = < ,(i.e. a constant) for all t
V (yt ) = 0 < ,(i.e. a constant) for all t
Cov (yt , yt

j)

= j ,a function depending only on j.

In other words, fyt g is weakly stationary if its rst two moments are
time invariant. Some people refer to weakly stationary processes as
covariance stationary processes.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

4 / 54

Univariate Time Series

Denition: Strict Stationary

A stochastic process fyt g is strictly stationary if


F (yt , yt +1 , yt +2 , ..., yt +s ) = F (yt +r , yt +r +1 , yt +r +2 , ..., yt +r +s )
for all r and s
In other words, fyt g is strictly stationary if
1
2

The distribution of yt and ys are the same for all t and s,


The joint distribution of (yt , yt +s ) is the same as that of
(yt +r , yt +r +s ) for all r and s,
The joint distribution of (yt , yt +s , yt +s +u ) is identical to that of
(yt +r , yt +r +s , yt +r +s +u ) for all r , s and u, and so on.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

5 / 54

Univariate Time Series

Denition: Strict Stationary

Notice:
F (y1 , y3 ) = F (y4 , y6 ) ! two periods
F (y1 , y5 ) = F (y10 , y15 ) ! ve periods
i.e. The joint distribution does not depend on t (in weak stationarity
we do not impose further stable conditions on the joint distribution)
In what follows when we mention stationary we will be refering to the
covariance stationary denition.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

6 / 54

Univariate Time Series

Denition: Ergodicity
T
A covariance stationary process fyt gtt =
=1 is ergodic if sample
moments converge in probability to population moments; i.e. if
p
p
p
y ! , j ! j and j ! j .
T
Intuitively, a stochastic process fyt gtt =
=1 is ergodic if any two
collections of random variables partitioned far apart in the sequence
are essentially independent.

T
More intuition. fyt gtt =
=1 is ergodic for the mean if:

1
T

Jorge Bravo (Universidad de Chile)

yt

t =1

(1 )

! E (Yt ) = = plim

T !

Time Series

I !

1 I (i )
yt
I i
=1
July 2015

7 / 54

Univariate Time Series

White Noise Process

It is the building block for our time series models.


t for t = 1, 2, . . . , T , is a white noise if:
1
2
3

E [t ] = E [t jt 1, t 2,... ] = E [t jall information en t 1] = 0


Cov (t , t j ) = 0, for all t and j.
Var [t ] = Var [t jt 1, t 2,... ] = Var [t jall information en t 1] =
2

The rst and second properties are the absence of any serial
correlation or predictability. The third property is conditional
homoskedasticity or a constant conditional variance.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

8 / 54

Univariate Time Series

White Noise Process

Remarks:
Notation: t s WN (0, 2 )
This process is a Gaussian white noise process if t s N (0, 2 )
By construction it is a stationary process (a collection of uncorrelated
random variables with mean 0)
Example: the error term of a classical linear regression model is a white
noise.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

9 / 54

Univariate Time Series

White Noise Process (example)


Gaussian White Noise

-2

Gaussian White Noise


0
2

e t GWN(0,1)

100

200

300

400

500

time

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

10 / 54

Univariate Time Series

Stationary Process with a Deterministic Trend

Consider the process


yt = 0 + 1 t + t
Notice that:
E (yt ) = 0 + 1 t
Var (yt ) = 2
yt is a non-stationary process.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

11 / 54

Univariate Time Series

Stationary Process with a Deterministic Trend


But the deviation from the trend
yt

1 t = yet = 0 + t

with
E (yt ) = 0
Var (yt ) = 2

It is stationary process (detrended). Thus the process yt is called


trend-stationary.
Jorge Bravo (Universidad de Chile)

Time Series

July 2015

12 / 54

Univariate Time Series

Stationary Process with a Deterministic Trend (example)

Stationary process with a deterministic trend


t

yt
10

15

20

y t = 0.2 + 0.1t + e

Jorge Bravo (Universidad de Chile)

50

100
time

Time Series

150

200

July 2015

13 / 54

Univariate Time Series

Pure Random Walk


Consider the process
yt = yt

+ t

We can iterate backwards


t 1

yt =

+ y0

s =0

Thus
E (yt ) = y0
t 1

Var (yt ) =

E (2t

s)

= t2

s =0

yt is non-stationary because its variance grows with t.


Jorge Bravo (Universidad de Chile)

Time Series

July 2015

14 / 54

Univariate Time Series

Random Walk with Drift


Now consider the process
yt = 0 + yt

+ t

As before we can iterate backwards


t 1

yt = 0 t +

+ y0

s =0

Thus
E (yt ) = 0 t + y0
t 1

Var (yt ) =

E (2t

s)

= t2

s =0

yt is non-stationary because both variance and mean grow with t.


Jorge Bravo (Universidad de Chile)

Time Series

July 2015

15 / 54

Univariate Time Series

Random Walk (example)

Random W alk
t -1

+e

Random W alk W ith Drift


y t = 0.15 + y

t -1

+e

20

10

40

yt

yt
20

60

30

80

100

40

y t =y

200

400

600

800

1000

time

Jorge Bravo (Universidad de Chile)

200

400

600

800

1000

time

Time Series

July 2015

16 / 54

Univariate Time Series

Random Walk with Drift

Remarks
The eect of the initial value, y0 , stays in the process.
The innovations, t s , are accumulated to a random walk, ts =10 t s .
This is denoted a stochastic trend.
Note that shocks have permanent eects.
In the case of deterministic trend shocks have transitory eects. Why?

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

17 / 54

Univariate Time Series

Non Stationary Processes


Stationary process with a deterministic trend vs Random walk with drift

Stationary process with a deterministic trend

Random W alk W ith Drift


y t = 0.15 + y

t -1

+e

yt
0

20

40

yt
10

60

15

80

20

100

y t = 0.2 + 0.1t + e

50

100
time

Jorge Bravo (Universidad de Chile)

150

200

200

400

600

800

1000

time

Time Series

July 2015

18 / 54

Univariate Time Series

Univariate Stationary Time Series Models

The Box-Jenkins (1976) methodology proposes to estimate time


series models of the form:
yt = 0 + 1 yt

+ ... + p yt

+ t + 1 t

+ ... + p t

Such models are called autoregressive moving average model (ARMA)


time series models.
Notice that we have a stochastic linear dierence equation. We are
going to review some basic concepts to solve this type of equations.
We will see that the stability conditions are necessary conditions for
stationarity.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

19 / 54

Univariate Time Series

Univariate Stationary Time Series Models

In what follows we will develop the tools used to identify and estimate
ARMA models following the Box-Jenkins methodology.
These models are useful for:
Statistical hypothesis testing (to prove some theory)
Forecasting

Some examples?

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

20 / 54

Univariate Time Series

Univariate Stationary Time Series Models

The Random Walk Hypothesis: The random walk model suggests


that day-to-day changes in the price of a stock should have a mean
value of zero. If we would know that capital gain can be made by
buying a share on day t and selling it for an expected prot the very
next day, e cient speculation will drive up the current price (or vice
versa).
yt = yt 1 + t ! yt +1 = yt + t +1
We can estimate
yt +1 = 0 + 1 yt + t
To test the null hypothesis: H0 : 0 = 1 = 0

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

21 / 54

Univariate Time Series

Univariate Stationary Time Series Models

Reduced Form and Structural Equations: Consider the stochastic


version of Samuelson s (1939) classic model:
yt = ct + it
ct = yt
it = (ct

+ ct
ct

i
1 ) + t

(1)

,0 < < 1

(2)

, > 0

(3)

In this Keynesian model yt , ct and it are endogenous.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

22 / 54

Univariate Time Series

Univariate Stationary Time Series Models

Equation (3) is a structural equation since it expresses the


endogenous variable it as being dependent on the current realization
of another endogenous variable, ct .
A reduce form equation is one expressing the value of a variable in
term of its own lags, lags of other endogenous variables, current and
past values of exogenous variables, and a disturbance terms.
Substituting it and ct in yt we have a reduce form equation for GDP
(yt ):
yt = (1 + )yt

Jorge Bravo (Universidad de Chile)

yt

Time Series

+ (1

)ct + it

ct

July 2015

(4)

23 / 54

Univariate Time Series

The lag (or backshift) operator


We dene the lag operator L by:
Li yt = yt

Properties:
Lc = c
(Lyt ) = L( yt ) = yt 1 (It is commutative)
Li Lj yt = yt i j
(Li + Lj )yt = yt i + yt j (/it is distributive over the addition operator)
L i yt = yt +1 (forward operator)
We can dene the dierence operator as:
yt = yt yt 1 = (1 L)yt
yt

= yt

yt

= L(1

L)yt

and so on.
Jorge Bravo (Universidad de Chile)

Time Series

July 2015

24 / 54

Univariate Time Series

Moving Average Process


A rst-order moving average, or MA(1) process is:
yt
t

= 0 + t + t
s WN (0, 2 )

= 0 + (1 + L)t

Current value of yt is a function of (a constant), current and lagged


unobservable shocks.
Each shock has impact over two periods: contemporaneous impact
and one-period delayed impact
The MA coe cient controls the degree of serial correlation. It may
be positive or negative.
Note that p = 0 and q = 1 in the ARMA(p, q ) model.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

25 / 54

Univariate Time Series

MA(1) Process
Unconditional mean
E [yt ] = 0
Unconditional variance
0 = Var [yt ] = (1 + 2 )2
Given that t s WN (0, 2 ), then E [t jt

1]

=0

Conditional mean

E [yt jt

1]

= 0 + t

Conditional variance
Var [yt jt

Jorge Bravo (Universidad de Chile)

Time Series

1]

= 2

July 2015

26 / 54

Univariate Time Series

MA(1) Process
Autocovariance for k

1
1 = 2
k

= 0 for k > 1

We dene the autocorrelation function: j =

j
0

Thus we have:
1 =
k

Jorge Bravo (Universidad de Chile)

1
2

=
=
2
2
0
(1 + )
(1 + 2 )
k
0
=
= 0 for k > 1
0
(1 + 2 )2

Time Series

July 2015

27 / 54

Univariate Time Series

MA(1) Process (examples)

MA(1): y

= + e t + e t-1

MA(1): y

= + e t + e t-1

( =0.2;

= -0.5)

yt
-2

-2

-1

yt

= 0.5)

( =0.2;

20

40

60

80

100

time

Jorge Bravo (Universidad de Chile)

20

40

60

80

100

time

Time Series

July 2015

28 / 54

Univariate Time Series

MA(2) Process

A second-order moving average, or MA(2) process is:


yt
t

= 0 + t + 1 t
s WN (0, 2 )

+ 2 t

= 0 + (1 + 1 L + 2 L2 )t

Now we have p = 0 and q = 2 in the ARMA(p, q ) model.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

29 / 54

Univariate Time Series

MA(2) Process
Properties
Unconditional mean
E [yt ] = 0
Unconditional variance
0 = Var [yt ] = (1 + 21 + 22 )2
Given that t s WN (0, 2 ), then E [t jt
Conditional mean
E [yt jt

1]

= 0 + 1 t

1]

=0

+ 2 t

Conditional variance
Var [yt jt
Jorge Bravo (Universidad de Chile)

Time Series

1]

= 2
July 2015

30 / 54

Univariate Time Series

MA(2) Process
Autocovariance for k

1
1 = ( 1 + 1 2 )2
2 = 2 2
3 = 0

for k > 2

For the autocorrelation function we have:


1
( 1 + 1 2 )
=
0
(1 + 21 + 22 )
2
2
=
=
0
(1 + 21 + 22 )
= 0 for k > 2

1 =
2
k

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

31 / 54

Univariate Time Series

MA(2) Process (example)

MA(2): y

= +e

1 =0.5;

+ 1 e t-1 + 2 e t-2
2 =0.4)

-2

-1

yt

( =0.2;

20

40

60

80

100

time

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

32 / 54

Univariate Time Series

MA(q) Process

A q-order moving average, or MA(q) process is:


yt
t

= 0 + t + 1 t 1 + 2 t 2 + ... + q t
= 0 + (1 + 1 L + 2 L2 + ... + q Lq )t
s WN (0, 2 )

Now we have ARMA(0, q ) model.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

33 / 54

Univariate Time Series

MA(q) Process
Unconditional mean
E [yt ] = 0
Unconditional variance
0 = (1 + 21 + 22 + ... + 2q )2
Given that t s WN (0, 2 ), then E [t jt

1]

=0

Conditional mean

E [yt jt

1]

= 0 + 1 t

+ 2 t

+ ... + q t

Conditional variance
Var [yt jt

Jorge Bravo (Universidad de Chile)

Time Series

1]

= 2

July 2015

34 / 54

Univariate Time Series

MA(q) Process

Autocovariance for k

= ( k + k 1 + ... + q q
= 0 for k > q

k
k

2
k )

for k = 1, 2, ..., q

For the autocorrelation function we have:


k

( k + k 1 + ... + q q k )
k
=
0
(1 + 21 + 22 + ... + 2q )

k
=0
0

Jorge Bravo (Universidad de Chile)

for k = 1, 2, ..., q

for k > q

Time Series

July 2015

35 / 54

Univariate Time Series

Autoregressive Process
A rst-order autoregressive, or AR(1) process is:
yt
t

= 0 + yt 1 + t
s WN (0, 2 )

We can write this in lag operator form:


yt

yt

(1

L)yt

= 0 + t
= 0 + t

Positive (negative) means yt and yt


correlated.
Jorge Bravo (Universidad de Chile)

Time Series

are positively (negatively)

July 2015

36 / 54

Univariate Time Series

AR(1) Process
A rst-order autoregressive, or AR(1) process is:
Substituting lagged y recursively, we can transfer the AR(1) into
MA().
yt

=
=
=
=

0 + (0 + yt

+ t 1 ) + t
0 + 0 + yt 1 + t 1 + t
0 + 0 + 2 (0 + yt 2 + t 2 ) + t 1 + t
0 + 0 + 2 0 + 3 yt 2 + 2 t 2 + t 1 + t

yt

= 0

j =0

Jorge Bravo (Universidad de Chile)

j +

j t

j =0

Time Series

July 2015

37 / 54

Univariate Time Series

AR(1) Process

yt = 0

j =0

j =0

j + j t

If j j < 1, this is a general linear process with geometrically declining


coe cients. The impact of a shock becomes smaller and smaller as
time passes.
If j j = 1, then the sum does not converge:
yt = t + t

+ t

+ ...

i.e. Shocks have permanent eects


The past never disappears (random walk or unit root process)

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

38 / 54

Univariate Time Series

AR(1) Process
Notice that j j < 1 is required for stationarity. For j j < 1 we have:
Unconditional mean
0
E [yt ] =
1
Unconditional variance
0 = Var [yt ] = 2

2j = 1

j =0

Given that t s WN (0, 2 ), then E [t jt


Conditional mean
E [yt jt

1]

1]

= 0 + yt

2
2

=0

Conditional variance
Var [yt jt
Jorge Bravo (Universidad de Chile)

Time Series

1]

= 2
July 2015

39 / 54

Univariate Time Series

AR(1) Process

Autocovariance
1 =

2
1 2

j =

2
1 2

In general

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

40 / 54

Univariate Time Series

AR(1) Process
Thus the autocorrelation function is:
1 =

1
=
0

2 =

1
=
0

2
1 2

2
1 2
2
1 2

2
1 2

= 2

..
.
j

Jorge Bravo (Universidad de Chile)

j
=
0

2
1 2

Time Series

2
1 2

= j

July 2015

41 / 54

Univariate Time Series

AR(1) Process

Thus
j
j =
=
0

2
1 2
2
1 2

= j

The autocorrelation of AR(1) is a geometric decay.


If is small, the autocorrelations decay rapidly to zero with k.
If is large (close to 1), the autocorrelations decay moderately.
The AR(1) parameter describes the persistence in the time series.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

42 / 54

Univariate Time Series

AR(1) Processes (examples)

AR(1): y

= y t-1 + e

AR(1): y

= y t-1 + e

( = 0.2)

y2
0

-4

-2

-2

-1

y1
0

( = 0.95)

20

40

60

80

100

Jorge Bravo (Universidad de Chile)

20

40

60

80

100

Time Series

July 2015

43 / 54

Univariate Time Series

AR(1) Process (examples)

AR(1): y

= y t-1 + e

AR(1): y

= y t-1 + e

( = -0.2)

-4

-4

-2

-2

y4
0

y3
0

( = -0.95)

20

40

60

80

100

Jorge Bravo (Universidad de Chile)

20

40

60

80

100

Time Series

July 2015

44 / 54

Univariate Time Series

AR(2) Process
A second-order autoregressive, or AR(2) process is:

= 0 + 1 yt 1 + 2 yt
s WN (0, 2 )

yt
t

+ t

We can write this in lag operator form:


yt

1 yt

(1

1 L

2 yt

2 L )yt
(L)yt

= 0 + t
= 0 + t
= 0 + t

Characteristic equation:
1

1 x

2 x 2 = 0

yt is covariance stationary if all roots of the characteristic equation


are outside the unit circle. (stationarity condition).
Jorge Bravo (Universidad de Chile)

Time Series

July 2015

45 / 54

Univariate Time Series

AR(2) Process
In the case of AR(1) we have:
1

1
> 1 , j 1 j < 1
1

1 L = 0 ) L =

In the case of AR(2) we have:


1

1 L

2 L = 0 ) L =

21

22

42

>1

Example:
yt = 0.7yt

(1

0.7L

+ 0.35yt

0.35L )yt = 0

Characteristic equation:
1
Jorge Bravo (Universidad de Chile)

0.7x

0.35x 2 = 0

Time Series

July 2015

46 / 54

Univariate Time Series

AR(2) Process

Solutions
x1,2

b 2 4ac
p2a
0.72 4 1 ( 0.35)
0.7
=
2 ( 0.35)
) x1 = 2.964; x2 = 0.964

One root is inside the unit circle.

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

47 / 54

Univariate Time Series

AR(2) Process

A necessary condition for stationarity


1 + 2 < 1
Su cient conditions for stationarity

j 1 + 2 j < 1
1 + 2 < 1
j 2 j < 1

Jorge Bravo (Universidad de Chile)

Time Series

July 2015

48 / 54

Univariate Time Series

AR(2) Process (examples)

AR(2): y

= 1 y t-1 + 2 y t-2 + e

AR(2): y

2 = 0.35)

= 1 y t-1 + 2 y t-2 + e

( 1 = 0.2;

2 = 0.35)

y2
-3

-2

10

-1

y1
20

30

40

( 1 = 0.7;

20

40

60

80

100

Jorge Bravo (Universidad de Chile)

20

40

60

80

100

Time Series

July 2015

49 / 54

Univariate Time Series

AR(2) Process

Unconditional mean
E [yt ] =

0
1

Unconditional variance
h
0 = E (yt

Jorge Bravo (Universidad de Chile)

i
E [yt ])2 =

(1 2 )2
(1 + 2 )(1 1 + 2 )(1

Time Series

July 2015

2 )

50 / 54

Univariate Time Series

AR(2) Process

Autocovariance.

= E [(yt E (yt )) (yt j E (yt ))]


= 1 j 1 + 2 j 2 for j > 1

j
j

Autocorrelation function
j
j

Jorge Bravo (Universidad de Chile)

j
0
= 1 j

+ j

Time Series

for j > 1

July 2015

51 / 54

Univariate Time Series

AR(p) Process
A p-order autoregressive, or AR(p) process is:
yt = 0 + 1 yt

+ 2 yt

+ ... + p yt

+ t

We can write this in lag operator form:

(1

2 L2

1 L

...

p Lp )yt
(L)yt

= 0 + t
= 0 + t

Characteristic equation:
1

1 x

2 x 2

...

p xtp = 0

As before, yt is covariance stationary if all roots of the characteristic


equation are outside the unit circle (stationarity condition).
A necessary condition for stationarity
p

j < 1

j =0
Jorge Bravo (Universidad de Chile)

Time Series

July 2015

52 / 54

Univariate Time Series

AR(p) Process
Substituting lagged y recursively
yt
yt

= 0 + 1 yt 1 + 2 yt 2 + ... + p yt p + t
= 0 + 1 (0 + 1 yt 2 + 2 yt 3 + ... + p yt p 1 + t 1 ) +
+ 2 (0 + 1 yt 3 + 2 yt 3 + ... + p yt p 2 + t 2 ) + ... +
+ p (0 + 1 yt p 1 + 2 yt p 2 + ... + p yt 2p + t p ) + t

if

j < 1

j =0

then
yt =

Jorge Bravo (Universidad de Chile)

0
2

...

Time Series

+ j t

j =0

July 2015

53 / 54

Univariate Time Series

AR(p) Process
Unconditional mean
E [yt ] =

0
2

...

Unconditional variance
0 = 1 1 + 2 2 + ... + p p + 2
Autocovariance
j = 1 j

Jorge Bravo (Universidad de Chile)

+ 2 j

+ ... + p j

Time Series

for j

July 2015

54 / 54

Potrebbero piacerti anche