Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Supplementary Readings:
Wilks, chapters 8
Recall:
Statistical Model
[Observed Data] = [Signal] + [Noise]
Cyclostationarity
statistics are a periodic function of lag k
Strict Stationarity
Time Series Modeling
All linear time series models can be written in the form:
xt 1 k (xt k 1 )t 1 mt m1
K M
k 1 m1
xt 1 k (xt k 1 )t 1 mt m1
K M
k 1 m1
Consider Special case of simple Autoregressive AR(k) model (m=0)
xt 1 k (xt k 1 )t 1
K
k 1
xt 1 r (xt )t 1
Special case of a Markov Process (a process for which the
state of system depends on previous states)
xt 1 k (xt k 1 )t 1 mt m1
K M
k 1 m1
Consider Special case of simple Autoregressive AR(k) model (m=0)
xt 1 k (xt k 1 )t 1
K
k 1
xt 1 r (xt )t 1
Assume process is zero mean, then
xt 1 rxt t 1
Lag-one correlated process or AR(1) process
AR(1) Process
For simplicity, we assume zero mean
y ry y y ry 2 y
k 1 k k 1 k k 1 k k 1 k
n1 n1 n1
yk yk 1 r yk 2 k 1yk
k 1 k 1 k 1
n 1
y y 1 yi y i l
r k 1n1k k 1
rl
n
2
y 2
y
k 1 k
AR(1) Process
4
140
120 2 0.85 3
100 1
N 201118
80
-1
60 -2
-3
40
-4
0 100 200 300 400 500 600 700 800 900 1000
20
0
-4 -3 -2 -1 0 1 2 3 4
3
3
2
2
1
1
0
0
-1
-1
Blue: r=0.4
-2
-2
Red: r=0.7
-3
-3 -4
0 10 20 30 40 50 60 70 80 90 100 0 100 200 300 400 500 600 700 800 900 1000
y ry
k 1 k k 1
y 2 ( ry ) 2
k 1 k k 1
k 1
y r
2
y 2
2
k k 1
2
2
y 2 r y 2 2
y2 2
2
1 r
AR(1) Process
10
-2 Blue: r=0.4
-4
Red: r=0.7
Green: r=0.9
-6
2
0 10 20 30 40 50 60 70 80 90 100
y2 2
1 r
AR(1) Process
y ry
k 1 k k 1
y
1000 900 800 700 600 500 400 300 200 100 0
0
y
k 1 k k 1
5
10
15
Random Walk
20
(Brownian Motion)
25
2
Variance y2 2
Not stationary is infinite! 1 r
AR(1) Process
Autocorrelation Function
i ik
( x x )( x x )
i 1
r i1 i 1
x xi var (xi x)
n n
k (nk ) var var 2
i k 1 i k 1
We can approximate:
nk
(xi x )(xik x ) x xi var (xi x )
n n 2
r i 1 i 1 i 1
k (nk)var
Let us assume the series x has been de-meaned
AR(1) Process
Autocorrelation Function
i ik
( x x )( x x )
i 1
r i1 i 1
x xi var (xi x)
n n
k (nk ) var var 2
i k 1 i k 1
Then:
nk
xi xik var xi
n 2
r i 1
i 1
k (nk )var
Autocorrelation Function
Serial correlation function
1 356
354
0.8
352
0.6 350
348
0.4
346
0.2
344
0 342
340
-0.2
338
-0.4 336
0 50 100 150 200 250 1976 1978 1980 1982 1984 1986 1988 1990 1992 1994 1996
y ry r[ ry ]
k 1 k k 1 k 1 k k 1
y ry r y (r )
k 1 k
2
k k 1 k k 1
y r 2 y '
k 1 k 1 k
Recursively, we thus have for an AR(1) process,
r r
k
k
r r exp(k ln r ) exp(k / )
k
(Theoretical)
k
AR(1) Process
Autocorrelation Function
1
4
3
r=0.5 N=100 0.8
tcf.m scf.m
0.6
2
0.4
1
0.2
0
0
-1
rednoise.m -0.2
-2
0 10 20 30 40 50 60 70 80 90 100
-0.4
0 5 10 15 20 25
nk
xi xik
r r exp(k ln r ) exp(k / )
k r i1
k k (nk )var
AR(1) Process
Autocorrelation Function
1.2
5
4
r=0.5
r=0.5N=100
N=500 1
3
0.8
2
1 0.6
0
0.4
-1
-2 0.2
-3 rednoise.m
rednoise.m 0
-4
0 50 100 150 200 250 300 350 400 450 500
-0.2
0 5 10 15 20 25
nk
xi xik
r r exp(k ln r ) exp(k / )
k r i1
k k (nk )var
AR(1) Process
Autocorrelation Function
1.2
3 r=0.23 N=1000 1
0.8
2
1 0.6
0
0.4
-1
-2 0.2
-3
Glacial Varves 0
-4
0 100 200 300 400 500 600 700 800 900 1000
-0.2
0 5 10 15 20 25
nk
xi xik
r r exp(k ln r ) exp(k / )
k r i1
k k (nk )var
AR(1) Process
Autocorrelation Function
1
0.8
r=0.75 N=144
0.9
0.6
0.8
0.4 0.7
0.2 0.6
0 0.5
-0.2 0.4
0.3
-0.4
0.2
-0.6
Northern Hem Temp 0.1
-0.8
0 50 100 150
0
0 5 10 15 20 25
nk
xi xik
r r exp(k ln r ) exp(k / )
k r i1
k k (nk )var
AR(1) Process
Autocorrelation Function
1.2
0.6
r=0.54 N=144
0.5 1
0.4
0.8
0.3
0.2
0.6
0.1
0 0.4
-0.1
0.2
-0.2
Northern Hem Temp
-0.3
-0.4
(linearly detrended) 0
0 50 100 150
-0.2
0 5 10 15 20 25
nk
xi xik
r r exp(k ln r ) exp(k / )
k r i1
k k (nk )var
AR(1) Process
Autocorrelation Function
4
0.8
3
Dec-Mar Nino3 0.6
2
0.4
1
0.2
0
0
-1
-0.2
-2
0 50 100 150 -0.4
1 2 3 4 5 6 7 8 9 10 11
nk
xi xik
r r exp(k ln r ) exp(k / )
k r i1
k k (nk )var
AR(1) Process
y ry
k 1 k k 1
The sampling distribution for r is given by the sampling
distribution for the slope parameter in linear regression!
1/ 2
se 1
2
b ( x x )
i
2
y2 2 r (1 r )/ n
2 2
1 r
AR(1) Process
y ry
k 1 k k 1
The sampling distribution for r is given by the sampling
distribution for the slope parameter in linear regression!
2 n' n /V
var(x) V x V 1 2 r Variance
k
n k 1 inflation factor
r r
k
Recall for AR(1) series,
k
r
V ? V 1 2 1
11 2
2
11 1
1 r 11rr 1 r
k
n' n /V
1/ln r
Suppose r 1 ln r r 1 1/1 r
r
2
1
2 1 2
V 1 2 1 1
1 r 1 r 1 r 1 r
k 1
xt 1 k xt k 1 t 1
K
k 1
k 1
xt k'xt 1 xt k'k xt k 1
K
k 1
AR(K) Process
r1 12r1 3r2 ...k rk 1
r2 1r1 2 3r1 ...k rk 2 Yule-Walker
r3 1r2 2r1 3 ...k rk 3 Equations
.
..
rK 1rK 12rK 2 3rK 3 ...k
nk
xi xik
Use r i1
k (nk )var
xt k'xt 1 xt k'k xt k 1
K
k 1
AR(K) Process
r1 12r1 3r2 ...k rk 1
r2 1r1 2 3r1 ...k rk 2 Yule-Walker
r3 1r2 2r1 3 ...k rk 3 Equations
.
..
rK 1rK 12rK 2 3rK 3 ...k
Several results obtained for the AR(1) model generalize
readily to the AR(K) model:
2 2
rm k rmk
K V 1 2 r x2
var(x) V x k
K
1 k rk
k 1
n k 1
k 1
AR(K) Process
r1 12r1 3r2 ...k rk 1
r2 1r1 2 3r1 ...k rk 2 Yule-Walker
r3 1r2 2r1 3 ...k rk 3 Equations
.
..
rK 1rK 12rK 2 3rK 3 ...k
The AR(2) model xt 1 1xt 2 xt 1 t 1
is particularly important because of the range of behavior it
can describe w/ a parsimonious number of parameters
r r 2
k 1
2 2 12
1 r1
The AR(2) model xt 1 1xt 2 xt 1 t 1
is particularly important because of the range of behavior it
can describe w/ a parsimonious number of parameters
r r 2
k 1
2 2 12
1 r1 For stationarity,
we must have:
2
+1 2 1
1 2 1
1
2 1 1
-1
-2 -1 0 1 2
AR(2) Process
Which readily gives:
r1 1 /(1 2 ) Note that this model allows for
r2 2 12 /(1 2 ) independent lag-1 and lag-2
correlation, so that both positive
rm2 k rmk
K
correlation and negative correlation
k 1 are possible...
For stationarity,
we must have:
2
+1 2 1
1 2 1
1
2 1 1
-1
-2 -1 0 1 2
AR(2) Process
Which readily gives:
r1 1 /(1 2 ) Note that this model allows for
r2 2 12 /(1 2 ) independent lag-1 and lag-2
correlation, so that both positive
rm2 k rmk
K
correlation and negative correlation
k 1 are possible...
1 For stationarity,
0.9
artwo.m we must have:
0.8
0.3 2 1
0.7
0.6
0.5
1
2 0.4
1 2 1
0.4
0.3
0.2
2 1 1
0.1
0
0 2 4 6 8 10 12 14 16 18 20
AR(K) Process
Selection Rules
Bayesian Information Criterion
BIC(m) nln n
s 2 (m) (m1)ln n
nm1
nm1
ENSO
Multivariate
ENSO Index
(MEI)
AR(1) Fit
Autocorrelation Function
4
0.8
3
Dec-Mar Nino3 0.6
2
0.4
1
0.2
0
0
-1
-0.2
-2
0 50 100 150 -0.4
1 2 3 4 5 6 7 8 9 10 11
nk
xi xik
r r exp(k ln r ) exp(k / )
k r i1
k k (nk )var
AR(2) Fit
Autocorrelation Function
1
4
0.8
3
Dec-Mar Nino3
0.6
2
0.4
1
0.2
0
0
-1
-0.2
-2
0 50 100 150
-0.4
1 2 3 4 5 6 7 8 9 10 11
nk
rm k rmk xi xik
K
r1 1 /(12)
r2 2 12 /(12) k 1 r i1
(m>2) k (nk )var
AR(3) Fit
Autocorrelation Function
1
4
0.8
3
Dec-Mar Nino3
0.6
2
0.4
1
0.2
0
0
-1
-0.2
-2
0 50 100 150
-0.4
1 2 3 4 5 6 7 8 9 10 11
nk
xi xik
r i1
Theoretical AR(3) Fit k (nk )var
AR(K) Fit
Autocorrelation Function
1
4
0.8
3
Dec-Mar Nino3
0.6
2
0.4
1
0.2
0
0
-1
-0.2
-2
0 50 100 150
-0.4
1 2 3 4 5 6 7 8 9 10 11
Minimum in BIC? nk
Minimum in AIC? xi xik
r i1
Favors AR(K) Fit for K=? k (nk )var
MA model
xt 1 k (xt k 1 )t 1 mt m1
K M
k 1 m1
Now, consider the case k=0
xt 1 t 1 mt m1
M
m1
xt 1 k (xt k 1 )t 1 mt m1
K M
k 1 m1
Now, consider the case k=0
xt 1 t 1 mt m1 t 1 1t
M
m1
x (11 )
2 2
2
r1 1 /(11 )
2
xt 1 1(xt )t 1 1t
2 2 2 )
(11
x
2 1 1
11
2
(1 11) ( 1 1)
r1
11 211
2