Sei sulla pagina 1di 7

Chapter 3

ARIMA
Models
3.1 Some Preliminaries

Univariate time series analysis is a way to introduce the tools


necessary for analyzing more complicated models. In this section we
will review what stationarity means, how to test if a time series is
stationary and will introduce methods and operations which are
pertinent in our discussions. These things will be essential in
discussing models that characterizes a stochastic process.

The core part of this whole chapter is to study the Box-Jenkins


approach (named after George Box and Gwilym Jenkins) in modeling
time series. This philosophy is centered on the concept of finding the
best fit model of a time series based on past values.

3.1.1 Stationarity

There are two main considerations when using the Box-Jenkins


approach in modeling time series:
1. Is the time series stationary?
2. Is significant seasonality that should be addressed?

(Strict Stationarity) Recall that there are two types of stationarity:


strict stationarity and weak stationarity. A stochastic process {Yt} is
strictly stationary if for all k , h and (t1 , t 2 ,K, t k ) k,

(Y , Y )( )
d
t1 t2 , K, Ytk = Yt1+ h , Yt2 + h , K, Ytk + h

where d denotes equality in distribution. This type of stationarity is


very restrictive and too difficult to verify.

(Weak Stationarity) On the other hand, the process {Yt} is weakly


stationary if, for all h, t ,

E (Yt ) =
Var(Yt ) = 2
Cov(Yt , Yt k ) = (k )

40
The three conditions mean that a (weakly) stationary time series has
a constant mean and constant variance with a covariance structure
which is not dependent on time t, but of k, the lag or time difference
between Yt and Yt+k.

3.1.2 Autocovariance, Autocorrelation and Partial Autocorrelation

(Autocovariance and Autocorrelation Functions) Let {Yt} be


a stationary process. We denote the following functions:

i. Autocovariance function:
Cov(Yt , Yt k ) = (k ) = k
ii. Autocorrelation function (ACF):
Cov(Yt , Yt k ) Cov(Yt , Yt k ) k
Corr(Yt , Yt k ) = = = = k
Var(Yt )Var(Yt k ) Var(Yt ) 0

For stationary processes, we expect that both k and k taper off to


zero very rapidly. This indicates a property known as the short-
memory behavior of the time series. A time series has short-memory
behavior if k < . Otherwise, it is said to have long memory.

An alternative measure is what we call the partial autocorrelation


function which is a conditional correlation:

iii. Partial autocorrelation function (PACF):


kk = Corr(Yt , Yt k | Yt 1 , Yt 2 , K, Yt k +1 ) .

The PACF is the correlation between Yt and Yt-k after the mutual
linear dependency of the intervening variables Yt-1 up to Yt-k+1 has
been removed.

(Example) A stochastic process {t} is called a white noise process if


for all t and nonzero k,
i. E ( t ) = 0
ii. Var( t ) = 2
iii. Cov( t , t k ) = k = 0

41
In terms of k and k,
2 k = 0 1 k = 0
k = and k = .
0 OW 0 OW

The white noise process hardly occurs in real-life time series but
plays an important role in constructing time series models. Also, the
white noise process is Gaussian if the joint distribution is normally
distributed.

(Sample Autocorrelation Function) The ACF and PACF will be


very vital in determining which univariate model appropriately
characterizes a stationary random process. Both are unknown and
have to estimated from the realized time series. Define a time series
{yt}, t=1,2,,T. Let

(y )
T T

yt
2
t Y
Y= t =1
and 0 = t =1
T T

be the sample mean and sample variance respectively. The sample


autocovariance, sample autocorrelation and sample partial
autocorrelation functions are defined as

i. Sample autocovariance function:

(y )( )
T k

t Y y t k Y
k = t =1
T
ii. Sample autocorrelation function:

(y )( )
T k
Y y t k Y
t
k = k = t =1

(y )
0 T
2
t Y
t =1

iii. Sample partial autocorrelation function (derivation is


omitted):
k 1
k k 1, j k j
j =1
11 = 1 , kk = k 1
1 k 1, j k j
j =1

where kj = k 1, j kk k 1,k j , k = 1,2, K, k 1 .

42
CPI
160

140

120

100

80

60

40
94 96 98 00 02 04 06 08 10 12 14 16

D(CPI)
3.0

2.5

2.0

1.5

1.0

0.5

0.0

-0.5

-1.0
94 96 98 00 02 04 06 08 10 12 14 16

AVE_RAIN
50

40

30

20

10

0
I II III IV I II III IV I II III IV I II III IV I II III
2009 2010 2011 2012 2013

43
3.2 Models of Stationary Processes

(Linear process) A stochastic process {Yt} is a linear process if it


can be represented of the form


Yt = + cr t r .
r =1

In the expression above, is the common mean, {cr} is a sequence of


constants and {t} is white noise.

We will be discussing three linear processes: the autoregressive (AR),


moving average (MA) and the autoregressive moving average
(ARMA) processes. All of these are special cases of the linear process.

3.2.1 Autoregressive Process

If a stochastic process follows an autoregressive process, it is assumed


that the current value of the time series is dependent upon its previous
value, plus some error. The (linear) autoregressive process is defined
as
Yt = + Yt 1 + t ,

where t is a white noise process. This model is autoregressive since


Yt is regressed on itself. In particular, this is an autoregressive process
of order 1, or an AR(1) process. At first, note that the AR(1) process
does not look like it is a linear process, but in fact it is:

Yt = Yt 1 + t
= t + ( t 1 + Yt 2 ) = t + t 1 + 2Yt 2
= t + t 1 + 2 ( t 2 + Yt 3 ) = t + t 1 + 2 t 2 + 3Yt 3
K = t + t 1 + 2 t 2 + 3 t 3 + K + k 1 t (k 1) + kYt k
= t + t 1 + 2 t 2 + 3 t 3 + K

An AR process is stationary. It is clear that the mean is constant.


Now, if the variance of t is 2, then

44
Var(Yt ) = 0 + 2 + 2 + 2 2 + K + k 1 2 + kVar(Yt k ) .

The form above for the variance is the same for any Yt+h, hence we
have a constant variance. Also, we need the variance to exist. The
RHS of the expression above converges only if ||<1 (no need to
prove). Thus, ||<1 is a requirement for AR(1) to be stationary.

In general, we can define an AR(p) process:

p
Yt = + iYt i + t .
i =1

We will be defining later the stationarity condition for an AR(p)


process.

3.2.2 Moving Average Process

The random process can also be modeled by a moving average process


where the current value of the series is a weighted sum of past white
noise term (hence the term moving average):

Yt = + t + t 1 .

The process above is a moving average process of order 1, or an MA(1)


process. We can think of the past white noise as an innovation or
shock new stochastically uncorrelated information or news show up
at each time step, when, combined with other innovations will provide
the observable series Yt.

In general, we have an MA(q) process:

q
Yt = + t + j t j .
j =1

The moving average process is a straightforward linear process.


Moreover, it is easy to see that a moving average process is covariance
stationary. It has a constant mean and a constant (and finite)
variance for any j.

45

Potrebbero piacerti anche