Chapter 1 Basics - University of Kansasdept.ku.edu/~empirics/Courses/Econ916/Chap_1.pdf · Chapter...

Post on 06-Jun-2020

10 views 0 download

Transcript of Chapter 1 Basics - University of Kansasdept.ku.edu/~empirics/Courses/Econ916/Chap_1.pdf · Chapter...

Chapter 1

Basics

1.1 Definition

A time series (or stochastic process) is a function Xpt, ωq such that for

each fixed t, Xpt, ωq is a random variable [denoted by Xtpωq]. For a fixed ω,

Xpt, ωq is simply a function of t and is called a realization of the stochastic

process.

$''&''%Time Domain Approach

Frequency Domain Approach

$''&''%Continuous Time

Discrete Time

11

12 CHAPTER 1. BASICS

1.2 Characterization of a time series

A time series is characterized by the joint distribution function of any

subset Xt1 , � � � , Xtn that is FXt1 , ��� , Xtnpxt1 , � � � , xtnq.

Using the joint distribution function, define

µt � EpXtq

γt,s � CovpXt, Xsq, provided that they exist.

The first two monents, {µt}, {γt,s} completely characterize a Gaussian process.

FXt1 , ��� , Xtnis in general, very difficult to analyze. In particular, esti-

mation of {µt}, {γt,s} appears to be impossible unless a number of different

realizations, i.e. repeated observations are available.

1.3 Stationarity

Assume FXt1 ,��� ,Xtnis invariant under transformation of the time indices,

that is

FXt1�h, ��� , Xtn�hpxt1 , � � � , xtnq � FXt1 , ��� ,Xtn

pxt1 , � � � , xtnq p1q

for all sets of indices (t1, � � � , tn). This is called the strict stationarity.

1.3. STATIONARITY 13

Under this assumption, the joint distribution function depends only on

the distance between the elements in the index set.

If tXtu is strictly stationary and E|Xt|   8, then

EpXtq � µ, @t p2q

CovpXt, Xsq � γ|t�s|, @t, s p3q

tXtu is said to be weakly stationary (or ”covariance stationary” or sim-

ply, ”stationary”) if (2) and (3) holds.

REMARKS

1) (Weak) stationarity does not imply strict stationarity. Nor does

strict stationarity imply (weak) stationarity. e.g. A strict

stationary process may not possess finite moments (e.g. Cauchy).

2) For a Gaussian process, weak and strict stationarity are equivalent.

14 CHAPTER 1. BASICS

µ and γ|t�s| can be estimated by

pµ � sX � 1T

T

t�1Xt

pγh � ch � 1T

T

t�h�1pXt � sXqpXt�h � sXq, h � 0, 1, � � � .

If the process is also ergodic (average asymptotic independence), sX and

ch are consistent estimators of µ and γh.

1.4 Autocovariance and Autocorrelation Func-

tions

The sequence tγhu viewed as a function of h is called the autocovariance function.

The autocorrelation function is defined by

ρh � γhγ0,

note ρ0 � 1

Example 1. (White noise process)

tXtu, Xt � iid p0, σ2q, 0   σ2   8.

µ � 0, γh �

$''&''%σ2 if h � 0,

0 otherwise.

1.4. AUTOCOVARIANCE AND AUTOCORRELATION FUNCTIONS 15

Example 2. (MA(1) process)

Let tεtu be a white noise process with finite variance σ2. Let

Xt � εt � θεt�1.

Then

µ � EpXtq � 0

γ0 � Epεt � θεt�1q2 � Epε2t q � θ2Epε2

t�1q

� p1� θ2qσ2

γh � Erpεt � θεt�1qpεt�h � θεt�h�1qs

$''&''%θσ2 if |h| � 1,

0 if |h| ¥ 2.

Therefore,

ρh �

$''''''&''''''%1 if h � 0,θ

1� θ2 if |h| � 1,

0 otherwise.

16 CHAPTER 1. BASICS

Suppose θ � 0.6. Then ρ1 � 0.61� 0.62 � 0.44.

1 2 3 4 5

1

0.5

0 h

ρh

ρh � ρphq : the autocorrelation function.

Sample mean of tXtuTt�1 � sXT

Sample variance = c0 pγ0

Sample autocovariance = ch, h � 1, 2, � � � pγhSample autocorrelation = rhpor pρhq � ch

c0, h � 1, 2, � � �

Note r0 � 1.

1.5. LAG OPERATOR 17

A plot of rh against h � 0, 1, � � � is called correlogram.

1 2 3 4 5

1

0.5

0 h

rh

The sample autocorrelations are estimates of the corresponding theo-

retical autocorrelations and are therefore subject to sampling errors.

1.5 Lag Operator

The operator L (sometimes denoted by B) is defined by

LXt � Xt�1.

18 CHAPTER 1. BASICS

Formally, L operates on the whole sequence.

L :

����������������

...

Xt�1

Xt

Xt�1

...

����������������ÝÑ

����������������

...

Xt�2

Xt�1

Xt

...

����������������The lead operator L�1 (sometimes denoted by F ) is defined by

L�1Xt � Xt�1.

Successive application of L yields LhXt � Xt�h, h � 1, 2, � � �

We define L0Xt � Xt.

The lag operator is a linear operator :

LpcXtq � cLXt

LpXt � Ytq � LXt � LYt

and can be manipulated like a usual algebraic quantity.

For example, suppose

yt � φyt�1 � εt, and |φ|   1.

1.6. GENERAL LINEAR PROCESS 19

Then, p1� φLqyt � εt

So, yt � εt1� φL

�8

j�0pφLqjεt

∆ � 1� L is called the difference operator.

∆2yt � p1� Lq2yt � p1� 2L� L2qyt

= yt � 2yt�1 � yt�2

∆2yt � ∆p∆ytq � ∆pyt � yt�1q

= pyt � yt�1q � pyt�1 � yt�2q � yt � 2yt�1 � yt�2

1.6 General Linear Process

yt � εt � ψ1εt�1 � ψ2εt�2 � � � �

� ψpLqεt ÐÝ linear process

where εt � iid p0, σ2q and

ψpLq � 1� ψ1L� ψ2L2 � � � �

polynomial in lag operators ψpLq is sometimes called transfer function

20 CHAPTER 1. BASICS

A time series tytu can be viewed as the result of applying a backward (linear)

filter to a white noise process.

εtinputÝÑ

linear filter

ψpLqoutputÝÑ Yt � ψpLqεt

The sequence tψj : j � 0, 1, � � � u can be finite or infinite. If it is finite

of order q, we obtain MApqq process. This is clearly a stationary process. If

tψju is infinite, we usually assume it is absolutely summable. i.e.8

j�0|ψj|   8.

Then the resulting process is stationary.

To see this,

µ � 0 �8

j�0ψj � 0

γ0 � σ28

j�0ψ2j   σ2

� 8

j�0|ψj|

2

  8

γh � σ28

j�0ψjψj�h   σ2

8

j�0|ψj||ψj�h|   σ2

� 8

j�0|ψj|

2

  8

1.7. AUTOREGRESSIVE PROCESS 21

The stationary condition is embodied in the condition that ψpzq must

converge for |z| ¤ 1, i.e. for z on or within the unit circle.

Note that absolute summability of tψju is sufficient but not necessary

for stationarity.

7

8

j�0ψjz

j  8

j�0|ψj|   8

1.7 Autoregressive Process

The process yt defined by

yt � φ1yt�1 � � � � � φpyt�p � εt p�q

is called a p-th order autoregressive process and is denoted by

yt � ARppq

The equation p�q is sometimes called a stochastic difference equation.

22 CHAPTER 1. BASICS

1.7.1 First - Order Autoregressive Process

ARp1q is given by

yt � φyt�1 � εt.

By successive substitution,

yt � εt � φyt�1

� εt � φεt�1 � φ2yt�2

� εt � φεt�1 � φ2εt�2 � φ3yt�3

�J�1

j�0φjεt�j � φJyt�J ,

implying Epyt|yt�Jq � φJyt�J

If |φ| ¥ 1, the value of yt�J can affect the prediction of future yt, no

matter how far ahead.

If |φ|   1,

yt � limJÑ8

J�1

j�0φjεt�j � lim

JÑ8φJyt�J

�8

j�0φjεt�j.

Note that8

j�0|φj| �

8

j�0|φ|j � 1

1� |φ|  8, if |φ|   1,

1.7. AUTOREGRESSIVE PROCESS 23

so, tφju is absolutely summable and tytu is a linear process.

ψpLq � p1� φLq�1 �8

j�0φjLj, or ψj � φj

Now,

Epy2t q � Epφ2y2

t�1 � ε2t � 2φyt�1εtq

γ0 � φ2γ0 � σ2

because Epyt�1εtq

� ErEpyt�1εt|yt�1qs

� Eryt�1Epεt|yt�1qs

� Epyt�1 � 0q � 0

Therefore, γ0 � σ2

1� φ2   8, if |φ|   1.

1.7.2 Second - Order Autoregressive Process

ARp2q is given by

yt � φ1yt�1 � φ2yt�2 � εt

or

φpLqyt � εt,

where, φpLq � 1� φ1L� φ2L2.

24 CHAPTER 1. BASICS

Now, suppose φpzq can be written as

φpzq � p1� λ1zqp1� λ2zq.

Then

ψpzq � φ�1pzq � 1p1� λ1zqp1� λ2zq �

K1

1� λ1z� K2

1� λ2z,

where K1 � λ1

λ1 � λ2and K2 � �λ2

λ1 � λ2.

Therefore, ψpzq converges for |z| ¤ 1, iff |λ1|   1 and |λ2|   1.

In other words, for ARp2q process to be stationary, the roots of

φpzq � 1� φ1z � φ2z2 must lie outside the unit circle.

Note that φp0q � 1 ¡ 0. Let m1 and m2 be the roots. The neces-

sary and sufficient condition for |m1| ¡ 1 and |m2| ¡ 1 are

|m1m2| ����� 1φ2

���� ¡ 1 ðñ |φ2|   1

φp1q � 1� φ1 � φ2 ¡ 0

φp�1q � 1� φ1 � φ2 ¡ 0

1.7. AUTOREGRESSIVE PROCESS 25

Õy � 1� φ1z � φ2z

2

�1 1 2 3

�1

1

2

z

y

To have real roots, φ21 � 4φ2 ¥ 0

Õreal roots

×complex roots

�2 �1 1 2

�2

�1

1

2

φ1

φ2

26 CHAPTER 1. BASICS

1.7.3 p - th Order Autoregressive Process

ARppq is given by

φpLqyt � εt,

where φpLq � 1� φ1L� � � � � φpLp.

This process is stationary if all characteristic roots of φpzq � 0 lie outside

the unit circle.

1.8 Autocovariance and Autocorrelation Func-

tions

1.8.1 AR(1) Process

yt � φyt�1 � εt

Then, Epyt�1ytq � Epφy2t�1 � εtyt�1q � φEpy2

t�1q.

So that

γ1 � φγ0,

ρ1 � φρ0 � φ .

1.8. AUTOCOVARIANCE AND AUTOCORRELATION FUNCTIONS 27

Similarly, and noticing that Epyt�hytq � Erφyt�hyt�1 � yt�hεts � φγh�1,

we have

γh � φγh�1 � φhγ0

ρh � φh.

1 2 3 4 5

1

0.5

0

exponential decay

φ � 0.6

h

ρτ

1 2 3 4 5

1

0.5

0

�0.5

damped oscillationφ � �0.6

h

ρτ

1.8.2 AR(2) Process

yt � φ1yt�1 � φ2yt�2 � εt

Then

Epytyt�hq � φ1Epyt�1yt�hq � φ2Epyt�2yt�hq � Epεtyt�hq p1q

γh � φ1γh�1 � φ2γh�2, h � 1, 2, � � �

ρh � φ1ρh�1 � φ2ρh�2, h � 1, 2, � � � . p2q

28 CHAPTER 1. BASICS

Setting h � 1 in (2) yields

ρ1 � φ1ρ0 � φ2ρ1 ρ2 � φ1ρ1 � φ2

� φ1

1� φ2� φ2

11� φ2

� φ2

Setting h � 0 in (1) yields

γ0 � φ1γ1 � φ2γ2 � σ2

γ0p1� φ1ρ1 � φ2ρ2q � σ2

γ0p1� φ21

1� φ2� φ2

1φ2

1� φ2� φ2

2q � σ2

γ0

�1� φ2

1� φ2

rp1� φ2q2 � φ2

1s � σ2

6 γ0 ��

1� φ2

1� φ2

σ2

rp1� φ2q2 � φ21s.

1.8.3 AR(p) Process

Epytyt�1q � Epφ1y2t�1 � � � � � φpyt�pyt�1 � εtyt�1q.

γ1 � φ1γ0 � � � � � φpγp�1

γ0 � φ1γ1 � � � � � φpγp � σ2.

1.9. MOVING AVERAGE PROCESS 29

If h ¥ p,

γh � φ1γh�1 � � � � � φpγh�p

ρh � φ1ρh�1 � � � � � φpρh�p

Note

ρh ÝÑ 0, as h ÝÑ 8 for AR(p) process.

1.9 Moving Average Process

The process yt which is defined by

yt � εt � θ1εt�1 � � � � � θqεt�q

is called a q -th order moving average process or MApqq process.

Write

yt � θpLqεt, where, θpLq � 1� θ1L� � � � � θqLq

Finite MA process can be regarded as the output yt from a linear filter

with transfer function θpLq when the input is WN (White Noise), εt. Finite

MA process is therefore stationary.

30 CHAPTER 1. BASICS

The autocovariances are

γ0 � Ey2t � Epεt � θ1εt�1 � � � � � θqεt�qq2

� σ2p1� θ21 � � � � � θ2

qq � σ2q

j�0θ2j , (where θ0 � 1).

γh � Epytyt�hq

� Erpεt � θ1εt�1 � � � � � θqεt�qqpεt�h � θ1εt�h�1 � � � � � θqεt�h�qqs

� σ2pθh � θ1θh�1 � � � � � θq�hθqq, for h ¤ q

� σ2q

j�h

θj�hθj (convolution).

γh � 0, for h ¡ q.

Note that there is a cut off q. This contrasts with ARppq process, for

which γh � 0, for any h.

The h� th order autocorrelation function

ρh � θh � θ1θh�1 � � � � � θq�hθq1� θ2

1 � � � � � θ2q

, for h � 1, 2, � � � , q

ρh � 0 for h ¡ q

The autocorrelation function for MA process has a cut - off at h � q.

Write

ΠpLqyt � εt

yt � θpLqεt

1.9. MOVING AVERAGE PROCESS 31

If MA process is invertible,

ΠpLq � θpLq � 1.

The invertiblity condition for MApqq process is that all roots of charac-

teristic equation θpzq � 0 lie outside the unit circle.

Write

yt �q¹j�1p1� ωjLqεt,

If it is invertible,q¹j�1p1� ωjL� ω2

jL2 � � � � qyt � εt,

that is, the finite order MA process is transformed into the infinite order AR

process.

1.9.1 MA(1) Process

yt � εt � θεt�1

� p1� θLqεt.

θ must lie in the range (-1, 1) for the process to be invertible.

γ0 � p1� θ2qσ2

γ1 � θσ2 γh � 0 for h � 2, 3, � � �

32 CHAPTER 1. BASICS

ρ1 � θ

1� θ2 ρh � 0 for h � 2, 3, � � � p�q

It follows from p�q that

θ2 � ρ�11 θ � 1 � 0

which implies that if θ � sθ is a solution, then so is θ � sθ�1.

So, if θ � sθ satisfies the invertibility condition, the other root θ � sθ�1

will not satisfy the condition.

1.10 Mixed Autoregressive - Moving Average

Model

The process yt which is defined by

yt � φ1yt�1 � � � � � φpyt�p � εt � θ1εt�1 � � � � � θqεt�q p1q

is called a (mixed) autoregressive - moving - average progress of p and q -th order,

or ARMApp, qq process. This may be thought of as a p-th order autoregres-

sive process, φpLqyt � et, with et following the q-th order moving average

process, et � θpLqεt, or alternatively, as an MApqq process yt � θpLqηt, with

ηt following ARppq process φpLqηt � εt.

1.10. MIXED AUTOREGRESSIVE - MOVING AVERAGE MODEL 33

ARMApp, qq process (1) is stationary if φpzq � 0 has all its roots lying

outside the unit circle, and is invertible if all roots of θpzq � 0 lie outside the

unit circle.

On multiplying throughout in (1) by yt�h and taking expectations, we see

that the autocovariance function satisfies the deterministic difference equa-

tion

γh � φ1γh�1 � � � � � φpγh�p

�γyεphq � θ1γyεph� 1q � � � � � θqγyεph� qq p2q

where γyεp�q is the cross covariance function between y and ε, and is defined

as γyεphq � Epyt�hεtq.

It is easy to see

γyεphq � 0, for h ¡ 0

γyεphq � 0, for h ¤ 0.

Hence,

γh � φ1γh�1 � � � � � φpγh�p, for h ¥ q � 1

and,

ρh � φ1ρh�1 � � � � � φpρh�p, for h ¥ q � 1

or,

φpLqρh � 0, for h ¥ q � 1 ÐÝ look like ARppq p3q

34 CHAPTER 1. BASICS

Setting h � 0 in (2), we have

γ0 � φ1γ1 � � � � � φpγp � σ2 � θ1γyεp�1q � � � � � θqγyεp�qq p4q

From (3), we see that the autocorrelation function for the mixed process

eventually takes the same shape as that of AR process φpLqyt � εt.

1.10.1 ARMA(1, 1) process

yt � φyt�1 � εt � θεt�1 p5q

p1� φLqyt � p1� θLqεt

The process is stationary if |φ|   1, and invertible if |θ|   1.

Using (2) and (4), we obtain

γ0 � φγ1 � σ2 � θγyεp�1q

γ1 � φγ0 � θσ2

γh � φγh�1, for h � 2, 3, � � �

Note that γyεp�1q � Epytεt�1q � Epyt�1εtq

So, on multiplying throughout (5) by εt�1, and taking expectations, we

obtain

γyεp�1q � pφ� θqσ2

1.10. MIXED AUTOREGRESSIVE - MOVING AVERAGE MODEL 35

Hence,

γ0 � φpφγ0 � θσ2q � σ2 � θpφ� θqσ2

� 1� θ2 � 2φθ1� φ2 σ2

γ1 � φp1� θ2 � 2φθq � θp1� φ2q1� φ2 σ2 � pφ� θqp1� φθq

1� φ2 σ2

γh � φγh�1, for h � 2, 3, � � � Ð look like ARp1q

and so

ρ1 � pφ� θqp1� φθq1� θ2 � 2φθ

ρh � φρh�1, for h � 2, 3, � � �

1 2 3 4 5

1

0.5

0

exponential decay

φ � 0.6θ � 0.3

h

ρh

[ρ1 � 0.337, ρ2 � 0.202, ρ3 � 0.121, ρ4 � 0.073, ρ5 � 0.044]

36 CHAPTER 1. BASICS

Theorem (Granger’s Lemma) IfXt � ARMApp,mq and Yt � ARMApq, nq,

if Xt and Yt are independent, and if Zt � Xt � Yt, then Zt � ARMApr, lq,

where r ¤ p� q, and l ¤ maxpp� n, q �mq.

P roof. Let a1pLqXt � b1pLqεt and a2pLqYt � b2pLqηt, where a1, a2, b1, b2

are polynomials in L of order p, q,m, n respectively, and εt, ηt are indepen-

dent WN processes. Multiplying Zt � Xt � Yt by a1pLqa2pLq, we obtain

a1pLqa2pLqZt � a1pLqa2pLqXt � a1pLqa2pLqYt

� a2pLqb1pLqεt � a1pLqb2pLqηt.

Since a2pLqb1pLqεt �MApq�mq and a1pLqb2pLqηt �MApp�nq, RHS �

MAplq, where l ¤ maxpp � n, q �mq. The order of a1pLqa2pLq is not more

than p� q, and hence the theorem follows. �

The inequalities are needed in the theorem since a1pLq and a2pLq may

contain common roots.

REMARKS

(1) The theorem implies that if the series analyzed is the sum of two

independent ARp1q, then the series will be ARMAp2, 1q,

1.11. AUTOCOVARIANCE GENERATING FUNCTION 37

for example. If the observed series is the sum of a ”true” ARppq

process plus a WN measurement error, then an ARMApp, pq process

results.

(2) Mixed model may achieve as good a fit as the AR model but using

fewer parameters. Principle of parsimony (Box and Jenkins)

1.11 Autocovariance Generating Function

A compact and convenient way of recording the information contained

in a sequence taju is by means of the generating function : apzq �¸j

ajzj,

where z is a possibly complex variable. The individual members of the se-

quence can easily be recovered from the coefficients associated with the zj’s.

The quantity z does not necessarily have any interpretation and should be

simply considered as the carries of the information in the sequence.

Now define

γpzq �8

h��8

γhzh autocovariance generating function

ρpzq �8

h��8

ρhzh autocorrelation generating function

mptq � Epetxq Moment Generating Function

φptq � Epeitxq Characteristic Function

38 CHAPTER 1. BASICS

Useful properties of generating functions :

(1) Additivity cj � aj � bj ÝÑ cpzq � apzq � bpzq

(2) Convolution∗ cj �j

k�0akbj�k ÝÑ cpzq � apzqbpzq

apzq �8

j�0ajz

j, j � 0, 1, 2, � � �

Given X1, X2, � � � , z - transform Xpzq �¸t

Xtzt

Now, let XT pzq �T

t�1Xtz

t,

then, ErXT pzqXT pz�1qs �T

h��T

pT � |h|qγhzh.

Dividing by T and passing to the limit, we obtain,

γpzq � limTÑ8

1T

ErXT pzqXT pz�1qs.

Let S � t�N

yt � yNS �S

j�0ψjε

NS�j �

S

j�0ψjεS�N�j �

t�N

j�0ψjεt�j

Ñ8

j�0ψjεt�j as N ÝÑ 8

∗c0 � a0b0c1 � a0b1 � a1b0c2 � a0b2 � a1b1 � a2b0pa0 � a1z � a2z2 � � � � qpb0 � b1z � b2z2 � � � � q � a0b0 � pa0b1 � a1b0qz � pa0b2 � a1b1 �a2b0qz

2 � � � �

1.11. AUTOCOVARIANCE GENERATING FUNCTION 39

y0 � ε0

y1 � ε1 � ψ1ε0

y2 � ε2 � ψ1ε1 � ψ2ε0

...

yT �T

j�0ψjεt�j

Define, yT pzq �T

t�0ytz

t

εT pzq �T

t�0εtz

t

ψT pzq �T

t�0ψtz

t

Suppose tYtu is a linear process.

Then, Y pzq � ψpzqεpzq.

Hence, γY pzq � limTÑ8

1T

ErψT pzqεT pzqψT pz�1qεT pz�1qs

� ψpzqψpz�1qσ2

1.11.1 MA(q) process

yt � θpLqεt

γpzq � σ2θpzqθpz�1q

40 CHAPTER 1. BASICS

Note that

θpzqθpz�1q �q

j�0

q

k�0θjθkz

j�k

�q

h��q

zhq

j�0θjθj�h

by setting j � k � h, and taking θj � 0 for j ¡ q.

For MA(1), yt � εt � θεt�1

θpzqθpz�1q � σ2p1� θzqp1� θz�1q

� σ2rp1� θ2q � θz � θz�1s

$''''''&''''''%γ0 � p1� θ2qσ2

γ1 � θσ2

γh � 0, for h � 2, 3, � � �

q � 1

$''''''&''''''%θ0 � 1

θ1 � θ

θh � 0 for h � 0, h � 1

1

h��1zh

1

j�0θjθj�h � pθ2

0 � θ21q � pθ0θ1 � θ1θ2qz � pθ0θ�1 � θ1θ0qz�1

� p1� θ2q � θz � θz�1

1.11.2 AR(p) process

φpLqyt � εt ùñ yt � ψpLqεt

and in generating function form

1.11. AUTOCOVARIANCE GENERATING FUNCTION 41

φpzqypzq � εpzq ùñ ypzq � ψpzqεpzq,

where, φpzqψpzq � 1.

Therefore,

γpzq � σ2ψpzqψpz�1q

� σ2

φpzqφpz�1q

For AR(1), yt � φyt�1 � εt

γpzq � σ2

p1� φzqp1� φz�1q� p1� φz � φ2z2 � � � � qp1� φz�1 � φ2z�2 � � � � qσ2

� rp1� φ2 � φ4 � � � � q � pφ� φ3 � φ5 � � � � qz � � � � sσ2

γ0 � σ2

1� φ2 , γh � φhγ0

1.11.3 ARMA(p, q) process

γpzq � σ2 θpzqθpz�1qφpzqφpz�1q

42 CHAPTER 1. BASICS