Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t) cov{U,V} = ∑ a(s) b(t) c YY (s-t)

18
Algebra U = a(t) Y(t) E{U} = c Y a(t) cov{U,V} = a(s) b(t) c YY (s-t) U is gaussian if {Y(t)} gaussian

description

Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t) cov{U,V} = ∑ a(s) b(t) c YY (s-t) U is gaussian if {Y(t)} gaussian. Some useful stochastic models. Purely random / white noise (i.i.d.) (often mean assumed 0) c YY (u) = cov(Y(t+u),Y(t)} = σ Y 2 if u = 0 - PowerPoint PPT Presentation

Transcript of Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t) cov{U,V} = ∑ a(s) b(t) c YY (s-t)

Page 1: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Algebra

U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)

cov{U,V} = ∑ a(s) b(t) c YY(s-t)

U is gaussian if {Y(t)} gaussian

Page 2: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Some useful stochastic models

Purely random / white noise (i.i.d.)

(often mean assumed 0)

cYY(u) = cov(Y(t+u),Y(t)} = σY2 if u = 0

= 0 if u ≠ 0

ρYY(u) = 1, u=0

= 0, u ≠ 0

A building block

Page 3: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Random walk

Not stationary, but

∆Y(t) = Y(t) – Y(t-1) = Z(t)

Y(t) = Y(t-1) + Z(t), Y(0) = 0

Y(t) = ∑i=1t Z(i)

E{Y(t)} = t μZ

var{Y(t)} = t σZ2

Page 4: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Moving average, MA(q)

Y(t) = β(0)Z(t) + β(1)Z(t-1) +…+ β(q)Z(t-q)

If E{Z(t)} = 0, E{Y(t)} = 0

cYY(u) = 0, u > q

= σZ2 ∑ t=0

q-k β(t) β(t+u) u=0,1,…,q

= cYY(-u) stationary

MA(1). ρYY(u) = 1 u = 0

= β(1)/(1+ β(1) 2), k = ±1

= 0 otherwise

Page 5: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Backward shift operator remember translation operator TuY(t)=Y(t+u)

Linear process. )(MA

jtt

j XXB

0iitit ZX

Need convergence condition, e.g. |i | or |i |2 <

q

q

t

q

q

tt

BBB

ZBB

ZBX

qMA

...)(

)...(

)(

)(

10

10

BjY(t) = Y(t-j)

Page 6: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

autoregressive process, AR(p)

first-order, AR(1) Markov

Linear process invertible

For convergence in probability/stationarity

1||

tt

tptptt

ZXB

ZXXX

)(

...11

ttt ZXX 1

...

)(

2

2

1

21

ttt

tttt

ZZZ

XZZX

(**)

Page 7: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

a.c.f. of ar(1) from previous slide (**)

||

22||

)(

,...2/,1/,0 ),1/()(k

Z

k

k

kk

p.a.c.f. using normal or linear definitions

corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)}

= 0 for m p when Y is AR(p)

Proof. via multiple regression

ρYY

Page 8: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

In general case,

Useful for prediction

tptptt ZXXX ...11

tystationarifor 1||in 0(z) of roots need

)(

z

ZXB tt

Page 9: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Yule-Walker equations for AR(p).

Sometimes used for estimation

Correlate, with Xt-k , each side of

tptptt ZXXX ...11

0 ),(...)1()( 1 kpkkk p

ρYY

Page 10: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

ARMA(p,q)

qtqtttptptt ZZZXXX ...... 11011

(B)Yt = (B)Zt

Page 11: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

ARIMA(p,d,q).

0)ARIMA(0,1,

)1(

walkRandom

1

tt

ttt

ZX

XBXX

q)ARMA(p, stationary a is t

d X

Xt = Xt - Xt-1 2Xt = Xt - 2Xt-1 + Xt-2

arima.mle() fits by mle assuming Gaussian noise

Page 12: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Armax.

(B)Yt = β(B)Xt + (B)Zt

arima.mle(…,xreg,…)

State space.

st = Ft(st-1 , zt ) Yt = Ht(st , Zt )could include X

Page 13: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Next i.i.d. → mixing stationary process

Mixing has a variety of definitions

e.g. normal case, ∑ |cYY(u)| < ∞, e.g.Cryer and Chan (2008)

CLT mY = cY

T = Y-bar = ∑ t=1T Y(t)/T

Normal with

E{mY} = cY

var{mY} = ∑ s=1T ∑ t=1

T c YY(s-t)

≈ T ∑ u c YY(u) = T σYY if white noise

Page 14: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

OLS.

Y(t) = α + βt + N(t)

b = β + ∑ (t - tbar)N(t) /∑ (t - tbar)2

= β + ∑ u(t) N(t)

E(b) = β

Var(b) = ∑ ∑ us ut cNN(s-t)

Page 15: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Cumulants.

cum(Y1,Y2, ...,Yk )

Extends mean, variance, covariance

cum(Y) = E{Y}

cum(Y,Y) = Var{Y}

cum(X,Y) = Cov(X,Y)

DRB (1975)

Page 16: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)
Page 17: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Proof of ordinary CLT.

ST = Y(1) + … + Y(T)

cumk(ST) = T κ k additivity and imdependence

cumk(ST/√T) = T–k/2 cumk(ST) = O( T T–k/2 ) → 0 for k > 2 as T → ∞

normal cumulants of order > 2 are 0

normal is determined by its moments

(ST - Tμ)/√ T tends in distribution to N(0,σ2)

Page 18: Algebra    U =  ∑ a(t) Y(t)     E{U} = c  Y  ∑ a(t)      cov{U,V} = ∑ a(s) b(t) c  YY (s-t)

Stationary series

cumulant functions.

cum{Y(t+u1 ), …,Y(t+u k-1 ),Y(t) } = ck(t+u 1 , … ,t+u k-1 ,t) = ck(u1 , .., uk-1)

k = 2, 3,, 4 ,…

cumulant mixing.

∑ u |ck(u1 , ..,uk-1)| < ∞ u = (u1 , .., uk-1)