Forecasting: principles and practice · Estimation If we minimize Pn2 t (by using ordinary...

Post on 17-Aug-2020

3 views 0 download

Transcript of Forecasting: principles and practice · Estimation If we minimize Pn2 t (by using ordinary...

Forecasting: principles and practice 1

Forecasting: principlesand practice

Rob J Hyndman

3.2 Dynamic regression

Outline

1 Regression with ARIMA errors

2 Stochastic and deterministic trends

3 Periodic seasonality

4 Lab session 14

5 Dynamic regression models

Forecasting: principles and practice Regression with ARIMA errors 2

Regression with ARIMA errorsRegression models

yt = β0 + β1x1,t + · · · + βkxk,t + et,

yt modeled as function of k explanatory variablesx1,t, . . . , xk,t.In regression, we assume that et was WN.Now we want to allow et to be autocorrelated.

Example: ARIMA(1,1,1) errorsyt = β0 + β1x1,t + · · · + βkxk,t + nt,(1− φ1B)(1− B)nt = (1 + θ1B)et,

where et is white noise .Forecasting: principles and practice Regression with ARIMA errors 3

Regression with ARIMA errorsRegression models

yt = β0 + β1x1,t + · · · + βkxk,t + et,

yt modeled as function of k explanatory variablesx1,t, . . . , xk,t.In regression, we assume that et was WN.Now we want to allow et to be autocorrelated.

Example: ARIMA(1,1,1) errorsyt = β0 + β1x1,t + · · · + βkxk,t + nt,(1− φ1B)(1− B)nt = (1 + θ1B)et,

where et is white noise .Forecasting: principles and practice Regression with ARIMA errors 3

Residuals and errors

Example: nt = ARIMA(1,1,1)yt = β0 + β1x1,t + · · · + βkxk,t + nt,(1− φ1B)(1− B)nt = (1 + θ1B)et,

Be careful in distinguishing nt from et.Only the errors nt are assumed to be white noise.In ordinary regression, nt is assumed to be whitenoise and so nt = et.

Forecasting: principles and practice Regression with ARIMA errors 4

Residuals and errors

Example: nt = ARIMA(1,1,1)yt = β0 + β1x1,t + · · · + βkxk,t + nt,(1− φ1B)(1− B)nt = (1 + θ1B)et,

Be careful in distinguishing nt from et.Only the errors nt are assumed to be white noise.In ordinary regression, nt is assumed to be whitenoise and so nt = et.

Forecasting: principles and practice Regression with ARIMA errors 4

Estimation

If we minimize ∑ n2t (by using ordinary regression):

1 Estimated coefficients β̂0, . . . , β̂k are no longeroptimal as some information ignored;

2 Statistical tests associated with the model (e.g.,t-tests on the coefficients) are incorrect.

3 p-values for coefficients usually too small (“spuriousregression”).

4 AIC of fitted models misleading.

Minimizing ∑ e2t avoids these problems.Maximizing likelihood is similar to minimizing ∑ e2t .

Forecasting: principles and practice Regression with ARIMA errors 5

Estimation

If we minimize ∑ n2t (by using ordinary regression):

1 Estimated coefficients β̂0, . . . , β̂k are no longeroptimal as some information ignored;

2 Statistical tests associated with the model (e.g.,t-tests on the coefficients) are incorrect.

3 p-values for coefficients usually too small (“spuriousregression”).

4 AIC of fitted models misleading.

Minimizing ∑ e2t avoids these problems.Maximizing likelihood is similar to minimizing ∑ e2t .

Forecasting: principles and practice Regression with ARIMA errors 5

Stationarity

Model with ARIMA(1,1,1) errorsyt = β0 + β1x1,t + · · · + βkxk,t + nt,(1− φ1B)(1− B)nt = (1 + θ1B)et,

Equivalent to model with ARIMA(1,0,1) errorsy′t = β1x

′1,t + · · · + βkx′k,t + n′t,

(1− φ1B)n′t = (1 + θ1B)et,

where y′t = yt − yt−1, x′t,i = xt,i − xt−1,i and n′t = nt − nt−1.

Forecasting: principles and practice Regression with ARIMA errors 6

Stationarity

Model with ARIMA(1,1,1) errorsyt = β0 + β1x1,t + · · · + βkxk,t + nt,(1− φ1B)(1− B)nt = (1 + θ1B)et,

Equivalent to model with ARIMA(1,0,1) errorsy′t = β1x

′1,t + · · · + βkx′k,t + n′t,

(1− φ1B)n′t = (1 + θ1B)et,

where y′t = yt − yt−1, x′t,i = xt,i − xt−1,i and n′t = nt − nt−1.

Forecasting: principles and practice Regression with ARIMA errors 6

Regression with ARIMA errorsAny regression with an ARIMA error can be rewritten as aregression with an ARMA error by differencing all variableswith the same differencing operator as in the ARIMAmodel.Original data

yt = β0 + β1x1,t + · · · + βkxk,t + ntwhere φ(B)(1− B)dnt = θ(B)et

After differencing all variablesy′t = β1x

′1,t + · · · + βkx′k,t + n′t.

where φ(B)nt = θ(B)etand y′t = (1− B)dyt

Forecasting: principles and practice Regression with ARIMA errors 7

Regression with ARIMA errorsAny regression with an ARIMA error can be rewritten as aregression with an ARMA error by differencing all variableswith the same differencing operator as in the ARIMAmodel.Original data

yt = β0 + β1x1,t + · · · + βkxk,t + ntwhere φ(B)(1− B)dnt = θ(B)et

After differencing all variablesy′t = β1x

′1,t + · · · + βkx′k,t + n′t.

where φ(B)nt = θ(B)etand y′t = (1− B)dyt

Forecasting: principles and practice Regression with ARIMA errors 7

Regression with ARIMA errorsAny regression with an ARIMA error can be rewritten as aregression with an ARMA error by differencing all variableswith the same differencing operator as in the ARIMAmodel.Original data

yt = β0 + β1x1,t + · · · + βkxk,t + ntwhere φ(B)(1− B)dnt = θ(B)et

After differencing all variablesy′t = β1x

′1,t + · · · + βkx′k,t + n′t.

where φ(B)nt = θ(B)etand y′t = (1− B)dyt

Forecasting: principles and practice Regression with ARIMA errors 7

Model selection

Fit regression model with automatically selectedARIMA errors.Check that et series looks like white noise.Note that estimation is done on the differenced seriesto ensure consistent estimators.

Selecting predictorsAICc can be calculated for final model.Repeat procedure for all subsets of predictors to beconsidered, and select model with lowest AICc value.

Forecasting: principles and practice Regression with ARIMA errors 8

US personal consumption & incomeC

onsumption

Income

Production

Savings

Unem

ployment

1970 1980 1990 2000 2010

−2−1

012

−2.50.02.5

−5.0−2.5

0.02.5

−50−25

02550

−1.0−0.5

0.00.51.01.5

Year

Quarterly changes in US consumption and personal income

Forecasting: principles and practice Regression with ARIMA errors 9

US personal consumption & income

−2

−1

0

1

2

−2.5 0.0 2.5

Income

Con

sum

ptio

n

Quarterly changes in US consumption and personal income

Forecasting: principles and practice Regression with ARIMA errors 10

US Personal Consumption and income

No need for transformations or further differencing.Increase in income does not necessarily translate intoinstant increase in consumption (e.g., after the loss ofa job, it may take a few months for expenses to bereduced to allow for the new circumstances). We willignore this for now.

Forecasting: principles and practice Regression with ARIMA errors 11

US personal consumption & income

(fit <- auto.arima(uschange[,"Consumption"],xreg=uschange[,"Income"]))

## Series: uschange[, "Consumption"]## Regression with ARIMA(1,0,2) errors#### Coefficients:## ar1 ma1 ma2 intercept xreg## 0.6922 -0.5758 0.1984 0.5990 0.2028## s.e. 0.1159 0.1301 0.0756 0.0884 0.0461#### sigma^2 estimated as 0.3219: log likelihood=-156.95## AIC=325.91 AICc=326.37 BIC=345.29

Forecasting: principles and practice Regression with ARIMA errors 12

US personal consumption & income

ggtsdisplay(residuals(fit, type='regression'),main="ARIMA errors")

−2

−1

0

1

1970 1980 1990 2000 2010

ARIMA errors

−0.1

0.0

0.1

0.2

0.3

4 8 12 16 20

Lag

AC

F

−0.1

0.0

0.1

0.2

0.3

4 8 12 16 20

Lag

PAC

F

Forecasting: principles and practice Regression with ARIMA errors 13

US personal consumption & income

ggtsdisplay(residuals(fit),main="ARIMA residuals")

−2

−1

0

1

1970 1980 1990 2000 2010

ARIMA residuals

−0.1

0.0

0.1

4 8 12 16 20

Lag

AC

F

−0.1

0.0

0.1

4 8 12 16 20

Lag

PAC

F

Forecasting: principles and practice Regression with ARIMA errors 14

US Personal Consumption and Income

A Ljung-Box test shows the residuals are uncorrelated.

checkresiduals(fit, plot=FALSE)

#### Ljung-Box test#### data: Residuals from Regression with ARIMA(1,0,2) errors## Q* = 5.8916, df = 3, p-value = 0.117#### Model df: 5. Total lags used: 8

Forecasting: principles and practice Regression with ARIMA errors 15

US Personal Consumption and Income

fcast <- forecast(fit,xreg=rep(mean(uschange[,"Income"]),8), h=8)

autoplot(fcast) + xlab("Year") +ylab("Percentage change") +ggtitle("Forecasts from regression with ARIMA(1,0,2) errors")

−2

−1

0

1

2

1970 1980 1990 2000 2010 2020

Year

Per

cent

age

chan

ge

level

80

95

Forecasts from regression with ARIMA(1,0,2) errors

Forecasting: principles and practice Regression with ARIMA errors 16

Forecasting

To forecast a regression model with ARIMA errors, weneed to forecast the regression part of the model andthe ARIMA part of the model and combine the results.

Some explanatory variable are known into the future (e.g.,time, dummies).Separate forecasting models may be needed for otherexplanatory variables.

Forecasting: principles and practice Regression with ARIMA errors 17

Outline

1 Regression with ARIMA errors

2 Stochastic and deterministic trends

3 Periodic seasonality

4 Lab session 14

5 Dynamic regression models

Forecasting: principles and practice Stochastic and deterministic trends 18

Stochastic & deterministic trends

Deterministic trend

yt = β0 + β1t + nt

where nt is ARMA process.Stochastic trend

yt = β0 + β1t + nt

where nt is ARIMA process with d ≥ 1.Difference both sides until nt is stationary:

y′t = β1 + n′t

where n′t is ARMA process.Forecasting: principles and practice Stochastic and deterministic trends 19

Stochastic & deterministic trends

Deterministic trend

yt = β0 + β1t + nt

where nt is ARMA process.Stochastic trend

yt = β0 + β1t + nt

where nt is ARIMA process with d ≥ 1.Difference both sides until nt is stationary:

y′t = β1 + n′t

where n′t is ARMA process.Forecasting: principles and practice Stochastic and deterministic trends 19

Stochastic & deterministic trends

Deterministic trend

yt = β0 + β1t + nt

where nt is ARMA process.Stochastic trend

yt = β0 + β1t + nt

where nt is ARIMA process with d ≥ 1.Difference both sides until nt is stationary:

y′t = β1 + n′t

where n′t is ARMA process.Forecasting: principles and practice Stochastic and deterministic trends 19

International visitors

2

4

6

1980 1990 2000 2010

Year

mill

ions

of p

eopl

e

Total annual international visitors to Australia

Forecasting: principles and practice Stochastic and deterministic trends 20

International visitorsDeterministic trend

(fit1 <- auto.arima(austa, d=0, xreg=1:length(austa)))

## Series: austa## Regression with ARIMA(2,0,0) errors#### Coefficients:## ar1 ar2 intercept xreg## 1.1127 -0.3805 0.4156 0.1710## s.e. 0.1600 0.1585 0.1897 0.0088#### sigma^2 estimated as 0.02979: log likelihood=13.6## AIC=-17.2 AICc=-15.2 BIC=-9.28

yt = 0.4173 + 0.1715t + ntnt = 1.0371nt−1 − 0.3379nt−2 + etet ∼ NID(0, 0.02854).

Forecasting: principles and practice Stochastic and deterministic trends 21

International visitorsDeterministic trend

(fit1 <- auto.arima(austa, d=0, xreg=1:length(austa)))

## Series: austa## Regression with ARIMA(2,0,0) errors#### Coefficients:## ar1 ar2 intercept xreg## 1.1127 -0.3805 0.4156 0.1710## s.e. 0.1600 0.1585 0.1897 0.0088#### sigma^2 estimated as 0.02979: log likelihood=13.6## AIC=-17.2 AICc=-15.2 BIC=-9.28

yt = 0.4173 + 0.1715t + ntnt = 1.0371nt−1 − 0.3379nt−2 + etet ∼ NID(0, 0.02854).

Forecasting: principles and practice Stochastic and deterministic trends 21

International visitorsStochastic trend

(fit2 <- auto.arima(austa,d=1))

## Series: austa## ARIMA(0,1,1) with drift#### Coefficients:## ma1 drift## 0.3006 0.1735## s.e. 0.1647 0.0390#### sigma^2 estimated as 0.03376: log likelihood=10.62## AIC=-15.24 AICc=-14.46 BIC=-10.57

yt − yt−1 = 0.1537 + etyt = y0 + 0.1537t + ntnt = nt−1 + etet ∼ NID(0, 0.03241).Forecasting: principles and practice Stochastic and deterministic trends 22

International visitorsStochastic trend

(fit2 <- auto.arima(austa,d=1))

## Series: austa## ARIMA(0,1,1) with drift#### Coefficients:## ma1 drift## 0.3006 0.1735## s.e. 0.1647 0.0390#### sigma^2 estimated as 0.03376: log likelihood=10.62## AIC=-15.24 AICc=-14.46 BIC=-10.57

yt − yt−1 = 0.1537 + etyt = y0 + 0.1537t + ntnt = nt−1 + etet ∼ NID(0, 0.03241).Forecasting: principles and practice Stochastic and deterministic trends 22

International visitors

2.5

5.0

7.5

1980 1990 2000 2010 2020

Year

level

80

95

Forecasts from linear trend with AR(2) error

2.5

5.0

7.5

10.0

1980 1990 2000 2010 2020

Year

level

80

95

Forecasts from ARIMA(0,1,0) with drift

Forecasting: principles and practice Stochastic and deterministic trends 23

Forecasting with trend

Point forecasts are almost identical, but predictionintervals differ.Stochastic trends have much wider predictionintervals because the errors are non-stationary.Be careful of forecasting with deterministic trends toofar ahead.

Forecasting: principles and practice Stochastic and deterministic trends 24

Outline

1 Regression with ARIMA errors

2 Stochastic and deterministic trends

3 Periodic seasonality

4 Lab session 14

5 Dynamic regression models

Forecasting: principles and practice Periodic seasonality 25

Fourier terms for seasonalityPeriodic seasonality can be handled using pairs of Fourierterms:

sk(t) = sin(2πkt

m

)ck(t) = cos

(2πktm

)

yt =K∑k=1

[αksk(t) + βkck(t)] + nt

nt is non-seasonal ARIMA process.Every periodic function can be approximated by sumsof sin and cos terms for large enough K.Choose K by minimizing AICc.

Forecasting: principles and practice Periodic seasonality 26

US Accidental Deaths

fit <- auto.arima(USAccDeaths,xreg=fourier(USAccDeaths, 5),seasonal=FALSE)

fc <- forecast(fit,xreg=fourier(USAccDeaths, 5, 24))

Forecasting: principles and practice Periodic seasonality 27

US Accidental Deaths

autoplot(fc)

8000

10000

12000

1973 1975 1977 1979 1981

Time

US

Acc

Dea

ths

level

80

95

Forecasts from Regression with ARIMA(0,1,1) errors

Forecasting: principles and practice Periodic seasonality 28

Outline

1 Regression with ARIMA errors

2 Stochastic and deterministic trends

3 Periodic seasonality

4 Lab session 14

5 Dynamic regression models

Forecasting: principles and practice Lab session 14 29

Lab Session 14

Forecasting: principles and practice Lab session 14 30

Outline

1 Regression with ARIMA errors

2 Stochastic and deterministic trends

3 Periodic seasonality

4 Lab session 14

5 Dynamic regression models

Forecasting: principles and practice Dynamic regression models 31

Dynamic regression models

Sometimes a change in xt does not affect ytinstantaneously

yt = sales, xt = advertising.yt = stream flow, xt = rainfall.yt = size of herd, xt = breeding stock.

These are dynamic systems with input (xt) and output(yt).xt is often a leading indicator.There can be multiple predictors.

Forecasting: principles and practice Dynamic regression models 32

Dynamic regression models

Sometimes a change in xt does not affect ytinstantaneously

yt = sales, xt = advertising.yt = stream flow, xt = rainfall.yt = size of herd, xt = breeding stock.

These are dynamic systems with input (xt) and output(yt).xt is often a leading indicator.There can be multiple predictors.

Forecasting: principles and practice Dynamic regression models 32

Dynamic regression models

Sometimes a change in xt does not affect ytinstantaneously

yt = sales, xt = advertising.yt = stream flow, xt = rainfall.yt = size of herd, xt = breeding stock.

These are dynamic systems with input (xt) and output(yt).xt is often a leading indicator.There can be multiple predictors.

Forecasting: principles and practice Dynamic regression models 32

Lagged explanatory variablesThe model include present and past values of predictor:xt, xt−1, xt−2, . . . .

yt = a + ν0xt + ν1xt−1 + · · · + νkxt−k + nt

where nt is an ARIMA process.Rewrite model as

yt = a + (ν0 + ν1B + ν2B2 + · · · + νkBk)xt + nt= a + ν(B)xt + nt.

ν(B) is called a transfer function since it describeshow change in xt is transferred to yt.x can influence y, but y is not allowed to influence x.Forecasting: principles and practice Dynamic regression models 33

Lagged explanatory variablesThe model include present and past values of predictor:xt, xt−1, xt−2, . . . .

yt = a + ν0xt + ν1xt−1 + · · · + νkxt−k + nt

where nt is an ARIMA process.Rewrite model as

yt = a + (ν0 + ν1B + ν2B2 + · · · + νkBk)xt + nt= a + ν(B)xt + nt.

ν(B) is called a transfer function since it describeshow change in xt is transferred to yt.x can influence y, but y is not allowed to influence x.Forecasting: principles and practice Dynamic regression models 33

Lagged explanatory variablesThe model include present and past values of predictor:xt, xt−1, xt−2, . . . .

yt = a + ν0xt + ν1xt−1 + · · · + νkxt−k + nt

where nt is an ARIMA process.Rewrite model as

yt = a + (ν0 + ν1B + ν2B2 + · · · + νkBk)xt + nt= a + ν(B)xt + nt.

ν(B) is called a transfer function since it describeshow change in xt is transferred to yt.x can influence y, but y is not allowed to influence x.Forecasting: principles and practice Dynamic regression models 33

Example: Insurance quotes and TV advertsQ

uotesT

V.advert

2002 2003 2004 2005

8

10

12

14

16

18

6

7

8

9

10

11

Year

Insurance advertising and quotations

Forecasting: principles and practice Dynamic regression models 34

Example: Insurance quotes and TV adverts

Advert <- cbind(insurance[,2], c(NA,insurance[1:39,2]))colnames(Advert) <- paste("AdLag",0:1,sep="")(fit <- auto.arima(insurance[,1], xreg=Advert, d=0))

## Series: insurance[, 1]## Regression with ARIMA(3,0,0) errors#### Coefficients:## ar1 ar2 ar3 intercept AdLag0 AdLag1## 1.4117 -0.9317 0.3591 2.0393 1.2564 0.1625## s.e. 0.1698 0.2545 0.1592 0.9931 0.0667 0.0591#### sigma^2 estimated as 0.2165: log likelihood=-23.89## AIC=61.78 AICc=65.28 BIC=73.6

yt = 2.04 + 1.26xt + 0.16xt−1 + ntnt = 1.41nt−1 − 0.93nt−2 + 0.36nt−3 + et

Forecasting: principles and practice Dynamic regression models 35

Example: Insurance quotes and TV adverts

Advert <- cbind(insurance[,2], c(NA,insurance[1:39,2]))colnames(Advert) <- paste("AdLag",0:1,sep="")(fit <- auto.arima(insurance[,1], xreg=Advert, d=0))

## Series: insurance[, 1]## Regression with ARIMA(3,0,0) errors#### Coefficients:## ar1 ar2 ar3 intercept AdLag0 AdLag1## 1.4117 -0.9317 0.3591 2.0393 1.2564 0.1625## s.e. 0.1698 0.2545 0.1592 0.9931 0.0667 0.0591#### sigma^2 estimated as 0.2165: log likelihood=-23.89## AIC=61.78 AICc=65.28 BIC=73.6

yt = 2.04 + 1.26xt + 0.16xt−1 + ntnt = 1.41nt−1 − 0.93nt−2 + 0.36nt−3 + et

Forecasting: principles and practice Dynamic regression models 35

Example: Insurance quotes and TV adverts

fc <- forecast(fit, h=20,xreg=cbind(c(Advert[40,1],rep(10,19)), rep(10,20)))

autoplot(fc)

8

10

12

14

16

18

2002 2003 2004 2005 2006 2007

Time

insu

ranc

e[, 1

]

level

80

95

Forecasts from Regression with ARIMA(3,0,0) errors

Forecasting: principles and practice Dynamic regression models 36

Example: Insurance quotes and TV adverts

fc <- forecast(fit, h=20,xreg=cbind(c(Advert[40,1],rep(8,19)), rep(8,20)))

autoplot(fc)

8

10

12

14

16

18

2002 2003 2004 2005 2006 2007

Time

insu

ranc

e[, 1

]

level

80

95

Forecasts from Regression with ARIMA(3,0,0) errors

Forecasting: principles and practice Dynamic regression models 37

Example: Insurance quotes and TV adverts

fc <- forecast(fit, h=20,xreg=cbind(c(Advert[40,1],rep(6,19)), rep(6,20)))

autoplot(fc)

8

10

12

14

16

18

2002 2003 2004 2005 2006 2007

Time

insu

ranc

e[, 1

]

level

80

95

Forecasts from Regression with ARIMA(3,0,0) errors

Forecasting: principles and practice Dynamic regression models 38

Dynamic regression modelsyt = a + ν(B)xt + nt

where nt is an ARMA process. So

φ(B)nt = θ(B)et or nt =θ(B)φ(B)

et = ψ(B)et.

yt = a + ν(B)xt + ψ(B)et

ARMA models are rational approximations to generaltransfer functions of et.We can also replace ν(B) by a rational approximation.There is no R package for forecasting using a generaltransfer function approach.Forecasting: principles and practice Dynamic regression models 39

Dynamic regression modelsyt = a + ν(B)xt + nt

where nt is an ARMA process. So

φ(B)nt = θ(B)et or nt =θ(B)φ(B)

et = ψ(B)et.

yt = a + ν(B)xt + ψ(B)et

ARMA models are rational approximations to generaltransfer functions of et.We can also replace ν(B) by a rational approximation.There is no R package for forecasting using a generaltransfer function approach.Forecasting: principles and practice Dynamic regression models 39

Dynamic regression modelsyt = a + ν(B)xt + nt

where nt is an ARMA process. So

φ(B)nt = θ(B)et or nt =θ(B)φ(B)

et = ψ(B)et.

yt = a + ν(B)xt + ψ(B)et

ARMA models are rational approximations to generaltransfer functions of et.We can also replace ν(B) by a rational approximation.There is no R package for forecasting using a generaltransfer function approach.Forecasting: principles and practice Dynamic regression models 39