540 likes | 801 Views
Non-Seasonal Box-Jenkins Models. Four-step iterative procedures. Model Identification Parameter Estimation Diagnostic Checking Forecasting. Step One: Model Identification. Model Identification. Stationarity Theoretical Autocorrelation Function (TAC)
E N D
Four-step iterative procedures • Model Identification • Parameter Estimation • Diagnostic Checking • Forecasting
Model Identification • Stationarity • Theoretical Autocorrelation Function (TAC) • Theoretical Partial Autocorrelation Function (TPAC) • Sample Partial Autocorrelation Function (SPAC) • Sample Autocorrelation Function (SAC)
Stationarity (I) • A sequence of jointly dependent random variables is called a time series
Stationarity (II) • Stationary process Properties :
Stationarity (III) • Example: The white noise series {et } • e’s are iid as N(0,se2). Note that
Stationarity (IV) • Three basic Box-Jenkins models for a stationary time series {yt } : (1) Autoregressive model of order p (AR(p)) i.e., yt depends on its p previous values (2) Moving Average model of order q (MA(q)) i.e., yt depends on q previous random error terms
Stationarity (V) • Three basic Box-Jenkins models for a stationary time series {yt } : (3) Autoregressive-moving average model of order p and q (ARMA(p,q)) i.e., yt depends on its p previous values and q previous random error terms
AR(1) (I) • Simple AR(1) process without drift
AR(1) (II) • Now, • Var(yt) and cov(yt, yt-s) are finite if and only if |f1| < 1, which is the stationarity requirement for an AR(1) process.
AR(1) (IV) • Special Case: f1 = 1 • It is a “random walk” process. Now, • Thus,
AR(1) (V) • Consider, • ytis a homogeneous non-stationary series. The number of times that the original series must be differenced before a stationary series results is called the order of integration.
Theoretical Autocorrelation Function (TAC) (I) • Autoregressive (AR) Processes Consider an AR(1) process without drift : Recall that
Theoretical Autocorrelation Function (TAC) (II) The autocorrelation function at lag k is So for a stationary AR(1) process, the TAC dies down gradually as k increases.
Theoretical Autocorrelation Function (TAC) (III) Consider an AR(2) process without drift : The TAC functions are
Theoretical Autocorrelation Function (TAC) (IV) Then the TAC dies down according to a mixture of damped exponentials and/or damped sine waves. • In general, the TAC of a stationary AR process dies down gradually as k increases.
Theoretical Autocorrelation Function (TAC) (V) • Moving Average (MA) Processes Consider a MA(1) process without drift : Recall that
Theoretical Autocorrelation Function (TAC) (VI) Therefore the TAC of the MA(1) process is The TAC of the MA(1) process “cuts off” after lag k=1.
Theoretical Autocorrelation Function (TAC) (VII) Consider a MA(2) process : The TAC of a MA(2) process cuts off after 2 lags.
Theoretical Partial Autocorrelation Function (TPAC) (I) • Autoregressive Processes By the definition of the PAC, the parameter fk is the kth PAC rkk. Therefore, the partial autocorrelation function at lag k is As mentioned before, if k=1, then That is, PAC=AC. The TPAC of an AR(1) process “cuts off” after lag 1.
Theoretical Partial Autocorrelation Function (TPAC) (II) • Moving Average Processes Consider which is a stationary AR process with infinite order. Thus, the partial autocorrelation decays towards zero as j increases.
Summary of the Behaviors of TAC and TPAC (I) Behaviors of TAC and TPAC for general non-seasonal models
Summary of the Behaviors of TAC and TPAC (II) Behaviors of TAC and TPAC for specific non-seasonal models
Summary of the Behaviors of TAC and TPAC (III) Behaviors of TAC and TPAC for specific non-seasonal models
Summary of the Behaviors of TAC and TPAC (IV) Behaviors of TAC and TPAC for specific non-seasonal models
Sample Autocorrelation Function (SAC) (I) • For the working series zb, zb+1, , zn, the sample autocorrelation at lag k is where
Sample Autocorrelation Function (SAC) (II) • rk measures the linear relationship between time series observations separated by a lag of k time units • The Standard error of rk is • The trk statistic is
Sample Autocorrelation Function (SAC) (III) • Behaviors of SAC (1) The SAC can cut off. A spike at lag k exists in the SAC if rk is statistically large. If Then rk is considered to be statistically large. The SAC cuts off after lag k if there are no spikes at lags greater than k in the SAC.
Sample Autocorrelation Function (SAC) (IV) (2) The SAC dies down if this function does not cut off but rather decreases in a ‘steady fashion’. The SAC can die down in (i) a damped exponential fashion (ii) a damped sine-wave fashion (iii) a fashion dominated by either one of or a combination of both (i) and (ii). The SAC can die down fairly quickly or extremely slowly.
Sample Autocorrelation Function (SAC) (V) • The time series values zb, zb+1, …, zn should be considered stationary, if the SAC of the time series values either cuts off fairly quickly or dies down fairly quickly. • However if the SAC of the time series values zb, zb+1, …, zn dies down extremely slowly, then the time series values should be considered non-stationary.
Sample Partial Autocorrelation Function (SPAC) (I) • The sample partial autocorrelation at lag k is where for j = 1, 2, …, k-1.
Sample Partial Autocorrelation Function (SPAC) (II) • rkk may intuitively be thought of as the sample autocorrelation of time series observations separated by a lag k time units with the effects of the intervening observations eliminated. • The standard error of rkk is • The trkk statistic is
Sample Partial Autocorrelation Function (SPAC) (III) • Behaviors of SPAC similar to its of the SAC. The only difference is that rkk is considered to be statistically large if for any k.
Sample Partial Autocorrelation Function (SPAC) (IV) • The behaviors of the SAC and the SPAC of a time series data help to tentatively identify a Box-Jenkins model. • Each Box-Jenkins model is characterized by its theoretical autocorrelation (TAC) function and its theoretical partial autocorrelation (TPAC) function.
Parameter Estimation • Given n observations y1, y2, …, yn, the likelihood function L is defined to be the probability of obtaining the data actually observed. • For non-seasonal Box-Jenkins models, L will be a function of the d, f’s, q’s and se2 given y1, y2, …, yn. • The maximum likelihood estimators (m.l.e.) are those value of the parameters for which the data actually observed are most likely, that is, the values that maximize the likelihood function L.
Diagnostic Checking • Often it is not straightforward to determine a single model that most adequately represents the data generating process. The suggested tests include (1) residual analysis, (2) overfitting, (3) model selection criteria.
Residual Analysis • If an ARMA(p,q) model is an adequate representation of the data generating process, then the residuals should be uncorrelated. • Use the Box-Pierce statistic • or the Ljung-Box-Pierce statistic
Overfitting • If an ARMA(p,q) model is specified, them we could estimate an ARMA(p+1,q) or an ARMA(p,q+1) process. • Then we check the significance of the additional parameters (but be aware of multicollinearity problems),
Model Selection Criteria • Akaike Information Criterion (AIC) AIC = -2 ln(L) + 2k • Schwartz Bayesian Criterion (SBC) SBC = -2 ln(L) + k ln(n) where L = likelihood function k = number of parameters to be estimated, n = number of observations. • Ideally, the AIC and SBC will be as small as possible
Forecasting • Given a the stationary series zb, zb+1, , zt, we would like to forecast the value zt+l. = l-step-ahead forecast of zt+l made at time t, = l-step-ahead forecast error = • The l-step-ahead forecast is derived using the minimum mean square error forecast and is given by
Forecasting with AR(1) model (I) • The AR(1) time series model is where et ~ N(0,se2). • 1-step-ahead point forecast
Forecasting with AR(1) model (II) • Recall that et+1 is independent of zb, zb+1, …, zn and it has a zero mean. Thus, • The forecast error is • Then the variance of the forecast error is
Forecasting with AR(1) model (III) • 2-step-ahead point forecast • The forecast error is • The forecast error variance is
Forecasting with AR(1) model (IV) • l-step-ahead point forecast • The forecast error is • The forecast error variance is
Forecasting with MA(1) model (I) • The MA(1) model is where et ~ N(0,se2). • l-step-ahead point forecast
Forecasting with MA(1) model (II) • The forecast error is • The variance of forecast error is