1 / 104

MODEL I D ENTIFICATION

MODEL I D ENTIFICATION. Stationary Time Series. Wold’ Theorem.

hansonc
Download Presentation

MODEL I D ENTIFICATION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MODEL IDENTIFICATION

  2. Stationary Time Series

  3. Wold’ Theorem • Wold’s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average (MA) representation. This in turn can be represented by a finite autoregressive moving average (ARMA) process. • Therefore, by examining the first and second order moments, we can identify a stationary process.

  4. Non-Stationary Time Series

  5. NON-STATIONARY TIME SERIES MODELS • Non-constant in mean • Non-constant in variance • Both

  6. NON-STATIONARITY IN MEAN • Deterministic trend • Detrending • Stochastic trend • Differencing

  7. DETERMINISTIC TREND • A deterministic trend is when we say that the series is trending because it is an explicit function of time. • Using a simple linear trend model, the deterministic (global) trend can be estimated. This way to proceed is very simple and assumes the pattern represented by linear trend remains fixed over the observed time span of the series. A simple linear trend model:

  8. DETERMINISTIC TREND • The parameter  measure the average change in Yt from one period to the another: • The sequence {Yt} will exhibit only temporary departures from the trend line +t. This type of model is called a trend stationary (TS) model.

  9. EXAMPLE

  10. TREND STATIONARY • If a series has a deterministic time trend, then we simply regress Yt on an intercept and a time trend (t=1,2,…,n) and save the residuals. The residuals are detrended series. If Yt is trend stationary, we will get stationary residual series. If Yt is stochastic, we do not necessarily get stationary series.

  11. DETERMINISTIC TREND • Many economic series exhibit “exponential trend/growth”. They grow over time like an exponential function over time instead of a linear function. • For such series, we want to work with the log of the series:

  12. STOCHASTIC TREND • The ARIMA models where the difference order ≥ 1 (that is, such series has at least one unit root) is a typical time series that has stochastic trend. • Such series is also called difference stationary – in contrast to trend stationary.

  13. STOCHASTIC TREND • Recall the AR(1) model: Yt = c + Yt−1 + at. • As long as || < 1, it is stationary and everything is fine (OLS is consistent, t-stats are asymptotically normal, ...). • Now consider the extreme case where  = 1, i.e. Yt = c + Yt−1 + at. • Where is the trend? No t term.

  14. STOCHASTIC TREND • Let us replace recursively the lag of Yt on the right-hand side: Deterministic trend • This is what we call a “random walk with drift”. If c = 0, it is a “random walk”.

  15. STOCHASTIC TREND • Each ai shock represents shift in the intercept. Since all values of {ai} have a coefficient of unity, the effect of each shock on the intercept term is permanent. • In the time series literature, such a sequence is said to have a stochastic trend since each ai shock imparts a permanent and random change in the conditional mean of the series. To be able to define this situation, we use Autoregressive Integrated Moving Average (ARIMA) models.

  16. DETERMINISTIC VS STOCHASTIC TREND • They might appear similar since they both lead to growth over time but they are quite different. • To see why, suppose that through any policies, you got a bigger Yt because the noise at is big. What will happen next period? – With a deterministic trend, Yt+1 = c +(t+1)+at+1. The noise at is not affecting Yt+1. Your policy had a one period impact. – With a stochastic trend, Yt+1 = c + Yt + at+1 = c + (c + Yt−1 + at) + at+1. The noise at is affecting Yt+1. In fact, the policy will have a permanent impact.

  17. DETERMINISTIC VS STOCHASTIC TREND Conclusions: – When dealing with trending series, we are always interested in knowing whether the growth is a deterministic or stochastic trend. – There are also economic time series that do not grow over time (e.g., interest rates) but we will need to check if they have a behavior ”similar” to stochastic trends ( = 1 instead of || < a, while c = 0). – A deterministic trend refers to the long-term trend that is not affected by short term fluctuations in the series. Some of the occurrences are random and may have a permanent effect of the trend. Therefore the trend must contain a deterministic and a stochastic component.

  18. DETERMINISTIC TREND EXAMPLE Simulate data from let’s say AR(1): >x=arima.sim(list(order = c(1,0,0), ar = 0.6), n = 100) Simulate data with deterministic trend >y=2+time(x)*2+x >plot(y)

  19. DETERMINISTIC TREND EXAMPLE > reg=lm(y~time(y)) > summary(reg) Call: lm(formula = y ~ time(y)) Residuals: Min 1Q Median 3Q Max -2.74091 -0.77746 -0.09465 0.83162 3.27567 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 2.179968 0.250772 8.693 8.25e-14 *** time(y) 1.995380 0.004311 462.839 < 2e-16 *** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 1.244 on 98 degrees of freedom Multiple R-squared: 0.9995, Adjusted R-squared: 0.9995 F-statistic: 2.142e+05 on 1 and 98 DF, p-value: < 2.2e-16

  20. DETERMINISTIC TREND EXAMPLE > plot(y=rstudent(reg),x=as.vector(time(y)), ylab='Standardized Residuals',xlab='Time',type='o')

  21. DETERMINISTIC TREND EXAMPLE > z=rstudent(reg) > par(mfrow=c(1,2)) > acf(z) > pacf(z) De-trended series AR(1)

  22. STOCHASTIC TREND EXAMPLE Simulate data from ARIMA(0,1,1): > x=arima.sim(list(order = c(0,1,1), ma = -0.7), n = 200) > plot(x) > acf(x) > pacf(x)

  23. AUTOREGRESSIVE INTEGRATED MOVING AVERAGE (ARIMA) PROCESSES • Consider an ARIMA(p,d,q) process

  24. ARIMA MODELS • When d=0, 0is related to the mean of the process. • When d>0, 0is a deterministic trend term. • Non-stationary in mean: • Non-stationary in level and slope:

  25. RANDOM WALK PROCESS • A random walk is defined as a process where the current value of a variable is composed of the past value plus an error term defined as a white noise (a normal variable with zero mean and variance one). • ARIMA(0,1,0) PROCESS

  26. RANDOM WALK PROCESS • Behavior of stock market. • Brownian motion. • Movement of a drunken men. • It is a limiting process of AR(1).

  27. RANDOM WALK PROCESS • The implication of a process of this type is that the best prediction of Y for next period is the current value, or in other words the process does not allow to predict the change (YtYt-1). That is, the change of Y is absolutely random. • It can be shown that the mean of a random walk process is constant but its variance is not. Therefore a random walk process is nonstationary, and its variance increases with t. • In practice, the presence of a random walk process makes the forecast process very simple since all the future values of Yt+s for s > 0, is simply Yt.

  28. RANDOM WALK PROCESS

  29. RANDOM WALK PROCESS

  30. RANDOM WALK WITH DRIFT • Change in Ytis partially deterministic and partially stochastic. • It can also be written as Pure model of a trend (no stationary component)

  31. RANDOM WALK WITH DRIFT After t periods, the cumulative change in Yt is t0. Each aishock has a permanent effect on the mean of Yt.

  32. RANDOM WALK WITH DRIFT

  33. ARIMA(0,1,1) OR IMA(1,1) PROCESS • Consider a process • Letting

  34. ARIMA(0,1,1) OR IMA(1,1) PROCESS • Characterized by the sample ACF of the original series failing to die out and by the sample ACF of the first differenced series shows the pattern of MA(1). • IF: Exponentially decreasing. Weighted MA of its past values.

  35. ARIMA(0,1,1) OR IMA(1,1) PROCESS where  is the smoothing constant in the method of exponential smoothing.

  36. REMOVING THE TREND • A series containing a trend will not revert to a long-run mean. The usual methods for eliminating the trend are detrending and differencing.

  37. DETRENDING • Detrending is used to remove deterministic trend. • Regress Yt on time and save the residuals. • Then, check whether residuals are stationary.

  38. DIFFERENCING • Differencing is used for removing the stochastic trend. • d-th difference of ARIMA(p,d,q) model is stationary. A series containing unit roots can be made stationary by differencing. • ARIMA(p,d,q)  d unit roots Integrated of order d, I(d)

  39. DIFFERENCING • Random Walk: Non-stationary Stationary

  40. KPSS TEST • To be able to test whether we have a deterministic trend vs stochastic trend, we are using KPSS (Kwiatkowski, Phillips, Schmidt and Shin) Test (1992).

  41. KPSS TEST STEP 1: Regress Yt on a constant and trend and construct the OLS residuals e=(e1,e2,…,en)’. STEP 2: Obtain the partial sum of the residuals. STEP 3: Obtain the test statistic where is the estimate of the long-run variance of the residuals.

  42. KPSS TEST • STEP 4: Reject H0 when KPSS is large, because that is the evidence that the series wander from its mean. • Asymptotic distribution of the test statistic uses the standard Brownian bridge. • It is the most powerful unit root test but if there is a volatility shift it cannot catch this type non-stationarity.

  43. DETERMINISTIC TREND EXAMPLE kpss.test(x,null=c("Level")) KPSS Test for Level Stationarity data: x KPSS Level = 3.4175, Truncation lag parameter = 2, p-value = 0.01 Warning message: In kpss.test(x, null = c("Level")) : p-value smaller than printed p-value > kpss.test(x,null=c("Trend")) KPSS Test for Trend Stationarity data: x KPSS Trend = 0.0435, Truncation lag parameter = 2, p-value = 0.1 Warning message: In kpss.test(x, null = c("Trend")) : p-value greater than printed p-value Here, we have deterministic trend or trend stationary process. Hence, we need de-trending to work with stationary series.

  44. STOCHASTIC TREND EXAMPLE > kpss.test(x, null = "Level")   KPSS Test for Level Stationarity  data: x KPSS Level = 3.993, Truncation lag parameter = 3, p-value = 0.01 Warning message: In kpss.test(x, null = "Level") : p-value smaller than printed p-value > kpss.test(x, null = "Trend") KPSS Test for Trend Stationarity data: x KPSS Trend = 0.6846, Truncation lag parameter = 3, p-value = 0.01 Warning message: In kpss.test(x, null = "Trend") : p-value smaller than printed p-value Here, we have stochastic trend or difference stationary process. Hence, we need differencing to work with stationary series.

  45. PROBLEM • When an inappropriate method is used to eliminate the trend, we may create other problems like non-invertibility. • E.g.

  46. PROBLEM • But if we misjudge the series as difference stationary, we need to take a difference. Actually, detrending should be applied. Then, the first difference: Now, we create a non-invertible unit root process in the MA component.

  47. NON-STATIONARITY IN VARIANCE • We will learn more about this in the future. • For now, we will only learn the variance stabilizing transformation.

  48. VARIANCE STABILIZING TRANSFORMATION • Generally, we use the power function

  49. VARIANCE STABILIZING TRANSFORMATION • Variance stabilizing transformation is only for positive series. If your series has negative values, then you need to add each value with a positive number so that all the values in the series are positive. Now, you can search for any need for transformation. • It should be performed before any other analysis such as differencing. • Not only stabilize the variance but also improves the approximation of the distribution by Normal distribution.

  50. Box-Cox TRANSFORMATION install(TSA) library(TSA) oil=ts(read.table('c:/oil.txt',header=T), start=1996, frequency=12) BoxCox.ar(y=oil)

More Related