450 likes | 460 Views
Time Series Analysis by means of inference statistical methods. R. Werner Solar Terrestrial Influences Institute - BAS. Inference statistic analysis of the time series Now: measures about the significance
E N D
Time Series Analysis by means of inference statistical methods R. Werner Solar Terrestrial Influences Institute - BAS
Inference statistic analysis of the time series Now: measures about the significance extrapolated trends causal relations between two variables Cross-section analysis: Y is a realization of a stochastic process, for example the errors must have a determined probability distribution Time series analysis: Prognosis for yt+1 , the influences of exogenous parameters can be investigated on this basis
A model that describes probability structures is called stochastic process. The model includes assumptions for the mechanisms generating the observed time series. A general assumption is the stationarity: weakly stationarity 4a) Autocovariances with a lag greater than k are assumed zero → moving average models 4b) Autocovariances of a higher order can be calculated by variances of a lower order → autoregressive models
Autoregressive (AR) models of order p at error term: white noise
ACF for AR(1) ACF for AR(1) Theoretical autocorrelation functions (ACF)
Yule-Walker equations AR(1) AR(2) AR(p)
1 0 -1 AR(2) φ2 2 0 2 φ1 Conditions of stationarity: In the area under the circle the AR(2) model describes a quasi-cycle process
z(1)=1 z(2)=2 time
Model identification tool partial autocorrelation function (PACF), as known from the cross-section statistics Step wise calculation of the coefficients from the Yule-Walker equations k=1: k=2: The theoretical PACF of an AR(p) process has values different from zero, only for k=1,2,…,p !
Theoretical autocorrelation functions (ACF) and partial autocorrelation function for an AR(2) process φ1=1.7 φ2=-0.95 φ1=1.7 φ2=-0.95
Parameter estimation Residues
Distribution of the residues Autocorrelation function of the residues
Moving-Average (MA) models AR models describe processes as a function of past z values, however as was shown for the AR(2) process z=1.7zt-1-0.95zt-2 the process is forced by the noise at. (with a theoretical infinite influence of the shocks). Now the idea is: as for the AR-process, to minimize the process parameters of finite series of atwith time lags
Autocorrelation for a MA(1) process for a MA(2) process
1 0 -1 МА(2) θ2 θ1 2 0 2 Invertibility condition: In the area under the circle the MA(2) model describes a quasi-cycle process Empiric ACF is a tool to identification of the MA order PACF?
Invertibility condition: For a MA(1) process we have
The MA(1) process can be presented by an AR( ) process In general MA(q) process can be presented by an AR( ) process and an AR(p) process can be presented by a MAR( ) process Box-Jenkins Principle: models with minimum number of parameters have to be used
Other models: ARMA: mixed model of AR and MA ARIMA: autoregressive integrating moving-average model it uses differences of the time series SARIMA: seasonal ARIMA model with constant seasonal figure VARMA : vector ARMA
Forecast AR(1)
MA(1) It can be shown that The MA models are not useful for prognosis
Dynamical regression Ordinary linear regression: Xi may be transformed Yi is normal distributed, for Xiit isnot necessary to be stochastically distributed (for ex. can be fixed) α and β can be optimally estimated by Ordinary Least Squares (OLS) using the assumptions: • E(εi)=0 • εiisnot autocorrelated Cov(εi , εj )=0 • εiis normally distributed • Equilibrium conditions
For time series can be formally written ( i →t ): • The assumption of equilibrium is not necessary • However: • In time series the error term is often autocorrelated • The estimations are not efficient (they have not the minimal variance) • Autocorrelations of Xi can be transferred to ε, autocorrelations of εproduce deviations ofσεfrom the true value, besides this implicates a not true value of σβ γ autocorr. of the residues λ autocorr. of the predictors
Simple lag model (models, dynamical by X) Distributed lag model the influence is distributed over k lags for example k=2 The statistical interpretation of β do not make sense
Therefore more restrictions are needed For the model where the influence decreases exponentially with k. Then the model has only three parameters:α,β0 ,δ
How do determine the parameters? Koyck transformation where Using OLS, δ and β0 can be estimated, and after this βk
Similar models Adaptive expectation model Partial adjustment model Models with two or more input variables
Model with autocorrelative error term We remember, that εtin lin. regr. has to be N(0,σ). Here εt is an AR(1) process. Estimation of the regr. coeff. by Cochrane/Orcutt method 1. By OLS estimation ofαand βand calculation of the residues et and estimation of the autocorrelation coeff.
new regr. equation where Note: to test if εt is autocorrelated, the Durbin-Watson test can be applied
Partial Autocorrelation function of the detrended residues
Acknowledgement I want to acknowledge to the Ministery of Education and Science to support this work under the contract DVU01/0120