430 likes | 664 Views
STAT 497 LECTURE NOTES 8. ESTIMATION. ESTIMATION. After specifying the order of a stationary ARMA process, we need to estimate the parameters. We will assume (for now) that: 1. The model order ( p and q ) is known, and 2. The data has zero mean.
E N D
STAT 497LECTURE NOTES 8 ESTIMATION
ESTIMATION • After specifying the order of a stationary ARMA process, we need to estimate the parameters. • We will assume (for now) that: 1. The model order (p and q) is known, and 2. The data has zero mean. • If (2) is not a reasonable assumption, we can subtract the sample mean , fit a zero-mean ARMA model: Then use as the model for Yt.
ESTIMATION • Method of Moment Estimation (MME) • Ordinary Least Squares (OLS) Estimation • Maximum Likelihood Estimation (MLE) • Least Squares Estimation • Conditional • Unconditional
THE METHOD OF MOMENT ESTIMATION • It is also known as Yule-Walker estimation. Easy but not efficient estimation method. Works for only AR models for large n. • BASIC IDEA: Equating sample moment(s) to population moment(s), and solve these equation(s) to obtain the estimator(s) of unknown parameter(s).
THE METHOD OF MOMENT ESTIMATION • Let nis the variance/covariance matrix of X with the given parameter values. • Yule-Walker for AR(p): Regress Xtonto Xt−1, . . ., Xt−p. • Durbin-Levinson algorithm with replaced by . • Yule-Walker for ARMA(p,q): Method of moments. Not efficient.
THE YULE-WALKER ESTIMATION • For a stationary (causal) AR(p)
THE YULE-WALKER ESTIMATION • To find the Yule-Walker estimators, we are using, • These are forecasting equations. • We can use Durbin-Levinson algorithm.
THE YULE-WALKER ESTIMATION • If • If {Xt} is an AR(p) process, Hence, we can use the sample PACF to test for AR order, and we can calculate approximate confidence intervals for the parameters.
THE YULE-WALKER ESTIMATION • If Xt is an AR(p) process, and n is large, • 100(1)% approximate confidence interval for j is
THE YULE-WALKER ESTIMATION • AR(1) Find the MME of . It is known that 1 = .
THE YULE-WALKER ESTIMATION • So, the MME of is • Also, is unknown. • Therefore, using the variance of the process, we can obtain MME of .
THE YULE-WALKER ESTIMATION • AR(2) Find the MME of all unknown parameters. • Using the Yule-Walker Equations
THE YULE-WALKER ESTIMATION • So, equate population autocorrelations to sample autocorrelations, solve for 1 and 2.
THE YULE-WALKER ESTIMATION Using these we can obtain the MME of To obtain MME of , use the process variance formula.
THE YULE-WALKER ESTIMATION • AR(1) • AR(2)
THE YULE-WALKER ESTIMATION • MA(1) • Again using the autocorrelation of the series at lag 1, Choose the root so that the root satisfying the invertibility condition
THE YULE-WALKER ESTIMATION • For real roots, If , unique real roots but non-invertible. If , no real roots exists and MME fails. If , unique real roots and invertible.
THE YULE-WALKER ESTIMATION • This example shows that the MMEs for MA and ARMA models are complicated. • More generally, regardless of AR, MA or ARMA models, the MMEs are sensitive to rounding errors. They are usually used to provide initial estimates needed for a more efficient nonlinear estimation method. • The moment estimators are not recommended for final estimation results and should not be used if the process is close to being nonstationary or noninvertible.
THE MAXIMUM LIKELIHOOD ESTIMATION • Assume that • By this assumption we can use the joint pdf instead of which cannot be written as multiplication of marginal pdfs because of the dependency between time series observations.
MLE METHOD • For the general stationary ARMA(p,q) model or
MLE • The joint pdf of (a1,a2,…, an) is given by • Let Y=(Y1,…,Yn) and assume that initial conditions Y*=(Y1-p,…,Y0)’and a*=(a1-q,…,a0)’ are known.
MLE • The conditional log-likelihood function is given by Initial Conditions:
MLE • Then, we can find the estimators of =(1,…,p), =(1,…, q) and such that the conditional likelihood function is maximized. Usually, numerical nonlinear optimization techniques are required. After obtaining all the estimators, where d.f.= of terms used in SS of parameters = (np) (p+q+1) = n (2p+q+1).
MLE • AR(1)
MLE The Jacobian will be
MLE • Then, the likelihood function can be written as
MLE • Hence, • The log-likelihood function:
MLE • Here, S*() is the conditional sum of squares and S() is the unconditional sum of squares. • To find the value of where the likelihood function is maximized, • Then,
MLE • If we neglect ln(12), then MLE=conditional LSE. • If we neglect both ln(12) and , then
MLE • Asymptotically unbiased, efficient, consistent, sufficient for large sample sizes but hard to deal with joint pdf.
CONDITIONAL LSE • If the process mean is different than zero
CONDITIONAL LSE • MA(1) • Non-linear in terms of parameters • LS problem • S*() cannot be minimized analytically • Numerical nonlinear optimization methods like Newton-Raphson or Gauss-Newton,... *There are similar problem is ARMA case.
UNCONDITIONAL LSE • This nonlinear in . • We need nonlinear optimization techniques.
BACKCASTING METHOD • Obtain the backward form of ARMA(p,q) • Instead of forecasting, backcast the past values of Ytand at, t 0. Obtain the unconditional log-likelihood function, then obtain the estimators.
EXAMPLE • If there are only 2 observations in time series (not realistic) Find the MLE of and .
EXAMPLE • US Quarterly Beer Production from 1975 to 1997 > par(mfrow=c(1,3)) > plot(beer) > acf(as.vector(beer),lag.max=36) > pacf(as.vector(beer),lag.max=36)
EXAMPLE (contd.) > library(uroot) Warning message:package 'uroot' was built under R version 2.13.0 > HEGY.test(wts =beer, itsd = c(1, 1, c(1:3)), regvar = 0,selectlags = list(mode = "bic", Pmax = 12)) Null hypothesis: Unit root. Alternative hypothesis: Stationarity. ---- HEGY statistics: Stat. p-value tpi_1 -3.339 0.085 tpi_2 -5.944 0.010 Fpi_3:4 13.238 0.010 > CH.test(beer) ------ - ------ ---- Canova & Hansen test ------ - ------ ---- Null hypothesis: Stationarity. Alternative hypothesis: Unit root. L-statistic: 0.817 Critical values: 0.10 0.05 0.025 0.01 0.846 1.01 1.16 1.35
EXAMPLE (contd.) > plot(diff(beer),ylab='First Difference of Beer Production',xlab='Time') > acf(as.vector(diff(beer)),lag.max=36) > pacf(as.vector(diff(beer)),lag.max=36)
EXAMPLE (contd.) > HEGY.test(wts =diff(beer), itsd = c(1, 1, c(1:3)), regvar = 0,selectlags = list(mode = "bic", Pmax = 12)) ---- ---- HEGY test ---- ---- Null hypothesis: Unit root. Alternative hypothesis: Stationarity. ---- HEGY statistics: Stat. p-value tpi_1 -6.067 0.01 tpi_2 -1.503 0.10 Fpi_3:4 9.091 0.01 Fpi_2:4 7.136 NA Fpi_1:4 26.145 NA
EXAMPLE (contd.) > fit1=arima(beer,order=c(3,1,0),seasonal=list(order=c(2,0,0), period=4)) > fit1 Call: arima(x = beer, order = c(3, 1, 0), seasonal = list(order = c(2, 0, 0), period = 4)) Coefficients: ar1 ar2 ar3 sar1 sar2 -0.7380 -0.6939 -0.2299 0.2903 0.6694 s.e. 0.1056 0.1206 0.1206 0.0882 0.0841 sigma^2 estimated as 1.79: log likelihood = -161.55, aic = 335.1 > fit2=arima(beer,order=c(3,1,0),seasonal=list(order=c(3,0,0), period=4)) > fit2 Call: arima(x = beer, order = c(3, 1, 0), seasonal = list(order = c(3, 0, 0), period = 4)) Coefficients: ar1 ar2 ar3 sar1 sar2 sar3 -0.8161 -0.8035 -0.3529 0.0444 0.5798 0.3387 s.e. 0.1065 0.1188 0.1219 0.1205 0.0872 0.1210 sigma^2 estimated as 1.646: log likelihood = -158.01, aic = 330.01