250 likes | 519 Views
Chapter 13. Time Series Forecasting. Time Series Forecasting. 13.1 Time Series Components and Models 13.2 Time Series Regression: Basic Models 13.3 Time Series Regression: More Advanced Models ( Optional ) 13.4 Multiplicative Decomposition 13.5 Exponential Smoothing
E N D
Chapter 13 Time Series Forecasting
Time Series Forecasting 13.1 Time Series Components and Models 13.2 Time Series Regression: Basic Models 13.3 Time Series Regression: More Advanced Models (Optional) 13.4 Multiplicative Decomposition 13.5 Exponential Smoothing 13.6 Forecast Error Comparisons 13.7 Index Numbers
Time Series Components and Models Trend Long-run growth or decline Cycle Long-run up and down fluctuation around the trend level Seasonal Regular periodic up and down movements that repeat within the calendar year Irregular Erratic very short-run movements that follow no regular pattern
No Trend • When there is no trend, the least squares point estimate b0 of b0 is just the average y value • yt = b0 + et • That is, we have a horizontal line that crosses the y axis at its average value
Trend • When sales increase (or decrease) over time, we have a trend • Oftentimes, that trend is linear in nature • Linear trend is modeled using regression • Sales is the dependent variable • Time is the independent variable • Weeks • Months • Quarters • Years • Not only is simple linear regression used, quadratic regression is sometimes used
Seasonality • Some products have demand that varies a great deal by period • Coats • Bathing suits • Bicycles • This periodic variation is called seasonality • Seasonality alters the linear relationship between time and demand
Modeling Seasonality • Within regression, seasonality can be modeled using dummy variables • Consider the model:yt = b0 + b1t + bQ2Q2 + bQ3Q3 + bQ4Q4 + et • For Quarter 1, Q2 = 0, Q3 = 0, and Q4 = 0 • For Quarter 2, Q2 = 1, Q3 = 0, and Q4 = 0 • For Quarter 3, Q2 = 0, Q3 = 1, and Q4 = 0 • For Quarter 4, Q2 = 0, Q3 = 0, and Q4 = 1 • The b coefficient will then give us the seasonal impact of that quarter relative to Quarter 1 • Negative means lower sales • Positive means higher sales
Time Series Regression: MoreAdvanced Models • Sometimes, transforming the sales data makes it easier to forecast • Square root • Quartic roots • Natural logarithms • While these transformations can make the forecasting easier, they make it harder to understand the resulting model
Autocorrelation • One of the assumptions of regression is that the error terms are independent • With time series data, that assumption is often violated • Positive or negative autocorrelation is common • One type of autocorrelation is first-order autocorrelation • Error term in time period t is related to the one in t-1 • et = φet-1 + at • φ is the correlation coefficient that measures the relationship between the error terms • at is an error term, often called a random shock
Autocorrelation Continued • We can test for first-order correlation using Durbin-Watson • Covered in Chapters 11 and 12 • One approach to dealing with first-order correlation is predict future values of the error term using the modelet = φet-1 + at
Autoregressive Model • The error term et can be related to more than just the previous error term et-1 • This is often the case with seasonal data • The autoregressive error term model of order q:et = φet-1 + φet-2 + … + φet-q + atrelates the error term to any number of past error terms • The Box-Jenkins methodology can be used to systematically a model that relates et to an appropriate number of past error terms
Multiplicative Decomposition • We can use the multiplicative decomposition method to decompose a time series into its components: • Trend • Seasonal • Cyclical • Irregular
Steps to Multiplicative Decomposition#1 • Compute a moving average • This eliminates the seasonality • Averaging period matches the seasonal period • Compute a two-period centering moving average • The average from Step 1 needs to be matched up with a specific period • Consider a 4-period moving average • The average of 1, 2, 3, and 4 is 2.5 • This does not match any period • The average of 2.5 and the next term of 3.5 is 3 • This matches up with period 3 • Step 2 not needed if Step 1 uses odd number of periods
Steps to Multiplicative Decomposition#2 • The original demand for each period is divided by the value computed in Step 2 for that same period • The first and last few period do not have a value from Step 2 • These periods are skipped • All of the values from Step 3 for season 1 are averaged together to form seasonal factor for season 1 • This is repeated for every season • If there are four seasons, there will be four factors
Steps to Multiplicative Decomposition#3 • The original demand for each period is divided by the appropriate seasonal factor for that period • This gives us the deseasonalized observation for that period • A forecast is prepared using the deseasonalized observations • This is usually simple regression • The deseasonalized forecast for each period from Step 6 is multiplied by the appropriate seasonal factor for that period • This returns seasonality to the forecast
Steps to Multiplicative Decomposition#4 • We estimate the period-by-period cyclical and irregular component by dividing the deseasonalized observation from Step 5 by the deseasonalized forecast from Step 6 • We use a three-period moving average to average out the irregular component • The value from Step 9 divided by the value from Step 8 gives us the cyclical component • Values close to one indicate a small cyclical component • We are interested in long-term patterns
Exponential Smoothing • Earlier, we saw that when there is no trend, the least squares point estimate b0 of b0 is just the average y value • yt = b0 + et • That gave us a horizontal line that crosses the y axis at its average value • Since we estimate b0 using regression, each period is weighted the same • If b0 is slowly changing over time, we want to weight more recent periods heavier • Exponential smoothing does just this
Exponential Smoothing Continued • Exponential smoothing takes on the form:ST = ayT + (1 – a)ST-1 • Alpha is a smoothing constant between zero and one • Alpha is typically between 0.02 and 0.30 • Smaller values of alpha represent slower change • We want to test the data and find an alpha value that minimizes the sum of squared forecast errors
Holt–Winters’ Double Exponential Smoothing • Simple exponential smoothing cannot handle trend or seasonality • Holt–Winters’ double exponential smoothing can handle trended data of the formyt = b0 + b1t + et • Assumes b0 and b1 changing slowly over time • We first find initial estimates of b0 and b1 • Then use updating equations to track changes over time • Requires smoothing constants called alpha and gamma • Updating equations in Appendix K of the CD-ROM
Multiplicative Winters’ Method • Double exponential smoothing cannot handle seasonality • Multiplicative Winters’ method can handle trended data of the formyt = (b0 + b1t) · SNt + et • Assumes b0,b1, and SNt changing slowly over time • We first find initial estimates of b0 and b1 and seasonal factors • Then use updating equations to track over time • Requires smoothing constants called alpha, gamma, and delta • Updating equations in Appendix K of the CD-ROM
Forecast Error Comparison Forecast Errors Error Comparison Criteria Mean Absolute Deviation (MAD) Mean Squared Deviation (MSD)
Index Numbers • Index numbers allow us to compare changes in time series over time • We begin by selecting a base period • Every period is converted to an index by dividing its value by the base period and them multiplying times 100 Simple (Quantity) Index
Aggregate Price Index • Often wish to compare a group of items • To do this, we compute the total prices of the items over time • We then index this total Aggregate Price Index
Weighted Aggregate Price Index • An aggregate price index assumes all items in the basket are purchased with the same frequency • A weighted aggregate price index takes into account varying purchasing frequency • The Laspeyres index assumes the same mixture of items for all periods as was used in the base period • The Paasche index allows the mixture of items in the basket to change over time as purchasing habits change