350 likes | 891 Views
Supply Chain Management. Lecture 13. Outline. Today Chapter 7 Thursday Network design simulation assignment Chapter 8 Friday Homework 3 due before 5:00pm. Outline. February 23 (Today) Chapter 7 February 25 Network design simulation description Chapter 8 Homework 4 (short) March 2
E N D
Supply Chain Management Lecture 13
Outline • Today • Chapter 7 • Thursday • Network design simulation assignment • Chapter 8 • Friday • Homework 3 due before 5:00pm
Outline • February 23 (Today) • Chapter 7 • February 25 • Network design simulation description • Chapter 8 • Homework 4 (short) • March 2 • Chapter 8, 9 • Network design simulation due before 5:00pm • March 4 • Simulation results • Midterm overview • Homework 4 due • March 9 • Midterm
Summary: Static Forecasting Method • Estimate level and trend • Deseasonalize the demand data • Estimate level L and trend T using linear regression • Obtain deasonalized demand Dt • Estimate seasonal factors • Estimate seasonal factors for each period St = Dt /Dt • Obtain seasonal factors Si = AVG(St) such that t is the same season as i • Forecast • Forecast for future periods is • Ft+n = (L + nT)*St+n Forecast Ft+n = (L + nT)St+n
Ethical Dilemma? In 2009, the board of regents for all public higher education in a large Midwestern state hired a consultant to develop a series of enrollment forecasting models, one for each college. These models used historical data and exponential smoothing to forecast the following year’s enrollments. Each college’s budget was set by the board based on the model, which included a smoothing constant () for each school. The head of the board personally selected each smoothing constant based on “gut reactions and political acumen.” How can this model be abused? What can be done to remove any biases? Can a regression model be used to bias results?
Forecast Forecast error Time Series Forecasting Observed demand = Systematic component + Random component L Level (current deseasonalized demand) T Trend (growth or decline in demand) S Seasonality (predictable seasonal fluctuation) The goal of any forecasting method is to predict the systematic component (Forecast) of demand and measure the size and variability of the random component (Forecast error)
1) Characteristics of Forecasts • Forecasts are always wrong! • Forecasts should include an expected value and a measure of error (or demand uncertainty) • Forecast 1: sales are expected to range between 100 and 1,900 units • Forecast 2: sales are expected to range between 900 and 1,100 units
Forecast Error • Error (E) • Measures the difference between the forecast and the actual demand in period t • Want error to be relatively small Et = Ft – Dt
Forecast Error • Bias • Measures the bias in the forecast error • Want bias to be as close to zero as possible • A large positive (negative) bias means that the forecast is overshooting (undershooting) the actual observations • Zero bias does notimply that the forecast is perfect (no error) -- only that the mean of the forecast is “on target” biast = • ∑n • ∑t=1 • Et
Forecast Error Forecast mean “on target” but not perfect Undershooting
Forecast Error • Absolute deviation (A) • Measures the absolute value of error in period t • Want absolute deviation to be relatively small At = |Et|
1 n Forecast Error • Mean absolute deviation (MAD) • Measures absolute error • Positive and negative errors do not cancel out (as with bias) • Want MAD to be as small as possible • No way to know if MAD error is large or small in relation to the actual data • ∑n MADn = ∑t=1 At = 1.25*MAD
Forecast Error Not all that large relative to data
Forecast Error • Tracking signal (TS) • Want tracking signal to stay within (–6, +6) • If at any period the tracking signal is outside the range (–6, 6) then the forecast is biased TSt = biast / MADt
Forecast Error Biased (underforecasting)
Et • ∑n • ∑t=1 • 100 • Dt • n Forecast Error • Mean absolute percentage error (MAPE) • Same as MAD, except ... • Measures absolute deviation as a percentage of actual demand • Want MAPE to be less than 10 (though values under 30 are common) MAPEn =
Forecast Error Smallest absolute deviation relative to demand MAPE < 10 is considered very good
1 n Forecast Error • Mean squared error (MSE) • Measures squared forecast error • Recognizes that large errors are disproportionately more “expensive” than small errors • Not as easily interpreted as MAD, MAPE -- not as intuitive MSEn = ∑t=1 • Et2 • ∑n VAR = MSE
1 1 n n • Et • ∑n • ∑t=1 • 100 • Dt • n Measures of Forecast Error • ∑n • ∑n
Summary • What information does the bias and TS provide to a manager? • The bias and TS are used to estimate if the forecast consistently over- or underforecasts • What information does the MSE and MAD provide to a manager? • MSE estimates the variance of the forecast error • VAR(Forecast Error) = MSEn • MAD estimates the standard deviation of the forecast error • STDEV(Forecast Error) = 1.25 MADn
Forecast Error in Excel • Calculate absolute error At =ABS(Et) • Calculate mean absolute deviation MADn =SUM(A1:An)/n=AVERAGE(A1:An) • Calculate mean absolute percentage error MAPEn =AVERAGE(…) • Calculate tracking signal TSt =biast / MADt • Calculate mean squared error MSEn=SUMSQ(E1:En)/n
Forecast Error in Excel Et = Ft – Dt Forecast Error
Forecast Error in Excel biasn = • ∑n • ∑t=1 • Et Bias
Forecast Error in Excel At = |Et| Absolute Error
1 n Forecast Error in Excel MADn = ∑t=1 At • ∑n Mean Absolute Deviation
Forecast Error in Excel TSt = biast / MADt Tracking Signal
Et • 100 • Dt Forecast Error in Excel |%Error|t = |%Error|
∑n • ∑t=1 Forecast Error in Excel • |%Error|t MAPEn = • n Mean Absolute Percentage Error
1 n Forecast Error in Excel MSEn = ∑t=1 • Et2 • ∑n Mean Squared Error