850 likes | 1.18k Views
Chapter 3. Demand Forecasting. Overview. Introduction Qualitative Forecasting Methods Quantitative Forecasting Models How to Have a Successful Forecasting System Computer Software for Forecasting. Introduction.
E N D
Chapter 3 Demand Forecasting
Overview • Introduction • Qualitative Forecasting Methods • Quantitative Forecasting Models • How to Have a Successful Forecasting System • Computer Software for Forecasting
Introduction • Demand estimates for products and services are the starting point for all the other planning in operations management. • Forecasting integral to production planning. • Long-range survival, growth, and profitability as well as short-range efficiency and effectiveness depend on accurate forecasting.
Forecasting is an Integral Part of Business Planning Inputs: Market, Economic, Other Demand Estimates Forecast Method(s) Sales Forecast Management Team Business Strategy Production Resource Forecasts
Some Reasons WhyForecasting is Essential in OM • New Facility Planning requires long-range forecasts– It can take 5 years to design and build a new factory or design and implement a new production process. • Production Planning requires medium-range forecasts– Demand for products vary from month to month and it can take several months to change the capacities of production processes. • Workforce Schedulingrequires short-range forecasts – Demand for services (and the necessary staffing) can vary from hour to hour and employees weekly work schedules must be developed in advance.
Examples of Production Resource Forecasts Forecast Horizon Time Span Item Being Forecasted Unit of Measure Long Range Years Product Lines, Factory Capacities Dollars, Tons Medium Range Months Product Groups, Depart. Capacities Units, Pounds Short Range Days, Weeks Specific Products, Machine Capacities Units, Hours
Forecasting Methods • Qualitative Approaches • Quantitative Approaches
Qualitative Approaches • Usually based on judgments about causal factors that underlie the demand of particular products or services • Do not require a demand history for the product or service, therefore are useful for new products/services • Approaches vary in sophistication from scientifically conducted surveys to intuitive hunches about future events • The approach/method that is appropriate depends on a product’s life cycle stage
Qualitative Methods • Educated guess intuitive hunches • Executive committee consensus • Delphi method • Survey of sales force • Survey of customers • Historical analogy • Market research scientifically conducted surveys
Quantitative Forecasting Approaches • Based on the assumption that the “forces” that generated the past demand will generate the future demand, i.e., history will tend to repeat itself • Analysis of the past demand pattern provides a good basis for forecasting future demand • Majority of quantitative approaches fall in the category of time series analysis
Time Series Analysis • A time series is a set of numbers where the order or sequence of the numbers is important, e.g., historical demand • Analysis of the time series identifies patterns • Once the patterns are identified, an appropriate method can be used to develop a forecast
Components of a Time Series • Trends are noted by an upward or downward sloping line. • Cycle is a data pattern that may cover several years before it repeats itself. • Seasonality is a data pattern that repeats itself over the period of one year or less. • Random fluctuation (noise) results from random variation or unexplained causes.
Seasonal peaks Trend component Actual demand line Demand for product or service Average demand over four years Random variation Year 2 Year 3 Year 4 Year 1 Product Demand Charted over 4 Years with Trend and Seasonality
Seasonal Patterns Length of Time Number of Before Pattern Length of Seasons Is Repeated Season in Pattern Year Quarter 4 Year Month 12 Year Week 52 Month Day 28-31 Week Day 7
Quantitative Forecasting Approaches • Linear Regression • Simple Moving Average • Weighted Moving Average • Exponential Smoothing (exponentially weighted moving average) • Exponential Smoothing with Trend (double exponential smoothing)
Long-Range Forecasts • Time spans usually greater than one year • Necessary to support strategic decisions about planning products, processes, and facilities • For long-range forecasting, it is important to plot historical time series data in order to identify data patterns (trends, cycles, seasonality) so that a proper forecasting method can be used.
Simple Linear Regression • Linear regression analysis establishes a relationship between a dependent variable and one or more independent variables. • In simple linear regression analysis there is only one independent variable. • If the data is a time series, the independent variable is the time period. • The dependent variable is whatever we wish to forecast.
Simple Linear Regression • Regression Equation This model is of the form: Y = a + bX Y = dependent variable X = independent variable a = y-axis intercept (the height of the line when x = 0) b = slope of regression line which is the amount by which y increases when x increases by 1 unit.
Simple Linear Regression • Constants a and b The constants a and b are computed using the following equations:
Simple Linear Regression • Once the a and b values are computed, a future value of X can be entered into the regression equation and a corresponding value of Y (the forecast) can be calculated.
Example: College Enrollment • Simple Linear Regression At a small regional college enrollments have grown steadily over the past six years, as evidenced below. Use time series regression to forecast the student enrollments for the next three years. Students Students YearEnrolled (1000s)YearEnrolled (1000s) 1 2.5 4 3.2 2 2.8 5 3.3 3 2.9 6 3.4
Example: College Enrollment • Simple Linear Regression x y x2 xy 1 2.5 1 2.5 2 2.8 4 5.6 3 2.9 9 8.7 4 3.2 16 12.8 5 3.3 25 16.5 6 3.4 36 20.4 Sx=21 Sy=18.1 Sx2=91 Sxy=66.5
Example: College Enrollment • Simple Linear Regression Y = 2.387 + 0.180X
Example: College Enrollment • Simple Linear Regression Y7 = 2.387 + 0.180(7) = 3.65 or 3,650 students Y8 = 2.387 + 0.180(8) = 3.83 or 3,830 students Y9 = 2.387 + 0.180(9) = 4.01 or 4,010 students Note: Enrollment is expected to increase by 180 students per year.
Simple Linear Regression • Simple linear regression can also be used when the independent variable X represents a variable other than time. • In this case, linear regression is representative of a class of forecasting models called causal forecasting models.
Example: Railroad Products Co. • Simple Linear Regression – Causal Model The manager of RPC wants to project the sales for the next 3 years. He knows that RPC’s long-range sales are tied very closely to national freight car loadings. On the next slide are 7 years of relevant historical data. Develop a simple linear regression model between RPC sales and national freight car loadings. Forecast RPC sales for the next 3 years, given that the rail industry estimates car loadings of 250, 270, and 300 million.
Example: Railroad Products Co. • Simple Linear Regression – Causal Model RPC Sales Car Loadings Year($millions)(millions) 1 9.5 120 2 11.0 135 3 12.0 130 4 12.5 150 5 14.0 170 6 16.0 190 7 18.0 220
Example: Railroad Products Co. • Simple Linear Regression – Causal Model x y x2 xy 120 9.5 14,400 1,140 135 11.0 18,225 1,485 130 12.0 16,900 1,560 150 12.5 22,500 1,875 170 14.0 28,900 2,380 190 16.0 36,100 3,040 220 18.0 48,400 3,960 1,115 93.0 185,425 15,440
Example: Railroad Products Co. • Simple Linear Regression – Causal Model Y = 0.528 + 0.0801X
Example: Railroad Products Co. • Simple Linear Regression – Causal Model Y8 = 0.528 + 0.0801(250) = $20.55 million Y9 = 0.528 + 0.0801(270) = $22.16 million Y10 = 0.528 + 0.0801(300) = $24.56 million Note: RPC sales are expected to increase by $80,100 for each additional million national freight car loadings.
Multiple Regression Analysis • Multiple regression analysis is used when there are two or more independent variables. • An example of a multiple regression equation is: Y = 50.0 + 0.05X1 + 0.10X2 – 0.03X3 where: Y = firm’s annual sales ($millions) X1 = industry sales ($millions) X2 = regional per capita income ($thousands) X3 = regional per capita debt ($thousands)
Coefficient of Correlation (r) • The coefficient of correlation, r, explains the relative importance of the relationship between x and y. • The sign of r shows the direction of the relationship. • The absolute value of r shows the strength of the relationship. • The sign of r is always the same as the sign of b. • r can take on any value between –1 and +1.
Coefficient of Correlation (r) • Meanings of several values of r: -1 a perfect negative relationship (as x goes up, y goes down by one unit, and vice versa) +1 a perfect positive relationship (as x goes up, y goes up by one unit, and vice versa) 0 no relationship exists between x and y +0.3 a weak positive relationship -0.8 a strong negative relationship
Coefficient of Correlation (r) • r is computed by:
Coefficient of Determination (r2) • The coefficient of determination, r2, is the square of the coefficient of correlation. • The modification of r to r2 allows us to shift from subjective measures of relationship to a more specific measure. • r2 is determined by the ratio of explained variation to total variation:
Coefficient of Determination (r2) • The coefficient of determination, r 2,is useful because it gives the proportion of the variance (fluctuation) of one variable that is predictable from the other variable. • It is a measure that allows us to determine how certain one can be in making predictions from a certain model/graph. • The coefficient of determination is the ratio of the explained variation to the totalvariation. • The coefficient of determination is such that 0 <r 2< 1, and denotes the strength of the linear association between x and y.
Coefficient of Determination (r2) • The coefficient of determination represents the percent of the data that is the closestto the line of best fit. For example, if r = 0.922, then r 2 = 0.850, which means that 85% of the total variation in y can be explained by the linear relationship between xand y (as described by the regression equation). The other 15% of the total variationin y remains unexplained. • The coefficient of determination is a measure of how well the regression linerepresents the data. If the regression line passes exactly through every point on the scatter plot, it would be able to explain all of the variation. The further the line is away from the points, the less it is able to explain.
Example: Railroad Products Co. • Coefficient of Correlation x y x2 xy y2 120 9.5 14,400 1,140 90.25 135 11.0 18,225 1,485 121.00 130 12.0 16,900 1,560 144.00 150 12.5 22,500 1,875 156.25 170 14.0 28,900 2,380 196.00 190 16.0 36,100 3,040 256.00 220 18.0 48,400 3,960 324.00 1,115 93.0 185,425 15,440 1,287.50
Example: Railroad Products Co. • Coefficient of Correlation r = .9829
Example: Railroad Products Co. • Coefficient of Determination r2 = (.9829)2 = .966 96.6% of the variation in RPC sales is explained by national freight car loadings.
Ranging Forecasts • Forecasts for future periods are only estimates and are subject to error. • One way to deal with uncertainty is to develop best-estimate forecasts and the ranges within which the actual data are likely to fall. • The ranges of a forecast are defined by the upper and lower limits of a confidence interval.
Ranging Forecasts • The ranges or limits of a forecast are estimated by: Upper limit = Y + t(syx) Lower limit = Y - t(syx) where: Y = best-estimate forecast t = number of standard deviations from the mean of the distribution to provide a given proba- bility of exceeding the limits through chance syx = standard error of the forecast
Ranging Forecasts • The standard error (deviation) of the forecast is computed as:
Example: Railroad Products Co. • Ranging Forecasts Recall that linear regression analysis provided a forecast of annual sales for RPC in year 8 equal to $20.55 million. Set the limits (ranges) of the forecast so that there is only a 5 percent probability of exceeding the limits by chance.
Example: Railroad Products Co. • Ranging Forecasts • Step 1: Compute the standard error of the forecasts, syx. • Step 2: Determine the appropriate value for t. n = 7, so degrees of freedom = n – 2 = 5. level of significance = =.05 Appendix B, Table 2 shows t = 2.571.
Example: Railroad Products Co. • Ranging Forecasts • Step 3: Compute upper and lower limits. Upper limit = 20.55 + 2.571(.5748) = 20.55 + 1.478 = 22.028 Lower limit = 20.55 - 2.571(.5748) = 20.55 - 1.478 = 19.072 We are 95% confident the actual sales for year 8 will be between $19.072 and $22.028 million.
Seasonalized Time Series Regression Analysis • Select a representative historical data set. • Develop a seasonal index for each season. • Use the seasonal indexes to deseasonalize the data. • Perform lin. regr. analysis on the deseasonalized data. • Use the regression equation to compute the forecasts. • Use the seas. indexes to reapply the seasonal patterns to the forecasts.
Example: Computer Products Corp. • Seasonalized Times Series Regression Analysis An analyst at CPC wants to develop next year’s quarterly forecasts of sales revenue for CPC’s line of Epsilon Computers. She believes that the most recent 8 quarters of sales (shown on the next slide) are representative of next year’s sales.
Example: Computer Products Corp. • Seasonalized Times Series Regression Analysis • Representative Historical Data Set Year Qtr. ($mil.) Year Qtr. ($mil.) 1 1 7.4 2 1 8.3 1 2 6.5 2 2 7.4 1 3 4.9 2 3 5.4 1 4 16.1 2 4 18.0