320 likes | 579 Views
Chapter 12:. Linear Regression. Introduction. Regression analysis and Analysis of variance are the two most widely used statistical procedures. Regression analysis: Description Prediction Estimation. 12.1 Simple Linear Regression. (12.1 ). (12.2). 12.1 Simple Linear Regression.
E N D
Chapter 12: Linear Regression
Introduction • Regression analysis and Analysis of variance are the two most widely used statistical procedures. • Regression analysis: • Description • Prediction • Estimation
12.1 Simple Linear Regression (12.1) (12.2)
12.1 Simple Linear Regression The regression equation is Y = 55.9 - 0.641 X Predictor Coef SE Coef T P Constant 55.923 2.824 19.80 0.000 X -0.64067 0.04332 -14.79 0.000 S = 0.888854 R-Sq = 95.6% R-Sq(adj) = 95.2% Analysis of Variance Source DF SS MS F P Regression 1 172.77 172.77 218.67 0.000 Residual Error 10 7.90 0.79 Total 11 180.67
12.3 Assumptions (12.1)
12.7 Regression Control Chart (12.5) (12.6)
12.8 Cause-Selecting Control Chart • The general idea is to try to distinguish between quality problems that occur at one stage in a process from problems that occur at a previous processing step. • Let Y be the output from the second step and let X denote the output from the first step. The relationship between X and Y would be modeled.
12.9 Linear, Nonlinear, and Nonparametric Profiles • Profile refers to the quality of a process or product being characterized by a (Linear, Nonlinear, or Nonparametric) relationship between a response variable and one or more explanatory variables. • A possible way is to monitor each parameter in the model with a Shewhart chart. • The independent variables must be fixed • Control chart for R2
12.10 Inverse Regression • An important application of simple linear regression for quality improvement is in the area of calibration. • Assume two measuring tools are available – One is quite accurate but expensive to use and the other is not as expensive but also not as accurate. If the measurements obtained from the two devices are highly correlated, then the measurement that would have been made using the expensive measuring device could be predicted fairly from the measurement using the less expensive device. • Let Y = measurement from the less expensive device X = measurement from the accurate device
12.10 Inverse RegressionExample Y X 2.3 2.4 2.5 2.6 2.4 2.5 2.8 2.9 2.9 3.0 2.6 2.7 2.4 2.5 2.2 2.3 2.1 2.2 2.7 2.7
12.12 Issues in Multiple Regression12.12.1 Variable Selection
12.12.3 Multicollinear Data • Problems occur when at least two of the regressors are related in some manner. • Solutions: • Discard one or more variables causing the multicollinearity • Use ridge regression