170 likes | 354 Views
ENGR 610 Applied Statistics Fall 2007 - Week 10. Marshall University CITE Jack Smith. Overview for Today. Collect and go over Exam #2 Simple Linear Regression , Ch 12 Types of Regression Models Least-Square Linear Regression Measures of Variation in Regression
E N D
ENGR 610Applied StatisticsFall 2007 - Week 10 Marshall University CITE Jack Smith
Overview for Today • Collect and go over Exam #2 • Simple Linear Regression, Ch 12 • Types of Regression Models • Least-Square Linear Regression • Measures of Variation in Regression • Assumptions of Regression (and Correlation) • Residual Analysis • t and F tests for Slope • Confidence and Prediction Intervals • Correlation • Homework assignment
Regression Modeling • Analysis of variance to “fit” a predictive model for a response (dependent) variable to a set of one or more explanatory (independent) variables • Interpolative over relevant range, i.e., not extrapolative • Typically linear, but may be curvilinear or more complex • Related to Correlation Analysis - measuring the strength of association between variables
Types of Regression Models • Scatter Plots • Y vs X • Dependent vs independent • Linear Models • Positive, negative or no slope • Zero or non-zero intercept • Curvilinear Models • Positive, negative or no “slope” • Positive, negative or varied curvature • May be U shaped, with extrema • May be asymptotically or piece-wise linear • May be polynomial, exponential, inverse,…
Least-Square Linear Regression • Simple Linear Model (for population) • Yi = 0 + 1Xi + i • Xi = value of independent variable • Yi = observed value of dependent variable • 0 = Y-intercept (Y at X=0) • 1 = slope (Y/X) • i = random error for observation i • Yi’ = b0 + b1Xi (predicted value) • b0 and b1 are called regression coefficients • ei = Yi - Yi’ (residual) • Minimize ei2 for sample with respect to b0 and b1
See Fig 12.7, p 577 Partitioning of Variation • Total variation • Regression variation • Random variation (Mean response) SST = SSR + SSE Coefficient of Determination r2 = SSR/SST Standard Error of the Estimate
Assumptions of Regression (and Correlation) • Normality of error about regression line • Homoscedasticity(equal variance) along X • Independence of errors with respect to X • No autocorrelation in time • Analysis of residuals to test assumptions • Histogram, Box-and-Whisker plots • Normalcy plot • Ordered plots (by X, by time,…) See figures on pp 584-5
t Test for Slope H0: 1 = 0 Critical t value based on chosen level of significance, , and n-2 degrees of freedom
F Test for Slope • F = MSR / MSE • Reject H0 if F > FU(,k,n-k-1) [or p<] • Note: t2 (,n-2) = FU(,1,n-2) • One-Way ANOVA Summary
Confidence and Prediction Intervals • Confidence Interval Estimate for the Slope • Confidence Interval Estimate for the Mean • Confidence Interval Estimate for Individual Response See Fig 12.16, p 592
Pitfalls • Not testing assumptions of least-square regression by analyzing residuals, looking for • Patterns • Outliers • Non-uniform distribution about mean • See Figs 12.18-19, p 597-8 • Not being aware of alternatives to least-square regression when assumptions are violated • Not knowing subject matter being modeled
Computing by Hand • Slope • Y-Intercept
Computing by Hand • Measures of Variation
Coefficient of Correlation • For a regression • In general Also called… Pearson’s product-moment correlation coefficient Covariance
t Test for Correlation H0: = 0 Critical t value based on chosen level of significance, , and n-2 degrees of freedom Or Compared to FU(,1,n-2) = t2(,n-2)
Homework • Work through Appendix 12.1 • Work and hand in Problem12.56 • Read Ch 13, “Multiple Regression”