230 likes | 392 Views
Section 10.3 Regression. Objective Given two linearly correlated variables ( x and y ), find the linear function (equation) that best describes the trend. Equation of a line. Recall that the equation of a line is given by its slope and y -intercept y = m x + b. Regression.
E N D
Section 10.3Regression Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend.
Equation of a line Recall that the equation of a line is given by its slope and y-intercept y = mx+ b
Regression For a set of data (with variables x and y) that is linearly correlated, we want to find the equation of the line that best describes the trend. This process is called Regression
Definitions x: The predictor variable (Also called the explanatory variable or independent variable) y: The response variable(Also called the dependent variable) Regression Equation The equation that describes the algebraically relationship between the two variables Regression Line The graph of the regression equation (also called the line of best fit or least squares line)
Definitions Regression Equation y=b0+b1x b0:y-interceptb1: slope Regression Line
y-intercept Slope Equation Notation for Regression Equation Population 0 1 y=0 +1 x Sample b0 b1 y=b0 +b1x
1. The sample of paired (x, y) data is a random sample of quantitative data. 2. Visual examination of the scatterplot shows that the points approximate a straight-line pattern. 3. Any outliers must be removed if they are known to be errors. Consider the effects of any outliers that are not known errors. Requirements
Round to three significant digits If you use the formulas from the book, do not round intermediate values. Rounding b0 and b1
Example 1 Refer to the sample data given in Table 10-1 in the Chapter Problem. Find the equation of the regression line in which the explanatory variable (x-variable) is the cost of a slice of pizza and the response variable (y-variable) is the corresponding cost of a subway fare. (CPI=Consumer Price Index, not used)
Example 1 x : 0.15 0.35 1.00 1.25 1.75 2.00 y : 0.15 0.35 1.00 1.35 1.50 2.00 1. Enter data in StatCrunch (columns)
Example 1 x : 0.15 0.35 1.00 1.25 1.75 2.00 y : 0.15 0.35 1.00 1.35 1.50 2.00 2. Stat – Regression – Simple Linear
Example 1 x : 0.15 0.35 1.00 1.25 1.75 2.00 y : 0.15 0.35 1.00 1.35 1.50 2.00 2. Select var1 and var2 (i.e. x and y values)ClickCalculate
Example 1 x : 0.15 0.35 1.00 1.25 1.75 2.00 y : 0.15 0.35 1.00 1.35 1.50 2.00 b0 = 0.0345 b1 = 0.945 Regression Equation y = (0.0345) + (0.945)x
Example 1 Regression Equation y = (0.0345) + (0.945)x
1. Predicted value of y is y = b0 + b1x 2. Use the regression equation for predictions only if the graph of the regression line on the scatterplot confirms that the regression line fits the points reasonably well. Using the Regression Equation for Predictions 3. Use the regression equation for predictions only if the linear correlation coefficient r indicates that there is a linear correlation between the two variables.
4. Use the regression line for predictions only if the value of x does not go much beyond the scope of the available sample data. Predicting too far beyond the scope of the available sample data is called extrapolation, and it could result in bad predictions. Using the Regression Equation for Predictions 5. If the regression equation does not appear to be useful for making predictions, the best predicted value of a variable is its point estimate, which is its sample mean ( y ) _
Using the Regression Equation for Predictions Source: www.xkcd.com
If the regression equation is not a good model, the best predicted value of y is simply y (the mean of the y values) Remember, this strategy applies to linear patterns of points in a scatterplot. Using the Regression Equation for Predictions _
For a pair of sample x and y values, the residual is the difference between the observed sample value of y and the y-value that is predicted by using the regression equation. That is, Definition Residual= (observed y) – (predicted y) = y – y
A straight line satisfies the least-squares property if the sum of the squaresof the residuals is the smallest sum possible. The best possible regression line satisfies this properties (hence why it is also called the least squares line) Definition
Least Squares Property sum= (-5)2 + 112 + (-13) 2 + 72= 364 (any other line would yield a sum larger than 364)