1 / 39

Simple Linear Regression

Simple Linear Regression. Correlation vs. Regression. A scatter plot (or scatter diagram) can be used to show the relationship between two variables Correlation analysis is used to measure strength of the association (linear relationship) between two variables

Download Presentation

Simple Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simple Linear Regression

  2. Correlation vs. Regression • A scatter plot (or scatter diagram) can be used to show the relationship between two variables • Correlation analysis is used to measure strength of the association (linear relationship) between two variables • Correlation is only concerned with strength of the relationship • No causal effect is implied with correlation • Correlation was first presented in Chapter 3

  3. Introduction to Regression Analysis • Regression analysis is used to: • Predict the value of a dependent variable based on the value of at least one independent variable • Explain the impact of changes in an independent variable on the dependent variable Dependent variable: the variable we wish to explain Independent variable: the variable used to explain the dependent variable

  4. Simple Linear Regression Model • Only one independent variable, X • Relationship between X and Y is described by a linear function • Changes in Y are assumed to be caused by changes in X

  5. Types of Relationships Linear relationships Curvilinear relationships Y Y X X Y Y X X

  6. Types of Relationships (continued) Strong relationships Weak relationships Y Y X X Y Y X X

  7. Types of Relationships (continued) No relationship Y X Y X

  8. Simple Linear Regression Model The population regression model: Random Error term Population SlopeCoefficient Population Y intercept Independent Variable Dependent Variable Linear component Random Error component

  9. Simple Linear Regression Model (continued) Y Observed Value of Y for Xi εi Slope = β1 Predicted Value of Y for Xi Random Error for this Xi value Intercept = β0 X Xi

  10. Simple Linear Regression Equation The simple linear regression equation provides an estimate of the population regression line Estimated (or predicted) Y value for observation i Estimate of the regression intercept Estimate of the regression slope Value of X for observation i The individual random error terms ei have a mean of zero

  11. Least Squares Method • b0 and b1 are obtained by finding the values of b0 and b1 that minimize the sum of the squared differences between Y and :

  12. Finding the Least Squares Equation • The coefficients b0 and b1 , and other regression results in this chapter, will be found using Excel Formulas are shown in the text at the end of the chapter for those who are interested

  13. Interpretation of the Slope and the Intercept • b0 is the estimated average value of Y when the value of X is zero • b1 is the estimated change in the average value of Y as a result of a one-unit change in X

  14. Simple Linear Regression Example • A real estate agent wishes to examine the relationship between the selling price of a home and its size (measured in square feet) • A random sample of 10 houses is selected • Dependent variable (Y) = house price in $1000s • Independent variable (X) = square feet

  15. Sample Data for House Price Model

  16. Graphical Presentation • House price model: scatter plot

  17. Regression Using Excel • Tools / Data Analysis / Regression

  18. Excel Output The regression equation is:

  19. Graphical Presentation • House price model: scatter plot and regression line Slope = 0.10977 Intercept = 98.248

  20. Interpretation of the Intercept, b0 • b0 is the estimated average value of Y when the value of X is zero (if X = 0 is in the range of observed X values) • Here, no houses had 0 square feet, so b0 = 98.24833 just indicates that, for houses within the range of sizes observed, $98,248.33 is the portion of the house price not explained by square feet

  21. Interpretation of the Slope Coefficient, b1 • b1 measures the estimated change in the average value of Y as a result of a one-unit change in X • Here, b1 = .10977 tells us that the average value of a house increases by .10977($1000) = $109.77, on average, for each additional one square foot of size

  22. Predictions using Regression Analysis Predict the price for a house with 2000 square feet: The predicted price for a house with 2000 square feet is 317.85($1,000s) = $317,850

  23. Interpolation vs. Extrapolation • When using a regression model for prediction, only predict within the relevant range of data Relevant range for interpolation Do not try to extrapolate beyond the range of observed X’s

  24. Measures of Variation • Total variation is made up of two parts: Total Sum of Squares Regression Sum of Squares Error Sum of Squares where: = Average value of the dependent variable Yi = Observed values of the dependent variable i = Predicted value of Y for the given Xi value

  25. Measures of Variation (continued) • SST = total sum of squares • Measures the variation of the Yi values around their mean Y • SSR = regression sum of squares • Explained variation attributable to the relationship between X and Y • SSE = error sum of squares • Variation attributable to factors other than the relationship between X and Y

  26. Measures of Variation (continued) Y Yi   Y SSE= (Yi-Yi )2 _ SST=(Yi-Y)2  _ Y  _ SSR = (Yi -Y)2 _ Y Y X Xi

  27. Coefficient of Determination, r2 • The coefficient of determination is the portion of the total variation in the dependent variable that is explained by variation in the independent variable • The coefficient of determination is also called r-squared and is denoted as r2 note:

  28. Examples of Approximate r2 Values Y r2 = 1 Perfect linear relationship between X and Y: 100% of the variation in Y is explained by variation in X X r2 = 1 Y X r2 = 1

  29. Examples of Approximate r2 Values Y 0 < r2 < 1 Weaker linear relationships between X and Y: Some but not all of the variation in Y is explained by variation in X X Y X

  30. Examples of Approximate r2 Values r2 = 0 Y No linear relationship between X and Y: The value of Y does not depend on X. (None of the variation in Y is explained by variation in X) X r2 = 0

  31. Excel Output 58.08% of the variation in house prices is explained by variation in square feet

  32. Standard Error of Estimate • The standard deviation of the variation of observations around the regression line is estimated by Where SSE = error sum of squares n = sample size

  33. Excel Output

  34. Comparing Standard Errors SYX is a measure of the variation of observed Y values from the regression line Y Y X X The magnitude of SYX should always be judged relative to the size of the Y values in the sample data i.e., SYX = $41.33K ismoderately small relative to house prices in the $200 - $300K range

  35. Assumptions of Regression • Normality of Error • Error values (ε) are normally distributed for any given value of X • Homoscedasticity • The probability distribution of the errors has constant variance • Independence of Errors • Error values are statistically independent

  36. Residual Analysis • The residual for observation i, ei, is the difference between its observed and predicted value • Check the assumptions of regression by examining the residuals • Examine for linearity assumption • Examine for constant variance for all levels of X (homoscedasticity) • Evaluate normal distribution assumption • Evaluate independence assumption • Graphical Analysis of Residuals • Can plot residuals vs. X

  37. Residual Analysis for Linearity Y Y x x x x residuals residuals  Not Linear Linear

  38. Residual Analysis for Homoscedasticity Y Y x x x x residuals residuals  Constant variance Non-constant variance

  39. Residual Analysis for Independence Not Independent  Independent X residuals X residuals X residuals

More Related