1 / 42

Chapter 4

Chapter 4. Linear Regression with One Regression. Linear Regression with One Regressor (SW Chapter 4).

Download Presentation

Chapter 4

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 4 Linear Regression with One Regression

  2. Linear Regression with One Regressor(SW Chapter 4) • Linear regression allows us to estimate, and make inferences about, population slope coefficients. Ultimately our aim is to estimate the causal effect on Y of a unit change in X – but for now, just think of the problem of fitting a straight line to data on two variables, Y and X.

  3. Estimation: How should we draw a line through the data to estimate the (population) slope (answer: ordinary least squares). What are advantages and disadvantages of OLS? Hypothesis testing: How to test if the slope is zero? Confidence intervals: How to construct a confidence interval for the slope? The problems of statistical inference for linear regression are, at a general level, the same as for estimation of the mean or of the differences between two means. Statistical, or econometric, inference about the slope entails:

  4. Linear Regression: Some Notation and Terminology(SW Section 4.1)

  5. The Population Linear Regression Model – general notation

  6. This terminology in a picture: Observations on Y and X; the population regression line; and the regression error (the “error term”):

  7. The Ordinary Least Squares Estimator(SW Section 4.2)

  8. Mechanics of OLS

  9. The OLS estimator solves:

  10. Application to the California Test Score – Class Size data

  11. Interpretation of the estimated slope and intercept

  12. Predicted values & residuals:

  13. OLS regression: STATA output

  14. Measures of Fit(Section 4.3)

  15. TheStandard Error of the Regression (SER)

  16. Example of the R2 and the SER

  17. The Least Squares Assumptions (SW Section 4.4)

  18. The Least Squares Assumptions

  19. Least squares assumption #1: E(u|X = x) = 0.

  20. Least squares assumption #1, ctd.

  21. Least squares assumption #2: (Xi,Yi), i = 1,…,n are i.i.d.

  22. Least squares assumption #3: Large outliers are rare Technical statement: E(X4) <  and E(Y4) < 

  23. OLS can be sensitive to an outlier:

  24. The Sampling Distribution of the OLS Estimator(SW Section 4.5)

  25. Probability Framework for Linear Regression

  26. ˆ b 1 The Sampling Distribution of

  27. The mean and variance of the sampling distribution of

  28. Now we can calculate E( ) and var( ):

  29. Next calculate var( ):

  30. What is the sampling distribution of ?

  31. Large-n approximation to the distribution of :

  32. The larger the variance of X, the smaller the variance of

  33. The larger the variance of X, the smaller the variance of

  34. Summary of the sampling distribution of :

More Related