1 / 29

Lecture 4 Regression (Continued): Analysis of Variance, Standard Errors & Confidence Intervals, Prediction Intervals, Ex

This lecture continues the discussion on regression analysis, covering topics such as analysis of variance, standard errors and confidence intervals, prediction intervals, and examination of residuals. Supplementary readings recommended.

dgaye
Download Presentation

Lecture 4 Regression (Continued): Analysis of Variance, Standard Errors & Confidence Intervals, Prediction Intervals, Ex

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LECTURE 4 REGRESSION (CONTINUED) Analysis of Variance; Standard Errors & Confidence Intervals; Prediction Intervals; Examination of Residuals Supplementary Readings: Wilks, chapters 6,9; Bevington, P.R., Robinson, D.K., Data Reduction and Error Analysis for the Physical Sciences, McGraw-Hill, 1992.

  2. Define: Recall from last time… We call these residuals What should we require of them?

  3. GAUSSIAN Recall from last time… What should we require of them?

  4. Recall from last time… Analysis of Variance (“ANOVA”)? 2(n=5) Gaussian data

  5. is guaranteed by linear regression procedure Analysis of Variance (“ANOVA”) Why “n-2”?

  6. Analysis of Variance (“ANOVA”) Define:

  7. Analysis of Variance (“ANOVA”) Define: 1 and n-2 degrees of freedom

  8. Analysis of Variance (“ANOVA”) Source df SS MS F-test Total n-1 SST Regression 1 SSR MSR=SSR MSR/MSE Residual n-2 SSE MSE=se2 1 and n-2 degrees of freedom

  9. Analysis of Variance (“ANOVA”) for Simple Linear Regression Source df SS MS F-test Total n-1 SST Regression 1 SSR MSR=SSR MSR/MSE Residual n-2 SSE MSE=se2 We’ll discuss ANOVA further in the next lecture (“multivariate regression”)

  10. ‘Goodness of Fit’

  11. ‘Goodness of Fit’ For simple linear regression

  12. ‘Goodness of Fit’ Outside the “support” of the regression, in general,

  13. ‘Goodness of Fit’ Outside the “support” of the regression, in general,

  14. ‘Goodness of Fit’ Reliability Bias

  15. Analysis of Variance (“ANOVA”) Under Gaussian assumptions, the estimates from linear regression of the parameter a and b represent unbiased estimates of means of a Gaussian distribution Where the standard errors in the regression parameters are:

  16. Confidence Intervals The estimated regression slope ‘b’ is likely to be within some range of the true ‘b’

  17. Confidence Intervals This naturally defines a t test for the presence of a trend:

  18. Prediction Intervals MSE in a predicted value or, (‘Prediction Error’) is larger than the nominal MSE, increasing as the predictand value departs from the mean Note that sy approaches se as the ‘training’ sample becomes large

  19. Linear Correlation ‘r’ suffers from sampling error both in the regression slope and the estimates of variance…

  20. Linear Correlation ‘r’ suffers from sampling error both in the regression slope and the estimates of variance…

  21. Linear Correlation Coefficient

  22. Examining Residuals Heteroscedasticity A trend in residual variance violates the assumption of Gaussian residuals…

  23. Examining Residuals Heteroscedasticity Often a simple transformation of the original data will yield more closely Gaussian residuals…

  24. Examining Residuals Leverage Points can still be a problem!

  25. Examining Residuals Autocorrelation Durbin-Watson Statistic

  26. Suppose we have the simple (‘first order autoregressive’) model For example: Examining Residuals Autocorrelation Then we can still use all of the results based on Gaussian statistics, but with the modified sample size:

  27. Suppose we have the simple (‘first order autoregressive’) model Examining Residuals Autocorrelation Then we can still use all of the results based on Gaussian statistics, but with the modified sample size: Different for tests of variance

  28. Suppose we have the simple (‘first order autoregressive’) model Examining Residuals Autocorrelation Then we can still use all of the results based on Gaussian statistics, but with the modified sample size: Different again for correlations

  29. Suppose we have the simple (‘first order autoregressive’) model Examining Residuals We can remove the serial correlation through

More Related