1 / 41

Part 4 Chapter 15 General Least Squares and Non-Linear Regression

Part 4 Chapter 15 General Least Squares and Non-Linear Regression. Part Organization. Chapter 14 Brief Review of Statistics Linear Regression (How to determine the best fit ) Linearization of Nonlinear Equations Chapter 15 Polynomial Regression Multiple Linear Regression

mearls
Download Presentation

Part 4 Chapter 15 General Least Squares and Non-Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part 4 Chapter 15General Least Squares and Non-Linear Regression

  2. Part Organization • Chapter 14 • Brief Review of Statistics • Linear Regression (How to determine the best fit) • Linearization of Nonlinear Equations • Chapter 15 • Polynomial Regression • Multiple Linear Regression • Chapter 16 – Skip • Chapter 17 – Interpolating • Chapter 18 – Spline Interpolation

  3. Chapter 15 – General Least Squares • Some engineering data is poorly represented by a straight line. • For these cases a curve is better suited to fit the data. • In chapter 14 we looked at techniques to linearize other models • This approach allowed us to use linear regression • An alternate approach is to use polynomial regression

  4. Taylor’s Theorem • Recall that any smooth function can be approximated by a polynomial • Polynomial regression fits a polynomial to a set of data points

  5. General Linear Least Squares Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero

  6. For Simple Linear Regression

  7. Least-Squares Fit of a Straight Line To minimize Sr, we need to find the partial derivatives with respect to a0 and a1 and set them equal to 0

  8. Coefficients for a Simple Linear Regression

  9. For 2nd Order Polynomial Regression

  10. Example 15.1 Fit this data to a second order polynomial

  11. As before we need to minimize the sum of the squares of the residuals Take the derivatives with respect to the coefficients (a) and set them equal to 0

  12. 3 equations and 3 unknowns

  13. This is starting to get cumbersome • We could solve for the coefficients using matrix algebra

  14. This is starting to get cumbersome • We could solve for the coefficients using matrix algebra…. • But… we could also use the built-in MATLAB function polyfit

  15. polyfit has the advantage that you can use it for higher order polynomials

  16. Or use left division

  17. General Linear Least Squares Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero

  18. For 2nd Order Polynomial Regression

  19. 2.1000 7.7000 13.6000 27.2000 40.9000 61.1000 a0 * 1 + a1*0 + a2* 0 a0 * 1 + a1* 1 + a2* 1 a0 * 1 + a1* 2 + a2* 4 a0 * 1 + a1* 3 + a2* 9 a0 * 1 + a1* 4 + a2* 16 a0 * 1 + a1* 5 + a2* 25 =

  20. 2.1000 7.7000 13.6000 27.2000 40.9000 61.1000 a0 * 1 + a1*0 + a2* 0 a0 * 1 + a1* 1 + a2* 1 a0 * 1 + a1* 2 + a2* 4 a0 * 1 + a1* 3 + a2* 9 a0 * 1 + a1* 4 + a2* 16 a0 * 1 + a1* 5 + a2* 25 = If you have an equal number of equations and unknowns the \ operator uses a modified Gaussian elimination strategy

  21. 2.1000 7.7000 13.6000 27.2000 40.9000 61.1000 a0 * 1 + a1*0 + a2* 0 a0 * 1 + a1* 1 + a2* 1 a0 * 1 + a1* 2 + a2* 4 a0 * 1 + a1* 3 + a2* 9 a0 * 1 + a1* 4 + a2* 16 a0 * 1 + a1* 5 + a2* 25 If you have more equations than unknowns the system is over-specified, and the \ operator utilizes QR factorization to find the best fit =

  22. The General Linear Least Squares Technique can be applied to more than just polynomials Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero

  23. What if you want to do a multiple linear regression? Use a similar approach as that outlined in Section 15.2

  24. This is pretty complicated, and it’s only good for 2 variables plus a constant!

  25. This does the same thing, generalized for m dimensions

  26. Or… use left division

  27. Or… use left division a0 * 1 + a1 * 0 + a2*0 a0 * 1 + a1* 2 + a2* 1 a0 * 1 + a1* 2.5 + a2* 2 a0 * 1 + a1* 1 + a2* 3 a0 * 1 + a1* 4 + a2* 6 a0 * 1 + a1* 7 + a2* 2 5 10 9 0 3 27 QR Factorization =

  28. Using the Interactive Curve Fitting Tools • MATLAB 7 includes interactive plotting tools. • They include • basic curve fitting, • more complicated curve fitting • statistical tools

  29. Use the curve fitting tools… • Create a graph • Making sure that the figure window is the active window select • Tools-> Basic Fitting • The basic fitting window will open on top of the plot

  30. Plot generated using the Basic Fitting Window

  31. Residuals are the difference between the actual and calculated data points

  32. Basic Fitting Window

  33. You can also access the data statistics window from the figure menu bar. Select Tools->Data Statistics from the figure window. This window allows you to calculate statistical functions interactively, such as mean and standard deviation, based on the data in the figure, and allows you to save the results to the workspace. Data Statistics Window

  34. Save to your Current Directory

  35. Reload a figure by double clicking it’s name in the current directory

  36. Or in the command window

More Related