410 likes | 436 Views
Part 4 Chapter 15 General Least Squares and Non-Linear Regression. Part Organization. Chapter 14 Brief Review of Statistics Linear Regression (How to determine the best fit ) Linearization of Nonlinear Equations Chapter 15 Polynomial Regression Multiple Linear Regression
E N D
Part 4 Chapter 15General Least Squares and Non-Linear Regression
Part Organization • Chapter 14 • Brief Review of Statistics • Linear Regression (How to determine the best fit) • Linearization of Nonlinear Equations • Chapter 15 • Polynomial Regression • Multiple Linear Regression • Chapter 16 – Skip • Chapter 17 – Interpolating • Chapter 18 – Spline Interpolation
Chapter 15 – General Least Squares • Some engineering data is poorly represented by a straight line. • For these cases a curve is better suited to fit the data. • In chapter 14 we looked at techniques to linearize other models • This approach allowed us to use linear regression • An alternate approach is to use polynomial regression
Taylor’s Theorem • Recall that any smooth function can be approximated by a polynomial • Polynomial regression fits a polynomial to a set of data points
General Linear Least Squares Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero
Least-Squares Fit of a Straight Line To minimize Sr, we need to find the partial derivatives with respect to a0 and a1 and set them equal to 0
Example 15.1 Fit this data to a second order polynomial
As before we need to minimize the sum of the squares of the residuals Take the derivatives with respect to the coefficients (a) and set them equal to 0
This is starting to get cumbersome • We could solve for the coefficients using matrix algebra
This is starting to get cumbersome • We could solve for the coefficients using matrix algebra…. • But… we could also use the built-in MATLAB function polyfit
polyfit has the advantage that you can use it for higher order polynomials
General Linear Least Squares Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero
2.1000 7.7000 13.6000 27.2000 40.9000 61.1000 a0 * 1 + a1*0 + a2* 0 a0 * 1 + a1* 1 + a2* 1 a0 * 1 + a1* 2 + a2* 4 a0 * 1 + a1* 3 + a2* 9 a0 * 1 + a1* 4 + a2* 16 a0 * 1 + a1* 5 + a2* 25 =
2.1000 7.7000 13.6000 27.2000 40.9000 61.1000 a0 * 1 + a1*0 + a2* 0 a0 * 1 + a1* 1 + a2* 1 a0 * 1 + a1* 2 + a2* 4 a0 * 1 + a1* 3 + a2* 9 a0 * 1 + a1* 4 + a2* 16 a0 * 1 + a1* 5 + a2* 25 = If you have an equal number of equations and unknowns the \ operator uses a modified Gaussian elimination strategy
2.1000 7.7000 13.6000 27.2000 40.9000 61.1000 a0 * 1 + a1*0 + a2* 0 a0 * 1 + a1* 1 + a2* 1 a0 * 1 + a1* 2 + a2* 4 a0 * 1 + a1* 3 + a2* 9 a0 * 1 + a1* 4 + a2* 16 a0 * 1 + a1* 5 + a2* 25 If you have more equations than unknowns the system is over-specified, and the \ operator utilizes QR factorization to find the best fit =
The General Linear Least Squares Technique can be applied to more than just polynomials Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero
What if you want to do a multiple linear regression? Use a similar approach as that outlined in Section 15.2
This is pretty complicated, and it’s only good for 2 variables plus a constant!
Or… use left division a0 * 1 + a1 * 0 + a2*0 a0 * 1 + a1* 2 + a2* 1 a0 * 1 + a1* 2.5 + a2* 2 a0 * 1 + a1* 1 + a2* 3 a0 * 1 + a1* 4 + a2* 6 a0 * 1 + a1* 7 + a2* 2 5 10 9 0 3 27 QR Factorization =
Using the Interactive Curve Fitting Tools • MATLAB 7 includes interactive plotting tools. • They include • basic curve fitting, • more complicated curve fitting • statistical tools
Use the curve fitting tools… • Create a graph • Making sure that the figure window is the active window select • Tools-> Basic Fitting • The basic fitting window will open on top of the plot
Residuals are the difference between the actual and calculated data points
You can also access the data statistics window from the figure menu bar. Select Tools->Data Statistics from the figure window. This window allows you to calculate statistical functions interactively, such as mean and standard deviation, based on the data in the figure, and allows you to save the results to the workspace. Data Statistics Window
Reload a figure by double clicking it’s name in the current directory