300 likes | 480 Views
Curve Fitting and Regression. EEE 244. Descriptive Statistics in MATLAB. MATLAB has several built-in commands to compute and display descriptive statistics. Assuming some column vector s : mean( s ), median( s ), mode( s )
E N D
Curve Fitting and Regression EEE 244
Descriptive Statistics in MATLAB • MATLAB has several built-in commands to compute and display descriptive statistics. Assuming some column vector s: • mean(s), median(s), mode(s) • Calculate the mean, median, and mode of s. mode is a part of the statistics toolbox. • min(s), max(s) • Calculate the minimum and maximum value in s. • var(s), std(s) • Calculate the variance (square of standard deviation) and standard deviation of s • Note - if a matrix is given, the statistics will be returned for each column.
Measurements of a Voltage Find mean, median, mode , min, max, variance and standard deviation using appropriate Matlab functions.
Histograms in MATLAB • [n, x] = hist(s, x) • Determine the number of elements in each bin of data in s. x is a vector containing the center values of the bins. • Open Matlab help and run example with 1000 randomly generated values for s. • [n, x] = hist(s, m) • Determine the number of elements in each bin of data in s using m bins. x will contain the centers of the bins. The default case is m=10 • Repeat the previous example with setting m=5 • Hist(s) ->histogram plot
EEE 244 Regression
Linear Regression • Fitting a straight line to a set of paired observations: (x1, y1), (x2, y2),…,(xn, yn). y=a0+a1x+e a1- slope a0- intercept e- error, or residual, between the model and the observations
Linear Least-Squares Regression • Linear least-squares regression is a method to determine the “best” coefficients in a linear model for given data set. • “Best” for least-squares regression means minimizing the sum of the squares of the estimate residuals. For a straight line model, this gives:
Least-Squares Fit of a Straight Line • Using the model:the slope and intercept producing the best fit can be found using:
Standard Error of the Estimate • Regression data showing (a) the spread of data around the mean of the dependent data and (b) the spread of the data around the best fit line: • The reduction in spread represents the improvement due to linear regression.
MATLAB Functions • MATLAB has a built-in function polyfit that fits a least-squares nth order polynomial to data: • p = polyfit(x, y, n) • x: independent data • y: dependent data • n: order of polynomial to fit • p: coefficients of polynomialf(x)=p1xn+p2xn-1+…+pnx+pn+1 • MATLAB’s polyval command can be used to compute a value using the coefficients. • y = polyval(p, x)
Polyfit function • Can be used to perform regression if the number of data points is a lot larger than the number of coefficients • p = polyfit(x, y, n) • x: independent data (Vce, 10 data points) • y: dependent data (Ic) • n: order of polynomial to fit n=1 (linear fit) • p: coefficients of polynomial (two coefficients)f(x)=p1xn+p2xn-1+…+pnx+pn+1
Polynomial Regression • The least-squares procedure can be readily extended to fit data to a higher-order polynomial. Again, the idea is to minimize the sum of the squares of the estimate residuals. • The figure shows the same data fit with: • A first order polynomial • A second order polynomial
Process and Measures of Fit • For a second order polynomial, the best fit would mean minimizing: • In general, this would mean minimizing:
EEE 244 Interpolation
Polynomial Interpolation • You will frequently have occasions to estimate intermediate values between precise data points. • The function you use to interpolate must pass through the actual data points - this makes interpolation more restrictive than fitting. • The most common method for this purpose is polynomial interpolation, where an (n-1)th order polynomial is solved that passes through n data points:
Determining Coefficients using Polyfit • MATLAB’s built in polyfit and polyval commands can also be used - all that is required is making sure the order of the fit for n data points is n-1.
Newton Interpolating Polynomials • Another way to express a polynomial interpolation is to use Newton’s interpolating polynomial. • The differences between a simple polynomial and Newton’s interpolating polynomial for first and second order interpolations are:
Newton Interpolating Polynomials (contd.) • The first-order Newton interpolating polynomial may be obtained from linear interpolation and similar triangles, as shown. • The resulting formula based on known points x1 and x2 and the values of the dependent function at those points is:
Newton Interpolating Polynomials (contd.) • The second-order Newton interpolating polynomial introduces some curvature to the line connecting the points, but still goes through the first two points. • The resulting formula based on known points x1, x2, and x3 and the values of the dependent function at those points is:
Lagrange Interpolating Polynomials • Another method that uses shifted value to express an interpolating polynomial is the Lagrange interpolating polynomial. • The differences between a simply polynomial and Lagrange interpolating polynomials for first and second order polynomials is:where the Li are weighting coefficients that are functions of x.
Lagrange Interpolating Polynomials (contd.) • The first-order Lagrange interpolating polynomial may be obtained from a weighted combination of two linear interpolations, as shown. • The resulting formula based on known points x1 and x2 and the values of the dependent function at those points is:
As with Newton’s method, the Lagrange version has an estimated error of:
Lagrange Interpolating Polynomials (contd.) • In general, the Lagrange polynomial interpolation for n points is:where Li is given by: