1 / 135

Analyzing Systematic Relationships in Nonlinear Regression

Explore systematic relationships between variables using graphical and quantitative methods. Learn to assess, diagnose, and test the accuracy of parameter estimates. Visualize data trends through scatterplots and evaluate correlations for multiple regression analysis.

kathrine
Download Presentation

Analyzing Systematic Relationships in Nonlinear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHEE824 Nonlinear Regression Analysis J. McLellan Winter 2004

  2. Module 1: Linear Regression

  3. Outline - • assessing systematic relationships • matrix representation for multiple regression • least squares parameter estimates • diagnostics • graphical • quantitative • further diagnostics • testing the need for terms • lack of fit test • precision of parameter estimates, predicted responses • correlation between parameter estimates

  4. The Scenario We want to describe the systematic relationship between a response variable and a number of explanatory variables multiple regression we will consider the case which is linear in the parameters

  5. Assessing Systematic Relationships Is there a systematic relationship? Two approaches: • graphical • scatterplots, casement plots • quantitative • form correlations between response, explanatory variables • consider forming correlation matrix - table of pairwise correlations between regressor and explanatories, and pairs of explanatory variables • correlation between explanatory variables leads to correlated parameter estimates

  6. Graphical Methods for Analyzing Data Visualizing relationships between variables Techniques • scatterplots • scatterplot matrices • also referred to as “casement plots” • Time sequence plots J. McLellan

  7. Scatterplots ,,, are also referred to as “x-y diagrams” • plot values of one variable against another • look for systematic trend in data • nature of trend • linear? • exponential? • quadratic? • degree of scatter - does spread increase/decrease over range? • indication that variance isn’t constant over range of data J. McLellan

  8. Scatterplots - Example • tooth discoloration data - discoloration vs. fluoride trend - possibly nonlinear? J. McLellan

  9. Scatterplot - Example • tooth discoloration data -discoloration vs. brushing signficant trend? - doesn’t appear to be present J. McLellan

  10. Scatterplot - Example • tooth discoloration data -discoloration vs. brushing Variance appears to decrease as # of brushings increases J. McLellan

  11. Scatterplot matrices … are a table of scatterplots for a set of variables Look for - • systematic trend between “independent” variable and dependent variables - to be described by estimated model • systematic trend between supposedly independent variables - indicates that these quantities are correlated • correlation can negatively ifluence model estimation results • not independent information • scatterplot matrices can be generated automatically with statistical software, manually using Excel J. McLellan

  12. Scatterplot Matrices - tooth data J. McLellan

  13. Time Sequence Plot - for naphtha 90% point - indicates amount of heavy hydrocarbons present in gasoline range material excursion - sudden shift in operation meandering about average operating point - time correlation in data J. McLellan

  14. What do dynamic data look like? J. McLellan

  15. Assessing Systematic Relationships Quantitative Methods • correlation • formal def’n plus sample statistic (“Pearson’s r”) • covariance • formal def’n plus sample statistic provide a quantiative measure of systematic LINEAR relationships

  16. Covariance Formal Definition • given two random variables X and Y, the covariance is • E{ } - expected value • sign of the covariance indicates the sign of the slope of the systematic linear relationship • positive value --> positive slope • negative value --> negative slope • issue - covariance is SCALE DEPENDENT

  17. Covariance • motivation for covariance as a measure of systematic linear relationship • look at pairs of departures about the mean of X, Y Y Y X X mean of X, Y mean of X, Y

  18. strong linear relationship with negative slope strong linear relationship with positive slope Correlation • is the “dimensionless” covariance • divide covariance by standard dev’ns of X, Y • formal definition • properties • dimensionless • range Note - the correlation gives NO information about the actual numerical value of the slope.

  19. Estimating Covariance, Correlation … from process data (with N pairs of observations) Sample Covariance Sample Correlation

  20. Making Inferences The sample covariance and corrleration are STATISTICS, and have their own probability distributions. Confidence interval for sample correlation - • the following is approximately distributed as the standard normal random variable • derive confidence limits for and convert to confidence limits for the true correlation using tanh

  21. Confidence Interval for Correlation Procedure 1. find for desired confidence level 2. confidence interval for is 3. convert to limits to confidence limits for correlation by taking tanh of the limits in step 2 A hypothesis test can also be performed using this function of the correlation and comparing to the standard normal distribution

  22. Example - Solder Thickness Objective - study the effect of temperature on solder thickness Data - in pairs Solder Temperature (C) Solder Thickness (microns) 245 171.6 215 201.1 218 213.2 265 153.3 251 178.9 213 226.6 234 190.3 257 171 244 197.5 225 209.8

  23. Example - Solder Thickness

  24. Example - Solder Thickness Confidence Interval zalpha/2 of 1.96 (95% confidence level) limits in tanh^-1(rho) -2.329837282 -0.848216548 limits in rho -0.981238575 -0.690136605

  25. Empirical Modeling - Terminology • response • “dependent” variable - responds to changes in other variables • the response is the characteristic of interest which we are trying to predict • explanatory variable • “independent” variable, regressor variable, input, factor • these are the quantities that we believe have an influence on the response • parameter • coefficients in the model that describe how the regressors influence the response

  26. Models When we are estimating a model from data, we consider the following form: response “random error” parameters explanatory variables

  27. The Random Error Term • is included to reflect fact that measured data contain variability • successive measurements under the same conditions (values of the explanatory variables) are likely to be slightly different • this is the stochastic component • the functional form describes the deterministic component • random error is not necessarily the result of mistakes in experimental procedures - reflects inherent variability • “noise”

  28. Types of Models • linear/nonlinear in the parameters • linear/nonlinear in the explanatory variables • number of response variables • single response (standard regression) • multi-response (or “multivariate” models) From the perspective of statistical model-building, the key point is whether the model is linear or nonlinear in the PARAMETERS.

  29. Linear Regression Models • linear in the parameters • can be nonlinear in the regressors

  30. Nonlinear Regression Models • nonlinear in the parameters • e.g., Arrhenius rate expression nonlinear linear (if E is fixed)

  31. Nonlinear Regression Models • sometimes transformably linear • start withand take ln of both sides to producewhich is of the form linear in the parameters

  32. Transformations • note that linearizing the nonlinear equation by transformation can lead to misleading estimates if the proper estimation method is not used • transforming the data can alter the statistical distribution of the random error term

  33. Ordinary LS vs. Multi-Response • single response (ordinary least squares) • multi-response (e.g., Partial Least Squares) • issue - joint behaviour of responses, noise We will be focussing on single response models.

  34. Linear Multiple Regression Model Equation random noise in i-th observation of response i-th observation of response (i-th data point) i-th value of explanatory variable X 1 i-th value of explanatory variable X p The intercept can be considered as corresponding to an X which always has the value “1”

  35. Assumptions for Least Squares Estimation Values of explanatory variables are known EXACTLY • random error is strictly in the response variable • practically - a random component will almost always be present in the explanatory variables as well • we assume that this component has a substantially smaller effect on the response than the random component in the response • if random fluctuations in the explanatory variables are important, consider alternative method (“Errors in Variables” approach)

  36. Assumptions for Least Squares Estimation The form of the equation provides an adequate representation for the data • can test adequacy of model as a diagnostic Variance of random error is CONSTANT over range of data collected • e.g., variance of random fluctuations in thickness measurements at high temperatures is the same as variance at low temperatures • data is “heteroscedastic” if the variance is not constant - different estimation procedure is required • thought - percentage error in instruments?

  37. Assumptions for Least Squares Estimation The random fluctuations in each measurement are statistically independent from those of other measurements • at same experimental conditions • at other experimental conditions • implies that random component has no “memory” • no correlation between measurements Random error term is normally distributed • typical assumption • not essential for least squares estimation • important when determining confidence intervals, conducting hypothesis tests

  38. Least Squares Estimation - graphically least squares - minimize sum of squared prediction errors o deterministic “true” relationship response (solder thickness) o o o o o T prediction error “residual”

  39. More Notation and Terminology Random error is “independent, identically distributed” (I.I.D) -- can say that it is IID Normal Capitals - Y - denotes random variable - except in case of explanatory variable - capital used to denote formal def’n Lower case - y, x - denotes measured values of variables Model Measurement

  40. More Notation and Terminology Estimate - denoted by “hat” • examples - estimates of response, parameter Residual - difference between measured and predicted response

  41. Matrix Representation for Multiple Regression We can arrange the observations in “tabular” form - vector of observations, and matrix of explanatory values:

  42. Matrix Representation for Multiple Regression The model is written as: Nx1 vector Nx1 vector px1 vector Nxp matrix N --> number of data observations p --> number of parameters

  43. Least Squares Parameter Estimates We make the same assumptions as in the straight line regression case: • independent random noise components in each observation • explanatory variables known exactly - no randomness • variance constant over experimental region (identically distributed noise components)

  44. Residual Vector Given a set of parameter values , the residual vector is formed from the matrix expression:

  45. Sum of Squares of Residuals … is the same as before, but can be expressed as the squared length of the residual vector:

  46. Least Squares Parameter Estimates Find the set of parameter values that minimize the sum of squares of residuals (SSE) • apply necessary conditions for an optimum from calculus (stationary point) • system of N equations in p unknowns, with number of parameters < number of observations : over-determined system of equations • solution - set of parameter values that comes “closest to satisfying all equations” (in a least squares sense)

  47. Least Squares Parameter Estimates The solution is: generalized matrix inverse of X - generalization of standard concept of matrix inverse to case of non-square matrix case

  48. Example - Solder Thickness Let’s analyze the data considered for the straight line case: Model:

  49. Example - Solder Thickness In matrix form:

  50. Example - Solder Thickness In order to calculate the Least Squares Estimates:

More Related