130 likes | 333 Views
LINEAR REGRESSION. By: Emmalyn S. Sumpay MSCN. Linear regression is used to make predictions about a single value. Simple linear regression involves discovering the equation for a line that most nearly fits the given data. That linear equation is then used to predict values for the data.
E N D
LINEAR REGRESSION By: Emmalyn S. Sumpay MSCN
Linear regression is used to make predictions about a single value. Simple linear regression involves discovering the equation for a line that most nearly fits the given data. That linear equation is then used to predict values for the data.
A technique in which a straight line is fitted to a set of datapoints to measure the effect of a single independent variable. The slope of the line is the measured impact of that variable.
Linear regression analyzes the relationship between two variables, X and Y. • the goal of linear regression is to find the line that best predicts Y from X. Linear regression does this by finding the line that minimizes the sum of the squares of the vertical distances of the points from the line.
Slope and intercept • Prism reports the best-fit values of the slope and intercept, along with their standard errors and confidence intervals. • The slope quantifies the steepness of the line. It equals the change in Y for each unit change in X. It is expressed in the units of the Y-axis divided by the units of the X-axis. If the slope is positive, Y increases as X increases. If the slope is negative, Y decreases as X increases. • The Y intercept is the Y value of the line when X equals zero. It defines the elevation of the line.
A linear regression line has an equation of the form Y = a + bX, where X is the explanatory variable and Y is the dependent variable. The slope of the line is b, and a is the intercept (the value of y when x = 0).
Least-Squares Regression • The most common method for fitting a regression line is the method of least-squares. This method calculates the best-fitting line for the observed data by minimizing the sum of the squares of the vertical deviations from each data point to the line (if a point lies on the fitted line exactly, then its vertical deviation is 0). Because the deviations are first squared, then summed, there are no cancellations between positive and negative values.
The names of the variables on the X and Y axes vary according to the field of application. Some of the more common usages are