631 likes | 1.51k Views
ECONOMETRICS I. CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION. Textbook: Damodar N. Gujarati (2004) Basic Econometrics , 4th edition, The McGraw-Hill Companies. 3.1 THE METHOD OF ORDINARY LEAST SQUARES. PRF: SRF: How is SRF determined?
E N D
ECONOMETRICS I CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION • Textbook: Damodar N. Gujarati (2004) Basic Econometrics, 4th edition, The McGraw-Hill Companies
3.1 THE METHOD OF ORDINARY LEAST SQUARES • PRF: • SRF: • How is SRF determined? • We do not minimize thesum of theresiduals! • Why not?
3.1 THE METHOD OF ORDINARY LEAST SQUARES • We adopt the least-squares criterion • We want to minimize the sum of the squared residuals. • This sum is a function of estimated parameters: • Normal equations:
3.1 THE METHOD OF ORDINARY LEAST SQUARES • Solving the normal equations simultaneously, we obtain the following: • Beta2-hat can be alternatively expressed as the following:
Three Statistical Properties of OLS Estimators I. The OLS estimators are expressed solely in terms of the observable quantities (i.e. X and Y). Therefore they can easily be computed. II. They are point estimators (not interval estimators). Given the sample, each estimator provide only a single (point) value of the relevant population parameter. III. Once the OLS estimates are obtained from the sample data, the sample regression line can be easily obtained.
The properties of the regression line • Itpassesthroughthesamplemeans of Y and X.
The properties of the regression line 3. The mean value of the residuals is zero.
3.2 The Classical Linear Regression Model: TheAssumptionsUnderlyingtheMethod of LeastSquares
3.2 The Classical Linear Regression Model: TheAssumptionsUnderlyingtheMethod of LeastSquares
3.2 The Classical Linear Regression Model: TheAssumptionsUnderlyingtheMethod of LeastSquares
3.2 The Classical Linear Regression Model: TheAssumptionsUnderlyingtheMethod of LeastSquares
3.2 The Classical Linear Regression Model: TheAssumptionsUnderlyingtheMethod of LeastSquares
3.2 The Classical Linear Regression Model: TheAssumptionsUnderlyingtheMethod of LeastSquares
3.2 The Classical Linear Regression Model: TheAssumptionsUnderlyingtheMethod of LeastSquares • Example of perfectmulticollinearity: X1 = 2X2+X3
PRECISION OR STANDARD ERRORS OF LEAST SQUARES ESTIMATES • var: variance • se: standarderror • : theconstanthomoscedasticvariance of ui • : thestandarderror of theestimate • : OLS estimator of
Gauss – MarkovTheorem • Anestimator, say the OLS estimator, is said to be a best linear unbiasedestimator (BLUE) of β2 if the following hold:
Thecoefficient of determination r2 • TSS: total sum of squares • ESS: explained sum of squares • RSS: residual sum of squares
Thecoefficient of determination r2 The quantity r2 thus defined is known as the (sample) coefficient of determination and is the most commonly used measure of the goodness of fit of a regression line. Verbally, r2 measures the proportion or percentage of the total variation in Y explained by the regression model.
The coefficient of correlation r r is the sample correlation coeffient
Homework • Study the numerical example on pages 87-90. There will be questions on the midterm exam similar to the ones in this example. • Data on page 88: