1 / 20

Do Corrections for Selectivity Bias Improve Forecasts of Navy Officer Retention?

Do Corrections for Selectivity Bias Improve Forecasts of Navy Officer Retention? Principal Investigator: Walter Mayer Research Assistant: Yang Li. General Forecasting Problem. Forecast Stay/Leave Decisions of Navy Officers Conditional on:

Download Presentation

Do Corrections for Selectivity Bias Improve Forecasts of Navy Officer Retention?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Do Corrections for Selectivity Bias Improve Forecasts of Navy Officer Retention? Principal Investigator: Walter Mayer Research Assistant: Yang Li

  2. General Forecasting Problem Forecast Stay/Leave Decisions of Navy Officers Conditional on: • Index of military-civilian earnings differential • Conditions in civilian Economy • Individual attributes • Chose to stay in previous periods (Selectivity Bias)

  3. Issue: Do “Corrections” for Selectivity Bias Improve Forecasts? • Estimate more parameters or restrict sample • “Corrections” decrease bias but increase variance: • Mean squared errors?

  4. Important Factors? • Sample size • Ratio of “stays” and “leaves” in sample • Economic conditions

  5. Previous Studies • Not been previously investigated for our model • Type 2 Tobit Model Zuelke and Zeman (1991, RevStat) Leung and Yu (1996, Journal of Econometrics) • Evidence favors “uncorrected” techniques for certain sample sizes and model configurations • Did not consider semiparametric techniques

  6. Econometric Model Specification Sequence of Stay/Leave Decisions • Linear Random Effects yit* = xitb + ci + uit, yit* = net benefit of the decision to stay for the ith officer at decision point t (latent variable) xit = observable explanatory variables ci = unobserved individual effect uit = unobserved transitory error term i = 1,…N individuals; t =1,…,Ti decision points.

  7. Sampling Process • Observed Variables: (yit, xit), where yit = 1 if yit* >0 (stay in Navy) = 0 if otherwise (leave Navy) • Sample Selection for t>1: (yit,xit) is observed only if zit = 1 zit = 1 if yit-1 = …= yi1 =1 = 0 if otherwise.

  8. Forecasting Retention Model and Estimate it(xit, () ) =P(yit=1| yit-1=1,…yi1=1, xit) Predict “stay” iff. it(xit, () ) 0.50  Probit parametric model depends on , () Assumes normal errors • MSCORE semiparametric model depends on ; () is unrestricted Assumes conditional symmetry Use kernel regression to estimate it(xit, () )

  9. Corrections for Selectivity Bias • “Uncorrected” Methods restrict: it(xit, () ) = P(yit=1| xit) =git(xit) Pooled Probit assumes: () diagonal Pooled MSCORE: apply standard MSCORE to pooled cross-sections

  10. Corrections for Selectivity Bias (b) “Corrected” Methods allow: it(xit, () )  P(yit=1| xit)  Pooled Probit with () not diagonal Cost: Estimate more parameters  Pooled MSCORE: “Restrict Sample to set A” Ait = {xit: xit ≥ 0 or E(Yit| zit=1, xit)<0, t>1} Cost: Use fewer observations

  11. Overview of Findings Simulated data “leaves” 125 (24%), “stays” 397 (76%) • For “leaves” uncorrected methods best: MSCORE, Probit • For “stays” corrected methods best: MSCORE, Probit • Does not depend on distribution of errors Normal distribution, T-distribution DF=3 • Probit versus MSCORE: MSCORE does slightly better with Non-normal errors; otherwise, neither method dominates

  12. Overview of Findings Navy Officers (Nuclear) data “leaves” 659 (44%), “stays” 847 (56%) • MSCORE: best method: For “leaves” uncorrected method For “stays” corrected method • Probit: best method: For “leaves” corrected method For “stays” uncorrected method • Probit versus MSCORE: neither method dominates

  13. Description of Data Nuclear Officers Data • Source: Navy’s Officer Master File • Data extracted and organized by ONR • Sample Characteristics: 4317 nuclear officers DP1 DP2 DP3 Y 4317 2075 1506 Stay(%) 2075(48%) 1506(73%) 847(56%)  7 independent variables: ACOL, UE, NONWHT,LOS, ACAD, ROTC, DEP

  14. Description of Data Simulated Data DP1 DP2 DP3 Y 1000 707 522 Stay(%) 707(71%) 522(74%) 397(76%)  Random number generators LIMDEP • 3 Decision Points • 2 independent variables • Errors: Normal, Non-normal (t-dist. df=3)

  15. Out-sample Prediction Simulation Result: Normal Error, Decision Point Three • Probit without Correction • Probit with Correction

  16. Out-sample Prediction Simulation Result: Normal Error, Decision Point Three • Mscore without Correction • Mscore with Correction

  17. Out-sample Prediction Simulation Result: Non-normal Error, Decision Point Three • Probit without Correction • Probit with Correction

  18. Out-sample Prediction Simulation Result: Non-normal Error, Decision Point Three • Mscore without Correction • Mscore with Correction

  19. Out-sample Prediction Nuclear Officers Data Result: Decision Point Three • Probit without Correction • Probit with Correction

  20. Out-sample Prediction Nuclear Officers Data Data Result: Decision Point Three • Mscore without Correction • Mscore with Correction

More Related