110 likes | 221 Views
The ( Mis )Use Of Statistics In Evaluating ASP Programs. O.J. Salinas & Jon McClanahan University of North Carolina School of Law. Successful Program?. Bar Prep Program Negatively Correlated With Bar Passage!. Successful Program?. ASP Program Causes Significant Improvements in Grades!.
E N D
The (Mis)UseOf Statistics In Evaluating ASP Programs O.J. Salinas & Jon McClanahan University of North Carolina School of Law
Successful Program? Bar Prep Program Negatively Correlated With Bar Passage!
Successful Program? ASP Program Causes Significant Improvements in Grades!
Sampling Basics Population of Interest Measurement Sample(s)
Introducing Error During Sampling • Misidentifying Population of Interest • Sampling Bias • Self-Selection (Volunteer) Bias • Exclusion Bias • Using Improper Comparison Groups
Identifying Outcome Measures • Objective Measures • Performance in Individual Courses • GPA / Change in GPA • Performance on the Bar Examination • Subjective Measures • Program-specific Evaluation Forms • School-wide Evaluation Forms
Introducing Error During Measurement • Objective Measures • Instrument Bias • Confounding Variables • Subjective Measures • “Experiential” Bias • Loaded and Compound Questions
(Some) Study Design Best Practices • Take stock before implementation. • Proceed step-by-step in making program changes. • Take steps to reduce sampling bias, through outreach and alternatives. • Quantify and capture related data. • Evaluate along several measures.
Avoiding Pitfalls in Analyzing Data and Communicating Results • Provide Context and Baselines. • Acknowledge Limitations in Experiment Design and Data Analysis. • Distinguish Between Profiling and Predicting. • Avoid Overgeneralizations. • Remember: Not Everything is Quantifiable.
Thank You! O.J. Salinas osalinas@email.unc.edu Jon McClanahan jonmc@email.unc.edu