80 likes | 123 Views
Between a Rock and a Hard Place. Scientifically Based Evidence in Evaluation Presentation to the Canadian Evaluation Society Vancouver, B.C. June 3, 2003. What Works Clearinghouse. A new U.S. federal initiative
E N D
Between a Rock and a Hard Place Scientifically Based Evidence in Evaluation Presentation to the Canadian Evaluation Society Vancouver, B.C. June 3, 2003
What Works Clearinghouse • A new U.S. federal initiative • To be an ongoing public resource that will assess and report scientific evidence on “What Works?" in education • Systematic review processes will be used to report on the quantity, quality, and relevance of evidence, and magnitude of effects of specific educational interventions
What is empirical evidence? • Scientifically-based research from fields such as psychology, sociology, speech & hearing, economics, and neuroscience, and especially from research in educational settings • Empirical data on performance used to compare, evaluate, and monitor progress
Quality: Levels of Evidence All evidence is NOT equal for questions of effectiveness (what works) • Randomized trial (true experiment) • Comparison groups (quasi-experiment) • Pre-/Post- comparison • Corelational studies • Case studies • Anecdotes
What might this mean for evaluators? • Evaluation questions reduced to what works questions. • Evaluation costs redirected from complex studies of multiple variables to RCT’s • Evaluators’ bookshelves filling up with research design texts and pushing evaluation texts aside • Logic models falling into disuse • Evaluations tapped for inclusion in reviews/syntheses often not complete • Client requests for RCT evaluations
Randomized Trials: The gold standard • Claims about the effects of an educational intervention on outcomes • Two or more conditions that differ in levels of exposure to the educational intervention • Random assignment to conditions • Tests for differences in outcomes