120 likes | 247 Views
Using interventions to increase the mobilization of knowledge in schools. Ben Levin & Amanda Cooper With assistance of Shalini Mascarenhas and Kathy Thompson AERA 2010. Context. Growing interest in research use in education Most research focus has been on researchers
E N D
Using interventions to increase the mobilization of knowledge in schools Ben Levin & Amanda Cooper With assistance of Shalini Mascarenhas and Kathy Thompson AERA 2010
Context • Growing interest in research use in education • Most research focus has been on researchers • Less attention to take up in the school system • Weak mechanisms for research use in school systems • Interventions have been attempted (primarily in health) • Hard to get impact • Methodology challenges
Research Questions • What knowledge do educational leaders have about some important research findings related to improving secondary schools? • What were the sources of evidence they referred to in relation to these findings? • How do they rate the importance of each source to their acceptance of the findings? • What interventions might improve the availability and use of research in schools?
Survey Sample • 11 school districts • 100 secondary schools • Potential respondents (superintendents, principals, vice-principals, and others in leadership roles) estimated 350 • Pre-intervention: • 188 responses • Post-Intervention • 169 responses
Findings on Knowledge Claims • Agreement on 3 knowledge claims: • Student disengagement (94% agreement) [Evidence – true] • Similar demographics, different outcomes (79% agreement) [Evidence – true] • Quality of teaching and learning (87% agreement) [Evidence – true] • Disagreement on 3 knowledge claims: • Failing a course affects drop out (63% agree; 24% disagree) [Evidence – true] • Grades predict post-secondary success (39% agree; 36% disagree) [Evidence – false] • Students believe that school prepares them (37% agree; 36% disagree) [Evidence – false]
Sources of Knowledge Reported Across all Claims PD events, Seminars, Conferences Research Reports Data Collected in your School District Colleagues , Professional networks Personal Experience
District commitment variable Did not appear related to district size Was related to existing district structures and priorities Participation uneven within districts Lack of skills, capacity, infrastructure for this work even in large districts Implementation Was Difficult
Measuring Impact • Could not see impacts in the post-intervention survey data • Promising design • Intermediate outcome (awareness of particular knowledge) as indicator of impact (Lavis et al., 2003) • Method challenges • Ensure more consistency in who responds to pre and post surveys • Ensure that respondents have been participants in interventions
References • Lavis, J., Ross, S., McLeod, C., & Gildiner, A. (2003). Measuring the impact of health research. Journal of Health Services Research & Policy, 8(3), 165-170. • www.oise.utoronto.ca/rspe Research Supporting Practice in Education (RSPE) program Thank You!