460 likes | 633 Views
Improving Teacher Quality Grants, Cycle 3: External Evaluation Report. December 8, 2006 University of Missouri-Columbia Evaluation Team. Principal Investigators Sandra Abell Fran Arbaugh James Cole Mark Ehlert John Lannin Rose Marra. Graduate Research Assistants Kristy Halverson
E N D
Improving Teacher Quality Grants, Cycle 3:External Evaluation Report December 8, 2006 University of Missouri-Columbia Evaluation Team
Principal Investigators Sandra Abell Fran Arbaugh James Cole Mark Ehlert John Lannin Rose Marra Graduate Research Assistants Kristy Halverson Kristen Hutchins Zeynep Kaymaz Michele Lee Dominike Merle Meredith Park Rogers Chia-Yu Wang Evaluation Team
Context of the Evaluation • Improving Teacher Quality grants program, Cycle 3, 2005-2006 • Focus on high-need schools • 9 professional development projects • Science and Mathematics, grades 4-8
Evaluation Model Adapted from Guskey, 2000
Purpose of Evaluation • Formative evaluation • PD environment evaluation • Summative evaluation • Participant reaction • Participant learning—content knowledge and inquiry • Participant use of knowledge • Organization change • Student learning
Methods—Formative • Site visits • Interviews: teachers and staff • Observations • Formative feedback report
Methods—PD Environment • Teacher Participant Data Questionnaire • Site visits • Interviews: teachers and staff • Observations • Surveys to PIs (TeachingPhilosophy Survey and Seven Principles) • PI preliminary report
Methods-Outcomes • Participant reactions • Site visits • Teacher Participant Survey 1 and 2 • Participant learning—content knowledge • Project-specific tests (all 9 projects) • Participant learning—inquiry • Teaching Philosophy Survey • Seven Principles • Participant use of knowledge • Teacher Participant Survey • Interviews • Seven Principles • Implementation Logs
Methods--Outcomes • Organization change • Higher Education Impact Survey • Student learning • Teacher-assessed (3projects) • Teacher Participant Survey • MAP analyses
Participant Summary • 252 participants • 86% female; 81% white • 40% held a masters degree or higher • 76% held their first Bachelor’s degree in a field other than science or math • Represented 76 different Missouri school districts, 6 private schools, and 2 charter schools • Directly impacted 16,747 students in 2005-2006
Results • PD Environment • Participant Reactions • Outcomes • Participant Content Knowledge • Participant Knowledge of Inquiry • Participant Use of Knowledge of Inquiry • Organization Change • Student Learning
PD Environment—PI Beliefs (n=19) least constructivist response = 1, neutral = 3, most constructivist = 5
Participant Reactions 1-5 scale
Participant Performance on Content Knowledge—Post/Pre Tests Posttest scores presented as a percent of pretest scores.
Participant Change in Inquiry Knowledge *p < .05. **p < .01. ***p < .001
Participant Change in Inquiry Usage *p < .05. **p < .01. ***p < .001
Participant Use of Knowledge Based on PD Components n=116 0-4 scale
Organization Change--Impact on Higher Education • Team members from five projects responded to HEI Survey • Establishment of new science courses related to the PD projects • Establishment of new education courses • Redesign of courses to include more inquiry-based labs • New or strengthened collaborations between education and science • Increased grant writing activity on campus
School Level Performance on MAP • Map Index and % Top 2 Levels • Served vs not served schools by High Needs status • Science – 2005-06 compared prior years’ average performance • Math – no historical comparison possible: examined performance levels by group
Summary of Results • Teachers were overall satisfied with PD experiences • Valued most: staff, engaging in activities as students would, opportunity to improve content knowledge, working with other teachers; • Valued least: lectures,activities geared toward a different grade level or subject matter than they taught, loosely structured follow-up sessions with no clear purpose.
Summary of Results (cont) • Assessment components less emphasized than content and inquiry components. • Teachers gained content knowledge • Evidence of some improved teacher practice attributed to projects. • Student learning data mixed. • Evidence of impact on higher education is limited but promising in some projects.
Conclusions: Effective Project Design Features • Projects demonstrated effective practice to varying degrees. • Alignment of content emphasis areas between projects and teacher/school needs is critical. • Shared vision/collaboration with team implemented in a variety of ways. • Effective emphasis areas: learning science/math through inquiry; collegial learning with teachers; long-term PD activities; sense of community.
Conclusions (cont.) • The “smorgasbord” approach – while well intentioned seemed difficult to carry out. • Emphasis on mathematics in overall cycle 3 ITQG program was somewhat limited. • Individual projects improve over time. • Evaluator role balance between program and projects continues to be an issue.
Limitations • Necessity of sampling. • Instruments align with overall program not specific projects. • Low overall response rates • Implementation Logs • End of Project instruments • Higher education impact • Overall evaluation vs. project specific. • Lack of and alignment of student achievement data. • Impact on evaluation due to ongoing team collaboration with PIs and K-12 partners.
Recommendations Project Directors: • Continue to build strong relations among PIs and instructional staff. • Build stronger K-12 partnerships. • Balance content and pedagogy. • Emphasize and provide opportunities for practice and feedback on classroom assessment. • Encourage participation in evaluation activities. • Take advantage of formative feedback. • Use literature on best practice when designing and implementing PD.
Recommendations External Evaluators: • Explore ways to reduce participant time on evaluation. • Be proactive in working with PIs and K-12 organizations. • Continue to work with PIs through all phases of evaluation. • Work with MDHE to examine our roles as evaluators.
Recommendations MDHE: • Continue funding multi-year projects. • Encourage true partnerships via RFP wording and reward systems. • Require that the majority of participants are from high-needs districts. • Require minimum hours of PD per project. • Support PI cross-fertilization of best practices.
Questions Copies of the report and Executive Summary available at: www.pdeval.missouri.edu