410 likes | 431 Views
Program Evaluation Strategies to Improve Teaching for Learning. Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement Conference 2008. Introductions Overview for the session. Evaluation comfort levels. Research driven school design
E N D
Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement Conference 2008
Research driven school design Rigor, relevance, relationships
Evaluation can provide information about • Definition and evidence of the problem (needs assessment) • Contextual information about the factors related to the problem and solutions • The input factors and resources • The interventions, processes and strategies employed • Outcomes, results and effectiveness
Elements of a sound evaluation • Clear statement of what the project is intended to do and why • Needs assessment • Theory of action • Clear, measurable statement of goal attainment • Appropriate evaluation methods and tools • Transparency
Key concepts • Identify the total system impact of program and financial decisions • Institutions led by data and evidence of results
Types of Data • Achievement • Demographic • Program • Perception • Process • Costs
Qualities of Data & Information • Time series, repeated data collection– how does the effect change over time? • Cohort analysis – overall effect on one group over time • Benchmarking & standards – makes the data relative, establishes context and comparables
Triangulation – looking for information in multiple indicators • Pattern & trend analysis • Leading & lagging indicators
Practical matters • Accuracy • Reliability • Validity • Accessibility
Evaluation Employ evaluation strategies from the very beginning of a project and assemble and review effectiveness data. State measurable outcomes, document processes and review progress throughout the life of the project.
“Where outcomes are evaluated without knowledge of implementation, the results seldom provide a direction for action because the decision maker lacks information about what produced the outcomes (or lack of outcomes).” Michael Quinn PattenQuoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998
Central Evaluation Questions • Did the program / project do what was intended? • Did the project stick to the plan? There may be valid reasons for varying from the plan – if so what are they?
What is the theory of change for the project? • Why is this project being carried out this way and why is it judged to be the most appropriate way? • These questions are important because they help when judging the impact of changes in the plan.
What is the context for the project? • Issues include past trends, local politics, resource distribution, threats, opportunities, strengths and weaknesses
What actually happened during the course of the project? • Who was served by the project? Why? • Who was not served by the project? Why? • Was the project valued by the intended audience?
What were the inputs, resources – both real e.g., financial and material – and intellectual? • What was the “cost” of the project?
What were the results /outcomes? • Were goals and objectives met? • What were the intended and unintended consequences? • What was the impact on the overall system? • Was there a process impact – did the project result in a change in the way that business is done?
A successful project begins and ends with a good evaluation design
The best and most “sticky”, lasting interventions have the following components • They are based in research and evidence • They are locally constructed and customized
They are targeted to a clear view of the problem to be solved • They are built for sustainability • They are “owned” – not just a matter of compliance • They are designed to build local capacity
Understand change and apply research about change to improve teaching for learning Change takes trust Change takes building relationships Change takes endurance (time) Change takes knowledge of research
Examine policies and practices that serve as barriers and those that serve as catalysts to achievement
Evaluate Audit the system Measure results Change to goal not just awareness or implementation
Meta Evaluation Use meta evaluation strategies to look for results across projects and interventions
Evaluation readiness • Identify and clearly state program goals and outcomes • Transform these goals into measurable objectives • Define program theory and supporting research
Develop the formative and summative evaluation plan • Develop the plan to gather data, deploy evaluation resources and gather information from and report to stakeholders
Consider system evaluation policies and expectations • At the time of proposal initiatives, programs and projects should be designed to include program evaluation
“It’s easy to make judgments – that’s evaluation. It’s easy to ask questions about impact – that’s evaluation. It’s easy to disseminate reports – that’s evaluation. What’s hard is to put all those pieces together in a meaningful whole which tells people something they want to know and can use about a matter of importance. That’s evaluation.” HalcolmQuoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998
In the beginning you think. In the end you act. In between you negotiate the possibilities. Some people move from complexity to simplicity and on into catastrophe. Others move from simplicity to complexity and onward into full scale confusion. Simplification makes action possible in the face of overwhelming complexity. It also increases the odds of being wrong. The trick is to let a sense of simplicity inform our thinking, a sense of complexity inform our actions, and a sense of humility inform our judgments…” Michael Quinn Patten (p. 143 Bernhardt)Quoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998
Resources • Center for Evaluation & Education Policy www.ceep.indiana.edu • The Evaluation Center; Western Michigan University
More Resources • What Works in Schools Translating Research into Action by Robert J. Marzano • Data Analysis for Comprehensive Schoolwide Improvement; Victoria L. Bernhardt
More Resources • The “Data Wise” Improvement Process: Eight steps for using test data to improve teaching and learning, by Kathryn Parker Boudett et al in Harvard Education Letter, January/February 2006 Volume 22, Number 1.
Rossi Ray-Taylor, PhD rossi@raytaylorandassoc.org Nora Martin, PhDnora@raytaylorandassoc.org Ray.Taylor and Associates 2160 S. Huron Parkway, Suite 3 Ann Arbor, Michigan 48104 www.raytaylorandassoc.org