170 likes | 870 Views
How can we build a culture of evaluation, so that many people contribute to evaluation?How can we provide a context for evaluation strategies and results?How can we conduct evaluation that helps with decision making?. Key Questions for Libraries. Culture of Assessment. ??organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders.".
E N D
1. Tools for Creating a Culture of AssessmentThe CIPP Model and Utilization-Focused Evaluation Yvonne Belanger, Duke University
Library Assessment Conference
September 25-27, 2006
Charlottesville, VA
2. How can we build a culture of evaluation, so that many people contribute to evaluation?
How can we provide a context for evaluation strategies and results?
How can we conduct evaluation that helps with decision making? Key Questions for Libraries Overview
Culture, Context, and Conducting evaluation
Varying evaluation resources available at different institutions
Drive toward decision-making is acceptable
Today’s presentation
A framework and key issues to consider in evaluation planning
Specific tools, templates and strategy – focus on the CIPP model and utilization-focused evaluation
Overview
Culture, Context, and Conducting evaluation
Varying evaluation resources available at different institutions
Drive toward decision-making is acceptable
Today’s presentation
A framework and key issues to consider in evaluation planning
Specific tools, templates and strategy – focus on the CIPP model and utilization-focused evaluation
3. Culture of Assessment “…organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders.”
4. Barriers to a Culture of Assessment Lack of evaluative thinking (at all levels)
Lack of engagement in evaluation
Pseudoevaluations (Stufflebeam, 1999)
Promote a positive or negative view of a program, irrespective of its actual merit and worth Lack of evaluative thinking – includes intuition-based rather than data driven decision making, lack of systems thinking – how does what I am actually doing connect back to my goals and forward to my intended outcomes
PR Inspired studies (withholding all negatives)
Politically controlled studies
Other factors discussed by Jim Self and Steve Hiller and others at this conference – need for clear leadership, specific person or group tasked with assessment
Lack of evaluative thinking – includes intuition-based rather than data driven decision making, lack of systems thinking – how does what I am actually doing connect back to my goals and forward to my intended outcomes
PR Inspired studies (withholding all negatives)
Politically controlled studies
Other factors discussed by Jim Self and Steve Hiller and others at this conference – need for clear leadership, specific person or group tasked with assessment
5. Building evaluative thinking: CIPP Model Stufflebeam’s CIPP Model - Context, Input, Process and Product evaluation
Focus: decision-making
Purpose: facilitate rational and continuing decision-making, particularly for programs and services with long-term goals
A comprehensive framework for guiding formative and summative evaluations
Based on a presumption that evaluation’s most important purpose is not to prove but to improve programsA comprehensive framework for guiding formative and summative evaluations
Based on a presumption that evaluation’s most important purpose is not to prove but to improve programs
6. Details of the CIPP Model CIPP
Context: Environment & Needs
Input: Strategies & Resources
Process: Monitoring implementation
Product: Outcomes - both quality and significance
More information at www.wmich.edu/evalctr/pubs The CIPP was developed by D. Stufflebeam (see annotated bibliography for references)
A comprehensive framework for guiding formative and summative evaluations
Based on a presumption that evaluation’s most important purpose is not to prove but to improve programs
Has evolved over 30 years but remained up to date with new ideas from evolving approaches – e.g. Patton’s Utilization-focused evaluation, Guba & Lincoln’s Stakeholder focused evaluation
CIPP adapts well to carrying out evaluations on any scale (projects, programs, organizations)
An organizing framework, not a lockstep linear process
Sensitive to needs of decision makers (more detail on that ahead…)
Systems approach – for that reason, using logic modeling to get a systems view of projects and programs can be a useful first step
Multiple observers and informants
Mining existing information
Multiple procedures for gathering data; cross-check qualitative and quantitative
Independent review by stakeholders and outside groups
Feedback from Stakeholders
The CIPP was developed by D. Stufflebeam (see annotated bibliography for references)
A comprehensive framework for guiding formative and summative evaluations
Based on a presumption that evaluation’s most important purpose is not to prove but to improve programs
Has evolved over 30 years but remained up to date with new ideas from evolving approaches – e.g. Patton’s Utilization-focused evaluation, Guba & Lincoln’s Stakeholder focused evaluation
CIPP adapts well to carrying out evaluations on any scale (projects, programs, organizations)
An organizing framework, not a lockstep linear process
Sensitive to needs of decision makers (more detail on that ahead…)
Systems approach – for that reason, using logic modeling to get a systems view of projects and programs can be a useful first step
Multiple observers and informants
Mining existing information
Multiple procedures for gathering data; cross-check qualitative and quantitative
Independent review by stakeholders and outside groups
Feedback from Stakeholders
7. CIPP approach recognizes… “All politics are local” – offers a tailored evaluation approach designed to answer locally interesting & useful questions, emphasis is on credibility and usefulness rather than generalizability to other places, times, audiences Tips taken from Stufflebeam’s recent writings on using the CIPP approach (OPEN, 2003):
Multiple observers and informants
Cross-check often referred to as “triangulating”
Multiple procedures for gathering data
Mining existing information
Independent review
Stakeholder feedback “All politics are local” – a tailored evaluation approach designed to answer locally interesting & useful questions, emphasis is on credibility and usefulness rather than generalizability to other places, times, audiencesTips taken from Stufflebeam’s recent writings on using the CIPP approach (OPEN, 2003):
Multiple observers and informants
Cross-check often referred to as “triangulating”
Multiple procedures for gathering data
Mining existing information
Independent review
Stakeholder feedback “All politics are local” – a tailored evaluation approach designed to answer locally interesting & useful questions, emphasis is on credibility and usefulness rather than generalizability to other places, times, audiences
8. CIPP View of Institutionalized Evaluation CIPP provides a systematic way of thinking about how evaluation can contribute to short term and long term organizational planning
CIPP for Decision Makers
C: Define goals and priorities
I: Assess competing proposals in terms of feasibility, alignment with goals
P: Provide context for interpreting outcomes, plan for service improvement
P: Keep organization focused on achieving important outcomes, gauge success of efforts
Connects manager / decision-maker thinking with an evaluation structure that all staff can contribute to and see themselves as a part of
Stufflebeam sees Input as potentially the most neglected type of evaluation (Stufflebeam, OPEN, 2003)
Provides a framework for integrating evaluation as an activity central to achieving broader organizational goals
Illustrates the focus of the model on use of evaluation information to shape goals, plans, and actionsCIPP provides a systematic way of thinking about how evaluation can contribute to short term and long term organizational planning
CIPP for Decision Makers
C: Define goals and priorities
I: Assess competing proposals in terms of feasibility, alignment with goals
P: Provide context for interpreting outcomes, plan for service improvement
P: Keep organization focused on achieving important outcomes, gauge success of efforts
Connects manager / decision-maker thinking with an evaluation structure that all staff can contribute to and see themselves as a part of
Stufflebeam sees Input as potentially the most neglected type of evaluation (Stufflebeam, OPEN, 2003)
Provides a framework for integrating evaluation as an activity central to achieving broader organizational goals
Illustrates the focus of the model on use of evaluation information to shape goals, plans, and actions
9. Advantages of the CIPP Model Adapts well to carrying out evaluations on any scale (projects, programs, organizations)
An organizing framework, not a lockstep linear process
Sensitive to needs of decision makers
Systems approach – encourages a systems view of projects and programs
10. Building evaluative thinking and engagement: Utilization-focused evaluation approach Taking a utilization-focused approach means asking…
Why is this evaluation being undertaken?
What decisions need to be made with the results?
Who will be most affected by those decisions?
How can we engage those people in the entire evaluation process? All participants in an evaluation should be clear as to why the evaluation is being conducted, whether or not the results (or all of the results) will be shared publicly, internally only, only with key decision-makers, etc. and how the outcomes of the evaluation might affect them. Failing to follow these procedures will jepoardize your efforts to build a culture of assessment by destroying good will for assessment efforts, contribute to negative view of assessment and increase the paranoia of any staff who already feel threatened by these efforts. All participants in an evaluation should be clear as to why the evaluation is being conducted, whether or not the results (or all of the results) will be shared publicly, internally only, only with key decision-makers, etc. and how the outcomes of the evaluation might affect them. Failing to follow these procedures will jepoardize your efforts to build a culture of assessment by destroying good will for assessment efforts, contribute to negative view of assessment and increase the paranoia of any staff who already feel threatened by these efforts.
11. Utilization-focused evaluation Premise – by engaging stakeholders in the entire evaluation process from design to implementation of recommendations
Evaluation addresses questions of greatest importance to those in a position to directly make use of its findings
Reduces the cultural barriers that can inhibit use of results by increasing transparency, empowering stakeholders
12. Another advantage of the Utilization-focused approach “Process Use” benefits
First described by Patton - ‘ways in which being engaged in the processes of evaluation can be useful quite apart from the findings that may emerge from these processes’
Four types of Process Use
1. Enhancing shared understandings, especially about results;
2. Supporting and reinforcing the object of the evaluation through intervention-oriented evaluation;
3. Increasing participants’ engagement, sense of ownership
4. Organizational development Patton (1997) Utilization-focused evaluation – New Century Text (3rd ed) pp. 88-91)Patton (1997) Utilization-focused evaluation – New Century Text (3rd ed) pp. 88-91)
13. Process Use & Culture of Assessment Increased capacity to make use of evaluation findings
Know how to use evaluation information – producing better evaluation users in the organization who can effectively “weigh evidence, consider contradictions and inconsistencies, articulate values, and examine assumptions” through their experiences interpret evidence, draw conclusions, and make judgmentsthrough their experiences interpret evidence, draw conclusions, and make judgments
14. Example
Evaluation of the Duke iPod experiment & Duke Digital Initiative…
15. Summary Foster a culture of assessment by:
Adopting frameworks that support decision-making
Engaging staff as stakeholders in the entire process of evaluation from design to implementation of recommendations
Leverage the opportunity of Process Use to develop staff and make them more saavy evaluation consumers
16. Final Thoughts… “…evaluation's most important purpose is not to prove, but to improve.”
Daniel Stufflebeam
(CIPP Model)
“Research is aimed at truth. Evaluation is aimed at action.”
Michael Quinn Patton
(Utilization-focused Evaluation) Michael Patton, Former President of AEA, leader in evaluation
"Research is aimed at truth. Evaluation is aimed at action.”
Research efforts often focus on a particular variable, and is often narrowly focused to answer only question.
Michael Patton, Former President of AEA, leader in evaluation
"Research is aimed at truth. Evaluation is aimed at action.”
Research efforts often focus on a particular variable, and is often narrowly focused to answer only question.
17. Thank You! Yvonne BelangerHead, Program Evaluation
Academic Technology & Instructional Services
Perkins Library
Duke University
yvonne.belanger@duke.edu
18. References Stufflebeam, D. (1999). Foundational models for 21st century program evaluation.
Stufflebeam, D. (2003). The CIPP Model for Evaluation: An update, a review of the model’s development, a checklist to guide implementation. Paper read at Oregon Program Evaluators Network Conference, at Portland, OR. http://www.wmich.edu/evalctr/pubs/CIPP-ModelOregon10-03.pdf
Patton, M. Q. (2004). "On evaluation use: Evaluative thinking and process use." The Evaluation Exchange IX(4).
Patton, M. Q. 1997. Utilization-focused evaluation: The new century text (3rd ed.). Thousand Oaks, CA: Sage.