340 likes | 477 Views
Evaluation as Continuous Improvement. The Health Disparities Service-Learning Collaborative Suzanne B Cashman February 27, 2007. “Good evaluation” is nothing more than “good thinking”.
E N D
Evaluation as Continuous Improvement The Health Disparities Service-Learning Collaborative Suzanne B Cashman February 27, 2007
“Good evaluation” is nothing more than “good thinking” It is the systematic collection of information about activities, characteristics and outcomes of programs, personnel, and products to use to reduce uncertainties, improve effectiveness and make decisions. Patton, 1997
Evaluation as Assessment/Improvement • Mechanism to tell the story • Becomes less of a burdensome add-on • Useful learnings • For yourself • For others
Why Evaluate? • Reduce uncertainties • Measure program achievement • Improve effectiveness • Demonstrate accountability • Make programmatic decisions • Build constituency • Influence policy
Program Evaluation • Commitment to following the “rule” of social research • But more than application of methods… • also a political and managerial activity; input into process of policy making and allocation for planning, design, implementing, continuing programs
Program Evaluation • Rooted in scientific methodology, but responsive to resource constraints, needs/purposes of stakeholders, and nature of evaluation setting
Key Questions • What is the aim of the assessment? • Who/What wants/needs the information? • What resources are available? • Who will conduct the assessment? • How can you ensure results are used?
Evaluation should: • Strengthen projects • Use multiple approaches • Address real issues • Create a participatory process • Allow for flexibility • Build capacity WKKellogg Foundation, 1998
Evaluation(should tell us……………) • What has been done • How well it has been done • How much has been done • How effective the work/program has been
Reasons to Evaluate • Measure program achievement • Demonstrate accountability • Examine resource needs and effectiveness • Improve operations, obtain and give feedback • Influence policy • Expand voices
Evaluation Framework(CDC) I. Engage Stakeholders II. Describe Program III. Focus the Design IV. Gather Credible Evidence V. Justify Conclusions VI. Ensure Use and Share Lessons Learned
I. Stakeholders • People who have a “stake” in what will be learned from an evaluation and what will be done with the knowledge • They include: • People who manage or work in the program/organization • People who are served or affected by the program, or who work in partnership with the program • People who are in a position to do or to decide something about the program CDC, 1998
Stakeholders • Stakeholders’ information needs and intended uses serve to focus the evaluation • Variety of stakeholders may mean: • more than one focus (policy implications vs documentation of local activities) • varied levels of involvement
Who are your stakeholders? How do their needs and desires differ from one another? Stakeholders
II. Describe Program • Need • Expectations • Activities • Context
Expectations • Outcome Objectives • statement of the amount of change expected for a given problem/condition for a specified population within a given timeframe • Process Objectives • statement regarding the amount of change expected in the performance or utilization of interventions that are related to the outcome
III. Focus the Design • Questions to answer • Process to follow • Methods to use • Activities to develop and implement • Results to disseminate
Clarify • Individual, Systems, or Community Level • Individual: individually targeted services or programs, often for people at high-risk • Systems: change organizations, policies, laws, or structures • Community: focus is on community norms, attitudes, beliefs, practices
IV. Gather Credible Evidence • Types of data • Demographic, health status, expenditures, quality of life, eligibility, utilization, capacity • Sources of data • Statistical reports, published studies, voluntary organizations, program reports, media articles, government reports, state surveys
Thinking about data • Match the data to the questions – what kinds of information would be worthwhile? • As much as possible, use data that are being created as a regular part of the program • Collect and analyze data from multiple perspectives • Keep available resources in mind
Thinking about data(continued) • Where might we find them? • How might we obtain them? • What types should we consider? • What do we do now that we have them?
Who can help us collect and make sense of data? • Community partners • Student participants • College administrative offices • Faculty colleagues (and their students) • Students who participated in previous programs • Campus service-learning centers
Indicators of Well-being: Dimensions to Consider(Cohen, MDPH) TraditionalLess Traditional Assets Social indications, Resiliency, Quality of life, Satisfaction, Self-reports Resources & Investment of health Deficits Disease, Utilization of Gaps among groups, medical services Education, Economics, Cultural, Safety deficits
(Cont) TraditionalLess Traditional Assets Use of pre-natal care Quality adjusted life years Self-reported health Social networks Screening rates Rescue response time % insured Support for needle exchange Graduation rate Volunteerism Deficits Age - adjusted death rate Lack of support for arts/culture Hospitalizations Crimes per capita Smoking prevalence
Specific Data Collection Methods • Surveys • Interviews • Focus groups • Literature search • Structured observations • Critical events log • Institutional documentation
Now that we have the data…... • Analyze • Quantitative (statistical software) • Qualitative (systematic review and assessment) • Synthesize information • Follow framework of concepts • Write reports • Disseminate
V. Justify Conclusions • Review findings • What do they mean? How significant are they? • How do the findings compare to the objectives of the program? • What claims or recommendations are indicated?
VI. Ensure Use and Share Lessons • Through deliberate planning, preparation, and follow-up • Collaborate with stakeholders for meaningful: • Communication of results (process and outcome) • Decisions based on results • New assessment plans emerging from results • Reflection on the assessment process
Challenges • Important things difficult to measure • Complexity • Measurement validity • Time • Proof of causation • Need to be sensitive to context • Resources
Challenges • What are the challenges you face?
Summary: Characteristics of Evaluation • Evolving process • Variety of approaches • More than collecting and analyzing data • Critical design issues • Reconciles competing expectations • Recognizes and engages stakeholders
References • Bell R, Furco A, Ammon M, Muller P, Sorgen V. Institutionalizing Service-Learning in Higher Education. Berkeley: University of California. 2000. • Centers for Disease Control and Prevention. Practical Evaluation of Public Health Programs. 1998. • Kramer M. Make It Last: The Institutionalization of Service-Learning in America. Washington, DC: Corporation for National Service. 2000. • Patton M. Utilization-Focused Evaluation. Sage Publications. 1997. • WKKellogg Foundation. Evaluation Handbook. 1998.