1 / 34

Evaluation as Continuous Improvement

Evaluation as Continuous Improvement. The Health Disparities Service-Learning Collaborative Suzanne B Cashman February 27, 2007. “Good evaluation” is nothing more than “good thinking”.

Download Presentation

Evaluation as Continuous Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation as Continuous Improvement The Health Disparities Service-Learning Collaborative Suzanne B Cashman February 27, 2007

  2. “Good evaluation” is nothing more than “good thinking” It is the systematic collection of information about activities, characteristics and outcomes of programs, personnel, and products to use to reduce uncertainties, improve effectiveness and make decisions. Patton, 1997

  3. Evaluation as Assessment/Improvement • Mechanism to tell the story • Becomes less of a burdensome add-on • Useful learnings • For yourself • For others

  4. Why Evaluate? • Reduce uncertainties • Measure program achievement • Improve effectiveness • Demonstrate accountability • Make programmatic decisions • Build constituency • Influence policy

  5. Why are you engaged in evaluation?

  6. Comparison of Academic Research and Practical Evaluation

  7. Program Evaluation • Commitment to following the “rule” of social research • But more than application of methods… • also a political and managerial activity; input into process of policy making and allocation for planning, design, implementing, continuing programs

  8. Program Evaluation • Rooted in scientific methodology, but responsive to resource constraints, needs/purposes of stakeholders, and nature of evaluation setting

  9. Key Questions • What is the aim of the assessment? • Who/What wants/needs the information? • What resources are available? • Who will conduct the assessment? • How can you ensure results are used?

  10. Evaluation should: • Strengthen projects • Use multiple approaches • Address real issues • Create a participatory process • Allow for flexibility • Build capacity WKKellogg Foundation, 1998

  11. Evaluation(should tell us……………) • What has been done • How well it has been done • How much has been done • How effective the work/program has been

  12. Reasons to Evaluate • Measure program achievement • Demonstrate accountability • Examine resource needs and effectiveness • Improve operations, obtain and give feedback • Influence policy • Expand voices

  13. Evaluation Framework(CDC) I. Engage Stakeholders II. Describe Program III. Focus the Design IV. Gather Credible Evidence V. Justify Conclusions VI. Ensure Use and Share Lessons Learned

  14. I. Stakeholders • People who have a “stake” in what will be learned from an evaluation and what will be done with the knowledge • They include: • People who manage or work in the program/organization • People who are served or affected by the program, or who work in partnership with the program • People who are in a position to do or to decide something about the program CDC, 1998

  15. Stakeholders • Stakeholders’ information needs and intended uses serve to focus the evaluation • Variety of stakeholders may mean: • more than one focus (policy implications vs documentation of local activities) • varied levels of involvement

  16. Who are your stakeholders? How do their needs and desires differ from one another? Stakeholders

  17. II. Describe Program • Need • Expectations • Activities • Context

  18. Expectations • Outcome Objectives • statement of the amount of change expected for a given problem/condition for a specified population within a given timeframe • Process Objectives • statement regarding the amount of change expected in the performance or utilization of interventions that are related to the outcome

  19. III. Focus the Design • Questions to answer • Process to follow • Methods to use • Activities to develop and implement • Results to disseminate

  20. Clarify • Individual, Systems, or Community Level • Individual: individually targeted services or programs, often for people at high-risk • Systems: change organizations, policies, laws, or structures • Community: focus is on community norms, attitudes, beliefs, practices

  21. IV. Gather Credible Evidence • Types of data • Demographic, health status, expenditures, quality of life, eligibility, utilization, capacity • Sources of data • Statistical reports, published studies, voluntary organizations, program reports, media articles, government reports, state surveys

  22. Thinking about data • Match the data to the questions – what kinds of information would be worthwhile? • As much as possible, use data that are being created as a regular part of the program • Collect and analyze data from multiple perspectives • Keep available resources in mind

  23. Thinking about data(continued) • Where might we find them? • How might we obtain them? • What types should we consider? • What do we do now that we have them?

  24. Who can help us collect and make sense of data? • Community partners • Student participants • College administrative offices • Faculty colleagues (and their students) • Students who participated in previous programs • Campus service-learning centers

  25. Indicators of Well-being: Dimensions to Consider(Cohen, MDPH) TraditionalLess Traditional Assets Social indications, Resiliency, Quality of life, Satisfaction, Self-reports Resources & Investment of health Deficits Disease, Utilization of Gaps among groups, medical services Education, Economics, Cultural, Safety deficits

  26. (Cont) TraditionalLess Traditional Assets Use of pre-natal care Quality adjusted life years Self-reported health Social networks Screening rates Rescue response time % insured Support for needle exchange Graduation rate Volunteerism Deficits Age - adjusted death rate Lack of support for arts/culture Hospitalizations Crimes per capita Smoking prevalence

  27. Specific Data Collection Methods • Surveys • Interviews • Focus groups • Literature search • Structured observations • Critical events log • Institutional documentation

  28. Now that we have the data…... • Analyze • Quantitative (statistical software) • Qualitative (systematic review and assessment) • Synthesize information • Follow framework of concepts • Write reports • Disseminate

  29. V. Justify Conclusions • Review findings • What do they mean? How significant are they? • How do the findings compare to the objectives of the program? • What claims or recommendations are indicated?

  30. VI. Ensure Use and Share Lessons • Through deliberate planning, preparation, and follow-up • Collaborate with stakeholders for meaningful: • Communication of results (process and outcome) • Decisions based on results • New assessment plans emerging from results • Reflection on the assessment process

  31. Challenges • Important things difficult to measure • Complexity • Measurement validity • Time • Proof of causation • Need to be sensitive to context • Resources

  32. Challenges • What are the challenges you face?

  33. Summary: Characteristics of Evaluation • Evolving process • Variety of approaches • More than collecting and analyzing data • Critical design issues • Reconciles competing expectations • Recognizes and engages stakeholders

  34. References • Bell R, Furco A, Ammon M, Muller P, Sorgen V. Institutionalizing Service-Learning in Higher Education. Berkeley: University of California. 2000. • Centers for Disease Control and Prevention. Practical Evaluation of Public Health Programs. 1998. • Kramer M. Make It Last: The Institutionalization of Service-Learning in America. Washington, DC: Corporation for National Service. 2000. • Patton M. Utilization-Focused Evaluation. Sage Publications. 1997. • WKKellogg Foundation. Evaluation Handbook. 1998.

More Related