310 likes | 331 Views
Developing An Evaluation Plan For TB Control Programs. Division of Tuberculosis Elimination National Center for HIV, STD, and TB Prevention Centers for Disease Control and Prevention. Developing An Evaluation Plan For TB Control Programs. Reference: A Guide to Developing an Evaluation Plan.
E N D
Developing An Evaluation Plan For TB Control Programs Division of Tuberculosis Elimination National Center for HIV, STD, and TB Prevention Centers for Disease Control and Prevention
Developing An Evaluation Plan For TB Control Programs Reference: A Guide to Developing an Evaluation Plan
Why Develop an Evaluation Plan? • Provides a cohesive approach to conducting evaluation and using the results • Guides evaluation activities • Explains what, when, how, why, who • Documents the evaluation process for all stakeholders • Ensures implementation fidelity
Guide to Developing An Evaluation Plan • Document referenced throughout presentation • Provides a template and instructions to help TB program staff develop an evaluation plan • Steps to evaluation are explained in detail • Completing sections and tables will result in an evaluation plan
The CDC Program Evaluation Framework • Systematic method for evaluation • Based on research and experience • Flexible and adaptable • Promotes a participatory approach • Focuses on using evaluation findings
Sections of an Evaluation Plan • Introduction • Stakeholder Assessment • Step 1: Engage Stakeholders • Background and Description of the TB Program and Program Logic Model • Step 2: Describe the Program • Focus of the Evaluation • Step 3: Focus the Evaluation Design
Sections of an Evaluation Plan • Gathering Credible Evidence: Data Collection • Step 4: Gather Credible Evidence • Justifying Conclusions: Analysis and Interpretation • Step 5: Justify Conclusions • Ensuring Use and Sharing Lessons Learned: Reporting and Dissemination • Step 6: Ensure Use and Share Lessons Learned
Introduction An introduction provides background information, identifies the purpose of the evaluation, and provides a roadmap of the plan. • Evaluation Goal • What is the purpose of the evaluation? • Evaluation Team • Who is your evaluation coordinator? • Who are the members of your evaluation team? Reference: Table 1 in the Evaluation Plan Guide
Stakeholder Assessment Stakeholders are individuals with vested interests in the success of the TB program. Involving stakeholders increases the credibility of the evaluation and ensures that findings are used as intended. • Who are the stakeholders in your TB program? • What are their interests in the evaluation? • What role do they play in the evaluation? • How do you plan to engage the stakeholders? Reference: Table 2 in the Evaluation Plan Guide
The program description ensures that stakeholders have a shared understanding of the program and identifies any unfounded assumptions and gaps. Need What problem does your program address? What are the causes and consequences of the problem? What is the magnitude of the problem? What changes or trends impact the problem? Background and Description of the TB Program
Context What are environmental factors that affect your program? Target Population Does your program target the TB concerns of one population? Program Objectives What objectives have been set for your program? Stage of Development Is this a new initiative or is it well established? Background and Description
Resources What resources are available to conduct the program activities? Activities What are program staff doing to accomplish program objectives? Outputs What are the direct and immediate results of program activities (materials produced, services delivered, etc.)? Outcomes What are the intended effects of the program activities? Reference: Table 3 in the Evaluation Plan Guide Background and Description
Program Logic Model A logic model is a graphic depiction of the program description. • Arrows describe the links between resources, activities, outputs and outcomes • A logic model • Provides a sense of scope of your program • Ensures that systematic decisions are made about what is to be measured • Helps to identify and organize indicators
Contact Investigation Goal: Prevent TB among contacts to cases (by finding and testing contacts for TB and LTBI, and then treating infected contacts to completion). C Short-term Outcomes E Long-term Outcomes B Activities A Inputs D Intermediate Outcomes 1 1 1 a a a Adequate infrastructure i Qualified, trained and and motivated staff ii Community and congregate setting partnerships iii Policies, procedures, and guidelines iv Ongoing data collection, monitoring, and reporting systems v Adequate physical, diagnostic, and treatment resources vi Linkages between jurisdictions vii Adequate data collection tools viii Partnership with private providers Interview/reinterview cases i Build rapport ii Provide education iii Obtain information about source case and contacts Cases identify contacts 2 a Contactseducated 3 a b Locate and evaluate contact:i Follow-up ii Education iii Examination & testing* Contacts evaluated 4 a Contacts followed up c Offertreatment 1 1 a 5 a d a Treat contact – case management (DOT/DOPT/incentives Contacts completeappropriate treatment for active TB or LTBI Active TB cured in contacts b TB (prevented) in contacts with LTBI Contacts start treatment 2 2 a Comprehensiveinterview tool 6 a a 2 Reporting a Evidence-based decisionsabout continuation or termination of contact investigation 2 Improved approaches forcontact investigation a b Reduced incidence and prevalence of TB Staff trained in interviewtechniques 3 a Monitor: i Data collection ii Data managementii Data analysisiv Data dissemination 3 3 a a Legal mandate to collectcontact information fromcongregate settings TB eliminated 4 a Conduct periodic reviewof cases/contacts and progresstoward contact treatment goals
Focus of the Evaluation Since you cannot feasibly evaluate everything, you must focus the evaluation by prioritizing and selecting evaluation questions. • Stakeholder Needs • Who will use the evaluation findings? • How will the findings be used? • What do stakeholders need to learn/know from the evaluation?
Focus of the Evaluation • Process Evaluation • What resources were required? • What program activities were accomplished? • Were they implemented as planned? • Outcome Evaluation • Is the program producing the intended outcomes? • Is there progress toward program objectives and goals?
Focus of the Evaluation • Evaluation Questions • Based on the needs of your stakeholders • Address process and outcome • Assess Your Questions • Feasible to collect • Provide accurate results
Focus of the Evaluation • Key Issues in Evaluation Design • Will you have a comparison or control group? • When will you collect data? • Will the data be collected retrospectively or prospectively? • What type of data do you need? • What data do you have already?
Focus of the Evaluation • Other Design Considerations • Standards for “good” evaluation • Timeliness • Stage of development • Data needed • Strengthen Your Design • Mix methods whenever possible • Use repeated measures • Triangulate
Gathering Credible Evidence: Data Collection Identify indicators, standards, and data sources to address evaluation questions. • Indicators • Visible, measurable signs of program performance • Reflect program objectives, logic model and evaluation questions • Program Benchmarks and Targets • Reasonable expectations of program performance • Benchmarks against which to measure performance Reference: Table 4 in your Evaluation Plan Guide
Gathering Credible Evidence: Data Collection Linking evaluation questions, indicators and program benchmarks. Example from the Guide – Table 4.
Gathering Credible Evidence: Data Collection • Data Collection • Where are the data? • What methods will be used to collect data? • How often will the data be collected? • Who will collect the data? • Tools for Data Collection • Collect only the information you need • Easy to administer and use Reference: Table 5 in your Evaluation Plan Guide
Gathering Credible Evidence: Data Collection Linking indicators and data sources and specifying your data collection plan. Example from the Guide – Table 5.
Gathering Credible Evidence: Data Collection • Human Subjects Considerations • Evaluation Timeline • Ensures that all stakeholders are aware of what activities are occurring at any time • Helps to determine if your evaluation resources will be strained by too many activities happening at once • Data Management and Storage • Ensures confidentiality and data quality Reference: Table 6 in your Evaluation Plan Guide
Justifying Conclusions:Analysis and Interpretation Once the data are collected, analysis and interpretation will help you understand what the findings mean for your program. • Analysis • What analysis techniques will you use for each data collection method? • Who is responsible for analysis? • Interpretation • What conclusions will you draw from your findings? • How will you involve stakeholders? Reference: Table 7 in your Evaluation Plan Guide
Ensuring Use and Sharing Lessons Learned: Reporting and Dissemination A plan for dissemination and use of the evaluation findings will avoid having evaluation reports “sit on the shelf.” • Dissemination • What medium will you use to disseminate findings? • Who is responsible for dissemination? • Use • How, where, and when will findings be used? • Who will act on the findings? Reference: Table 8 in your Evaluation Plan Guide
Tips for Evaluation Planning • Start small – focus on one initiative or program component to start with and limit the number of evaluation questions • Use what you already know about the program • Consider existing sources of data • Be realistic in your timeline and assessment of resources • Use the template and tables provided in the guide, adapt as needed • Seek help with your evaluation
Evaluation Resources Some Web-Based Resources • Centers for Disease Control and Prevention: http://www.cdc.gov/eval/ • W.K. Kellogg Foundation: http://www.wkkf.org/Publications/evalhdbk/ • University of Wisconsin Extension: http://www.uwex.edu/ces/pdante/evaluat.htm/ Selected Publications • Connell JP, Kubisch AC, Schorr LB, Weiss, CH. New Approaches to Evaluating Community Initiatives, New York, NY: Aspen Institute, 1995. • Patton MQ, Utilization-focused Evaluation, Thousand Oaks, CA: Sage Publications, 1997. • Rossi PH, Freeman HE, Lipsey MW. Evaluation: A Systematic Approach. Newbury Park, CA: Sage Publications, 1999. • Taylor-Powell E, Steele S, Douglas M. Planning a Program Evaluation. Madison, Wl: University of Wisconsin Cooperative Extension, 1996.