450 likes | 713 Views
Creating a Program Evaluation Plan and Utilizing Data to Drive Programs. David B. Buller, Ph.D. Klein Buendel, Inc. January 24, 2007. What is Program Evaluation?. The ongoing systematic collection of information on the purpose, process and outcomes of a program.
E N D
Creating a Program Evaluation Plan and Utilizing Data to Drive Programs David B. Buller, Ph.D. Klein Buendel, Inc. January 24, 2007
What is Program Evaluation? • The ongoing systematic collection of information on the purpose, process and outcomes of a program. • Information from program evaluation is used to: • identify a health problem; • create or select a program to address the problem; • judge the effectiveness of the program; • make changes to the program to improve it; and • provide accountability for the program.
Practicality Feasibility Ethics Accuracy Resource and time constraints Political, organizational, and personal contexts Important Considerations in Program Evaluation
Continuous, routine collection of information from a population on a defined set of behaviors. Examples: Behavioral Risk Factors Surveillance System (BRFSS). Youth Risk Behavior Survey (YRBS). Monitoring the Future Study. National College Health Assessment. Harvard College Alcohol Study. Surveillance
Can be very useful for program planning. Helps identify health risks and problems. Describes the populations most at risk. Potentially useful for evaluation of program outcomes. Program objectives can target behaviors measured by surveillance systems. Surveillance systems are often inflexible making them less useful for evaluation of outcomes. Surveillance
Program Planning Program Evaluation Phases of Program Evaluation
Document need for program and people in need. Describe context and identify resources available for program. Lay out and priority program goals. Design activities for reaching and assisting people in need and fulfilling goals. Engage important stakeholders. Steps in Program Planning
Profile college community. Describe impact of risk behavior. Identify present and past programs on campus addressing risk behavior. Program Planning: Describing Need, Context & Resources
Prioritize college community needs. Define target population Students, parents, college community, local community, society. Document assets and barriers for program. Program Planning: Describing Need, Context & Resources
Data Sources: Surveillance systems; Formal policy documents; Previous studies, reports, publicity materials, meeting minutes; Public records/campus records; Program Planning: Describing Need, Context & Resources
Data Sources: “Interviews” with campus and community stakeholders; “Interviews” with campus and community members; and Environmental scans of campus & community. Program Planning: Describing Need, Context & Resources
Surveillance Systems: Whatthehealth Program Planning: Describing Need, Context & Resources
Surveillance Systems: BRFSS YRBS PRAMS Monitoring the Future survey Disease registries Vital statistics from state & county health dept. NHIS & HINTS Specialized disease surveys (ATS, YTS) National College Health Assessment Code enforcement/violations databases Program Planning: Describing Need, Context & Resources
Useful Information: Demographics; Relevant organizations; Media outlets (audience & usage); Laws & regulations; Prior/existing program materials; and Health risk behavior (prevalence & patterns). Program Planning: Describing Need, Context & Resources
Interpreting Data: Consider data quality. Objective v. subjective. Time of collection. Number and selection of respondents. Phrasing of questions. Summarize key points/don’t get lost in details. Obtain group input. Program Planning: Describing Need, Context & Resources
Goals – intended consequences of program. SMART goals. Specific Measurable (observable) Achievable Experience & resources Chance of success Relevant Time-bound Program Planning: Setting Goals
Prioritize order that you will address goals. State objectives to reach goals. Process objectives. Set up our display at all on-campus health events. Outcome objectives. Short – increase students’ awareness of DUI risks. Intermediate – increase use of designated drivers. Long-term – decrease arrests of students for DUI. Program Planning: Setting Goals
Select indicators to measure goals. Specific, observable and measurable characteristics, changes, or actions. Examples: Number of campus health events where program has a display. Number of students who attend university health events. Number of students who report using designated driver in campus survey. Number of students arrested for DUI by campus police. Program Planning: Setting Goals
Define activities. Link activities to each objective. Describe how activities relate to one another. Predict campus/community response to activities. Be specific. Example: We will send a letter to presidents of all fraternities and sororities that describes how to use designated drivers at their parties. Program Planning: Specifying Activities
Select activities that: Will accomplish objectives; Will not have unintended negative side effects; Complement existing programs; Have been successful in the past; and Can be measured and tracked. Program Planning: Specifying Activities
Stakeholders are people: invested in program; Interested and/or engaged in program activities; and have a stake in program success or failure. Stakeholders: Improve program implementation; Increase support for program; and Boost program credibility. Program Planning: Involving Stakeholders
Examples of stakeholders: Students & parents College administrators & staff Campus organizations staff and clients Community & business leaders Representatives of advocacy groups National and state health agencies Critics Funders Program Planning: Involving Stakeholders
Monitor progress toward program goals. Describe program activities. Demonstrate the program components and program as a whole are effective. Make comparisons between groups of people in need. Justify existing and future funding and resources. Identify weaknesses, make changes, and improve outcomes of programs. Insure that effective programs are maintained and ineffective programs do not waste resources. Steps in Program Evaluation
Keep program staff focused on evaluation. Prepare stakeholders for results. Be sure evaluation is useful, feasible, ethical and accurate. Program Evaluation:Planning
Conduct a SWOT analysis. Strengths, Weaknesses, Opportunities, and Threats. Consider expertise and assign tasks to program staff. Identify needed resources and expertise and take action to acquire them. Program Evaluation:Planning
Identify existing data on indicators and develop affordable procedures for collecting indicators. Be conceraned with privacy. Anticipate challenges to conducting evaluation. Program Evaluation:Planning
Assign someone to take primary responsibility for program evaluation. Establish clear, concrete, routine procedures to obtain information on indicators. Can data be gathered as planned? Is the information of sufficient quality? Allow time at regular meetings to update staff on progress of program evaluation. Program Evaluation:Monitoring Progress
Provide preliminary results to program staff and stakeholders to maintain their interest and help them see value in evaluation activities. Are there program or evaluation problems that can be addressed immediately? How comfortable are we with findings as they emerge? How should we plan to communicate findings to stakeholders? Program Evaluation:Monitoring Progress
Don’t overlook measuring program activities. Be sensitive: Measures of activities can be threatening to program staff. Description of activities is essential for determining components that work, need modification, and should be discarded. Program Evaluation:Describing Activities
Record implementation by program staff. Meeting notes, contact records, ratings presentation quality, impressions of student reactions. Measure exposure and use by students/ administrators/community members. Surveys of students, return postcards, website use. Try to identify unanticipated modifications to program activities. Program Evaluation:Describing Activities
The key to showing success is “change.” Seeing change requires time. Obtain information on outcome indicators before and after a program. Time period should be matched to objectives. Information before a program can be obtained during program planning. Relying only on information after a program makes conclusions about success difficult. Program Evaluation:Demonstrating Success
Seeing change requires time. Obtain information on outcome indicators before and after a program. Time period should be matched to objectives. “Before” information often can be obtained during program planning. Relying only on “after” information makes conclusions about success difficult. Program Evaluation:Demonstrating Success
Program evaluation can be improved by comparing groups within the target population. Sometimes a program works better for one group than another (e.g., light rather than heavy drinkers). Suggest program features in need of improvement. Aid decisions about program maintenance. Program Evaluation:Comparing Groups
Program evaluations that compare a group that received the program to a group that did not can provide stronger information on success. Many things can change a single group of people other than the program. Having a group that doesn’t receive the program (called a “control” group by researchers) helps show that the program is successful apart from these other things. Look for greater change in students receiving the program than students not receiving it. Program Evaluation:Comparing Groups
Group not receiving the program can be measured at the same time (best option). Delivering designated-driver program to half of the dorms and measuring students in all dorms. Groups not receiving the program can be measured before or after the program. Measuring students in all dorms at the beginning and end of fall semester and then in the spring delivering the designated-driver program and measuring students in all dorms at end of spring semester. Program Evaluation:Comparing Groups
Use findings of program evaluation to show the benefits to stakeholders on campus and in community. Make group presentations and consider 1-on-1 discussions to clarify outcomes and recommendations. May need to educate stakeholders about program and its evaluation. Program Evaluation:Justifying Support
Keys to high quality communication. Summarize program, goals, objectives & evaluation. Present clear, succinct results. Make recommendations for program and discuss advantages and disadvantages. Remove technical jargon. Use neutral tone & visual aids. Program Evaluation:Justify Support
Use findings to assess the success and failures of the program. Engage staff in critical appraisal of program processes and outcomes using evaluation data. Helps build enthusiasm and commitment. Reconsider all aspects of program planning with reference to findings. Program Evaluation:Re-assessing Program
Don’t be afraid to admit to mistakes. Maintain components and programs that achieve objectives. Discard or modify components and programs that don’t achieve objectives. Necessary changes should be apparent when careful program planning and evaluation have occurred. Program Evaluation:Re-assessing Program
Don’t rush to discredit or “cherry pick” the findings. Start by accepting the findings and considering how you can improve the program. Don’t accept the findings you like and distrust the ones you don’t like. If the evaluation is done well, you should place equal faith in all findings. Look for problems in the evaluation that would discredit both findings of success and failure. Program Evaluation:Re-assessing Program
Program evaluation improves practice. Program evaluation is proactive. Do it now to improve your success. Don’t wait for a stakeholder to ask for it. Program evaluation involves careful planning and on-going information collection. Summary
Program evaluation is affordable. There are stakeholders who can help. Your can build in information collection. Your campus has many existing resources. Program evaluation has big payoffs. Staff enthusiasm & commitment. Continued funding and support. Improved health on campus. Program evaluation is for you. Summary
Dave B. Buller, Ph.D.Klein Buendel, Inc.303.565.4340dbuller@kleinbuendel.com