220 likes | 233 Views
https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758. Program Evaluation Screencast. Prepared by Mary Secret Based on materials from the following sources:
E N D
https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758 Program Evaluation Screencast Prepared by Mary Secret Based on materials from the following sources: Babbie,, E. 2014The Practice of Social Research,(14th edition). Boston, MA:Thomson Wadsworth Corcoran, J. & Secret, M. (2013). Social Work Research Skills Workbook. New York: Oxford Engel, R.J., & Schutt, R.K., (2013). The practice of research in social work (3rd Ed). Thousand Oaks, CA: Sage. ECHO 360 links https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758?ec=true https://ess.echo360.vcu.edu:8443/ess/echo/presentation/fff3175a-9600-4bda-91fa-a21a1fe32758
What is the purpose of program evaluation To investigate social programs To assess effectiveness of social policies and programs.
Program Evaluation Prologue • … is not a specific activity or method that you can point to or associate with any particular step of the research process • … encompasses all aspects of research processes and methods
Major comprehensive program evaluation can: • Include experimental and non-experimental research designs, • Use both qualitative and quantitative approaches, • Collect data from secondary data sources, interview participants, • Use standardized or non-standardized measurement instruments, • Include both probability and nonprobability samples, • Must adhere to the standard research ethics
Program Evaluation … is distinguished from other types of social science research not by the design, method, or the approach … but the underlying intent, the purposes that guide the evaluation process
What is the purpose of program evaluation To investigate social programs To assess effectiveness of social policies and programs.
Question FIRST!!! The specific methods depends on the evaluation question of interest about a specific program, policy or intervention Questions to be answered • Is the program needed? Do a needs assessment • How does the program operate? Do a formative or process evaluation • What is the program’s impact? Do a summative or outcome evaluation • How efficient is the program? Do a cost benefit or a cost effectiveness analysis
The language of evaluation: Fill in the Blank the impact of the program; the intended result; the response variable; the dependent variable Outcomes the services delivered or new products produced by the program process Outputs resources, raw materials, clients, and staff that go into a program In puts Population for whom the program is designed. Target Population individuals and groups who have some basis of concern with the program, often setting the research agenda and controlling research findings Stakeholders information about service delivery system outputs, outcomes, or operations that is available to any program stakeholders Feedback
What is a Needs Assessment Systematically researching questions about the needs of a target population for program planning purposes that obtains information from: Key Informants: expert opinions from individuals who have special knowledge about the needs and about the existing services Rates under treatment: secondary analysis of existing statistics to estimate need for services based on number and characteristics of clients who are already being served. Social Indicators: existing statistics that reflect conditions of an entire population.. i.e. census data, Kids Count data. Rubin and Babbe, (2007) Essential Research Methods for Social Work. Brooks Cole:CA
What is a Process or Formative Evaluation: How do you know whether or not the service was delivered in the manner intended… i.e. according to protocol or evidence based practice model Must measure (collect data) the Independent Variable– the intervention.. What services were actually delivered, i.e. Number of counseling sessions, hours of training, number of meetings, etcetc
What is an Outcome Evaluation: Also known as impact evaluation and summative evaluation Evaluation research that examines the effectiveness of the treatment or other service • Program is independent variable (treatment) • Outcomes are dependent variables • Experimental design is preferred method for maximizing internal validity because of • Random assignment into an experimental group and a control/comparison group • Manipulation of the independent variable
A closer look at Experimental Designs Research Design Notations • R—Random assignment • O—Observation, data collection • X—Intervention or treatment
Classic Experimental Design Controls for selection bias and history and maturation and statistical regression threat to internal validity
Quasi-experimental design Less control for threat to internal validity .. Possibility of selection bias
Pretest/Posttest design (pre-experimental) Least control for threat to internal validity .. History, maturation,
What about Measurement • Use of many different types of measurement tools, … dependent on the intent and type of evaluation research
Independent varialbe Measures for outcome and causal mechanisms Does the program cause change? How does change happen USING MULTIPLE MEASURES and SEVERAL DATA COLLECTION STRATEGIES TO EVAULATE THE FACT PROGRAM Measures for input And program efficiency Measures for Process/Implementation Evaluation What services are being delivered, by who, how?
LOGIC MODEL • many pieces of information that must be organized and then interpreted. • need a way in which this information can be organized.
What is the Logic Model? A schematic representation of the various components that make up a social service program. • Logic models may describe • theory and its link to change (theory approach model) where attention is on the “how and why” a program works • outcomes (outcome approach model) where the focus of the logic model is to connect resources and activities to expected changes • activities (activities approach model) or describing what the program actually does
Short term outcomes.Measured by the research Logic model- outcomes example Program inputs Program processes Long term outcomes/ difficult to measure Identifying the causal mechanism
What’s an ‘Evaluabilty’ Assessment Newly emergent programs that are not fully operational are not ready for, and indeed can be tarnished by a summative evaluation geared to assessing program outcomes. HOW SO?? a systematic process that helps identify whether program evaluation is justified, feasible, and likely to provide useful information*. • determines whether a program is ready for evaluation—either a process or outcome evaluation, or both. • Is the program able to produce the information required for a process evaluation,.. AT WHAT STAGE OF IMPLEMTATION IS THE PROGRAM? • Can a program meet the other criteria for beginning an outcome evaluation. • determines whether a program has the basic foundation for an evaluation to take place * Evaluability Assessment: Examining the Readiness of a Program for Evaluation. Juvenile Justice Evaluation Center Justice Research and Statistics Association. Program Evaluation Briefing Series #6. May, 2003, p. 6 http://www.jrsa.org/pubs/juv-justice/evaluability-assessment.pdf
Evaluabilityof a program based on • ESTABLISHED PROGRAM • measurable outcomes • defined service components • an established recruiting, enrollment, and participation process; • good understanding of the characteristics of the target population, program participants and program environment; • ability to collect and maintain information; • adequate program size • RESEARCH SAVVY SERVICE DELIVERY STAFF • problem solving values and skills • prior experience with evaluation confidence in program • commitment to ‘new knowledge’ • openness to change