300 likes | 319 Views
Class Meeting #5. Program Evaluation Methods I: Needs Assessment, Formative Evaluation, Process Evaluation, and Single System Research Designs (SSRDs). Types of Program Evaluations.
E N D
Class Meeting #5 Program Evaluation Methods I: Needs Assessment, Formative Evaluation, Process Evaluation, and Single System Research Designs (SSRDs)
Types of Program Evaluations “The type of evaluation you undertake to improve your programs depends on what you want to learn about the program.” McNamara, C. (1998). “Basic Guide to Program Evaluation.” Downloaded 1/5/06 from http://www.managementhelp.org/evaluatn/fnl_eval.htm.
Many Possibilities • Needs assessments • Accreditation • Cost/benefit analysis • Effectiveness • Efficiency • Goal-based • Process • Outcomes
Definitions • Inputs: resources needed to run the program • Process: how the program is carried out • Outputs: units of service • Outcomes: impacts on the customers or clients McNamara, C. (1998). “Basic Guide to Program Evaluation.” Downloaded 1/5/06 from http://www.managementhelp.org/evaluatn/fnl_eval.htm.
Definition Needs Assessment: Decision-aiding tools used for resource allocation, program planning, and program development based on the assumption that planned programming can alleviate distress and aid growth. McKillip
Four Types of Need • Normative – defined by an expert • Felt – ascertained by asking clients • Expressed – a demand for services • Comparative – inferred from similar characteristics
Approaches • Secondary data analysis • Impressionistic approach • Nominal groups • Delphi technique • Focus groups • Surveys • Convergent analysis • Multimethod approach
Considerations • Budget • Resources available • Amount of time to complete the evaluation • One-time or continuing process • Amount of detail or information desired
Persons of Interest • Sponsor: Agency and/or individual that authorizes the evaluation and provides necessary fiscal resources • Stakeholders: Various individuals and/or groups who have a direct interest in and may be affected by the program being evaluated or the evaluation results • Clients: Individuals who are receiving services by or from the program • Audience: Individuals, groups, and/or agencies who have an interest in the evaluation and receive its results
Definitions • Social indicators – variables that help gauge the extent of social problems • Ecological fallacy – misinterpretation of data derived from social indicators • Key informants – those who are informed about a given problem because of training or work experience and are willing to share their knowledge
Formative Evaluation • Purpose is to adjust and enhance interventions • In-progress examination of the program or intervention • No specific methodology or procedures (flexible implementation)
What it involves... “Formative Evaluation typically involves gathering information during the early stages of your project or program, with a focus on finding out whether your efforts are unfolding as planned, uncovering any obstacles, barriers or unexpected opportunities that may have emerged, and identifying mid-course adjustments and corrections which can help insure the success of your work.” NWREL http://www.nwrel.org/evaluation/formative.shtml
Approaches • Compare the program to model standards or similar standards • Bring in an “expert” to observe and give suggestions • Form an “ad-hoc” committee http://www.google.com/ “standards for good child care programs”
Possible Questions • How were program goals established? • What is the status of the program’s progress toward achieving the goals? • Will the goals be achieved according to the timelines specified in the operational plan? • Do personnel have adequate resources? • What changes are needed?
Process Evaluation • Purpose is to understand how a program works and possibly reveal its strengths and weaknesses • Very useful for programs that have been operating for a long time • Possible focuses: • Program description • Program monitoring • Quality assurance
A Rose by any other name... Process evaluations can go by numerous other names... http://www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf
Possible Questions • How are employees trained? • How do customers or clients come into the program? • What is the general process that customers or clients go through with the product or program? • What do customers or clients consider to be strengths of the program? • What typical complaints are heard from employees and/or customers?
Program Description • Clarify the purpose • Develop a data collection plan • Identify who will be interviewed • Develop instruments • Conduct interviews • Examine documents and records • Analyze and integrate information • Prepare the report
Program Monitoring • Start with the goals and objectives • Compare them to data that is routinely collected • Patterns of use • Client utilization
Management Information Systems • “The Old-Fashioned Way” (paper and pencil) • Electronic data management • Manual entry • Electronic “dumps” (Most school systems now have MISs and test scores are electronically “dumped” into the system.)
Knowing What to Count! For guidance, we go first to the organization’s: • Mission statement • Goals • Objectives
Mission Statement An organization’s statement of purpose, which provides a common vision for all stakeholders and a point of reference for all major planning decisions. http://www.pgcps.org/docs/mission.pdf
Goals General statements that specify an organization’s direction based on values, ideals, political mandates, and program purpose. Objectives Specific and precise statements that describe what is to be accomplished and by what date. Objectives have a single aim and end product. Goals and Objectives
Writing Objectives • Verb • Specific target • Date
Difficult to Measure Help Assist Understand Know Realize Discover Improve Measurable Increase Add Decrease Reduce Advertise Publicize Start Create Measurability
Quality Assurance • Determining compliance with standards • Most human services agencies and programs have standards • “Utilization Review” • Is this program evaluation? http://www.iso.org/
Sample Standards National Association for the Education of Young Children Accreditation Performance Criteria http://www.naeyc.org/accreditation/performance_criteria/
Parallel in School Systems • Special Education • Diagnostic standards • Provision of services standards • Reimbursement from Medicaid
Basic Steps for SSRDs • Assess student (client) behavior • Operationalize the behavior(s) • Develop measurement tools • Measure behavior(s) • Choose design • Collect data • Analyze data
Operationalize = Choose Variables • There are variables that predict things we cannot control • There are variables that predict things we can control • There are variables that do not predict Interventions should focus on variables that predict things we can control!