1 / 15

Review: Alternative Approaches II

This review explores alternative evaluation approaches, guidelines for planning evaluations, and practical applications for your evaluation plan. It covers understanding reasons for initiating evaluations, different uses of evaluations, and conditions under which evaluation studies may be inappropriate. It also delves into determining appropriateness, methods for evaluation, and who should be involved in the evaluation process. The text emphasizes the importance of clarity in evaluation requests and responsibilities, identifying stakeholders, and prioritizing the study's intended uses. Various informational and noninformational uses of evaluations are discussed, along with cautionary considerations for evaluating programs. The review provides valuable insights into conducting evaluations effectively and ethically.

kapplewhite
Download Presentation

Review: Alternative Approaches II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review: Alternative Approaches II • What three approaches did we last cover? • Describe one benefit of each approach • Which approach focuses on the marginalized? • What were the five cautions the authors shared about the alternative approaches to evaluation?

  2. Guidelines for Planning Evaluations: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

  3. Individuals who Affect or are Affected by an Evaluation Study • Sponsor: authorizes the evaluation, provides resources for its conduct • Client: requests the evaluation • Stakeholders: those who have a stake in the program or in the evaluation’s results • Audiences: individuals, groups, agencies who have an interest in the evaluation and receive its results

  4. Understanding Reasons forInitiating Evaluation • Understanding the purpose of the evaluation is an important first step • Did a problem prompt the evaluation? • Did some stakeholder demand it? • Who has the need to know? • What does s/he want to know? Why? • How will s/he use the results?

  5. It is not uncommon for the clients to be uninformed about evaluation procedures and to have not given deep thought about the ramifications • Frequently, the purpose is not clear until the evaluator has carefully read the relevant materials, observed the evaluation object, and interviewed stakeholders

  6. Practical Application to YOUR Plan: Questions to Begin 1) Why is this evaluation being requested? What questions will it answer? 2) To what use will the evaluation findings be put? By whom? What others should receive the information? 3) What is to be evaluated? What does it include? Exclude? During what time period? In what settings? Who will participate?

  7. 4) What are the essential program activities? How do they link with the goals and objectives? What is the program theory? 5) How much time and money are available for the evaluation? Who can help with it? Is any information needed immediately? 6) What is the political climate and context surrounding the evaluation? Will any political factors and forces interfere in gaining meaningful and fair information?

  8. Informational Uses of Evaluation • Needs Assessment • Determine whether sufficient need exists to initiate a program and describe the target audience • Assist in program planning by identifying potential program models • Monitoring/Process Study • Describe program implementation and whether changes from the initial model have occurred • Outcomes Study • Examine whether certain goals are being achieved at desired levels • from the initial model have occurred • Cost Effectiveness Study • Judge overall program value & its relative cost:value ratio compared to competing programs

  9. Noninformational Uses • Postponement of a decision • Ducking responsibility [know decision already but need to make it look good] • Public Relations [justify the program] • Fulfilling grant requirements • Covert, nefarious, political uses of information: • Typically more common in federal/national evaluations

  10. Conditions under which evaluation studies are inappropriate • Evaluation would produce trivial information • Low impact program, one-time effort • Evaluation results will not be used • Regardless of outcome, political appeal/public support… • Cannot yield useful, valid information (bad worse than none) • Well-intentioned efforts, “mission impossible” evals • Evaluation is premature for the stage of the program • Fitness program evaluation in first 6 weeks will not yield meaningful information • Premature summative evals most insidious misuse of evaluation • Motives of the evaluation are improper • Ethical considerations, “hatchet jobs” (propriety: eval respects rights & dignity of data sources; help organizations address all clients’ needs)

  11. Determining Appropriateness • Use a tool called evaluability assessment • Clarify the intended program model or theory • Examine the program implementation to determine whether it matches the program model and could achieve the program goals • Explore different evaluation approaches to match needs of stakeholders • Agree on evaluation priorities and intended uses of the study

  12. Methods • Create working group to clarify program model or theory, define information needs, evaluation expectations • Personal interviews with stakeholders • Reviews of existing program documentation • Site visits • Figure 10.1 (p. 186): checklist to determine when to conduct an evaluation

  13. Who will Evaluate? • External • impartial, credible, expertise, fresh look • participants may be more willing to reveal sensitive information to outsiders • more comfort presenting unpopular information/advocating changes, etc. • Internal • Knowledge of program, history, context, etc. • familiarity with stakeholders • Serve as advocates to use findings • quick start up • Known quantity

  14. Combination • Internal collect contextual information • Internal collect data • External directs data collection, organizes report • Internal is there to advocate and support after external is gone

  15. Evaluator Qualifications/Skills • Does evaluator have the ability to use methodologies and techniques needed in the study? • ….have the ability to help articulate the appropriate focus for the study? • ….have the management skills to carry out the study? • …maintain proper ethical standards? • …communicate results to audiences so that they will be used?

More Related