1 / 16

Johns Hopkins University

Johns Hopkins University. Evaluation Overview. Evaluation Overview. Evaluation Leaders… Michael Scriven : Claremont Graduate University David Fetterman: Stanford University Medical School Definition

Download Presentation

Johns Hopkins University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Johns Hopkins University Evaluation Overview

  2. Evaluation Overview Evaluation Leaders… Michael Scriven: Claremont Graduate University David Fetterman: Stanford University Medical School Definition The identification, clarification, and application of defensible criteria to determine an evaluation object’s value (worth or merit) in relation to those criteria. Examining and judging to determine value…

  3. Evaluation Overview Evaluation uses inquiry and judgment methods, including: • Determining standards for judging quality and deciding whether those standards should be relative or absolute, • Collecting relevant information, and • Applying the standards to determine value, quality, utility, effectiveness, or significance. Evaluation leads to recommendations intended to optimize the evaluation object in relation to its intended purpose(s) or to help stakeholders determine whether the evaluation object is worthy of adoption, continuation, or expansion.

  4. Research vs. Evaluation

  5. Informal vs. Formal Evaluation We evaluate everyday! Informal evaluations occur all the time… however, they often lack: • breadth and depth • systematic procedures • formally collected evidence Informal evaluations can be influenced by experience, instinct, generalization, and reasoning.

  6. Purposes of Evaluation According to Mark, Henry, and Julnes (1999), evaluation has 4 main purposes: • Assessment of merit and worth • Oversight and compliance • Program and organizational improvement, • Knowledge development

  7. Formative vs. Summative Evaluation Formative: Provides information for program improvement Summative: Provides information to serve decisions or assist

  8. Activity… Formative vs. Summative Evaluation Activity

  9. Internal vs. External Evaluation • Internal (employees/members): • More familiar with organization and program history • Knows decision-making style of organization • Is present to remind others of results now and in the future • Can communicate technical results more frequently and clearly • External (outside organization) • Can bring greater credibility, perceived objectivity • Typically brings more breadth and depth of technical expertise • Has knowledge of how other similar organizations and programs work

  10. Discussion • Describe a situation in which an internal evaluator would be more appropriate than an external evaluator. What is the rationale for your choice? • Now Describe a situation in which an external evaluator might be more appropriate.

  11. Trends in Program Evaluation • Increasing priority and legitimacy of internal evaluation • Expanded use of qualitative methods • Strong shift toward multiple and diverse methods in program evaluation (mixed methods approach) • Increased use of program evaluation by foundations and other agencies in the not for profit sector • Increased education and involvement of stakeholders in evaluation to empower them, increase buy in • Advances in technology make evaluation more accessible • Performance measurement in the federal government and not for profit organizations • Increase in standards-based assessment in education as a means of tracking performance • Growth of evaluation internationally

  12. Empowerment Evaluation • Critical Friend or Coach—be on their side but politely pose questions to help refine and improve • Cycles of Reflection and Action—provide data to inform decision making, then act on it • Culture of Evidence—develop a pattern of data collection and documentation to support positions • Community of Learners—encourage peers to learn together • Reflective Practitioner—thoughtfully consider data to guide practical day-to-day activities

  13. Setting Goals: Creating a Road Map • Mission • Vision and Value Statements • Project Goals, Objectives, and Indicators of Success • Theory of Change • Logic Model

  14. Goals… MUST BE S.M.A.R.T

  15. S.M.A.R.T Goals Activity

  16. See you all Next Week!

More Related