250 likes | 269 Views
Explore the process and importance of evaluating information systems, including projects, programs, personnel, and more. Learn the steps involved, from understanding the basics to providing feedback and meta-evaluation.
E N D
S519: Evaluation of Information Systems Understanding Evaluation Ch1+2
“I don’t get it. Our first three quarters were excellent.”
Definition of Evaluation • Systematic determination of the quality or value of something (Scriven, 1991) • What can we evaluate? • Projects, programs, or organizations • Personnel or performance • Policies or strategies • Products or services • Processes or systems • Proposals, contract bids, or job application Lessons-learned and methods are transdisciplinary
Terminology (Davison, Glossary) • Evaluand • That which is being evaluated (e.g. Program, policy, project, product, service, organization) • In personnel evaluation the term is evaluee
Issues of Evaluation • Why do we conduct evaluation? • Fast failure (c.f., Davidson, p. 17) • Reach failure before your competitors • There is an evaluation report on evaluation going unused: • “Lack of information does not appear to be the main problem. Rather, the problem seems to be that available information is not organized and communicated effectively” (GAO, 1995, p. 39, - Michael Quinn Patton, Utilization-Focused Evaluation, 2008, Google Book)
Issues of Evaluation • Evaluation is for • Find areas for improvement • Generate an assessment of overall quality • Any others? • Answer the question of „Merit“ or „Worth“ (Scriven, 1991) • Merit is the „intrinsic“ value of something = „quality“ • Worth is the value of something to an individual, an organization, an institution – contextualized merit -- = „value“ Most „big picture“ evaluation questions are question of value (worth) rather than pure merit – give me examples?
Choosing the right group • Accountability evaluation • It is important to conduct an independent evaluation • i.e. Nobody on the evaluation team should have a significant vested interest in whether the results are good or bad • Organization learning capability ealuation • It can (better) be dependent evaluation • i.e. Organizational staff, consultants, managers, customers, trainers, trainnees etc. can join.
The steps involved (D-p4) • Step1: understanding the basic about evaluation (ch1) • Step2: defining the main purposes of the evaluation and the „big picture“ questions that need answers (ch2) • Step3: Identifying the evaluative criteria (ch3) • Step4: Organizing the list of criteria and choosing sources of evidence (collecting data) (ch4)
The steps involved (D-p4) • Step5: analyzing data • dealing with the causation issue (which cause what, why), to avoid „subjectivity“ (ch5+6) • importance weighting: weight the results (ch7) • Meric determination: how well your evaluand has done on the criteria (good? Unacceptable?) (ch8) • Synthesis methodology: systematic methods for condensing evaluation findings (ch9) • Staticistical analysis: Salkind (2007)
The steps involved (D-p4) • Step6: result • Putting it all together: fitting the pieces into the KEC framework (ch10) • Step7: feedback • Meta-evaluation: how to figure out whether your evlauation is any good (ch11)
The steps involved • Form a group and discuss the steps • List the most important steps • Anything is missing here? • How these steps fit into KEC (Exhibit 1.2, p6-7) • Select three important boxes from KEC
The Key Evaluation Checklist (Davidson, 2005, p. 6-7) I. Executive Summary II. Preface III. Methodology 1. Background & Context 2. Descriptions & Definitions 3. Consumers 4. Resources 5. Values 10. Exportability 7. Outcome Evaluation 6. Process Evaluation 8 & 9. Comparative Cost-Effectiveness 11. Overall Significance 14. Reporting & Follow-up 12. Recommendations & Explanations 13. Responsibilities 15. Meta-evaluation
Step 1: Understand the basic of evaluation • Identify the evaluand • Background and context of evaluand • Why did this program or product come into existence in the first place? • Descriptions and definitions • Describe the evaluand in enough detail so that virtually anyone can understand what it is and what it does • How: collect background information, pay a firsthand visit or literature review
Are you ready for your first evaluation project? • Some tips before you start • Make sure that your evaluand is not difficult to access (geolocation, inanimate objects) • Make your evaluand a clearly defined group (avoid abstract and complex system) • Avoid political ramification (assess your boss pet project, university administration) • To avoid your invovlement in the evaluand (to assess a class which you are teaching, etc.)
Some potential topics for first-timer • A community healt program • A workplace wellness initiative • A distance learning course • A training program or workshop • ... • More see D-p10-11 • Form your group and choose your project
Standards for Evaluation (Patton, 1997, p. 17) • Utility • To ensure that an evaluation will serve the practical information needs of intended users • Feasibility • To ensure that an evaluation will be realistic, prudent, diplomatic, and frugal • Propriety • To ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results (IRB: http://research.iu.edu/rschcomp/revlocation.html) • Accuracy • To ensure that an evaluation will reveal and convey technically adequate information about the features http://www.wmich.edu/evalctr/jc/
Guiding Principles for Evaluators (American Evaluation Association, 1995) • Systematic inquiry • Competence • Integrity/honesty • Respect for people • Responsibilities for general and public welfare
Step1: Output report • Output: one or two page overview of the evaluand and findings • What is your evaluand • Background and context of your evaluand • Description of your evaluand • Try to be as detail as possible
Step2: Defining the Purpose of the Evaluation (D-Ch2) • Who asked for this evaluation and why? • What are the main evaluation questions? • Who are the main audience?
Evaluation purposes • A. what is (are) the main purpose(s) of the evaluation? • To determine the overall quality or value of something (summative evaluation, absolute merit) • i.e. Decision making, funding allocation decision, benchmarking products, using a tool, etc. • To find areas for improvement (formative evaluation, relative merit) • To help a new „thing“ to start • To improve the existing „thing“ • Both of the above
Big picture questions • Respond to the following comment from a manager in an organization you work with: • “All of our evaluations are geared exclusively toward improving our programs, policies, and practices. We have no need for any kind of evaluation that looks at the overall quality or value of something.”
Big picture questions • Big picture questions: • B. What is (are) the big picture question(s) for which we need answers? • Absolute merit vs. • Do we want to invest this project? • relative merit • How does this project compare with the other options? • Ranking • Example: Table 2.2 (D-p19)
Step2: Output report • Your step2 output report should answer the following questions: • Define the evaluation purpose • Do you need to demonstrate to someone (yourself) the overall quality of something? • Or Do you need to find a file for improvement? • Or do you do both? • Once you answer above questions, figure out what are your big picture questions: • Is your evaluation related to the absolute merit of your evaluand? • Or the relative merit of your evaluand
Keep in mind • Mainly for decision-making purpose: • Find out whether our evaluation are yielding benefit • Help a new product or a new employee to get up to speed • Knowing the competitors • Keep updating (avoid boiled frog problem) • Avoid reinventing the wheel (learning from failures)
Exercise • Form a pair to discussion Exercises on p22 (E1 and E2) • Form your project team, think about your project topics • Website • Knowledge system • Online learning course • Social network system • Company • …