1 / 45

Intervention Evaluation

Intervention Evaluation. Part 2. Last Week. Why evaluate Results-based approach Purposes and uses of evaluation Reviewed Kirkpatrick model. This Week. More on Kirkpatrick model Types of data collection Types of data Evaluation instruments Tips for Survey/Questionnaire

neci
Download Presentation

Intervention Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intervention Evaluation Part 2

  2. Last Week • Why evaluate • Results-based approach • Purposes and uses of evaluation • Reviewed Kirkpatrick model

  3. This Week • More on Kirkpatrick model • Types of data collection • Types of data • Evaluation instruments • Tips for Survey/Questionnaire * Will vary from note taking H.O.

  4. Warm Up Exercise F

  5. Levels of Evaluations • Donald Kirkpatrick - 4 Levels I. Reaction – Were the participants pleased with interventionII. Learning – What did the participants learnIII. Behavior – Did the participants change their behavior based on what was learning?IV. Results – Did the change in behavior positively affect the organization

  6. Exercise # 3 • What are some advantages and limitations in each of the four levels of Kirkpatrick’s evaluation model? (Use note-taking handout)

  7. Collect Post Intervention Data(Handout 3) • Surveys • Questionnaires • On the job observation • Post intervention interviews • Focus groups • Program assignments • Action plans • Performance contracts • Follow up session • Performance monitoring

  8. Exercise # 4 • Using the 10 items in the Post Intervention Data handout (3): Each group describe (fabricate) a situation where you might gather post intervention data for a level 2/3/4 evaluation. Be prepared to explain your rationale.

  9. Hard Data Hard data can be group into four categories 1. Output – of the work unit 2. Quality – how well produced or serviced 3. Cost – improvement in costs 4. Time – savings

  10. Soft Data • Work habits – absenteeism, tardiness, violations • Climate – number of grievances, complaints, job satisfaction • Satisfaction – Favorable reactions, employee loyalty, increased confidence

  11. Exercise # 5 • List some advantages and disadvantages / limitations when collecting hard and soft data. • (Use Notetaking H.O. p. 3/4)

  12. Evaluation Instruments Validity – does it measure what it is supposed to measure • Content validity – how well does the instrument measure the content/objectives of the program • Construct validity –How well does it measure the construct (abstract variable such as KSA) • Concurrent validity – How well does the instrument measure up against other instruments • Predictive validity – how well can it predict future behavior

  13. Intervention Evaluation Part 3

  14. Last Week • More on Kirkpatrick model • Types of data collection • Types of data

  15. This Week • Developing evaluation instruments • The survey process * New Handouts

  16. Exercise # 5 • List some five things you would do to enhance the chances of getting a good number returns for surveys/ questionnaires.

  17. Survey Process -- Tips • Communicate • in advance • the purpose • Signed introductory letter • Explain who’ll see data • Use anonymous input?

  18. More Tips • Keep it simple • Simplify the response process • Bubble format • SASE • Utilize local support

  19. More Tips • Consider incentives • Use follow-up reminders • Send a copy of the results to the participants

  20. Action Planning • Most common type of follow-up assignments. • Developed by participants. • Contains detailed steps to accomplish measurable objectives. • Shows what is to be done, by whom, when. • Must be monitored

  21. Action Plans • Communicate the action plan requirement early and explain its value (avoids resistance) • Describe the action-planning process at the beginning of the program (outline) • Teach the action-planning process • Allow time to develop the plan • Have the facilitator approve the action plans • Require participants to assign a monetary value for each improvement (helps ROI later)

  22. Action Plans • Ask participants to isolate the effects of the program • Ask participants to provide a confidence level for estimates • Require action plans to be presented to the groups by participants (peer review)if possible • Explain the follow up mechanism • Collect action plans • Summarize the data and calculate ROI

  23. Converting Data to Monetary Benefits • Focus on a unit of measure • Determine the value of each unit • Calculate the change in performance • Determine an annual amount for the change • Calculate the total value of the improvement

  24. Ways to Put Value on Units • Cost of quality • Converting Employee time • Using Historical Costs • Using Internal and External Experts • External Databases • Estimates from the participants • Estimates from Supervisors • Estimates from Senior Managers • Using HRD staff estimates

  25. Credibility • Source of the Data • Source for the study • Motives of evaluators • Methodology of the study • Assumptions made in the analysis • Realism of the outcome data • Types of data • Scope of analysis

  26. Guidelines for Study • Credible and reliable sources for estimates • Present material in an unbiased, objective way • Fully explain methods (step by step) • Define assumptions and compare to other studies • Consider factoring or adjusting output values when they appear unrealistic • Use hard data whenever possible

  27. Identifying intangible Measures (Not based upon monetary values) • Employee satisfaction • Stress reduction • Employee turnover • Customer satisfaction, retention • Team effectiveness

  28. Determining Costs • Collect costs on every intervention • Costs will not be precise (hard to be perfect) • Be practical - work with accounting department • Define which costs to collect, categories, sources • Computerize • Cost accumulation (track accounts) • Cost estimation (forumulas - page 227) • Fully load with all costs possible - be truthful • Overhead, benefits, perpherial costs, etc

  29. Data Analysis • Statistics (use a professional) • Use terms appropriately (ie, Significant difference) • Statistical deception (erroneous conclusions)

  30. Return on Investment • Compares costs to benefits • Complicated • Usually annualized • Business casespecific • Communicate the formulaused

  31. Phillips ROI Framework I. Reaction and Planned Action – measure’s participants reactions and plans to change II. Learning– Measures KSA III. Job Applications – Measures change of behavior on the job and specific use of the training material IV. Business results – Measures impact of the program V. Return on investments – Measures the monetary value of the results and costs for the program, usually expressed as a percentage

  32. Evaluation as a Customer Satisfaction Tool Level 1 Reaction Participants Level 2 Learning Participants Level 3 Job Immediate Applications Managers Level 4 Business Immediate/Senior Impact Managers Level 5 Return on Senior Managers Executives Investment

  33. From Level 4 to Level 5 Requires Three Steps:1. Level 4 data must be converted to monetary values2. Cost of the intervention mustbe tabulated3. Calculate the formula

  34. ROI Process Model Tabulate Program Costs Convert Data to Monetary Valve Isolate Effects of Training, Collect Data Calculate the Return on Investment Identify Intangible Benefits

  35. ROI Formula Net Program Benefits Program Costs X 100 ROI (%) =

  36. Two Methods • 1. Cost/Benefit RatioAn early model that compares the intervention’s costs to its benefits in a ratio form. For every one dollar invested in the intervention, X dollars in benefits were returned. • 2. ROI FormulaUses net program benefits divided by costs, and expressed as a percent.

  37. Cost / Benefit Program BenefitsProgram Costs CBR =

  38. ROI Formula Net Program Benefits Program Costs X 100 ROI (%) =

  39. Cautions With Using ROI • Make sure needs assessment has been completed • Include one or more strategies for isolating the effects of training • Use the reliable, credible sources in making estimates • Be conservative when developing benefits and costs • Use caution when comparing the ROI in training and development with other financial returns • Involve management in developing the return • Approach sensitive and controversial issues carefully • Do not boast about a high return (internal politics)

  40. Implementation Issues • Identify an internal champion (cheerleader) • Develop an implementation leader • Assign responsibilities so everyone will know their assigned tasks and outcomes • Set targets (annual) • Develop a project plan, timetable • Revise/Develop Policies and Procedures (Page 367) • Assess the climate – gap analysis, SWOT, barriers

  41. Preparing Your Staff • Involve the staff in the process • Using Evaluation Data as a Learning Tool • Identify and remove obstacles(complex, time, motivation, correct use of results)

  42. ROI Administration • Which programs to select?Large target audiences Important to corporate strategies Expensive High Visibility Comprehensive needs assessment

  43. ROI Administration • Reporting ProgressStatus meetings (facilitated by expert)Report progressAdd evaluation areas • Establish Discussion Groups • Train the Management Tool

  44. Timing of Evaluation • During the program • Time series – multiple measures • Post tests – timing

  45. Progress with objectives Action plan status Relevance of intervention Use of program materials Knowledge/skill application Skill frequency Changes in the work unit Measurable improvements/accomplishments Monetary impact Confidence level Improvement linked with the intervention Investment perception Linkage with output measures Barriers Enablers Management support Other solutions Target audience recommendations Suggestions for improvement Questionnaire Content Issues

More Related