600 likes | 889 Views
PROGRAM EVALUATION. Produced by Dr. James J. Kirk Professor of HRD Western Carolina University. What You Will Learn. The definition of “program evaluation” Various terms used used by program evaluators Common reasons for conducting program evaluations Selected types of evaluations.
E N D
PROGRAM EVALUATION Produced by Dr. James J. Kirk Professor of HRD Western Carolina University
What You Will Learn • The definition of “program evaluation” • Various terms used used by program evaluators • Common reasons for conducting program evaluations • Selected types of evaluations
WHAT IS EVALUATION ? Determining the worth of something. Employee Orientation Program EVALUATION
WHAT CONSTITUTES A PROGRAM? Employee Orientation Program Planned learning activities.
NEEDS ASSESSMENTS VS TRAINING EVALUATIONS PROGRAM EVALUATION NEEDS ASSESSMENT
FORMAL EVALUATION VS INFORMAL EVALUATION Formal Informal
SUMMATIVE EVALUATION End Result
FORMATIVE EVALUATION Steps Taken To Achieve The Desired End Result
THE EVALUATION OBJECT That which is evaluated.
EVALUATION OBJECTS 1. Orientation Program 2. Management Training 3. Safety Program 4. Mentoring Program 5. Cross Training 6. Drug Education 7. Job Rotation 8. Team Building
DESCRIPTION OF OBJECTS 1. Who is involved? 2. Why it exists? 3. What are its parts or functional elements? 4. When does it take place? 5. Where it exists?
EXTERNAL EVALUATIONS Conducted by someone from outside the organization.
INTERNAL EVALUATIONS Conducted by someone from inside the organization.
GOAL - FREE EVALUATIONS Journalistic Style Evaluations
SOME COMMON AUDIENCES • MANAGERS • OTHER PROFESSIONALS • PARTICIPANTS • TRAINERS • GOVT. OFFICIALS • PROGRAM PLANNERS • VENDORS
TYPES OF AUDIENCES PARTICIPANTS STAKEHOLDERS CLIENTS
AUDIENCE CHARACTERISTICS 1. Age, Sex, Race 2. Occupation 3. Education/Training Background 4. Values 5. Knowledge of Evaluation 6. Special Concerns 7. Special Interests 8. Hidden Agendas
EVALUATION CONSTRAINTS MONEY TIME EXPERTISE POLITICS
TIMES WHEN AN EVALUATION MAY NOT BE APPROPRIATE Crisis Political Unrest Right After A Program Has Begun When Evaluation May Cost More Than Program
EVALUATION MYTHS 1. My CEO does not require evaluation, so why should I do it? 2. Measuring progress toward objectives is an adequate evaluation strategy. 3. There are too many variables affecting the behavior change for me to evaluate the impact of training.
EVALUATION MYTHS 4. I can’t measure the results of my training. 5. I don’t need to justify my existence, I have a proven track record. 6. I don’t know what information to collect. 7. Measurement is only effective in the production and financial areas.
EVALUATION MYTHS 8. If I can”t calculate the return on investment, then it is useless to evaluate the program. 9. Evaluation will probably cost too much. 10. Evaluation will lead to criticism. 11. The emphasis should be the same in all organizations.
CLASSIFIED BY 1. Research design 2. Type of data 3. Way data is collected 4. Who does the evaluating 5. Who uses the info 6. Type of ?’s asked 7. Scope of the evaluation
OBJECTIVES-ORIENTED APPROACH PROPONENTS Tyler, Provus, Metfessel & Michael etc. PURPOSE Determine extent of achieved objectives. DISTINGUISHING CHARACTERISTICS Specify measurable objectives & compare objectives with performance. PAST USES Curriculum development and needs assessment etc. CONCEPTUAL CONTRIBUTIONS Pre-post performance measurements. CRITERIA USED Measures objectives, reliability and validity. BENEFITS Simple, sets objectives. LIMITATIONS Reductionistic, linear.
MANAGEMENT-ORIENTED APPROACH PROPONENTS Stufflebeam, Alkin & Provus. PURPOSE Provides info for decision-making. DISTINGUISHING CHARACTERISTICS Evaluating all stages of program development. PAST USES Accountability, program planning. CONCEPTUAL CONTRIBUTIONS Identifies / evaluates needs and objectives. CRITERIA USED Utility, propriety, and technical soundness. BENEFITS Comprehensive and sensitive to leadership. LIMITATIONS Expensive and focuses on production.
CONSUMER-ORIENTEDAPPROACH PROPONENTS Scriven, Komoski. PURPOSE Provides info for educational purchases etc. DISTINGUISHING CHARACTERISTICS Uses criterion checklists to analyze products etc. PAST USES Consumer reports. CONCEPTUAL CONTRIBUTIONS Provides criteria for evaluating educational products. CRITERIA USED Objective criteria to infer conclusions / make recommendations. BENEFITS Provides info on cost, consumer needs & product developers. LIMITATIONS Cost, not open to cross examination.
EXPERTISE-ORIENTED APPROACH PROPONENTS Eisner, Accreditation Groups. PURPOSE Professional judgments DISTINGUISHING CHARACTERISTICS Judgment based upon individual knowledge and experience. PAST USES Self-study, accreditation, criticism. CONCEPTUAL CONTRIBUTIONS Legitimizes subjective criticism. CRITERIA USED Qualified BENEFITS Capitalizes LIMITATIONS Personal bias, overuse of intuition.
ADVERSARY-ORIENTED APPROACH PROPONENTS Wolf, Owens, Levine & Kourilsky. PURPOSE Expose program’s strengths / weaknesses. DISTINGUISHING CHARACTERISTICS Airs opposing viewpoints / public hearings. PAST USES Examines controversial programs / issues. CONCEPTUAL CONTRIBUTIONS Uses forensic / judicial public hearings, clarifies issues. CRITERIA USED Balance, open to public. BENEFITS Aims at closure / resolution, audience impact.. LIMITATIONS Fallible arbiters / judges, cost, time involved.
NATURALISTIC OR PARTICIPANT-ORIENTED APPROACH PROPONENTS Stake, Patton, Guba & Lincoln etc. PURPOSE Expose complexities of educational activity. DISTINGUISHING CHARACTERISTICS Multiple realities, inductive logic and discovery. Ethnographies of operating program. PAST USES Emergent evaluation designs, studying context criteria for judging naturalistic inquiry. CONCEPTUAL CONTRIBUTIONS CRITERIA USED Credibility BENEFITS Focuses on describing, judging and understanding. LIMITATIONS Nondirective, atypical, may not reach closure.
RESULTS ORIENTED APPROACH? The focus is on Kirkpatrick’s 4th level of evaluation-”results.”
SELECTION CRITERIA 1. Purpose of the evaluation 2. Expertise of the evaluator 3. Evaluation audience 4. Time 5. Money 6. Scope 7. Help available
HRD EVALUATION MODELS • Xerox • IBM • AT&T/Bell • Saratoga Inst. • CIPP • CIRO • Kirkpatrick
AT&T/BELL • Reaction outcomes • Capability outcomes • Applications outcomes • Worth outcomes
IBM • Reaction • Testing • Applications • Business
Zerox • Entry capability • End-of-course performance • Mastery job performance • Organizational performance
CIPP (Phi Delta Kappa) • Context evaluation • Input evaluation • Process Evaluation • Product Evaluation
SARATOGA INSTITUTE • Trainee satisfaction • Learning change • Behavior change • Organization change
CIRO (Warr, Bird, Rackham) • Context evaluation • Input evaluation • Reaction evaluation • Outcome evaluation
KIRKPATRICK • Reaction • Learning • Behavior • Results
COMMON EVALUATION LEVELS • Participant’s Reaction • Participant’s Increased Knowledge