190 likes | 398 Views
Developing Evaluation Instruments. Evaluations. Formative – How are we doing? Summative – How did we do? Confirmative – How are we still doing?. Formative Evaluation. Even excellent plans and concepts can be improved Gauge success by classroom program presentation
E N D
Evaluations • Formative – How are we doing? • Summative – How did we do? • Confirmative – How are we still doing?
Formative Evaluation • Even excellent plans and concepts can be improved • Gauge success by classroom program presentation • Informs the instructor how well instructional package is serving the objectives as it progresses
Formative Evaluation • Program of instruction is evaluated through • Test results • Learner reactions/comments • General observation of learners during class • Subject-matter expert reviews • Colleague suggestions • Quality control of the development process
Formative Evaluation • Questions that may be asked • Level of learning acceptable compared to objectives? • Learners using knowledge to perform skills? • Time element acceptable? • Activities appropriate? • Tests measuring objectives?
Summative Evaluation • Measures success of major outcomes • Usually follows all instruction, projects and testing in a program
Summative Evaluation • Measures • Learning efficiency (material mastered/time) • Program cost (Development and delivery) • Continuing expenses • Student comments/reactions/evaluations • Long-term benefit of program
Confirmative Evaluation • Although instruction initially effective, some problems appear over time • Tracks learner experiences over time to judge program’s validity • Relies upon numerous data-collection instruments (interviews, performance assessments, etc.)
Confirmative Evaluation • Questions to investigate • Do learners continue to perform correctly over time? • Do materials still meet original objectives? • How has technology and/or resources, trends, and attitudes changed since inception? • How to best meet clients’ needs over time?
Confirmative Evaluation • Changes needed in materials? What are the costs? • If instruction not working as well as before: • Should instruction continue as is? • Should it be revised? • Should it be terminated? • What might replace it?
Validity • The test assesses what it is supposed to measure – For example: • Performance tests assess processes and outcomes relating to skills or competencies • Course attitude surveys need to measure reactions to the course
Reliability • The test will produce consistent results whenever used • More questions relating to specific objectives = more reliable test • Standardized delivery of test • Scoring methods – the less subjective, the better
Pretesting • Judges the learner’s preparation to study the material (course or topic) • Determines competencies already mastered
Testing for Prerequisites • Standardized paper-and-pencil test • Performance observations • Questionnaire (Looks at learner background, training, and experiences) • Review of past work experience • Interview past supervisors/managers
Objective tests • Multiple choice • True or False? • Matching items
Constructed-Response • Make learner plan answers and express them in their own words • Short-answer • Essay questions • Problem solving
Skills and Behavior • Actions observed • Process, product, or both? • Constraints/limitations? • Testing conditions simulated or realistic?
The Rubric • General assessment of overall product • Certain elements attain points • More objective tool From Designing Effective Instruction 4th Ed., Morrison, Ross, & Kemp, 2004.