390 likes | 738 Views
Formative and Summative Evaluations. Instructional Design For Multimedia. Implementation. Evaluation Phases. Summative Evaluation. Development. Analysis. Design. Formative Evaluation. Formative Evaluation. Occurs before implementation
E N D
Formative and Summative Evaluations Instructional Design For Multimedia
Implementation Evaluation Phases Summative Evaluation Development Analysis Design Formative Evaluation
Formative Evaluation • Occurs before implementation • Determines the weaknesses in the instruction so that revisions can be made • Makes instruction more effective and efficient
Formative Evaluation Is Especially Important When… • Designer is novice • Content area is new • Technology is new • Audience is unfamiliar • Task performance is critical • Accountability is high • Client requests/expects evaluation • Instructions will be disseminated widely • Opportunities for later revision are slim
Formative Evaluation Phases Design Reviews Expert Reviews One-to-One Evaluation Learner Validation Small-Group Evaluation Field Trials Ongoing Evaluation
Design Reviews • Should take place after each step of the design process • Goal Review • Review of Environment and Learner Analysis • Review of Task Analysis • Review of Assessment Specifications
Expert Reviews • Should take place when instructional materials are in draft form • Experts include: • Content Experts • Instructional Design Experts • Content-specific Educational Experts • Learner Experts
Expert ReviewsContent Experts • Subject matter experts (SMEs) review for accuracy and completeness • Is the content accurate and up-to-date? • Does the content present a consistent perspective? Example: Physics expert
Expert ReviewsInstructional Design Experts • Reviews for instructional strategy and theory • Is the instructional strategies consistent with principles of instructional theory? Example: Instructional Designer
Expert ReviewsContent-Specific Educational Expert • Reviews for pedagogical approach in content area • Is the pedagogical approach consistent with current instructional theory in the content area? Example: Science education specialist
Expert ReviewsLearner Expert • Reviews appropriateness such as vocabulary, examples and illustrations • Are the examples, practice exercises, and feedback realistic and accurate? • Is the instruction appropriate for target learners? Example: 6th grade teacher
Expert ReviewsProcess • Distribute draft material to experts • Collect comments and prioritize into categories such as: • CriticalRevisions should be made immediately • Non-criticalDisregard or address at a later date • More InfoFind more data or information
Learner Validation • Try instruction with representative learners to see how well they learn and what problems arise as they engage with the instruction • One-to-One Evaluations • Small Group Evaluation • Field Trials
Learner ValidationOne-to-One Evaluation • Present materials to one learner at a time • Typical problems that might arise are: • Typographical errors • Unclear sentences • Poor or missing directions • Inappropriate examples • Unfamiliar vocabulary • Mislabeled pages or illustrations • Make revisions to instruction • Conduct more evaluations if necessary
Learner ValidationOne-to-One Evaluation Process • Present materials to student • Watch student interact with material • Employ “Read-Think-Aloud” method • Continually query students about problems they face and what they are thinking • Assure student that problems in the instruction are not their fault • Tape record or take notes during session • Reward participation
Learner ValidationSmall Group Evaluation • Present materials to 8-12 learners • Administer a questionnaire to obtain general demographic data and attitudes or experiences • Problems that might arise are: • Students have more or less entry level skills than anticipated • Course was too long or too short • Learners react negatively to the instruction • Make revisions to instruction • Conduct more evaluations if necessary
Learner ValidationSmall-Group Evaluation Process • Conduct entry-level and pretests with students • Present instruction to students in natural setting • Observe students interacting with materials • Take notes and/or videotape session • Only intervene when instruction cannot proceed without assistance • Administer posttest • Administer attitudinal survey or discussion • Reward participation
Learner ValidationField Trials Evaluation • Administer instruction to 30 students • Problems that might arise: • Instruction is not implemented as designed • Students have more or less entry-level skills • Assessments are too easy or difficult • Course is too long or too short • Students react negatively to instruction • Make revisions • Conduct more field trials if necessary
Learner ValidationField Trials Evaluation Process • Administer instruction students in normal setting, in various regions and with varying socioeconomic status • Collect and analyze data from pretests and posttests • Conduct follow-up interviews if necessary • Conduct questionnaire with instructors who deliver the training
Formative EvaluationOngoing Evaluation • Continue to collect and analyze data • Collect all comments/changes made by teachers who deliver the instruction • Keep track of changes in learner population • Revise instruction or produce new material to accompany instruction as needed
Conduct design reviews after each stage of design including goals, environment and learner analysis, task analysis and assessment specifications Conduct expert reviews with content, instructional design, content-specific educator and learner experts Conduct one-to-one evaluations with students Conduct small-group evaluations with 8-12 students Conduct field trials with 30 or more students Conduct ongoing evaluations Formative Evaluation Summary
Summative Evaluation • Occurs after implementation(after program has completed full cycle) • Determines the effectiveness, appeal, and efficiency of instruction • Assesses whether the instruction adequately solves the “problem” that was identified in the needs assessment
Summative Evaluation Phases Determine Goals Select Orientation Select Design Design/Select Evaluation Measure Collect Data Analyze Data Report Results
Summative EvaluationDetermine Goals • Identify questions that should be answered as a result of the evaluation • Does implementation of the instruction solve the problem identified in the assessment? • Do the learners achieve the goals of the instruction? • How do the learners feel about the instruction? • What are the costs of the instruction, what is the return on investment (ROI)? • How much time does it take for learners to complete the instruction? • Is the instruction implemented as designed? • What unexpected outcomes result from the instruction?
Summative EvaluationDetermine Goals • Select indicators of success • If program is successful, what will we observe it in: • Instructional materials? • Learner’s activities? • Teachers knowledge, practice and attitudes? • Learner’s understanding, processes, skills, and attitudes?
Summative EvaluationSelect Orientation • Come to an agreement with client on most appropriate orientation of evaluation • Objectivism – Observation and quantitative data collected to determine the degree to which the goals of the instruction have been met • Subjectivism – Expert judgment and qualitative data not based on instructional goals
Summative EvaluationSelect Design of Evaluation • What data will be collected, when, and under what conditions? • Instruction, Posttest • Pretest, Instruction, Posttest • Pretest, Instruction, Posttest, Posttest, Posttest
Summative EvaluationDesign or Select Evaluation Measures • Payoff Outcomes - Review statistics that may have changed after instruction was implemented • Learning Outcomes - Measure for an increase in test scores • Attitudes - Conduct interviews, questionnaires, and observations • Level of Implementation - Compare design of program to how it is implemented • Costs - Examine costs to implement and continue program, personnel, facilities, equipment, and material
Summative EvaluationCollect Data • Devise a plan for the collection of data that includes a schedule of data collection periods
Summative EvaluationAnalyze Data • Analyze the data so that it is easy for the client to see how the instructional program affected the problem presented in the needs assessment.
Summative EvaluationReport Results • Prepare a report of the summative evaluation findings that includes: • Summary • Background • Description of Evaluation Study • Results • Discussion • Conclusion and Recommendations
Determine the goals of the evaluation Select objective or subjective orientation Select design of evaluation plan Design or select evaluation measures Collect the data Analyze the data Report the results Summative Evaluation Summary