660 likes | 966 Views
EDIT 6170 Instructional Design June 27, 2003 Dr. Lloyd Rieber, Michael Orey, Michele Estes, University of Georgia, Department of Instructional Technology If you can hear me, click If you cannot hear me, click And refer to Online Survival Guide for Help 4 Topics Mid-semester review
E N D
EDIT 6170Instructional DesignJune 27, 2003 Dr. Lloyd Rieber, Michael Orey, Michele Estes, University of Georgia, Department of Instructional Technology If you can hear me, click If you cannot hear me, click And refer to Online Survival Guide for Help
4 Topics • Mid-semester review • Brief Review of Key Topics • Team responsibilities • More examples of instructional development (if there is time at end) • Formative evaluation • Review • Revising instructional materials • Summative evaluation
HorizonLive is Available • Feel free to use your break out room for team meetings. • You also have a WebCT chat room, bulletin board and you are free to use phones and other meeting places to meet. You do need to meet and we provide several options for holding team meetings.
Updates & Reminders • Design Teams • WebCT topic areas created for each team • Weekend time on HL? • WWILD Team (submit 5, review 5) due today • Check out teams’ progress reports #2
Progress Report #3Due June 30 (Monday) • You need to have completed the first draft of your needs assessment, course design, & unit design • Provide the first draft of your progress report as a link to the Word document off of your team page • I assume someone on the team has the skill to upload the file and knows how to determine the URL.
Mid Semester Course Review In class: Polls and discussion Online Survey about Team formation
Polls • Daily class organization • IDAs • Giving buddy feedback • Getting buddy feedback • The way teams were formed • Course support for team work • Prepared to do team project • Self-directed media-based learning experience • WWILD Team • HL Classroom • WebCT • Communication between instructor/students • Dick, Carey, & Carey textbook
Open Discussion • Daily class organization • IDAs • Giving buddy feedback • Getting buddy feedback • The way teams were formed • Course support for team work • Prepared to do team project • Self-directed media-based learning experience • WWILD Team • HL Classroom • WebCT • Communication between instructor/students • Dick, Carey, & Carey textbook • Other?
Things to keep | Ways to Improve • Daily class organization • IDAs • Giving buddy feedback • Getting buddy feedback • The way teams were formed • Course support for team work • Prepared to do team project • Self-directed media-based learning experience • WWILD Team • HL Classroom • WebCT • Communication between instructor/students • Dick, Carey, & Carey textbook • Other
Debriefing of… • WWILD Team • Remember, you don’t have to “reinvent the wheel”; it’s OK to use material from the Internet • Think of these as “learning objects” (very hot topic right now in eLearning circles) • Of course, be sure to adhere to copyright laws.
Brief Review… Designing and Conducting Formative Evaluations
Revise Instruction Conduct Instructional Analysis Assess Need to Identify Goal(s) Write Performance Objectives Develop Assessment Instruments Develop Instructional Strategy Develop And Select Instructional Materials Design and Conduct Formative Evaluation Analyze Learners and Contexts Design and Conduct Summative Evaluation (Dick & Carey’s Model)
The Concepts Of Formative Evaluation Definition The collection of data and information during the development of instruction that can be used to improve the effectiveness of the instruction. Purpose To obtain data that can be used to revise the instruction to make it more efficient and effective.
Formative Evaluation Helps toAnswer the Following Questions • How Effective Is This Instruction at This Stage of Development? • What Has Been Learned? • How Usable Is The Instruction? • How Easy Is It For Students To Use The Media I’ve Developed? • How Motivational Is The Instruction? • In What Ways Can It Be Improved? • Improvement Is The Goal Of Formative Evaluation. After All, Your Instruction Is At A Very “Formative” Period, Is It Not?
What Data Should I Collect? • Be very open to collecting any data that will help you answer the questions on the previous slide. • Don’t be defensive as a designer – expect improvements to be needed. • The sooner you begin the evaluation process, the less costly will be the revisions. • Imagine trying to persuade the most skeptical person about your lesson’s effectiveness • Be your own worst critic
Evaluation and ResearchUse Similar Methods • A Variety Of Data:Quantitative And Qualitative • Triangulation: Do All Data PointTo The Same Interpretations? • Quantitative: Based On Numbers • Carefully Designed InstrumentsThat Can Be Scored • Qualitative: Based On Words • YOU Are The Instrument! • Careful Observation • More Focus On “Why” Questions
The Role Of Subject-Matter,Learning, And Learner Specialists It’s important to have the instruction reviewed by specialists. SMEmay be able to comment on the accuracy and currency of the instruction. Learning specialist may be able to critique your instruction related to what is known about enhancing that particular type of learning Learner specialist may be able to provide insights into the appropriateness of the material for the eventual performance context.
The Three PhasesOf Formative Evaluation • One-to-One Evaluation • Small-Group Evaluation • Field Trial
One-to-One Evaluation Purpose To identify and remove the most obvious errors in the instruction To obtain initial performance indications and reactions to the content by learners Criteria • Clarity • Impact • Feasibility
Small-Group Evaluation Purposes • To determine the effectiveness of changes made following the one-to-one evaluation. • To identify any remaining learning problems that learners may have. • To determine whether learners can use the instruction without interacting with the instructor.
Field Trial Purpose To determine whether the changes/revisions in the instruction made after the small group stage were effective. To see whether the instruction can be used in the context for which it was intended.
Constructing Good Assessments • Remember, a well-written objective ALMOST IS the assessment.
Assessing Physics Understanding Pretend there is no friction or gravity. If a ball is moving to the right and its acceleration is also to the right, which of the following is true? • The ball’s speed is not changing. • The ball’s speed is increasing. • The ball’s speed is decreasing. • The ball’s speed increases at first, and then decreases. • None of the above are true.
Assessing Physics Understanding If the speedometer needle of a car moved at a steady rate from the 30 mph mark to the 40 mph mark over a stretch of flat, straight road, which of the following is true? • Acceleration was nonzero in the opposite direction the car was moving. • Acceleration was 0. • Acceleration was nonzero in the direction the car was moving. • Acceleration was nonzero, but decreasing. • Acceleration was nonzero and increasing.
Assessing Physics Understanding C B D E A Imagine that you threw a ball up into the air and it just left your hand at point A. Describe the motion of the ball and all the forces acting on it at each point.
Formative vs. Summative Evaluation • The purpose of formative evaluation is to improve instruction by getting data for revisions. • The purpose of summative evaluation is to prove the worth of the instruction, given that it will not revised.
Questions? • Go ahead and enter question in message field, or… • Click and wait for my prompt to speak.
Responsibility of Each Team • Course Design needs to be complete • Unit Design: Just choose one unit to design fully • Lesson design scope modification: How many lessons to design, develop, and field test? • Original: Design as many lessons as there are team members • Revised to… • 5-6 team members: 3 lessons • 4 or less team members: test 2 lessons
Responsibility of Each Team • Identify lesson objective(s). • Prepare assessment instruments. • Consider both quantitative and qualitative methods/instruments • Check evaluation instruments for validity (i.e. are they congruent with objectives?) and reliability. • Consider both performance and motivation in your evaluation. • Be open to collect any other data that will serve to improve your instruction (including observation and learner introspection). • Prepare lesson using Instructional Strategy Planning Guide as a job aid. • Each lesson must be evaluated with at least 3 students in the target audience. • Interpret your formative evaluation based on all assessment instruments and observations. • Report the results in your final report.
Questions? • Go ahead and enter question in message field, or… • Click and wait for my prompt to speak.
Revise Instruction Conduct Instructional Analysis Assess Need to Identify Goal(s) Write Performance Objectives Develop Assessment Instruments Develop Instructional Strategy Develop And Select Instructional Materials Design and Conduct Formative Evaluation Analyze Learners and Contexts Design and Conduct Summative Evaluation (Dick & Carey’s Model)
This Time… • Summarizing and analyzing data obtained from formative evaluation • Revising materials
Two Basic Types of Revision • The changes that are made to the content of the materials • The changes that are related to the procedures employed in using the materials
Do We Need To Make Revisions? • The changes that are made to the content of the materials - NO • The changes that are related to the procedures employed in using the materials – YES, within practical limits
Kinds Of Data To Analyze • Learner characteristics • Entry behavior • Direct responses to the instruction • Learning time • Posttest performance • Responses to an attitude questionnaire • Comments made directly in the materials
Analyzing Data from One-to-One Trials The designer must look at the similarities and differences among the responses of the learners, and determine the best changes to make in the instruction.
Analyzing Data from One-to-One Trials Three Sources Of Suggestions For Changes • Learner suggestions • Learner performance • Your own reactions to the instruction
Analyzing Data from Small Groupand Field Trials The fundamental unit of analysis for all the assessments is the individual assessment item. Performance on each item must be scored as correct or incorrect.
Analyzing Data from Small Groupand Field Trials Methods For Summarizing Data • Item-by-objective performance • Graphing learners’ performance • Descriptive fashion
Analyzing Data from Small Groupand Field Trials Another Method For Summarizing Data • Comments can be captured in one-on-one charts where you list out comments made by each learner
Analyzing Data from Small Group and Field Trials • Another Method For Summarizing Data • Assessment scores can be shown in charts or hierarchies that represent your individual objectives
Analyzing Data from Small Group and Field Trials • Another Method for Summarizing Data • Results from attitude surveys can be placed in an attitude table.
Revising Materials Use the data, your experience, and sound learning principles as the bases for your revision.
Revising Selected Materials • Omit portions of the instruction. • Include other available materials. • Simply develop supplementary instruction.