500 likes | 646 Views
An introduction to intelligent interactive instructional systems Kurt VanLehn ASU. Outline. Tutoring systems. Step loop User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations
E N D
An introduction to intelligent interactive instructional systems Kurt VanLehn ASU
Outline Tutoring systems • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination Other interactive instructional systems
Intelligent “tutoring” system is a misnomer • Almost all are used as seatwork/homework coaches • The instructor still… • Lectures • Leads whole class, small group & lab activities • Assigns grades; defends grades • Can assign homework / seatwork problems • or delegate to the tutoring system • The instructor no longer… • Grades homework / seatwork • Tests? For-profit web-based homework grading services are growing rapidly
If students enter only the answer, call it answer-based tutoring 30° 25 X = 40° y° y° x° Answer 45° What is the value of x?
30° 40° y° y° x° 45° What is the value of x? If students enter steps that derive the answer, call it step-based tutoring Step Step Step Step Step Step Answer Step
30° 40° y° y° x° 45° What is the value of x? OK Def: Feedback is a comment on one of the student’s steps Oops! Check your arithmetic.
30° 40° y° y° x° 45° What is the value of x? OK Feedback is often given as a hint sequence Oops! Check your arithmetic.
30° 40° y° y° x° 45° What is the value of x? OK Hints become more specific You seem to have made a sign error.
30° 40° y° y° x° 45° What is the value of x? OK Hints segue from commenting on the student’s step to suggesting a better step Try taking a smaller step.
30° 40° y° y° x° 45° What is the value of x? OK and become more specific Try doing just one arithmetic operation per step.
30° 40° y° y° x° 45° What is the value of x? OK Def: A bottom-out hint is the last hint, which tells the student what to enter. Enter 70+y=180, and keep going from there.
30° 40° y° y° Try doing just one arithmetic operation per step. x° 45° What is the value of x? OK Def: A next step help request is another way to start up a hint sequence.
30° 40° y° y° x° 45° What is the value of x? Delayed (as opposed to immediate) feedback occurs when the solution is submitted
30° 40° y° y° Oops! Check your arithmetic. x° 45° What is the value of x? OK OK Delayed (as opposed to immediate) feedback occurs when the solution is submitted Can an angle measure be negative?
Both step-based tutors and answer-based tutors have a task loop • Tutor and/or student select a task • Tutor poses it to the student • Student does the task and submits an answer • If answer-based tutor, then work offline • If step-based tutor, then work online • The step-loop = Do step; get feedback/hints; repeat • Repeat
Technical terms/concepts (so far) • Answer-based tutoring system (= CAI, CBI, …) • Step-based tutoring system (= ITS, ICAI…) • Step • Next-step help • Feedback • Immediate • Delayed • Hint sequence • Bottom-out hint • Task loop • Step loop
Andes user interface Read a physics problem Draw vectors Type in equations Type in answer
Andes feedback and hints “What should I do next?” “What’s wrong with that?” Green means correctRed means incorrect Dialogue & hints
SLQ-Tutor (Addison Wesley) Problem Step Step Step Submit! Feedback The database that the problem refers to
Cognitive Algebra I Tutor (Carnegie Learning) Step: Enter an equation Problem Step: Divide both sides Step: Label a column Step: Define an axis Step: Fill in a cell Step: Plot a point
The task Student input is the 2nd half of the step Each tutor turn + student turn in the dialogue is a step AutoTutor
Introduction: Summary • Main ideas • Task loop over tasks • Step loop over steps of a task • Feedback can be immediate or delayed • But it focuses on steps • Hint sequence • Types of tutoring systems • Step-based tutors (ITS) – both loops • Answer-based tutors (CBT, CAI, etc) – task loop only
Initial framework • Step loop • User interface • Interpreting student actions • Suggesting good actions • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Initial framework • Step loop • User interface • Forms, with boxes to be filled • Dialogue • Simulation • Etc. • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Initial framework • Step loop • User interface • Interpreting student steps • Equations • Typed natural language • Actions in a simulation • Etc. • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Any correct path vs. shortest path to answer • Which steps can be skipped? • Recognize the student’s plan and suggest its next step? • Etc. • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Give a hint before the student attempts a step? • Immediate vs. delayed feedback? feedback on request? • How long a hint sequence? When to bottom out immediately? • Etc. • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Keeping the student in the “zone of proximal development” (ZPD) • Mastery learning: Keep giving similar tasks until student master them • Choosing a task that suits the learner’s style/attributes • Etc. • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Assessment vs. Evaluation • “Assessment” of students • What does the student know? • How motivated/interested is the student? • “Evaluation” of instructional treatments • Was the treatment implemented as intended? • Did it produce learning gains in most students? • Did it produce motivation gains in most students? • What is the time cost? Other costs?
Assessment consists of fitting a model to data about the student • Single factor model: A single number representing competence/knowledge • Probability of a correct answer on a test item =f(competence(student), difficulty(item)) • Knowledge component model: One number per knowledge component representing its mastery • Probability of a correct answer on a test item =f(mastery(KC1), mastery(KC2), mastery(KC3), …) where KCn are the ones applied in a correct solution
Example: Answer-based assessment of algebraic equation solving skill • Test item: Solve 3+2x=10 for x • KC5: Subtract from both sides & simplify3+2x=10 2x=7 • KC8: Divide both sides & simplify2x=7 x=3.5 • Single factor model • If answer is correct, increment competence else decrement • Knowledge component model • If answer correct, increment mastery of KC5 & KC8 • If answer incorrect, decrement mastery of KC5 & KC8 • Weakest one is most likely to be the failure, so decrement it more
Step-based assessment of algebraic equation solving skill • Solve 3+2x=10 for x • Step1: • Step2: • Single factor model: • Whenever a step is answered correctly without hints, increment competence else decrement • Knowledge component model: • Whenever a step is answered correctly without hints,increment its KC’s mastery else decrement 2x = 7 x = 3.5
Task selection uses assessments • Single factor model • Choose a task that is the right level of difficulty i.e.,in the ZPD (zone of proximal development) of the student • Knowledge component model • Choose a task whose solution uses mostly mastered KCs, and only a few KCs that need to be mastered
Other assessment issues • Other decisions, besides task selection, that can use assessment? • Assessment of motivation or interest? • Assessment of learning styles? Disabilities? • Diagnosis of misconceptions? Bugs?
Should a “Skillometer” displays knowledge component mastery to the student?
Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Authoring • Author creates new tasks • Author generates all solutions? • System generates all solutions? • Same taste as author? • Can author add new problem-solving knowledge? • Who can be an author? • Instructors? • Professional authors? • Knowledge engineers?
Software architecture & engineering • Client-server issues • Platform independence • Integration with learning management systems • E.g., Blackboard, WebAssign, many others • Cheating, privacy • Quality assurance • Software bugs • Content & pedagogy bugs
Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Types of evaluations • Analyses of expert human tutors • What do they do that the system should emulate? • Formative evaluation • What behaviors of the system need to be fixed? • Have students talk aloud, interviews; teachers… • Summative evaluation • Is the system more effective than what it replaces? • Two condition experiment: System vs. control/baseline • Pre-test and post-test (+ other assessments) • Hypothesis testing • Why is the system effective? • Multi-condition experiments: System ±feature(s)
Example: Summative evaluation of the Andes Physics tutor • University physics (mechanics) 1 semester • 2 Conditions: Homework done with… • Andes physics tutor • Pencil & paper • Same teachers (sometimes), text, exams, labs • Results (2000-2003) in terms of effect sizes • Experimenter’s post-test: d=1.2 • Final exam: d=0.3 • d = (mean_Andes_score – mean_control_score) ÷ pooled_standard_deviation
Open-response, problem solving exams scores Andes Control Exam score Grade-point average
Ideal tutoring system adapts to the student’s needs High Bored, & irritated Assistance provided Large learning gains Struggling Low Assistance needed Assistance provided = task selection, feedback, hints, user interface…
No-so-good tutoring system helps only some students High Not-so-good tutor Assistance provided Bored, & irritated Large learning gains Struggling Low Assistance needed Assistance provided = task selection, feedback, hints, user interface…
Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination
Dissemination = getting the system into widespread use • Routes • Post and hope • Open source • Commercialization • Issues • Instructor acceptance • Instructor training • Student acceptance • Marketing
Next Outline Tutoring systems • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination Other interactive instructional systems
Other intelligent interactive instructional systems • Teachable agent • Student deliberately teaches the system, which is then assessed (in public) • Learning companion • Student works while system encourages • Peer learner • Student and system work & learn together • To be discovered…