660 likes | 949 Views
RELATE :. RE search in L earning, A ssessing, and T utoring E lectronically RELATE.mit.edu Postdocs: Phil Dukes Sofia Morote Rasil Warnakulasooriya PI: Dave Pritchard $: MIT, NSF, DEP. Also known as. by EET. and CyberTutor.MIT in publications.
E N D
RELATE: REsearch in Learning, Assessing, and Tutoring ElectronicallyRELATE.mit.eduPostdocs:Phil DukesSofia MoroteRasil Warnakulasooriya PI: Dave Pritchard$: MIT, NSF, DEP
Also known as by EET and CyberTutor.MIT in publications an expert system based on educational expertise, not AI The most advanced tutorial and assessment system in the worldMade by Effective Educational Technologies, a Pritchard company.
Outline • Objective • Pedagogy that Works • Feedback – Closed Loop Education • Research from MIT • Revolution in Assessment • Closing Thoughts
Digital Education Future?! Perspective Two-way Radio Interactive Student Stylized (e.g. Audio) Next Day Coach Authors/Researchers Embedded Assessment Broadcast Radio Passive Class Uniform Style Next Edition Teacher Author High Stakes Tests
Why Homework?? Teachers’ Priorities: Lectures Exams Notes and Demonstrations Homework Students spend most time and learn most from 1. Homework
TWO WAY LEARNING Books, lectures, most WWW education Students, Teachers, Authors, Researchers Learn from each other DATA, EXPERIMENT, ANALYSIS, CONCLUSIONS
Teacher SOCRATIC LEARNING Student System Authors, Researchers
Outline • Objective • Pedagogy that Works • Feedback – Closed Loop Education • Research from MIT • Revolution in Assessment • Closing Thoughts
Pedagogy Design Philosophy of myCyberTutor • Emulate the interaction between a human tutor and a student. The tutor informs the teacher. The process informs the author. Results: • an effective interactive learning tool • you can author, deliver and improve content • an expert program embodying your expertise
Pedagogy Student-CenteredInstruction Mastery Learning The amount learned should be constant and time allowed to vary Constructivists Allow students to construct knowledge in their own way Student Others Socratic pedagogy and learning styles (to be implemented)
Pedagogy Pedagogical Principles • Actively engage the student • Adapt problem to less skillful students with hints • Prompt feedback addresses wrong answers • Mastery Learning >90% get solution • Declarative and procedural knowledge are both important hints and subproblems • Solidify and extend the solution followups • Free response answers reduce guessing
Gain on the MIT Final ExamDecember 2000 to May 2001 Results from MIT P-value 0.69 0.69 0.35 0.010
Results from MIT Gain on Force Concept Inventorydata C. Ogilvie 2000 FCI gain=0.41 for course P-value 0.854 0.807 0.198 0.087 0.015
Outline • Objective • Pedagogy that Works • Feedback – Closed Loop Education • Research from MIT • Revolution in Assessment • Closing Thoughts
Two Way Learning - Feedback TWO WAY LEARNING • Typical student returns to server 10 times during course of each problem (cf. Web Assign ~4 times per assignment) • Students achieve the correct answer 90% of the time (cf. ~60% first time right) • Students comment on ~3% of all problems • More if problem has flaws myCyberTutor interactions:
Student Comments Feedback – Closed Loop Education
Wrong Answers Feedback – Closed Loop Education
Feedback to AuthorImproves Problems Feedback – Closed Loop Education • < 90% correct: Need more hints • Wrong Answers: Respond to common ones • Comments: Revise wording, remove confusion, revise program • Time: Is this problem worthwhile?
Feedback Enables Revisionsthat Improve Problems Percent Answering Correctly Percent Requesting Solutions Average Wrong Answers /part Average hints/part Median Minutes/part Spring 2001 83.5 % 12.3 % 1.51 0.76 1.5 Spring 2002 91.6 % 6.0% 0.89 0.83 1.5 Fall 2003 93.4% Room for even more improvement!!
Outline • Objective • Pedagogy that Works • Feedback – Closed Loop Education • Research from MIT • Revolution in Assessment • Closing Thoughts
Pedagogy RELATE:REsearch in Learning, Assessing, and Tutoring ElectronicallyRELATE.mit.eduPostdocs:Phil DukesSofia MoroteRasil Warnakulasooriya
Attractions for Researchers Objective • Data with resolution • Capability for Split Class Assignments • Your Data Package • Log of your students’ interactions • Anonymous student # vs name & ID for your class • Database of your class’ assignments • Plus - General Data Package • Performance Data - your class vs. standard • SML skeleton each problem (subparts, hints, etc.) • Format key
Inductive vs. Deductive Instruction Inductive: Students learn by doing a problem from the hints, from the subproblems by figuring it out from feedback Transfer learning to tutorial questions?? Deductive: Students learn from a tutorial from the learning goal & text from the hints from the self-assessment questions Transfer learning to related problem??
Tutorials • Tutorial problems in Mastering Physics are carefully planned and sequenced instruction with SAQ’s • They are used as instructional material to impart principles in deductive learning.
Problems Pedagogy • Problems require a student to apply an already familiar concept, formula, or procedure • Socratic help is available, including explanation of the concept, a formula that is needed, etc. • Related Problems cover same topic as adjacent tutorial
Deductive: Related Problem Difficulty Reduced by working Tutorial First Results from MIT Torque p = 0 . 0 1 * Newton 3rd Law p = 0 . 0 3 * Harmonic Os. p = 0 . 0 6 * * After working tutorial
Inductive: Tutorial Difficulty Reduced by working Related Problem First? Results from MIT Torque Newton 3rd Law Harmonic Os.
Conclusion: Deductive Works Results from MIT • Interactive tutorials significantly increase performance on subsequent related problems (~25% less difficult) • Students don’t learn inductively from a multi-part example • We recommend using online tutorials in the old fashioned way - preparation for subsequent deductive exercises
Tutorial-first 0.020 Problem-first 0.018 0.016 0.014 0.012 0.010 Improvement per unit time 0.008 0.006 0.004 0.002 0.000 Torque Newton III SHM Twice as much learning per unit time spent on the tutorial compared with time spent on the preparatory problem
Prior related problem reduces the hints requested on the related problem by ~12% (based on 6 problems) Prior tutorial reduces the hints requested on the related problem by ~19% (based on 5 problems) p < 0. 05 p < 0. 005 p < 0. 01 p < 0. 1
Time to Completion The real time environment allows us to study how long it takes students to work problems, whether good students do problems quicker or slower, etc. We have discovered that there are Three groups of students in time:
When Students Finish: Three distinct groups Quick solvers < 2.5 minutes Real-time solvers 2.5 min – 2.2 hours Interrupted solvers > 2.2 hours Quick solvers Interrupted solvers Real-time solvers
Note that: 1. The quick solvers do not make mistakes or ask for hints 2. The real-time solvers make mistakes and ask for hints 3. The interrupted solvers make mistakes and ask for hints 4. Fewer real-time solvers in the prepared group ask for hints
Fraction Finished curves with hints & feedback 44% 64% For 14 problems: fraction of real-time solvers = 65 4%
Time to completion curves without hints & feedback 35% 28% For 3 typical homework problems: fraction of real-time solvers = 29 3%
Outline • Objective • Pedagogy that Works • Feedback – Closed Loop Education • Research from MIT • Revolution in Assessment • Closing Thoughts
Low-Error Embedded Assessment Future - Assessment Imagine that a rich ship-owner has hired Socrates to tutor his children. At the end of the month he desires to assess the amount they have learned. Would you advise him to: a) Administer a standardized hour-long test to the children? b)Ask Socrates how much they have learned? MyCyberTutor Assessment has ~100 times less variance due to error than a good final exam! This Assessment gives ~6 times as reliable an assessment per unit of student time as a good final exam!
Embedded Assessment Future - Assessment myCyberTutor vs. Final Exam myCyberTutor • …is 6 timesmore reliable per unit time • …has ~100 timesless error variance
Assessment: Detailed Skill Profile Future - Assessment
Predicting Final Exam Score Future - Assessment Implication: MyCyberTutor can replace tests
myCyberTutor Assessment Implies: Assessment 1) More Accurate 2) Fine Grained Assessment on Subtopics 3) Immediate Remediation -Select Next Problem 4) JITT Guide for Teacher 5) Learning vs. Avoiding Lost Points 6) Predict Test Scores -Eliminate Tests 7) Incredible Tool for Education Research 8) Replace High Stakes Tests
Outline • Objective • Pedagogy that Works • Feedback – Closed Loop Education • Research from MIT • Revolution in Assessment • Closing Thoughts
What you can gain • Write interactive problems • Educational Research on them • Educational Research in general What we can accomplish together • Partnerships in each - ideally all!!