350 likes | 518 Views
Beyond Problem-solving: Student-adaptive Interactive Simulations for Math and Science. Cristina Conati Department of Computer Science University of British Columbia. Overview. Motivations Challenges of devising student-adaptive simulations
E N D
Beyond Problem-solving: Student-adaptive Interactive Simulations for Math and Science • Cristina Conati • Department of Computer Science • University of British Columbia
Overview • Motivations • Challenges of devising student-adaptive simulations • Two examples of how we target these challenges • ACE: interactive simulation for mathematical functions • CSP Applet: interactive simulation for AI algorithm • Conclusions and Future work
Intelligent Tutoring Systems (ITS) • Create computer-based tools that support individual learnersBy autonomously and intelligentlyadapting to their specific needs Adaptive Interventions Tutor Student Model Domain Model
ITS Achievements • In the last 20 years, there have been many successful initiatives in devising Intelligent Tutoring Systems (Woolf 2009, Building Intelligent Interactive Tutors, Morgan Kaufman) • Mainly ITS that provide individualized support to problem solving through tutor-lead interaction (coached problem solving) • Well defined problem solutions => guidance on problem solving steps • Clear definition of correctness => basis for feedback
Beyond Coached Problem Solving • Coached problem solving is a very important component of learning • Other forms of instruction, however, can help learners acquire the target skills and abilities • At different stages of the learning process • For learners with specific needs and preferences • Our Goal: Extend ITS to other learning activities that support student initiative and engagement: • Interactive Simulations • Educational Games
Overview • Motivations • Challenges of devising student-adaptive simulations • Two examples of how we target these challenges • ACE: interactive simulation for mathematical functions • CSP Applet: interactive simulation for AI algorithm • Conclusions and Future work
Challenges • Activities more open-ended and less well-defined than pure problem solving • No clear definition of correct/successful behavior • Different user states to be captured (meta-cognitive, affective) in order to provide good tutorial interventions • difficult to assess unobtrusively from interaction events • How to model what the student is doing? • How to provide feedback that fosters learning while maintaining student initiative and engagement?
Our Approach • Student models based on formal methods for probabilistic reasoning and machine learning • Increase information available to student model through innovative input devices: • e.g.eye-tracking and physiological sensors • Iterative model design and evaluation
Overview • Motivations • Challenges of devising student-adaptive simulations • Two examples of how we target these challenges • ACE: interactive simulation for mathematical functions • CSP Applet: interactive simulation for AI algorithm • Conclusions and Future work
ACE: Adaptive Coach for Exploration (Bunt, Conati, Hugget, Muldner, AIED 2001) • Activities organized into units to explore mathematical functions (e.g. input/ouput, equation/plot) • Probabilistic student model that captures student exploratory behavior and other relevant traits • Tutoring agent that generates tailored suggestions to improve student exploration/learning when necessary
Adaptive Coach for Exploration EDM 2010
Adaptive Coach for Exploration Before you leave this exercise, why don’t you try scaling the function by a large negative value? Think about how this will affect the plot
Knowledge ACE Student Model(Bunt and Conati 2002) • Iterative process of design and evaluation • Probabilistic model of how individual exploration actions influence exploration and understanding of exercises and concepts • e.g. (in Plot unit) • positive/negative slope • positive/negative intercept • large/small, positive/negative exponents… Exploration Categories Individual Exploration Cases Exploration of Exercises Exploration of Units
Interface Actions Modeling Student Exploration • Our first attempt (Bunt and Conati, 2002) Student Model • Number and Coverage of Exploratory Actions, e.g. • Positive/negative Y-Intercept • Odd/Even, Positive Negative Exponent.... Learning
Preliminary Evaluation • Quasi-experimental design with 13 participants using ACE (Bunt and Conati 2002) • The more exercises were effectively explored according to the student model, the more the students improved • The more hints students followed, the more they learned Because the model only considers coverage of student actions, it can overestimate student exploration • Need to consider whether the student is reasoning about the effects of his/her actions • Self-explanationmeta-cognitive skill:
Revised User Model (Bunt, Muldner and Conati, ITS2004; Merten and Conati, Knowledge Based Systems 2007) Interface Actions Input from eye-tracker Student Model • Number and coverage of student actions • Self-explanation of action outcomes • Time between actions • Gaze Shifts in Plot Unit Learning
Results on Accuracy • We evaluated the complete model against • The original model with no self-explanation • A model that uses only time in between actions as evidence of self-explanation
What’s Next (1) • Test adaptive interventions to trigger self-explanation (Conati 2011)
Discussion • ACE work provided evidence that • It is possible to track more “open ended” students’ behaviors than structured problem solving • eye-tracking can support the process • However, hand-coding the relevant behaviors, as we did for ACE (knowledge-based approach) • is time consuming • likely to miss other, less intuitive patterns of interaction related to learning (or lack thereof)
Alternative Approach (Amershi and Conati 2009, Kardan and Conati 2011) • Behavior Discovery Via Data Mining Actions Logs Other Data Groups together students that have similar interaction behaviors Extract rules describing distinguishing patterns in each cluster • Vector of Interaction Features • Frequency Of Actions • Latency Between • Actions • …………… • Experts • Performance Measure(s) Clustering Association Rules Mining Feature Vectors Interpret in terms of learning
Overview • Motivations • Challenges of devising student-adaptive simulations • Two examples of how we target these challenges • ACE: interactive simulation for mathematical functions • CSP Applet: interactive simulation for AI algorithm • Conclusions and Future work
Tested with AI Space CSP applet • AISpace(Amershi et al., 2007) • set of applets implementing interactive simulations of common Artificial Intelligence algorithms • Used regularly in our AI courses • Google “AISpace” if you want to try it out • Applet for Constraint Satisfaction problems (CSP), visualizes the working of the AC3 algorithm
AISpace CSP Applet Direct Arc Clicking
User Study (Kardan and Conati 2011) • 65 subjects • Read intro material on the AC-3 algorithm • Pre test • Use CSP applet on two problems • Post test • 13,078 actions • More than 17 hours of interaction
Behavior Discovery Dataset Feature vectors Clustering Rule Mining • Features: • frequencies of use for each action • pause duration between actions (Mean and SD) • 7 actions 21 features • Performance measure for validation • Learning Gain from pretest to posttest
Behavior Discovery Clustering Feature vectors Clustering Rule Mining • Found 2 clusters • Statistically significant difference in Learning Gains (LG) • High Learners (HL) and Low Learners (LL) clusters
Behavior Discovery Usefulness:Sample Rules Feature vectors Clustering Rule Mining LL members: • Use Direct Arc Click sparsely (R3) • Leave little time between a Direct Arc Click and the next action (R2) HL members: Use Direct Arc Click action very frequently (R1).
Great, but what do we do with this? • We can use the learned clusters and rules to classify a new student based on her behaviors • Use detected behaviours for adaptive support • Promoting the behaviours conducive of learning • Discouraging/preventing detrimental behaviours
The User Modeling Framework Actions Logs Other Data Behavior Discovery • Vector of Interaction Features If user is a LL and pauses very briefly after a Direct Arc Click (R2) Thentake action to slow her down If user is a LL and uses Direct Arc Click very infrequently (R3) Thenprompt this action Clustering Association Rules Mining Feature User Classification New user’s Actions Feature Vector Calculation Online Classifier Adaptive Interventions
Classifier Evaluation • Leave-one-out Cross Validation on dataset of 64 users • For each user u in dataset • Remove user u • do Behaviour Discovery on the remaining 63 • for each of u’s actions: • Calculate the feature vector uv • Classify uv • Compare with u’s original label
Discussion • User modeling framework for open-ended and unstructured interactions • Relevant behaviours are discovered via data mining techniques instead being hand-crafted • Very encouraging results with CSP applet • Detected clusters represent groups with different learning gains • Online classifier: good accuracy soon enough to generate adaptive interventions • These interventions can be derived from the generated rules
Current Work • Applying the discovered rules to generate the adaptive version of the CSP applet • Adding eye-tracking input to the dataset
Conclusions • Research on devising student-adaptive didactic support for exploratory activities beyond problem solving • Interactive simulations • Challenges in modeling interactions with no clear structure or definition of correctness • Student modeling approaches based on probabilistic techniques and unsupervised machine learning • very promising results • Shown how eye-tracking can help! • We are also exploring it in relation to assessing engagement and attention in educational games (Muir and Conati 2011)