160 likes | 295 Views
Systematic Evaluation of the Effectiveness of TREs through Software Platform Development for Data Mining across Multiple Disciplines and Tracking Changes in Affective and Cognitive Growths . Eunice Jang, PhD Maryam Wagner. Overall Goals.
E N D
Systematic Evaluation of the Effectiveness of TREs through Software Platform Development for Data Mining across Multiple Disciplines and Tracking Changes in Affective and Cognitive Growths Eunice Jang, PhD Maryam Wagner
Overall Goals • Use measurement variables from TREs interventions to assess, track, and evaluate affective and cognitive changes attributable to TREs’ interventions. • Investigate factors that affect the effectiveness of feedbackin order to optimize the TREs feedback system for maximal learning experience
Research Questions • Do cognitive and learning outcomes change during learning with TREs? • Do affective processes lead to better learning outcomes such as increases in knowledge acquisition, better communication skills, more effective decision-making, more sophisticated self-regulated, enhanced transfer of knowledge? • Do these relationships between affect and learning differ based on domain, TRE, and type of learner? • Which types of scaffolding (human, computer, or both) best guide students’ learning and engagement in TREs across disciplines?
BioWorld System Analysis Data Sources • Pre-existing data: students’ verbal protocols as they engage with BioWorld • Students’ questionnairedata • Student background information • Computer logs
BioWorld System AnalysisApproach • Identify moderating variables from student background and computer logs • Develop latent class profiles that characterize distinct student learning styles • Examine how learners with different profiles respond to feedback
BioWorld System Preliminary Analysis Think Aloud Data Identify: • Cognitive skills elicited by participants • Self-regulatory learning strategies elicited • Affective variables exhibited • Participants’ goal-orientations(task-orientation)
Cognitive Skills Types of knowledge and skills participants used during engagement with BioWorld: In general: • As participants read and became familiar with the cases, they were eliciting lower level cognitive knowledge and skills (e.g., understanding explicitly stated information) • Participants exhibited use of higher cognitive knowledge and skills as they began to interact with other features of BioWorld in order to create and test and test their hypotheses (e.g., evaluating)
Examples of Knowledge and Skills Elicited • Understanding Explicitly Stated Information “Okay, so her daily activities are further disturbed by having to urinate more frequently.” • Making Connections “So I would have though, like, maybe like, menopause, but she’s too young for that so no.”
Predicting/Hypothesizing “Okay so, she started taking her meds for high blood pressure… maybe she’s having some kind of reaction to it” • Evaluating/Appraising “Um, pre-test value 9.0, two hour post test value is 14 that seems high. That seems really high to me. Does it give you standard values? Oh okay, normal test should be 6. Oh cool, okay so she’s diabetic. Great. Great, great, great, perfect.”
Examples of Self-Regulated Learning Strategies • Planning (setting goals, activating knowledge) “I would say…just…want to look at my evidence” • Monitoring (metacognition, monitoring cognition) “Can diabetes make you lose weight? I really don’t know. Nausea, vomiting, abdominal pain, I don’t think so. Oh no, oh but I can use fatigue”
Controlling (selecting cognitive strategies) “That’s not the order, I’ll have to change that. Um, I can do that after.” • Reflecting (making cognitive judgments) “Oh, I forgot the antibodies, of course, although, ya, I came from Argentina, and in Argentina, we use the very um, cheap tests first, but I should have thought of that too, the antiboides. I knew that and I completely missed it…”
Response to Feedback • Participants reacted differently to feedback and focused on different aspects • Highlighting if diagnosis correct/incorrect • Self-evaluating • Accepting/non-questioning • Feedback response illustrated students’ self-efficacy • For most participants, feedback provided an opportunity for self-reflection
“This is disappointing, only got 37%. Oh, I should’ve checked the ketones, that’s right, that’s bad.” (Evaluating self) “Alright, okay so I got the two main things urinating, and thirsty. Okay but difficulty seeing, nausea, vomiting, abdominal pain, okay apparently that goes with diabetes, I didn’t know.” (Self-reflection) “One should always rule out physiologic causes before making a psychological diagnosis, OK” (Accepting feedback) “ Oh that’s interesting, cause I was going to do it the other way, I was going to rule out the psychological before.” (Self-reflection) “…she’s hypovolemic. The electrolytes, that’s a big one that I, I mean, no I didn’t know, but I was so sure of the diagnosis that I didn’t even go there.” (Self-Efficacy)
Tracking students’ profiles as a function of feedback use How do students’ profiles change over time?
Thank you! eun.jang@utoronto.ca maryam.wagner@utoronto.ca