300 likes | 507 Views
Learning Analytics: On the Way to Smart Education. Dr. Griff Richards Athabasca University. Moscow, 08 October 2012. Distance Learning In Canada - Is growing, especially in K-12 public schools 15 % of 2008 high school graduates in BC took at least one on-line course
E N D
Learning Analytics: On the Way to Smart Education Dr. Griff Richards Athabasca University Moscow, 08 October 2012
Distance Learning • In Canada • - Is growing, especially in K-12 public schools • 15% of 2008 high school graduates in BC took at least one on-line course • Increase in “Blended Learning”, online activities for F2F courses. • Increasing concern for course quality and student retention 60oN 49oN Athabasca, Alberta
Learning Engagement The more a learner interacts with the content and with their peers about the content, the more they are likely to internalize it, and remember it... (Richard Snow, 1980)
Learning Engagement Learning engagement promotes student retention and academic success. (George Kuh,2001) • Engagement is “action based on internal motivation” • Can not directly measure engagement • Must look for external traces, actions that indicate interest, and involvement (NSSE)
CAUTION: Engagement • Arum & Roska (2011) • Students who studied alone had higher marks than students studying in groups. • Engagement that is not task-focused is unlikely to improve learning.
Interaction Equivalency Theory(Garrison & Anderson,1995) Course Content The quality of the interaction is more important than the source Other learners Instructor
4 places to improve learning 1. Clearer designs Course Content 2. Collaborative Learning activities 3. Less lecturing, more mentoring Other learners Instructor 4. More study skills
We have insufficient data about which instructional strategies actually work best.
Hybrid or Blended Learning? The use of online technologies to augment face to face delivery. -> replaces some F2F “face time” (forums save classroom space) -> uses LMS to track assessments and assignments -> uses technology in class e.g. pop quizzes, collaboration tools Face to Face On-Line
Analytic Measures Engagement is inferred from activity. • Student interaction with Online systems leaves an “exhaust trail” of their activity • Learning Analytics is not a statistical sample, it is all the data for all learners • Questions: What patterns to look for? How to interpret them?
Example: Snapp • LMS interaction data e.g. Moodle discussion • Extract linked list • Plot interactions in star map
SNAPP shot of Conference Students “engaged”
SNAPP shot of Conference Individuals with lower engagement, 3 or less messages
Does Activity = Engagement ? Beer (2010) plotted total LMS interactions with academic grades. Does activity = engagement? Is Blackboard more engaging than Moodle?
Limitations: Activity Outside LMS • As learners become engaged, they move to more engaging channels: email, Elluminate, Skype • This activity is not tracked by LMS. • No data available. LMS
Interpretation of Analytics • Data patterns require investigation • Quantitative data requires interpretation --> make and test hypotheses --> create useful models • When we measure something we risk changing it. • e.g. If learners know we count hits they may make meaningless hits to fool the system.
Analytics for Learners! • The same analytics should be able to provide easy to understand information dashboards for students.
Analytics for Learners SIGNALS Arnold (2010) inserted a traffic light on each student’s course page to provide guidance on course success. A/B C D/F
Dashboard for Faculty Arnold (2010) reported 14 % shift from D’s to C’s & B’s. Same number of withdrawals, but W’s occurred earlier before damaging student grade point averages. 12% increase in students requesting assistance. N=220
Mesmotsa dashboard for learners(Richards & Sehboub, 2008) My webquest data Data for my class
How to start: Analytics for online & blended learning? • Measuring something is the first step • “Better to measure something than to measure nothing” (Scrivens) • Need more data than just page hits. We also need to ask learners about their experience, what worked, what needs improvement.
Dynamic Evaluation Model (Richards & Devries,2011) Analytics at the activity level • Preparation Conduct Reflection • Design • Facilitation • Learning
Dynamic Evaluation Model Timely feedback enables quick fixes • Preparation Conduct Reflection • Design • Facilitation • Learning
If Analytics, Then What? • If analytics show students are failing, is there a moral obligation to help them? • If analytics show a course has weaknesses, is there a business obligation to fix it? • If analytics reveal weak instruction, who is responsible to intervene? • If analytics are inaccurate, who fixes them? • What about privacy & ethics? Who owns the data? Who has the right to see it?
The Analytics Box? • joannaparypinski.com
Learning Analytics: On the Way to Smart(er) Education Dr. Griff Richards Athabasca University griff@sfu.ca Moscow, 08 October 2012