1 / 40

Media & Learning Design (M&LD) Research & Evaluation

Media & Learning Design (M&LD) Research & Evaluation. Presentation to M&LD Steering Committee By Christos Anagiotos (cxa5065@psu.edu) & Phil Tietjen ( prt117@psu.edu ) April 4 2012. OUR CHARGE. Review prior approaches to evaluation within M&LD

ila-golden
Download Presentation

Media & Learning Design (M&LD) Research & Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Media & Learning Design (M&LD)Research & Evaluation Presentation to M&LD Steering Committee By Christos Anagiotos (cxa5065@psu.edu) & Phil Tietjen (prt117@psu.edu) April 4 2012

  2. OUR CHARGE • Review prior approaches to evaluation within M&LD • Incorporate evaluation in new M&LD projects more systematically • Investigate state of the art in media & learning evaluation • Investigate what other Universities are doing in regards to the use of Media in online courses.

  3. Evaluation elements • Learning outcomes • Learning experience • Usability

  4. Research:Media enhances learning • Retention • Transfer • Cognitive Flexibility

  5. Course Surveys

  6. Course Surveys • Criminal Justice • Public Administration

  7. Sample Questions • By watching the library learning tutorials, I learned things I previously did not know about the library • These tutorials will help make my research easier • Video cases allowed me to relate to the content • Video cases helped me in doing my assignments for the course

  8. Focus Groups

  9. Focus Groups • World Campus orientation videos: focused groups with students • M&LD participants: Focused groups with Instructional Designers

  10. Identified Problems

  11. Surveys Problem = Low Response Rate

  12. Response = • Embedded Evaluation • Learning Analytics

  13. Learning Analytics

  14. Learning Analytics Learning analytics is the • measurement, • collection, • analysis and • Reporting of data and their contexts, for purposes of • understandingand • optimizing learning and environments in which it occurs

  15. Universities that are using Learning Analytics • University of Phoenix • Cabelas University • Baylor University • Sinclair Community College • University of Baltimore • Purdue University • Regis University (Library’s Distance Learning Department) • University of Rutgers-Newark (Law Library) • Khan Academy

  16. POTENTIAL OF LEARNING ANALYTICS A. Compare users (e.g. evaluation) B. Predict student performance (Predictive Analytics) C. Understand student’s needs D. Identify media flaws E. Personalization of educational material

  17. What data can we collect from current WC sources • ANGEL • Outside ANGEL - Google analytics - Flash Media Server

  18. What does ANGEL offer? • All data is connected to the student (PSU ID, IP address) • Individual analytics (very complicated to get group analytics) Examples: • Log in time, Log out time, Time spend in each website, Items downloaded

  19. Google Analytics (Outside ANGEL) Collect anonymous information about the user Data is connected to IP Address Data is NOT to the PSU ID • Records much more data than ANGEL • The data is presented in a more user friendly way

  20. Media Flash server(M&LD Videos, Outside ANGEL) Collect anonymous information about the user Data is connected to IP Address Data is NOT to the PSU ID We can currently measure: • Log in/ log out time • Duration per visit, per visitor • Streaming duration • Play, pause hits

  21. How to make sense of data collected? EXAMPLES

  22. Example 1: from Media Flash Server: Course Ed. Leadership 802: • Average Length of videos : 10 minutes • Average watch time: 4 minutes

  23. Example 1: Possible explanations • The content is not valuable or useful to the viewers • The user already got the info from other sources (readings, discussions etc) • Users are tired or bored after watching the same person talking for more than 4 min. • The content may not be clear enough to the user

  24. Example 2 A video was watched 46 times by 12 users in 7 days. Possible Explanations: • High relevance to the user (e.g. used for an assignment) • Entertaining • Confusing

  25. Comparison of data collected ANGEL: Pros: Data connected to the user PSU ID Cons: Very limited amount of data, tough to use Google Analytics (Outside ANGEL): Pros: Large amount of data & Great detail Cons: Data not connected to the individual users’ PSU ID Flash Media Server (Outside ANGEL): Pros: Decent amount of data Cons: Data for the videos ONLY Data not connected to the individual users’ PSU ID

  26. Combining the data we already collect We can gather : • Data directly connected to each user (PSU ID) from the 3 sources • Group data • Data for every activity in the course website

  27. OTHER FORMS OF DATA THAT WC DO NOT COLLECT • Social Network Analysis (Student networks) • Record student screens

  28. 1. Social Network Analysis (Student Networks) • Students’ social networks facilitate learning processes (Dawson, 2010). • These tools are making learner networking visible • Able to “see” (identify) students who are network-poor (apply interventions)

  29. Visualization of Social Networks Analysis

  30. 2. Record student screens e.g. Team Viewer software

  31. What’s next in Learning Analytics?Personalization of educational material Knewton - Pearsons partnership (video): (Knewton Adaptive Learning Platform).

  32. Confidentiality issues • How much data we collect? • Students’ consent • Who has access to the data?

  33. Recommendations • Coordinate with IDs to implement regular evaluations • Establish regular meetings with IDs to discuss and analyze results • Develop internal visualization-reporting tools • Make the connection to student performance • Publicize our findings, let people, outside PSU, know what we are doing in M&LD.

  34. Some other ideas for evaluation

  35. Thank You Questions?

More Related