1 / 18

An Empirical Study of In-Class Labs on Student Learning of Linear Data Structures

An empirical study on the impact of in-class labs versus active-learning lectures in improving student learning and engagement with linear data structures. Research indicates mixed results on learning outcomes but positive engagement; future work includes replication and further analysis.

farrahj
Download Presentation

An Empirical Study of In-Class Labs on Student Learning of Linear Data Structures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Empirical Study of In-Class Labs on Student Learning of Linear Data Structures Sarah Heckman Teaching Associate Professor Department of Computer Science North Carolina State University ICER 2015

  2. Problem • 7-8 sections • 33 students • 1 instructor • 2 TAs • Lecture/Lab • 1-2 sections • 70-90 students • 1 instructor • 2-3 TAs • Lecture • 1-2 sections • 70-90 students • 1 instructor • 2-3 TAs • Lecture CSC116 CSC216 CSC316 Retention! Transition! In-Class Labs? Do Nothing? Lab? ICER 2015

  3. Research Goal • To increase student learning and engagement through in-class laboratories on linear data structures • Hypothesis: active learning practices that involve larger problems would increase student learning and engagement In-class Labs > Pair & Share ICER 2015

  4. Research Questions • Do in-class laboratories on linear data structures increase student learningon linear data structures exam questions when compared to active-learning lectures? • Do in-class laboratories on linear data structures increase student engagement on linear data structures exam questions when compared to active-learning lectures? ICER 2015

  5. Active Learning in CSC216 • “engaging the students in the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert” [Freeman, et al. 2014] • Control: Active Learning Lectures • 2-5 pair & share exercises per class • Submitted through Google forms • Treatment: In-class Labs • Lab activity for the entire lecture period • Pre-class videos introduced topic ICER 2015

  6. Study Participants • Self-selected into section during standard registration period • Populations were similar as measured by a survey on experience with tooling and self-efficacy. ICER 2015

  7. Methods Replication Materials: http://people.engr.ncsu.edu/sesmith5/216-labs/csc216_labs.html • Quasi-Experimental • Counter-balanced design • Learning measured through exams • Engagement measured through observations of class meetings Observed Class Meetings 001 Array Array Linked Linked Lists Iterators Array Array Linked Linked 002 Exam 1 ICER 2015

  8. Student Learning – Exam 1 • Part 4: Method Tracing with ArrayLists • Part 5: Writing an ArrayList method ICER 2015

  9. Student Learning – Exam 2 • Part 3 – Linked Node Transformation • Part 5 – Writing a LinkedList Method ICER 2015

  10. Student Learning – Exam 3 • Comprehensive 3 hour final exam • Stack Using an ArrayList • Queue Using a LinkedList ICER 2015

  11. Student Engagement • Observations for ArrayList and LinkedList class meetings • Observers were graduate students and a colleague participating in a Teaching and Learning seminar • Counts of students off topic during lecture and exercise portions of the class • Some inconsistent use of the observation protocol ICER 2015

  12. Student Engagement ICER 2015

  13. Threats to Validity • External Validity • Two sections of the same course, taught by the same instructor, in the same semester, and same time of day • Replication needed in other contexts to generalize further • Could provide additional data points in future meta-analyses ICER 2015

  14. Threats to Validity • Internal Validity • Selection bias: students selected their own sections • Initial surveys shows groups were similar • Confounding factors • Materials shared between groups • Effect size – only 6 in-class labs • Differential Attrition Bias • Considered “soft-drops” in the study • Experimenter Bias • Participants were not revealed until after the semester was over ICER 2015

  15. Threats to Validity • Construct Validity • Exams as Measures of Learning • Exam 1 and Exam 2 were similar, but not the same, between sections • Exam 3 was common • Does exam really measure student learning? • Survey • Wording may be confusing for prior tool experience • Efficacy questions not a validated instrument • Observation Protocol as Measure of Engagement • Inconsistent use by observers ICER 2015

  16. Discussion • Did in-class labs increase student learning? • No, at least not as measured by exam questions • Both control and intervention were active learning • Maybe a simple active learning intervention is enough • Comparisons with earlier semesters may show more • Did in-class labs increase student engagement? • Yes and No • The atmosphere in the classroom was fantastic • But many questions were technology and not concept • Completion – 72% of students earned a C or higher • Not reaching the higher levels of completion we expect from active learning literature ICER 2015

  17. Future Work Replication Materials: http://people.engr.ncsu.edu/sesmith5/216-labs/csc216_labs.html • Additional Work on Fall 2014 Data • Compare results on final exam with previous courses • Incorporate analysis of other measures of learning – projects, exercises, etc. • Starting in Fall 2015 • Additional in-class labs → Lab-based course • Measure types of questions asked during in-class labs • Use labs as a way to encourage best practices (frequent commits to version control, TDD) ICER 2015

  18. Thank You! Questions? Comments? Concerns? Suggestions? ICER 2015

More Related