180 likes | 206 Views
An empirical study on the impact of in-class labs versus active-learning lectures in improving student learning and engagement with linear data structures. Research indicates mixed results on learning outcomes but positive engagement; future work includes replication and further analysis.
E N D
An Empirical Study of In-Class Labs on Student Learning of Linear Data Structures Sarah Heckman Teaching Associate Professor Department of Computer Science North Carolina State University ICER 2015
Problem • 7-8 sections • 33 students • 1 instructor • 2 TAs • Lecture/Lab • 1-2 sections • 70-90 students • 1 instructor • 2-3 TAs • Lecture • 1-2 sections • 70-90 students • 1 instructor • 2-3 TAs • Lecture CSC116 CSC216 CSC316 Retention! Transition! In-Class Labs? Do Nothing? Lab? ICER 2015
Research Goal • To increase student learning and engagement through in-class laboratories on linear data structures • Hypothesis: active learning practices that involve larger problems would increase student learning and engagement In-class Labs > Pair & Share ICER 2015
Research Questions • Do in-class laboratories on linear data structures increase student learningon linear data structures exam questions when compared to active-learning lectures? • Do in-class laboratories on linear data structures increase student engagement on linear data structures exam questions when compared to active-learning lectures? ICER 2015
Active Learning in CSC216 • “engaging the students in the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert” [Freeman, et al. 2014] • Control: Active Learning Lectures • 2-5 pair & share exercises per class • Submitted through Google forms • Treatment: In-class Labs • Lab activity for the entire lecture period • Pre-class videos introduced topic ICER 2015
Study Participants • Self-selected into section during standard registration period • Populations were similar as measured by a survey on experience with tooling and self-efficacy. ICER 2015
Methods Replication Materials: http://people.engr.ncsu.edu/sesmith5/216-labs/csc216_labs.html • Quasi-Experimental • Counter-balanced design • Learning measured through exams • Engagement measured through observations of class meetings Observed Class Meetings 001 Array Array Linked Linked Lists Iterators Array Array Linked Linked 002 Exam 1 ICER 2015
Student Learning – Exam 1 • Part 4: Method Tracing with ArrayLists • Part 5: Writing an ArrayList method ICER 2015
Student Learning – Exam 2 • Part 3 – Linked Node Transformation • Part 5 – Writing a LinkedList Method ICER 2015
Student Learning – Exam 3 • Comprehensive 3 hour final exam • Stack Using an ArrayList • Queue Using a LinkedList ICER 2015
Student Engagement • Observations for ArrayList and LinkedList class meetings • Observers were graduate students and a colleague participating in a Teaching and Learning seminar • Counts of students off topic during lecture and exercise portions of the class • Some inconsistent use of the observation protocol ICER 2015
Student Engagement ICER 2015
Threats to Validity • External Validity • Two sections of the same course, taught by the same instructor, in the same semester, and same time of day • Replication needed in other contexts to generalize further • Could provide additional data points in future meta-analyses ICER 2015
Threats to Validity • Internal Validity • Selection bias: students selected their own sections • Initial surveys shows groups were similar • Confounding factors • Materials shared between groups • Effect size – only 6 in-class labs • Differential Attrition Bias • Considered “soft-drops” in the study • Experimenter Bias • Participants were not revealed until after the semester was over ICER 2015
Threats to Validity • Construct Validity • Exams as Measures of Learning • Exam 1 and Exam 2 were similar, but not the same, between sections • Exam 3 was common • Does exam really measure student learning? • Survey • Wording may be confusing for prior tool experience • Efficacy questions not a validated instrument • Observation Protocol as Measure of Engagement • Inconsistent use by observers ICER 2015
Discussion • Did in-class labs increase student learning? • No, at least not as measured by exam questions • Both control and intervention were active learning • Maybe a simple active learning intervention is enough • Comparisons with earlier semesters may show more • Did in-class labs increase student engagement? • Yes and No • The atmosphere in the classroom was fantastic • But many questions were technology and not concept • Completion – 72% of students earned a C or higher • Not reaching the higher levels of completion we expect from active learning literature ICER 2015
Future Work Replication Materials: http://people.engr.ncsu.edu/sesmith5/216-labs/csc216_labs.html • Additional Work on Fall 2014 Data • Compare results on final exam with previous courses • Incorporate analysis of other measures of learning – projects, exercises, etc. • Starting in Fall 2015 • Additional in-class labs → Lab-based course • Measure types of questions asked during in-class labs • Use labs as a way to encourage best practices (frequent commits to version control, TDD) ICER 2015
Thank You! Questions? Comments? Concerns? Suggestions? ICER 2015