250 likes | 368 Views
Student Response Systems for Quantitative Literacy: (A) Yes (B) No (C) Maybe. Lisa M. Reilly Assistant Professor of Chemistry Adam C. Fletcher Assistant Professor of Mathematics Bethany College Bethany, WV . Outline. The Wired Generation History of Student Response Systems
E N D
Student Response Systems for Quantitative Literacy: (A) Yes (B) No (C) Maybe Lisa M. Reilly Assistant Professor of Chemistry Adam C. Fletcher Assistant Professor of Mathematics Bethany College Bethany, WV
Outline • The Wired Generation • History of Student Response Systems • Research on Student Response Systems in STEM areas • Types of Student Response Systems Available • Example of Polleverywhere.com will be conducted • Our Choice of System and Why • Formative Assessment using Multiple Choice Questions • Formative Assessment using Open-Ended Questions • Comparing Error Analysis of Both Assessment Tools • Current Data on Student Response Systems Improving Quantitative Literacy in a Liberal Arts Physical Science Course • Discussion • Acknowledgements
The Wired Generation • Data reported on the technology students in CHEM 100 (Consumer Chemistry) possess
History of Student Response Systems • General Lecture • Traditional v. Interactive Engagement • Biology and Related Areas • Chemistry • Mathematics • Role of Peer Discussion
General Lecture Bill Watterson. Calvin and Hobbes
Traditional v. Interactive Engagement • Study of over 6 000 students in introductory physics courses using a variety of pedagogical methods • Survey of pre/post test data • Students in the interactive engagement courses improved problem solving skills beyond the traditional lecture R. Hake, ” Interactive-engagement vs traditional methods: A six-thousand student survey of mechanics test data for introductory physics courses” AJP66, 64-74 (1998).
Biology and Related Areas • References report either a slight to minimal effect on exam grades • Significant gains in student attitudes, attendance, and overall satisfaction with the course • Preszler, R. W.; Dawe, A.; Shuster, Charles B.and Shuster, M. Assessment of the Effects of Student Response Systems on Student Learning and Attitudes over a Broad Range of Biology Courses. CBE—Life Sciences Education. 2007, 6, 29-41. • Caldwell, J.E. Clickers in the Large Classroom: Current Research and Best-Practice Tips. CBE-Life Science Education. 2007, 6, 9-20. • GauciSA;Dantas AM; Williams DA; KemmRE. Promoting student-centered active learning in lectures with a personal response system. AdvPhysiol Educ. 2009, 33(1):60-71. From Cadwell, results of student evaluation of a non-majors biology course that used SRS
Chemistry • Study reveled that showing the results as the student take the assessment is not best • In comparing the SRS with WebCT questions the differences may have existed due to the WebCT questions being available for review online • Practice, reflection, and review are necessary for effective student learning • Reference: • Bunce, D.M.; VandenPlas, J.R.; Havanki, K.L. Comparing the Effectiveness on Student Achievement of a Student Response System versus Online WebCT Quizzes. J. Chem. Educ., 2006, 83 (3), 488. • Additional: • MacArthur, J.R.; Jones, L.L. A Review of Literature reports of clickers applicable to College Chemistry Classrooms. Chem. Educ. Res. Pract. 2008, 9, 187-195. • Asirvatham, M.R. Clickers in Action: Increasing Student Participation in General Chemistry. W.W. Norton, New York. 2010.
Mathematics and Computer Science • Students can use cell phones as SRS to • take vocabulary quizzes, • study geometry via geotagging • As with Biology students, Computer Science saw • no significant increase in grades, but • increase in both engagement and attitude • At Bethany, we have seen these trends mirrored in • online precalculus, • hybrid mathematical writing, and • online network architecture courses Kolb, Liz. Adventures with Cell Phones, ASCD,February 2011, 39-43. Martyn, Margie. Clickers in the Classroom. Educause Quarterly, 2007 (2). 71-74. Fletcher, Adam. How Effective are Online Courses, WVCTM Conference Presentation, March 2010.
Peer Discussion Smith, M. K.; Wood,W.B.; Krauter, K.; Knight, J.K. Combining Peer Discussion with Instructor Explanation Increases Student Learning from In-Class Concept Questions. CBE Life Sci Educ. 2011 March 1; 10(1): 55–63.
Summary of Benefits • Instantaneous feedback to both you and your students about learning in the classroom • Engages the students enough to keep them alive during lecture • Privacy of voting allows student to be honest about what they do and do not know. • Increase class attendance • Some evidence of problem-solving skills increasing
Types of Student Response Systems Available • Some of the more prominent SRS options: • Dyknow(http://www.dyknow.com/) • eInstruction (http://www.einstruction.com/) • iClicker (http://www.iclicker.com) • LectureTools (https://www.lecturetools.com/) • Netsupport(http://www.netsupportschool.com) • Polleverywhere.com (http://www.polleverywhere.com/) • Qwizdom(http://www.qwizdom.com) • Smart Technologies (http://smarttech.com) • TurningPoint (http://www.turningtechnologies.com) • Ubiquitous Presenter( http://up.ucsd.edu/ ) • Like other e-learning applications, this list is fluid and subject to change
Use of Polleverywhere.com • Example of its uses will be included in the presentation.
Our Choice and Why • Smart Technologies (http://smarttech.com) • SMART Response XE interactive response system • Support advanced math and science content by allowing for symbolic manipulations as input to the system • Allows for a range of acceptable answers for questions by permitting a range of representations for the correct answer • Integrated with SmartBoard Systems and its software purchased with a previous grant Pictures from: http://smarttech.com/us/Solutions/Education+Solutions/Products+for+education/Complementary+hardware+products/SMART+Response/SMART+Response+XE
Formative Assessment Using Multiple Choice Questions • This will be conducted in the Smart Notebook Software Program.
Formative Assessment Using Open-Ended Questions • This will be conducted in the Smart Notebook Software Program.
Comparing Error Analysis of Both Assessment Tools • Reference: Ashlock, Robert. Error Patterns in Computation. 9th edition, Pearson. • For multiple-choice • Attractive distractors must be presented • Time-consuming to create • Time-saving to analyze • For open-ended • Real-time errors can be made (and recorded!) • Is wrong answer a sign of misunderstanding, or tech malfunction? • Instructor must analyze errors “on the fly” • 6(1+4x)+2 = 30x+2 • Domain of (9-x^2)^(1/2) and Domain of (9+x^2)^(1/2) • Second semester calculus: f(x)=1/x. Compute f’(x).
Active Learning Exercises in a Liberal Arts Chemistry Course
Discussion • Questions for Initiating Discussion • Will I cover less material if I use clickers and peer instruction? • How long should conceptual questions take to answer (how complex should they be)? • How do I write good questions? • What “credit” should I give?
How Do We Make Good Clicker Questions? • Questions need to match the student learning objectives for the lecture • Key is to use effective “distractors” • Listen to students & how they think to find with these • Textbooks are now including some student response system questions • Talk to colleagues, especially when working to include interdisciplinary topics 21
Tips for Using Clickers Based on literature above and author Doug Duncan, astronomer, University of Colorado, Clickers in the Classroom: How to Enhance Science Teaching Using Classroom Response Systems • Must have “student buy in” • Observe other teachers • Practice with the system before going in the classroom • Plan for the unexpected as best you can: low batteries, missing student response systems • Start with a few questions and then increase • Vary complexity – Question should encourage discuss • Questions must be related to course objective and exams • Plan time to discuss questions
Pros and Cons – Session Generated PRO – Clickers ANTI - Clickers
Questions? Questions Now Questions Then Ask Us How Or an Email Send!
Acknowledgements • ACA/Teagle Quantitative Literacy Grant • Bethany College • Students who participated in multiple sections • Student who were the test group for practice