• 400 likes • 418 Views
This paper discusses the development and deployment of a web-based course evaluation system that aims to improve accountability and efficiency in gathering student feedback. It addresses the limitations of the paper-based system and highlights the goals and benefits of the web-based system. The challenges faced in implementing the system, including student issues and administration and faculty concerns, are also explored. Overall, the web-based system offers a more convenient and effective way of collecting and analyzing course evaluations.
E N D
Development and Deployment of a Web-Based Course Evaluation System Jesse Heines and David Martin Dept. of Computer ScienceUniv. of Massachusetts Lowell Miami, Florida, May 26, 2005
The All-Important Subtitle Trying to satisfy ... • the Students • the Administration • the Faculty • and the Union presented in a slightly different order from that listed in the paper
Paper-Based System Reality • Distributed and filled out in classrooms • Thus, virtually all students present that day fill them out • However, absentees never fill them out
Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • At best, Chairs “look them over” to get a “general feel” for students’ reactions • Professors simply don’t bother with them • lack of interest and/or perceived importance • simple inconvenience of having to go get them and wade through the raw forms
Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • Lose valuable free-form student input because those comments are often ... • downright illegible • so poorly written that it’s simply too difficult to try to make sense of them
Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • Lose valuable free-form student input • However, these comments have the greatest potential to provide real insight into the classroom experience
Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • Lose valuable free-form student input • However, these comments have the greatest potential to provide real insight • Bottom Line #1: The paper-based system pays little more than lip service to the cry for accountability in college teaching
Paper-Based System Reality • Bottom Line #2: We’re all already being evaluated online whether we like it or not ...
Web-Based System Goals • Collect data in electronic format • Easier and faster to tabulate • More accurate analysis • Possibility of generating summary reports
Web-Based System Goals • Collect data in electronic format • Easier and faster to tabulate • More accurate analysis • Possibility of generating summary reports • Retrieve legible free-form responses • Allow all students to complete evaluations anytime, anywhere, at their leisure, and even if they miss the class in which the evaluations are distributed
What We Thought If we build it, they will come ... ... but we were very wrong!
Student Issues • Maintain anonymity • Ease of use • Speed of use
Student Issues • Maintain anonymity • Ease of use • Speed of use We guessed wrong on the relative prioritiesof these issues.
Student Issues • Our main concern: • Prevent students from “stuffing the ballot box” • One Student = One Survey Submission
Student Issues • Our main concern: • Prevent students from “stuffing the ballot box” • One Student = One Survey Submission • Major concern that appeared after the system was deployed: • Simply getting students to participate • There appeared to be a great deal of apathy, particularly in non-technical courses
Student Login Evolution Fall 2003
Administration Issues • System quality and integrity • “Buy in” from the deans • But the real issue was ... Dealing with the faculty union
Faculty Issue #1 • Control of which courses are evaluated • Contract wording: “The evaluation will be conducted in a single section of one course per semester. ... At the faculty member’s option, student evaluations may be conducted in additional sections or courses.”
Union Issue #1 • In 2004, all surveys were “turned on” by default, that is, they were all accessible to students on the Web • This was a breach of the contract clause stating that “evaluation will be conducted in a single section of one course” • In 2005, the default is inaccessible • Use of the system thus became voluntary • As of May 20, 2005 (end of final exams), 95 professors (25% of the faculty) in 40 departments had made 244 course surveys accessible to students
Faculty Issue #2 • Control of what questions are asked • Contract wording: “Individual faculty members in conjunction with the Chairs/Heads and/or the personnel committees of academic departments will develop evaluation instruments which satisfy standards of reliability and validity.”
Union Issue #2 • In 2004, deans could set questions to be asked on all surveys for their college • This was a breach of the contract clause stating that faculty would develop questions “in conjunction with the Chairs/Heads and/or department personnel committees” • In 2005, all college-level questions are now at the department level so that only Chairs can specify required questions • Deans then had essentially no access to the system unless they were teaching themselves or were the acting chair of a department
Faculty Issue #3 • Control of who sees the results • Contract wording: • “Student evaluations shall remain at the department level. At the faculty member’s option, the faculty member may submit student evaluations or a summary of their results for consideration by various promotion and tenure review committees. The faculty member shall become the sole custodian of these student evaluations at the end of every three academic years and shall have the exclusive authority and responsibility to maintain or destroy them.”
Union Issue #3 • Data was collected without faculty consent • This was a breach of the contract clause stating that “student evaluations shall remain at the department level” • All survey response data for the Fall 2004 semester were deleted on February 15, 2005, unless the faculty member explicitly asked that it be kept • What’s going to happen with this semester’s data has not yet been determined
Lessons Learned/Confirmed • No matter what you do, there will be those who object You must remain open-minded and flexible • Practice good software engineering so that the software can be easily modified • It’s really worth it to work with the many power factions to garner support • Every system needs a “champion” • Be prepared to spend a huge amount of time on system support
Thank You Jesse M. Heines, Ed.D. David M. Martin, Ph.D. Dept. of Computer Science Univ. of Massachusetts Lowell {heines,dm}@cs.uml.edu http://www.cs.uml.edu/{~heines,~dm}