680 likes | 857 Views
Course Evaluations on the Web: Our experiences. Jacqueline Andrews, SUNY New Paltz Donna Johnson, SUNY Ulster Lisa Ostrouch, SUNY New Paltz Julie Rao, SUNY Geneseo. Agenda. Overview of history of course evaluation New Paltz transition Evaluating online courses Year 3 of being online
E N D
Course Evaluations on the Web:Our experiences Jacqueline Andrews, SUNY New Paltz Donna Johnson, SUNY Ulster Lisa Ostrouch, SUNY New Paltz Julie Rao, SUNY Geneseo
Agenda • Overview of history of course evaluation • New Paltz transition • Evaluating online courses • Year 3 of being online • Questions & discussion welcome throughout
General History of Student Course Evaluations • 1920s at the University of Wisconsin • Since 1960s, used by higher education administration in decisions for tenure and promotion • Traditionally, in class on paper. • Referred to by acronyms SEI, SET, SOI, etc.
General History of Student Course Evaluations • Late 1990s, a few test online administration (ca 2%) • % of institutions implementing online systems is on the rise – medium is the message
Research • Most common concern with online course evaluations: response rates. • Though most research has shown lower responses rates, there is much research that suggest improvement • In addition, some research suggests response rates are lower in only some courses
Research on Response Rates Factors that seem to affect response rates: • Technical difficulties • Access to open computers • Students’ use of multiple e-mail addresses • When and how the availability of the course evaluation is announced • When and how the importance of the evaluations are addressed • Reminders • Incentives
Research on Response Rates • A study at the Northern Arizona University showed the professors who posted information about course evaluation on a class discussion board produced the best response rates. • In another study, NAU, found an average 32% increase in response rates when instructor followed these instructions: • 1) Announcement with location a few weeks prior to the end of class 2) an explanation of the how the evaluations are used 3) one reminder to complete the evaluation one following the initial announcement by e-mail • In addition, NAU switched from Evalajack to Survey Monkey.
Schools Currently Using the On-line Format • Brigham Young University has a site called OnSET, which is dedicated to information on on-line student evaluations. • Fabulous site: http://OnSET.byu.edu
University of Idaho University of Virginia Northwestern University Bates College Yale Clemson University University of Cincinnati UCLA Columbia Penn State University of Michigan Syracuse Cornell University North Carolina State Ohio State University of Delaware University of Massachusetts Lehigh University Palm Beach Community College Examples of Schools Using Online Format to Some Degree
Commercial Software • In-house programs or vendor product • BYU’s OnSET site listed 10 commercial providers. • They include Evaluation Kit, OCE, Web eVal,and Class Climate from Scantron and others.
History of Course Evaluations at New Paltz • Fall,1969, 42 questions • 1972 to 1976 college-wide procedure • ETS for the scanning and reports • 24 questions
History of Course Evaluations at New Paltz • 1990s, responsibility for scanning and administering reports switched to the Office of Institutional Research. • Results on carbon paper that needed to be separated. • SEI desk attended 7am-9pm.
History of Course Evaluations at New Paltz • Early 1990s a Task Force on Teaching was formed in order to examine and revise the course evaluation form • Recommended a form with 22 questions, still used today • In 2004, 1 survey given to students and 1 to faculty regarding course evaluations
History of Course Evaluations at New Paltz • The Current Process • Labels are printed for each course • Packets (course/sec) are made up for each course • Packets are delivered to • Liberal Arts & Sciences – individual departments • Business – Dean’s office • Engineering – Dean’s office • Education – Dean’s office • Fine & Performing Arts – Dean’s office • Packets are returned to Institutional Research
History of Course Evaluations at New Paltz • The Current Process • Each packet is matched to a header sheet • Each packet is scanned • Scanned packets are uploaded • Reports are searched for trouble areas • “Cleaned” data sent to Computer Services • Reports generated • Packets returned to faculty with an individual report summary and department summary. • Chairs and deans receive a copy of each faculty report, summary, and Department Summary
Online Tests at SUNY New Paltz • Through Blackboard in 2007 • Through OCE in Spring and Summer of 2008
The Current Process Pros Cons
Online Pros Cons
New Paltz Experiment • SUNY New Paltz conducted 2 on-line pilots with the vendor OCE • Summer 2008, all on-line SEIs were conducted for all courses • Spring of 2008, School of Business and School of Science and Engineering
New Paltz Experiment • Comparison of the mean scores of the paper and on-line versions of the SEI to determine whether or not there were statistically significant differences between them. • We calculated a mean SEI score using all the questions on all the SEIs for each school. • We used ANOVA testing to compare means.
New Paltz Experiment Results • The results of the significance scores were inconsistent. • Several of the tests showed significant differences between the mean scores for paper between years. • It is unlikely for the mean scores of on-line SEIs to be significantly different, at the statistical level due from the paper scores, due to the change in format. • These results are consistent with the current body of research of online SEI.
Issues with going online at New Paltz! • Differing POV: OIRP, faculty, faculty governance, Deans, Provost, President • Hard for each to see the POV of the other • Reducing the OIRP work load is not a driver for any of these groups except OIRP • Lack of consistent other means of evaluating teaching puts a heavy weight on the SEIs
Assumptions at New Paltz • Harder courses and tougher graders get lower SEI scores • Current way of doing it is perfect • Students will not go online to complete an SEI • SEIs are easy
The facts about SEIs • A one semester analysis found no relationship between grades and SEIs • The current way is familiar. It is methodologically suspect. SEI scores are so uniformly high that it is unlikely the questions are valid or reliable. • Students will go online to do the SEIs if they think it is useful to do so. • Here’s that OIRP workload thing again- SEIs take up way too many hours! We handle more than 50,000 sheets of paper multiple times during the year. Surely there is something more useful we could be doing for the college.
More SEI facts • That workload thing – 30% increase in student responses, i.e., pieces of paper from fall,1998 to fall, 2008
Get • Immediate results • Flexibility in questions • Ability to add their own questions each semester • Comments in a file – no need to read handwriting • Access to their own data all the time • Their class time back
Give up • Comfort zone with the present setting • Time to do things now unfamiliar:Need to be involved in the process to secure a decent response rateActive participation in analyzing the data
Get • Ability to do an SEI on their own time • Use of a familiar medium – online; no more golf pencils • The class time back • Anonymous responses – no handwriting to be recognized
Give up • The comfort of the familiar • A designated time for the SEI – will have to use their own time
Possibilities for increasing response rates • Hard (hard to sell) waysHold something of value like gradesFaculty award something for completion (timing tricky)Faculty put on syllabusFaculty talk about during the semesterFaculty state how much they value student opinions often
Possibilities for increasing response rates • Soft (maybe still hard to sell) waysPop-ups – every time log onto site (intranet or Blackboard), there is a reminderDirect route to the survey for those who have not completed (intranet)Email reminders from OIRPEmail reminders to faculty from OIRPIncentivesPaper reminders
Where we hope New Paltz is going next: • In-house software • Offer faculty a choice • Not be constrained by the existing 22 questions • Weight the scale in favor of online • Hope to get to a tipping point wherein 95%+ are online
Where it likely New Paltz is headed next: • Summer pilot (few have opted out) • Work out the kinks with the software • Work with the faculty governance system • If possible, test the Academic Affairs Committee questions • Revise the questions • Work with the faculty governance system • Perhaps offer a choice to faculty with the 22 questions in the fall • Perhaps offer a choice to faculty in the spring with the new questions • Work with the new Provost • Work with the work group in evaluation of teaching to put SEIs in context • Work with various ways of ramping up response rates • Get to a place wherein MOST of the SEIs are online
the SUNY Ulster experience Collecting Student Opinion Data in-House Using ANGEL
Where we started . . . • Tried using Microsoft Sharepoint in Fall 2007 & Spring 2008 • Mailed logins & passwords • No portal • No Standardized student e-mail account • No luck & possible new expense item
Then we tried . . . • Angel Survey through course management system in Fall 2008 • Still mailing logins & passwords • Still no portal • No standardized e-mail accounts • No luck • BUT better controls and no additional costs
How we’ve changed & why • Switched to using e-mail contact in Spring 2009 • Now have a portal • All students now have SUNY Ulster e-mail & are enrolled in Angel classes • Reasonably good response rate - 55% • No added costs – all electronic
Student Evaluation of Instruction for Online Courses • Use 0 – 5 frequency scale • Items examples, “Instructor . . .” • Is well organized • Enjoys teaching the course • Explains materials clearly • Is fair in dealing with students • Shows commands of the subject matter • Is able to answer questions clearly & concisely
I am listed as the “instructor” of a series of “courses” that have only one learning module in them which is the Student Opinion Survey for each course scheduled to be evaluated that term - - I reuse course shells from semester to semester.
Each course survey can have individual open and close dates attached to it, that allows for a reasonable period of time for students to participate. I send a separate email to them letting them know when the survey is open and encouraging their participation in the process. Most of our students use Angel to access at least some of their course materials in their regular classes.
One thing that is invaluable is that I have authority to create my own course roster. I use Banner to extract a class roster of ID’s and names, and then Batch Enroll the class into my Student Opinion class. So my “class” shows up automatically as one of the courses they are enrolled in once I add a student to the roster.