70 likes | 161 Views
Interpreting Student Evaluations. Heather McGovern, Fall 2011. Quick Advice. Attend to trends—disregard outliers Use whichever is higher—adjusted or raw scores Resist interpreting numbers with too much precision
E N D
Interpreting Student Evaluations Heather McGovern, Fall 2011
Quick Advice • Attend to trends—disregard outliers • Use whichever is higher—adjusted or raw scores • Resist interpreting numbers with too much precision • Attend to the context of the institution (level of class, race of faculty member, etc.) and of the person’s teaching • Remember that students cannot provide any valid information on most f our aspects of excellence in teaching with our student rating instrument (or at all)
Myth: Being “similar” is bad • It is “similar.” The majority of faculty nationwide will fit into this band, given that the scores are normed.
FAQ: What does it mean when someone is adjusted down? • It means that the five factors IDEA uses indicate that students in the class were predisposed to give higher ratings. This may be because they were unusually motivated to take the class, perceive themselves as unusually hard working, because the class is smaller, or a combination of these and other factors. • What it does not mean: the teacher did something wrong. • Basic guidance from IDEA: Use the higher score when evaluating an individual faculty member.
FAQ: Why do some people have to use the small class form? • IDEA’s research indicates that fewer than 15 student responses lead to unreliable data. The union and administration agreed to move to the small class form for classes under 15 in order to avoid giving faculty what is essentially “junk” statistical data. IDEA reports the following median rates: 10 raters .69 reliability 15 raters .83 reliability 20 raters .83 reliability 30 raters .88 reliability 40 raters .91 reliability Reliability ratings below .70 are highly suspect.
FAQ: Why does page 3 not highlight an area in which faculty performed well? • Because IDEA’s research hasn’t noted a correlation between that item and the objectives selected. Bottom line: good teachers don’t have to use all the pedagogical techniques all the time (or ever), and you should see IDEA’s guidance as informed guidance, but not as a mandate.
Myth: Can low scores be because a CIP code is wrong? • Probably not. Unless you mean you are really talking about your disciplinary-comparison scores—let’s see who in this room even knows where they can find those on their IDEA report? • That said, a change in CIP code can provide better disciplinary comparison data which you can then consider and point others to.