400 likes | 543 Views
Gauging the Impact of Student Characteristics on Faculty Course Evaluations . Laura Benson Marotta University at Albany – SUNY. Student Characteristics. UAlbany course evaluations ask student level of study course is required/ an elective major/minor/other GPA expected grade gender.
E N D
Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY
Student Characteristics UAlbany course evaluations ask student • level of study • course is required/ an elective • major/minor/other • GPA • expected grade • gender
Student Characteristics 40% of the University at Albany course evaluation instrument measures student characteristics Other institutions go further • Average hours per week studying • Average hours per week seeking outside help • Level of interest in the subject before taking the course
Student Characteristics “ Surveys are not ends unto themselves. They are tools that are used to help make decisions. You wouldn’t buy tools for a workbench unless you knew what you would use them for and unless you knew for sure that you were going to use them.” Linda A. Suskie (1992)
Problems • Students may view questions about them as intrusive or irrelevant • Gathering more data than we can analyze wastes good will and instructional time • Contributes to survey fatigue
Companion Survey • Departments do not assess student satisfaction by bubble sheets alone • Open-ended departmental course surveys ask students to describe themselves all over again
Student Characteristics Data warehouse has canned queries to report student demographics by course
Opportunities • It is easier to establish a clear link between student characteristics and faculty evaluation outcomes if the characters are measured on the survey, rather than estimated from registration records after the fact
Opportunities • Sampling bias is a fact of life • Comparing demographic information between the survey respondents and the class population • Underrepresented subpopulations?
Student Characteristics • Grade Inflation • Course Assessment • Non-response bias
Grade Inflation • “Evaluations depend solely on students, and grade inflation reflects faculty worried about the impact students may have on their careers.” Virginia Myers Kelly (2005)
Grade Inflation on your campus • Does Expected Grade predict the response to Instructor, Overall
Grade Inflation Practice Data Set: • Undergraduate Courses • Student getting a grade (not pass/fail) • Limited to students who are passing
Undergraduate Survey Responses for Instructor, Overall
Grade Inflation • Karl Pearson published a model in 1900 that described experiments with mutually exclusive, categorical outcomes • Row by Column test of independence • SPSS output using this model is still labeled “Pearson’s Chi Square”
Nonparametric Test • Assumptions for Chi-Square: “Nonparametric tests do not require assumptions about the shape of the underlying distribution…The expected frequencies for each category should be at least 1. No more than 20% of the categories should have expected frequencies of less than 5.” • SPSS Base User’s Guide 12.0 page 466; follows guidelines set by W.G. Cochrain (1954).
Grade Inflation • Null hypothesis: “Instructor, Overall” is independent of “Expected Grade” • Alternative hypothesis “Instructor, Overall” and “Expected Grade” are dependent
Grade Inflation- RESULTS Chi-Square Tests (a) 0 cells (.0%) have expected count less than 5. The minimum expected count is 9.58.
Grade Inflation • INTERPRETATION • Faculty ratings on the Likert scale varies depending on the students’ expected grade • Instructors have a reason to expect lower student satisfaction if they assign lower grades.
Grade Inflation Clear progression in students rating instructors as “Poor”: • Expecting a D: 32/291 = 11% • Expecting a C: 196/2989 = 6% • Expecting a B: 371/11066 = 3% • Expecting an A: 197/9830 = 2%
Grade Inflation Instructors Rated as “Excellent” • Expecting a B: 4765/11066 = 43% • Expecting an A: 5774/9830 = 59%
Policy Implications • Faculty evaluations should be considered in conjunction with grade distributions • If your institution wants to follow Harvard and fight grade inflation by setting a cap on “A” grades in undergraduate courses, expect lower student satisfaction ratings • “Expected Grade” should be included during a survey redesign
Course Assessment • 1 credit lower-division general education course in Information Science • Gap between satisfaction with Instructors and satisfaction with course
Course Assessment • The first step to solving the problem is to confirm that student satisfaction with the general education course in Information Science is different from the other lower-level undergraduate courses
Course Assessment • Exploring these data did not solve the curriculum coordinator’s original problem, but it did help focus our questions in designing a follow-up study.
Course Assessment The following semester the instructors handed out a two question survey on the first day of class: • Why did you take this class? • Are you a freshman, sophomore, junior, senior, or other?
Course Assessment • 3 Readers Scored Open-Ended Responses (Reliability) • Scored categories: • General Education • Need 1 Credit • Subject Matter • Other
Course Evaluation • Results • ~ 1/3 Seniors interested in subject • ~ 4/5 Seniors needed 1 credit • Grads self-selecting for remediation
Course Evaluation Policy Implications • Examine other opportunities for upperclassmen to earn 1 credit • Make Seniors jump through hoops to get into this course
Student Characteristics Conclusion Institutional Researchers use student characteristics on faculty evaluations to: • Track trends like grade inflation • Conduct ad hoc analyses • Estimate Sample Bias
Student Characteristics Conclusion: • It is only wise to gather as much data as we will use.