1 / 40

Gauging the Impact of Student Characteristics on Faculty Course Evaluations

Gauging the Impact of Student Characteristics on Faculty Course Evaluations . Laura Benson Marotta University at Albany – SUNY. Student Characteristics. UAlbany course evaluations ask student level of study course is required/ an elective major/minor/other GPA expected grade gender.

vincenzo
Download Presentation

Gauging the Impact of Student Characteristics on Faculty Course Evaluations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY

  2. Student Characteristics UAlbany course evaluations ask student • level of study • course is required/ an elective • major/minor/other • GPA • expected grade • gender

  3. Student Characteristics 40% of the University at Albany course evaluation instrument measures student characteristics Other institutions go further • Average hours per week studying • Average hours per week seeking outside help • Level of interest in the subject before taking the course

  4. Student Characteristics “ Surveys are not ends unto themselves. They are tools that are used to help make decisions. You wouldn’t buy tools for a workbench unless you knew what you would use them for and unless you knew for sure that you were going to use them.” Linda A. Suskie (1992)

  5. Problems • Students may view questions about them as intrusive or irrelevant • Gathering more data than we can analyze wastes good will and instructional time • Contributes to survey fatigue

  6. Companion Survey • Departments do not assess student satisfaction by bubble sheets alone • Open-ended departmental course surveys ask students to describe themselves all over again

  7. Student Characteristics Data warehouse has canned queries to report student demographics by course

  8. Opportunities • It is easier to establish a clear link between student characteristics and faculty evaluation outcomes if the characters are measured on the survey, rather than estimated from registration records after the fact

  9. Opportunities • Sampling bias is a fact of life • Comparing demographic information between the survey respondents and the class population • Underrepresented subpopulations?

  10. Student Characteristics • Grade Inflation • Course Assessment • Non-response bias

  11. Grade Inflation • “Evaluations depend solely on students, and grade inflation reflects faculty worried about the impact students may have on their careers.” Virginia Myers Kelly (2005)

  12. Grade Inflation on your campus • Does Expected Grade predict the response to Instructor, Overall

  13. Grade Inflation Practice Data Set: • Undergraduate Courses • Student getting a grade (not pass/fail) • Limited to students who are passing

  14. Undergraduate Survey Responses for Instructor, Overall

  15. Grade Inflation • Karl Pearson published a model in 1900 that described experiments with mutually exclusive, categorical outcomes • Row by Column test of independence • SPSS output using this model is still labeled “Pearson’s Chi Square”

  16. Nonparametric Test • Assumptions for Chi-Square: “Nonparametric tests do not require assumptions about the shape of the underlying distribution…The expected frequencies for each category should be at least 1. No more than 20% of the categories should have expected frequencies of less than 5.” • SPSS Base User’s Guide 12.0 page 466; follows guidelines set by W.G. Cochrain (1954).

  17. Row by Column Test of Independence

  18. Grade Inflation • Null hypothesis: “Instructor, Overall” is independent of “Expected Grade” • Alternative hypothesis “Instructor, Overall” and “Expected Grade” are dependent

  19. Grade Inflation- RESULTS Chi-Square Tests (a) 0 cells (.0%) have expected count less than 5. The minimum expected count is 9.58.

  20. Grade Inflation • INTERPRETATION • Faculty ratings on the Likert scale varies depending on the students’ expected grade • Instructors have a reason to expect lower student satisfaction if they assign lower grades.

  21. Grade Inflation Clear progression in students rating instructors as “Poor”: • Expecting a D: 32/291 = 11% • Expecting a C: 196/2989 = 6% • Expecting a B: 371/11066 = 3% • Expecting an A: 197/9830 = 2%

  22. Grade Inflation Instructors Rated as “Excellent” • Expecting a B: 4765/11066 = 43% • Expecting an A: 5774/9830 = 59%

  23. Policy Implications • Faculty evaluations should be considered in conjunction with grade distributions • If your institution wants to follow Harvard and fight grade inflation by setting a cap on “A” grades in undergraduate courses, expect lower student satisfaction ratings • “Expected Grade” should be included during a survey redesign

  24. Course Assessment • 1 credit lower-division general education course in Information Science • Gap between satisfaction with Instructors and satisfaction with course

  25. Course Assessment • The first step to solving the problem is to confirm that student satisfaction with the general education course in Information Science is different from the other lower-level undergraduate courses

  26. Student Characteristics

  27. Course Assessment

  28. Course Assessment • Exploring these data did not solve the curriculum coordinator’s original problem, but it did help focus our questions in designing a follow-up study.

  29. Course Assessment The following semester the instructors handed out a two question survey on the first day of class: • Why did you take this class? • Are you a freshman, sophomore, junior, senior, or other?

  30. Course Assessment • 3 Readers Scored Open-Ended Responses (Reliability) • Scored categories: • General Education • Need 1 Credit • Subject Matter • Other

  31. Course Evaluation • Results • ~ 1/3 Seniors interested in subject • ~ 4/5 Seniors needed 1 credit • Grads self-selecting for remediation

  32. Course Evaluation Policy Implications • Examine other opportunities for upperclassmen to earn 1 credit • Make Seniors jump through hoops to get into this course

  33. Student Characteristics Conclusion Institutional Researchers use student characteristics on faculty evaluations to: • Track trends like grade inflation • Conduct ad hoc analyses • Estimate Sample Bias

  34. Student Characteristics Conclusion: • It is only wise to gather as much data as we will use.

  35. Questions?

More Related