1 / 25

A presentation by: The University Student Evaluation of Teaching Task Force August, 2014

A presentation by: The University Student Evaluation of Teaching Task Force August, 2014. Take a moment…. Reflect upon feedback you have received from your student ratings of your teaching. Were the comments/ratings useful? Did they help you improve? Were they demoralizing?.

lucine
Download Presentation

A presentation by: The University Student Evaluation of Teaching Task Force August, 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A presentation by: The University Student Evaluation of Teaching Task Force August, 2014

  2. Take a moment… Reflect upon feedback you have received from your student ratings of your teaching. • Were the comments/ratings useful? • Did they help you improve? • Were they demoralizing?

  3. Comments From Student Ratings… • The entire structure of this class is messed up! The 10 minute quizzes were bull***t. She taught this class like it was a SWS class but its not SWS. Therefore she needs to change that. This final exam is bull***t. We have never been tested on the material yet. • Does not tolerate logical discussion from students. • He just rubbed me the wrong way.

  4. Comments Continued… • Best teacher at Grand Valley; he should get a big raise. • She was cool. I’d like to set her up with my dad.

  5. Student Ratings of Teaching at GVSU • GVSU’s primary identity is that of a teaching institution, and we use student ratings for important personnel and salary decisions. • Student ratings of teaching performance should be a tool that give faculty useful information to help them succeed. • Yet, we are not measuring student perceptions of teaching effectiveness in a consistent, empirically validated manner.

  6. Concerns with Student Rating Instruments at GVSU • 25 different instruments that measure 180 constructs/competencies are used across the university. • Most instruments are homemade and have not been tested for validity or reliability.

  7. Concerns With Interpreting Student Ratings of Teaching at GVSU • Student ratings of teaching are a subjective and perceptual measure, but they are often treated as a finite and objective measure.

  8. Interpretation Concerns continued… • Some units place too much weight on small differences in rating scores. For example, there is not much empirical difference between scores of 4.1 and 4.3, but such gaps are often treated as meaningful. • Student comments are sometimes being used haphazardly and inconsistently. Interpreting rating form comments necessitates specific questions and a validated coding system.

  9. Concerns With the Use of Student Rating Data at GVSU • Student ratings of teaching are not a complete measure of teaching effectiveness, and they are sometimes given too much importance in the instructor evaluation process. • There is no consensus across the university about the weight that should be assigned to student ratings.

  10. Use Concerns continued… • The outputs of student ratings are used inconsistently across the university. Some units use only scores, some use only comments, and some use both. Some units have no specific guidelines about what should be used.

  11. Past Work on Student Rating Systems • A 2007 task force studied the feasibility of a university-wide student rating system. The task force concluded there was a lack of understanding at GVSU regarding the purposes and use of student rating systems, and education of faculty and administrators was needed before a university-wide system was adopted. • From 2010-2013, the Pew FTLC Advisory Committee conducted an extensive investigation into available student rating instruments. It recommended that GVSU should adopt a standardized instrument. It further noted that faculty held many incorrect notions about student ratings of teaching. • ECS held a number of town hall meetings during AY 2013-2014 to give faculty an opportunity to learn about and discuss the issue.

  12. ECS/UAS Motion On January 31, 2014, the UAS approved the following motion: • The university should adopt a standardized measure of student evaluations of faculty teaching that meets contemporary standards for reliability and validity in psychometric measurement. The measure should yield both quantitative and qualitative results. The university should also adopt a standardized platform for administration of the measure (e.g., online software). The measure and platform should be used in all units and colleges. Units and colleges may employ additional instruments separately from the university standard.

  13. Current Task Force Formed In April of 2014, ECS appointed a task force charged with recommending: • A common student evaluation form • Guidelines concerning the use of the form • An implementation procedure. Their report to ECS is due in December, 2014.

  14. Task Force Members • The task force is comprised of people who have expertise in test and measurement, statistics, technology, performance rating, and teaching scholarship. • You can find their names on the USETI website.

  15. Research about Student Ratings of Teaching • It is one of the most studied aspects of college teaching; there are currently over 3,000 citations. • We relied heavily on reviews of the literature for research information. • Links to this material are posted on the USETIWebsite

  16. Research About Consistency • Starting at mid-career, student ratings for an instructor in a particular course remain fairly stable over timeif a valid instrument is used. • Individual student ratings of an instructor remain the same five years after graduation.

  17. Research About Validity… • Student ratings do measure learning – in multi-section courses that use a common final exam, students of teachers with higher ratings do better on the exam. • Students do separate the teacher from the course when rating.

  18. Factors That Are Associated With Lower Numerical Ratings • Teachers of math and science courses are rated lower than teachers of humanities courses. • Teachers of lower-level courses are rated lower than teachers of higher-level courses. • Teachers of required classes are rated lower than teachers of elective classes.

  19. Factors That Are Not Consistently or Strongly Related to Ratings • Gender of student or instructor • Time of day class is taught • How many majors are in the course • The scholarly output of the instructor

  20. Factors Not Consistently or Strongly Related continued… • Assigning higher grades does not result in better evaluations. Hundreds of studies have examined this issue; the average correlation across studies is zero. Note: The difficulty of the course actually has a POSITIVE relationship with higher ratings.

  21. What Can Be Learned From Student Ratings • Student ratings can serve as a useful global indicator of teaching effectiveness. • Students are qualified to rate how much they learned in a course. • Students are qualified to rate observable teacher behaviors, such as clarity, fairness, organization, presentation skills, etc.

  22. The Limits of Student Ratings • Student ratings cannot precisely or completely appraise teacher effectiveness. • Student ratings of teaching inherently contain error; they are a form of performance appraisal, and as such, are susceptible to all the known rater errors. • Students cannot accurately evaluate all important aspects of teaching. For example, students are not qualified to evaluate instructor knowledge or depth of subject coverage.

  23. What’s Next… • As charged by ECS, by the end of December, 2014, the task force will: • Recommend an instrument for adoption by all units at GVSU. Only instruments with established reliability and validity are being considered. • Recommend a method of delivery. • Recommend parameters addressing the use of student ratings. • Recommend an implementation plan.

  24. What’s Next continued: • ECS and UAS will consider, discuss, and vote on these recommendations.

  25. Available Resources • The task force has constructed a website about student ratings of teaching and its work: • gvsu.edu/useti • You can also contact Ed Aboufadel, Chair of USETI.

More Related