1 / 16

End-of-Term Course Evaluation SETE

End-of-Term Course Evaluation SETE. Fall 2013 Campus Wide Trial. Response Rate. 128,747 surveys scheduled 46,093 surveys completed . Response Rates by College. Benefits. Green system S ingle system to facilitate Annual Review and P&T C onsistent questionnaire

shen
Download Presentation

End-of-Term Course Evaluation SETE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. End-of-Term Course Evaluation SETE Fall 2013 Campus Wide Trial

  2. Response Rate • 128,747 surveys scheduled • 46,093 surveys completed

  3. Response Rates by College

  4. Benefits • Green system • Single system to facilitate Annual Review and P&T • Consistent questionnaire • Uses up-to-date information from PeopleSoft SoC • Integration with FAAR • Centralized reporting • Aggregate reports for units • Individual reports for faculty • SETE score

  5. Faculty Reports

  6. Faculty Reports

  7. Faculty Reports

  8. Faculty Reports

  9. Faculty Reports

  10. Campus Wide Trial Challenges • Planning and implementing simultaneously • Decentralized institution • Dynamically dated courses • Sequentially taught courses • Inconsistency in SoC Primary Instructor

  11. Decisions Made on the Fly • Overlay the new system on the old • Open and close dates • Roster view available after 5 responses • Responses available with less than 5 after the close

  12. Lessons Learned • Send more than three notifications that are personalized • Make clear end date • Support faculty in encouraging student response

  13. Lessons Learned • Centralize mapping of evaluations • Continuous communication with students • Encourage Smarter Services to: • Develop normed national data • Expand ways to analyze the results

  14. Resources • Tutorials • http://www.smartersurveys.com/default/index.cfm/client-resources/support-documents/

  15. Gathering Feedback • Comments • Please send additional comments to Denise Helm at Denise.helm@nau.edu

  16. Special Thank You • The Implementation Team • ITS, ELC, Student Support Desk, Office of the Registrar

More Related