1 / 33

Guidelines for Good Evaluation Practice

Guidelines for Good Evaluation Practice. Michael Theall, Youngstown State University Trav Johnson, Brigham Young University Robin Zuniga, TLT Group. The Personnel Evaluation Standards The Joint Committee on Standards for Educational Evaluation.

derex
Download Presentation

Guidelines for Good Evaluation Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Guidelines for Good Evaluation Practice Michael Theall, Youngstown State University Trav Johnson, Brigham Young University Robin Zuniga, TLT Group

  2. The Personnel Evaluation StandardsThe Joint Committee on Standardsfor Educational Evaluation • The Evaluation Standards facilitate the responsible conduct and use of personnel evaluations in educational settings. • These standards provide a framework for designing, conducting, and judging evaluation systems.

  3. Organizations Supporting the Development and Use of the Evaluation Standards • Partial list: • American Evaluation Association (AEA) • American Educational Research Association (AERA) • American Psychological Association (APA) • American Counseling Association (ACA) • Canadian Evaluation Association (CEA) • Canadian Society for the Study of Education (CSSE) • Consortium for Research on Educational Accountability and Teacher Evaluation (CREATE) • National Council on Measurement and Evaluation (NCME) • National Education Association (NEA)

  4. Four General Areas ofPersonnel Evaluation Standards • Propriety Standards • Utility Standards • Feasibility Standards • Accuracy Standards

  5. Propriety Standards • Evaluations will be conducted: • Legally • Ethically • With due regard for the welfare of those evaluated and others involved in the evaluation

  6. Utility Standards • Evaluations should be: • Informative • Timely • Influential

  7. Feasibility Standards • Evaluation systems should be: • As easy to implement as possible • Efficient in their use of time and resources • Adequately funded • Viable from a political standpoint

  8. Accuracy Standards • Evaluations should produce sound information about an educator’s performance. Sound information leads to: • Valid interpretations • Justifiable conclusions • Appropriate follow-up

  9. Evaluation Standards andOnline Student Ratings • Online ratings present some unique opportunities and challenges in relation to the standards. • Discussion of online ratings and the evaluation standards: • Clark, S. J., Reiner, C.M., & Johnson, T.D. (2005). Online course ratings and the personnel evaluation standards. In D.D. Williams, M. Hricko, & S.L. Howell (Eds.), Online Assessment, Measurement, and Evaluation: Emerging Practices, Volume III (pp. 61-75). Hershey, PA: Idea Group Publishing.

  10. Personnel Evaluation Standards • First Edition: The Joint Committee on Standards for Educational Evaluation. (1988). The Personnel Evaluation Standards. Newbury Park, CA: Sage Publications. • Second Edition: Field trial version available at: http://jc.wmich.edu/PersStds2005/

  11. GUIDELINESFOR IMPROVED EVALUATION & DEVELOPMENT

  12. APPLIED MODEL

  13. 8 Steps to Better Evaluation(from Arreola, 2000) • Determine the faculty role model TEACH – RESEARCH - SERVE – ADMINISTER • Determine the range of role model values MAX-MIN % WEIGHT ASSIGNED TO EACH ROLE • Define the roles NEEDED SKILLS – EXPECTATIONS - LIMITS • Determine the component weights ASSIGN % VALUES TO TOTAL 100%

  14. 8 Steps to Better Evaluation(from Arreola, 2000) • Determine sources of information STUDENTS – PEERS – ADMINISTRATORS - OTHERS • Determine source weights ASSIGN % TO EACH SOURCE • Determine general data collection tools/process CHOOSE A TOOL FOR EACH SOURCE • Select/design instruments/protocols/reports/etc. • SURVEYS - DATA – OBSERVATIONS – PORTFOLIO ITEMS

  15. “Source – Impact Matrix”

  16. 8 Steps to Better Development(Theall, 2005) • Determine needs and associated development functions MATCH NEEDS WITH FUNCTIONS • Determine the principal clients of development services FACULTY – ADMINISTRATION – DEPTS. – STUDENTS? • Determine the configuration, and location of development programs STRUCTURE – LEADERSHIP – REPORTING LINE • Determine the allocation of development resources % OF RESOURCES TO EACH FUNCTION

  17. 8 Steps to Better Development(Theall, 2005) • Determine the intended impact of development programs DIRECT/INDIRECT – MATCH WITH MISSION/FUNCTIONS • Determine the connections to other campus programs EVALUATION – ASSESSMENT – ACCREDITATION EFFORTS • Establish leadership in faculty development EXPERTISE – FACULTY INVOLVEMENT – ADMIN. SUPPORT • Create & implement programs COST-EFFECTIVE - MANAGEABLE – SUSTAINABLE - ASSESSED

  18. FUNCTION – CLIENT MATRIX

  19. Guidelines #1 • Establish a public purpose for evaluation, identifying ratings uses and users as well as evaluation criteria, process, and procedures. • Include all stakeholders in decisions; keep individual and institutional needs in mind; get consensus.

  20. Guidelines # 2 • Absolutely include resources for needed infrastructure and for support of teaching and teachers. • Build a real "system" for evaluation, not a haphazard and unsystematic process. • Establish clear lines of responsibility/ reporting for those who administer the system; legally defensible process; and a system for grievances.

  21. Guidelines # 3 • Use, adapt, or develop instrumentation suited to institutional/individual needs use multiple sources of information from several situations. • Insure maximum student participation & response rates • Build a ratings database and validate the instruments used.

  22. Guidelines # 4 • Produce reports that can be easily and accurately understood. • Educate the users of ratings results to avoid misuse and misinterpretation. • Keep formative evaluation confidential and separate from summative decision making.

  23. Guidelines # 5 • In summative decisions, compare teachers on the basis of data from similar situations. • Consider the appropriate use of evaluation data for assessment and other purposes. • Invest in the evaluation system and evaluate it regularly. • Seek expert, outside assistance when necessary or appropriate.

  24. Using the Move to Online Course Evaluations to Build a Culture of Reflective Practice

  25. BETA Sponsors and Partners • FIPSE – Fund for Improvement of Postsecondary Education • Teaching, Learning with Technology (TLT) Group • Mt. Royal College • Washington State University

  26. Pilot Institutions • St. Edward’s University • Virginia Commonwealth University • Purdue University • Embry-Riddle University • Johnson C. Smith University • Valencia Community College • Santa Ana College • Ohio University • Kent State University • University of Southern Indiana • Northern Arizona University • Washington State University • Mt. Royal College

  27. BETA Project Goals • Build Culture of Assessment • Community Definition of Good Teaching and Learning • Students as responsible evaluators of own learning • Focus on formative uses of evaluation • Multi-tiered Feedback Process • Move Evaluation into 21st Century

  28. The BeTA Project • Develop Culture of Assessment • Improve Student Feedback • Improve Efficiency of Evaluation Systems/Processes • Shifting focus of evaluation questions • Encourage Scholarship of Teaching • Improve Teaching and Learning through Assessment

  29. BeTA Project Tasks • Visible Knowledge Mapping • Student Engagement • Improving interpretation and use of results • Penguin software development

  30. Lessons Learned 1. Throw out the old way of doing things! Shift to E-commerce model from paper survey model. 2. Communication/Politics is Crucial! Anticipate and respond effectively to resistance to both online distribution and changes in questions. 3. Are you ready? Make sure communication, logistical and political support are in place before you begin moving to online or rewrite your instrument.

  31. Lessons Learned Challenges • Logistics • Distribution of instructions • Easier/Shorter instructions • Flat customer service model needed (one-stop help) • Need for new survey system (Skylight release summer 2006) • Politics • Fear of change in questions • Resistance to online survey dissemination • Response rates • Requires more student responsibility • Student perceptions of confidentiality • Student perceptions of “no effect”

  32. Lessons Learned Successes/Improvements • Questions focused on Learning Environment • Confidentiality of Data • Legible/Direct Student Comments • Student Satisfaction • Faster Turn Around of Reports • Better Data for Accreditation • Becoming positioned for 21st Century

  33. Conclusions • Change requires: • Change in process = opportunity for change in structure/content • Strong and broad coalition • Support from CAO or other academic officer • Engagement of students and faculty • Support of IT and Administrative Services • Shift from old ways to E-commerce/E-research model

More Related