1 / 63

Outcomes Assessment

Thursday November 7, 2013. Outcomes Assessment . Before We Start…. CE Documentation Process Attendance Sheets 1 hour after beginning of session Completion of session Certificates Emailed to participants. Board of Directors. Board of Directors.

lyris
Download Presentation

Outcomes Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Thursday November 7, 2013 Outcomes Assessment

  2. Before We Start… CE Documentation Process • Attendance Sheets 1 hour after beginning of session Completion of session • Certificates Emailed to participants

  3. Board of Directors

  4. Board of Directors

  5. The JRCERT is pleased to announce the appointment of Tricia D. Leggett, D.H.Ed., R.T.(R)(QM)our newest board member effective April 2014. Tricia was nominated by the ASRT and is the Director of Institutional Effectiveness at Zane State College. She is the former radiography program director. Board of Directors

  6. Executive Staff Leslie F. Winter, M.S., RT(R) Chief Executive Officer Jay Hicks, M.S.R.S., R.T.(R) Associate Director Kelly A. Ebert, M.P.A., R.T.(T) Assistant Director

  7. Thomas A. Brown, M.A.Ed., RT(R) Barbara A. Burnham, B.S., RT(R), FASRT FAHRA Traci B. Lang, M.S.R.S., R.T.(R)(T) Stuart Frew, M.S., R.T. (R) Accreditation Specialists

  8. ASSESSMENT

  9. Assessment: • Is an ongoing process aimed at understanding and improving student learning • Is a systematic process of gathering and analyzing evidence to determine students’ performance compared to expectations in relation to a program’s mission and goals

  10. Assessment is Dynamic!

  11. Programmatic Assessment Programmatic assessment explains the overall learning, growth, and development of groups of students as a result of their collective educational experiences, rather than just documenting the achievement of individual students.

  12. Purpose of Assessment is: • to improve teaching and learning • to ensure program improvement • to facilitate accountability • to identify program strengths

  13. Assessment Involves: • Making your expectations explicit and public • Systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards • Using the resulting information to document, explain, and improve performance

  14. Types of Assessment Student Learning Program Effectiveness What students will do or achieve • Knowledge • Skills • Attitudes What the program will do or achieve • Certification Pass Rate • Program Completion • Graduate Satisfaction • Job Placement • Employer Satisfaction

  15. Types of Assessment Formative Assessment Summative Assessment • Gathering of information during the progression of a program. • Allows for student improvement prior to program completion. • Gathering of information at the conclusion of a program.

  16. Mission Statement Mission Statement The program at ABC is an integral part of the School of Allied Health Professions and shares its values. The program serves as a national leader in the education of students in the radiation sciences and provides learning opportunities that are innovative and educationally sound. In addition to exhibiting technical competence and the judicious use of ionizing radiation, graduates provide high quality patient care and leadership in their respective area of professional practice. Consideration is given to the effective use of unique resources and facilities. Strong linkages with clinical affiliates and their staff are vital to our success. Faculty and staff work in a cooperative spirit in an environment conducive to inquisitiveness and independent learning to help a diverse student body develop to its fullest potential. The faculty is committed to the concept of lifelong learning and promotes standards of clinical practice that will serve students throughout their professional careers.

  17. Mission Statement Mission Statement The mission of ABC program is to produce competent entry-level radiation therapists.

  18. Goals • broad statements of student achievement that are consistent with the mission of the program. • should address all learners and reflect clinical competence, critical thinking, communication skills, and professionalism. Goals

  19. Goals should not: • Contain assessment tools • Contain increases in achievement • Contain program achievements Goals

  20. Goals ? • The program will prepare graduates to function as entry-level ___. • The faculty will assure that the JRCERT accreditation requirements are followed. • Students will accurately evaluate images for diagnostic quality. • 85% of students will practice age-appropriate patient care on the mock patient care practicum. Goals

  21. Better Goals • Students will be clinically competent. • Students will demonstrate communication skills. • Students will develop critical thinking skills. • Students will model professionalism. Goals

  22. STUDENT LEARNING OUTCOMES

  23. Student Learning Outcomes Specific Measureable Attainable Realistic Targeted

  24. Student Learning Outcomes Students will ______ ________. action verb something No More Than 8-9 SLO’s are needed SLOs

  25. BLOOM’S TAXONOMY http://www.odu.edu/educ/roverbau/Bloom/blooms_taxonomy.htm

  26. BLOOM’S TAXONOMY

  27. Outcomes? • Students will be clinically competent. • Students will complete 10 competencies with a grade ≥75% in RAD 227. • Graduates will be prepared to evaluate, critique, and interpret images for proper evaluation criteria and quality. • Students will demonstrate ability to operate tube locks. SLOs

  28. Better Outcomes • Students will select appropriate technical factors. • Students will be able to execute a qualitative research study. • Students will be able to adjust technique in non-routine situations. • Students will be able to formulate a strategy for overcoming clinical weaknesses. SLOs

  29. Selecting the BEST Assessment Tools

  30. What do we want to learn from Assessment? • How well the students are learning The right tools can help you determine how well the students are learning

  31. From (Unconsciously): Making the program look good on paper. TO : How can I find opportunities/problems so that I can improve the program. Adding value. Shift the focus of Assessment

  32. How many tools? • We recommend: • 2-3 Student Learning Outcomes per the four required goals • 2 tools per outcome

  33. Assessment Methods The most important criterion when selecting an assessment method is whether it will provide useful information - information that indicates whether students are learning and developing in ways faculty have agreed are important. (Palomba & Banta 2000)

  34. Types of Assessment Methods Direct Assessment Methods Indirect Assessment Methods • Demonstrate learning • Performance learning allows students to demonstrate their skills through activities. • Provides reflection about learning.

  35. Commonly Used Tools • Portfolios • Clinical Competency Forms • Clinical Evaluation Forms • Unit Tests • Final Exams • Surveys • Laboratory Competency Forms • Many, many more

  36. What should measurement tools do for you? • MUST measure the outcome, only the outcome and nothing but the outcome • Should represent your students’ achievements as accurately as possible. • Should assess not only whether the students are learning but how well.

  37. Important for Measurement Tools • Use the best tools available • Tools needs to correlate to the Student Learning Outcome • We recommend using two tools • They should predict how well the student will do with future measurement, OR • Validate another tool.

  38. How to Select the Best Tools? • Identify the tools that you believe best measure the outcome. • Use tools that measure all students equally. • Choose the best tool for formative and summative assessment. • “If I could only have one of these tools, which one would give me the most valuable information?”

  39. Things to Consider about current tools • Perhaps were established for purposes of individual student evaluation • May not be good to measure overall student learning assessment • Whether or not used for assessment will continue to use for purpose intended. • If you hate them, change them

  40. Common Tool Mistakes • Course grades as tools • Group assessment tools • Looking at the right tool at the wrong time • Pass/Fail, Yes/No • Too many tools or use of random sampling

  41. Benchmarks Standard of performance against which achieved outcomes are compared in relation to expectation.

  42. Benchmarks • Standard of Performance • Realistic yet Attainable • External or Internal • No Double Quantifiers (Qualifiers) • Rating Scale

  43. Common benchmark mistakes • 70% of students will score 80% or better • Class will average ≥80% (when you are only using a part of the tool.) • Class will average ≥80% • If the part being used is on a Likert scale, the measurement must be on the same scale.

  44. Establish a Timeframe Individual Responsible

  45. When is the right time? • Assess for formative and summative data • Would you measure in the first week? • When do you have an expectation of the students? • When does the program consider them graduates of the program?

  46. Identify who is responsible • Who will actually completes the tool? • This should involve a number of people – not just the program director.

  47. Benchmarks (Examples)

  48. REPORTING and ANALYSIS

More Related