1 / 74

Measuring Outcomes

Measuring Outcomes. Geoffrey T. Miller Associate Director, Research and Curriculum Development Division of Pehospital and Emergency Healthcare Gordon Center for Research in Medical Education University of Miami Miller School of Medicine. Session aims.

nani
Download Presentation

Measuring Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Outcomes Geoffrey T. Miller Associate Director, Research and Curriculum Development Division of Pehospital and Emergency Healthcare Gordon Center for Research in Medical Education University of Miami Miller School of Medicine

  2. Session aims • Discuss the importance of outcomes evaluation and challenges to traditional assessments • Discuss the importance of validity, reliability and feasibility as it relates to assessment • Discuss types of assessments and their application in healthcare education

  3. A little terminology… • Assessment and evaluation are often used interchangeably • However for our purposes… • Assessment = learner outcomes • Evaluation = course/program outcomes

  4. Why is assessment important?

  5. Because… assessment: • “Drives learning” • Allows measures of individual and programmatic progress • Fundamental to outcomes- or competency-based education • Assures public that providers are competent • Credentialing, privileging, licensure, board certification – high stakes for practitioner and patient/society • All involve assessment of competence

  6. Formula for the effective use of simulation Training Resources Trained Educators Curricular Institutionalization Effective Simulation- based Healthcare Education X X = Issenberg, SB. The Scope of Simulation-based Healthcare Education. Simulation in Healthcare. 2006.

  7. Formula for effective outcomes measurement Defined Outcomes Instruments & Trained Evaluators Appropriate Simulator Effective Outcomes Measurement X X =

  8. What are some challenges totraditional methods of assessmentfor healthcare providers?

  9. Challenges in traditional assessments • Ethical issues: “using” real pts (substitutes) • Invasive procedures (patient safety) • Sensitive tasks (cultural concerns, pt modesty) • Problems using cadaveric tissue models • Animal welfare issues

  10. Challenges in traditional assessments • Real patients for evaluation of physical exam skills • Feasibility issues for large-scale examinations • Standardized, perceived fairness issues in high-stakes settings • Standardized patients (SPs) improve reliability, but validity issues exist: cannot mimic many physical findings

  11. Challenges in traditional assessments • Wide range of clinical problems, including rare and critical events • Availability • Cost • Reliability, validity, feasibility

  12. Developing outcome measurements

  13. “Any road will get you there, when you don’t know where you are going”

  14. Curricula development • Analysis • Define expected outcomes • Design • Development • Implementation • Evaluation

  15. Defining outcomes • Learners are more likely to achieve competency and mastery of skills if the outcomes are well defined and appropriate for the level of skill training • Define clear benchmarks for learners to achieve • Plain goals with tangible, measurable objectives • Start with the end-goal in mind and the assessment metrics, then the content will begin to develop itself

  16. Curricula/assessment process Teaching and Learning • Curricular • Development • Define • Outcomes +/- Refinement Assessment and Evaluation

  17. Use of assessments in healthcare simulation Information Demonstration Practice Feedback Remediation Measurement Diagnosis Rosen, MA et al. Measuring Team Performance in Simulation-Based Training: Adopting Best Practices for Healthcare. Simulation in Healthcare 3:2008;33–41.

  18. Preparing assessments • What should be assessed? • Every aspect of curriculum considered essentialand/or has significant designated teaching time • Should be consistent with learning outcomes that are established as the competencies students should master/perform at a given phase of study

  19. Blueprinting

  20. Clinical competence and performance • “Competent performance” = requires acquisition of basic knowledge, skills & attitudes • Competence = • Application of specific KSAs • Performance = • “Translation of competence into action” “Can they do it? Do they do it?”

  21. Possible outcome competencies • Patient care • Medical knowledge • Practice-based learning and improvement • Interpersonal and communication skills • Professionalism • Systems-Based Practice Knowledge Attitudes Skills

  22. Knowledge competencies Knowledge • Cognitive knowledge • (factual) Recall • Comprehension • Application • Analysis • Synthesis • Evaluation

  23. Skill competencies Knowledge • Skills • Communication • Physical Exam • Procedures • Informatics • Self Learning • Time Management • Problem Solving Skills

  24. Attitude competencies Knowledge • Attitudes • Behavior • Teamwork • Professionalism • Key Personal Qualities • Motivation Attitudes Skills

  25. Continuous process Knowledge Attitudes Skills

  26. Relating Miller’s pyramid of competence to learning and assessment

  27. Miller’s Pyramid of Competence Does Shows Knows How Knows Miller GE. The Assessment of Clinical Skills / Competence / Performance, Academic Medicine, 65:9, S63-S67.

  28. Teaching and Learning “Knows” • Learning • Opportunity • Reading / Independent Study • Lecture • Computer-based • Colleagues / Peers Does Shows Knows How Knows

  29. Assessment of “Knows” Factual Tests Does Shows Knows How Knows

  30. The Tools of “Knows” • Multiple Choice Questions (MCQs) • Short Answer • True / False • Matching (extended) • Constructed Response Questions

  31. Example - MCQ FACT “Wheezes are continuous, musical, whistling sounds during difficult breathing such as in asthma, croup and other respiratory disorders.” Learning Opportunity Information Input (Facts) Factual Output (Answers) Assessment • Q. Whistling sounds associated with an asthmatic patient are called? • Rales B. Rhonchi • C. Wheezes D. Vesicular ANSWER

  32. Choose the best description of the patient’s finding: A. Myoclonus B. Partial Seizure C. Tic D. Fasciculations E. Tremor Computer-based model Click on picture to play video

  33. Teaching and Learning - “Knows How” • Learning • Opportunity • Problem-based Ex. • Tabletop Exercises • Direct Observation • Mentors Does Shows Knows How Knows

  34. Assessment of “Knows how” Clinical Context Based Tests Does Shows Knows How Knows

  35. The Tools of “Knows How” • Multiple-choice question • Essay • Short answer • Oral interview

  36. 64-year-old man No past medical Hx 1 week of intermittent Headache Double vision R pupil dilated Which of the following is most likely the patients problem? A. Migraine B. Myasthenia gravis C. Multiple Sclerosis D. Ischemic Stroke E. Cerebral aneurysm Example – Clinical Context MCQ

  37. Teaching and Learning - “Shows” • Learning • Opportunity • Skill-based Exercises • Repetitive practice • Small Group • Role Playing Does Shows Knows How Knows

  38. Assessment of “Shows” Performance Assessment Does Shows Knows How Knows

  39. The Tools of “Shows” • Objective Structured Clinical Examination (OSCE) • Standardized Patient-based

  40. Variables in Clinical Assessment Examiner Student Clinical Assessment Patient Control as many variables as possible

  41. Teaching and Learning - “Does” • Learning • Opportunity • Experience Does Shows Knows How Knows

  42. Assessment of “Does” Performance Assessment Does Shows Knows How Knows

  43. The Tools of “Does” • Undercover / Stealth / Incognito Standardized Patient-based • Video • Portfolio • Service ratings (customer satisfaction)

  44. Influences on clinical performance Performance Does Individual related System related Competence Cambridge Model for delineating performance and competence Rethans JJ, et al. The relationship between competence and performance: implications for assessing practice performance, Medical Education, 36:901-909.

  45. Assessments types • Choose the appropriate assessment method: • Formative • Summative • Self • Peer

  46. Assessment • Formative Assessment • Lower stakes • One of several, over time of course or program • May be evaluative, diagnostic, or prescriptive • Often results in remediation or progression to next level • Summative Assessment • Higher stakes • Generally final of course or program • Primary purpose is performance measurement • Often results in a “Go, No-Go” outcome

  47. Formative assessment example

  48. Assessments - self • Encourages responsibility for the learning process, fosters skills in making judgments as to whether work is of an acceptable standard – it improves performance. • Most forms of assessment can be adapted to a self-assessment format (MCQs, OSCEs, and short answers) • Students must be aware of standards required for competent performance.

  49. Individual self-learning and assessment

  50. Assessments - peer • Enables learners to hone their skills in their ability to work with others and professional insight • Enables faculty to obtain a view of students they do not see • An important part of peer assessment is for students to justify the marks they award to others • Justification can also be used as a component when faculty evaluates attitudes and professionalism.

More Related