960 likes | 974 Views
“ Show Me How to Get Past MCQs: Emerging Opportunities in Measurement ” Carol O’Byrne, PEBC Karen S. Flint and Jaime Walla, AMP Drs. Frank Hideg, Paul Townsend, & Mark Christensen, NBCE Alison Cooper, CAPR Lila Quero-Munoz, Consultant. Presented at the 2004 CLEAR Annual Conference
E N D
“Show Me How to Get Past MCQs: Emerging Opportunities in Measurement ”Carol O’Byrne, PEBC Karen S. Flint and Jaime Walla, AMPDrs. Frank Hideg, Paul Townsend, & Mark Christensen, NBCEAlison Cooper, CAPRLila Quero-Munoz, Consultant Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Goals • Gain an overview of performance assessment • Observe and try out electronic & standardized patient simulations • Consider exam development, implementation and administration issues • Consider validity questions & research needs • Create computer-administered & standardized patient simulations with scoring rubrics • Set passing standards Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Part 1 - Presentations Introduction to performance assessment • Purposes and objectives • Models • Issues, successes and challenges 15-minute presentations • Four models, including their unique aspects with two participatory demonstrations • Developmental and ongoing validity issues and research studies Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Part 2 - Break-out Sessions • Identify steps in development and implementation of a new performance assessment and develop a new station • Create a new electronic simulation and set passing standards • Create a new standardized patient simulation and scoring rubrics • Participate in a standard setting exercise using the ‘Competence Standard Setting Method’ and all the while, ask the ‘hard questions’ Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Performance Assessment - WHY? To assess important problem solving, critical thinking, communications, hands-on and other complex skills that: • Impact clients' safety and welfare if not performed adequately and • Are difficult to assess in a multiple choice question format Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
HOW? • ‘Pot luck’ direct observation (e.g., medical rounds, clerkships and internships) • Semi-structured assessments (e.g. orals and Patient Management Problems) • Objective, Structured Clinical Examinations (OSCEs) (combining standardized client interactions with other formats) • Other standardized simulations (e.g., airline pilots' simulators) • Electronic simulations (e.g., real estate, respiratory care, architecture) Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Does it really work? Links in the Chain of Evidence to Support the Validity of Examination Results: • Job Analysis • Test Specifications • Item Writing • Examination Construction • Standard Setting • Test Administration • Scoring • Reporting Test Results Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Based on national competencies Two parts: MCE & OSCE Must pass both to be eligible for pharmacist licensure in Canada Offered spring and fall in multiple locations 1400+ candidates/year $1350 CDN 15-station OSCE 12 client interactions (SP or SHP) + 3 non-client stations 7 minute stations One expert examiner Checklist to document performance Holistic ratings to score exam Standard Setting Reports – results and feedback PEBC Qualifying Examination Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Competencies Assessed by PEBC’s MCE and OSCE Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Comparing PEBC’s OSCE (PS04) and MCE (QS04) Scores Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Comparing PEBC’s OSCE and MCE scores Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Holistic Rating Scales COMMUNICATION Skills (1) • Rapport • Organization • Verbal and nonverbal expression Problem-solving OUTCOME (2) • Information processing • Decision making • Follow-up Overall PERFORMANCE (3) • Comm & Outcome • Thoroughness (checklist) • Accuracy (misinformation) • Risk Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Validity – an ascent from Practice Analysis to Test Results Job/practice analysis • Who/what contexts? • How? Test specifications & sampling • Which competencies? • Which tasks/scenarios? • Other parameters? Item writing and review • Who and how? Scoring • Analytic (checklists) &/or holistic (scales)? Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Validity – an ascent from Practice Analysis to Test Results Detect and minimize unwanted variability, e.g.: • Items/tasks – does the mix matter? • Practice effect – how can we avoid it? • Presentation/administration – what is the impact of different SPs, computers, materials/equipment? • Scores – how do we know how accurate and dependable they are? What can we do to improve accuracy? Set Defensible Pass-fail Standards • How should we do this when different standard setting methods -> different standards? • How do we know if the standard is appropriate? Report Results • Are they clear? Interpreted correctly? • Are they defensible? Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Validity – flying high Evidence • Strong links from job analysis to interpretation of test results • Relates to performance in training & other tests Reliable, generalizable & dependable • Scores • Pass-fail standards & outcomes Feasible • Large & small scale programs • Economic, human, physical, technological resources Ongoing Research Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Wild Life Candidate diversity • Language • Training • Format familiarity, e.g. computer skills • Accommodations Logistics • Technological requirements • Replications (fatigue, attention span) Security Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
“Computer-Based Simulations”Karen S. FlintDirector, Internal Development & Systems IntegrationApplied Measurement Professionals, Inc. Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Evolution of Simulation Exam Format • AMP’s parent company, NBRC, provided oral exams from 1961 to 1978 • Alternative sought due to: • Limited number of candidates that could be tested each administration • Cost to candidates who had to travel to location • Concern about potential oral examiner bias Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Evolution of Simulation Exam Format • Printed simulation exam format introduced in 1978 using latent image technology • Latent image format used by NBRC from 1978 to 1999 • NBRC decision to convert all exams to computer-based testing • Proprietary software developed by AMP to administer simulation exams in comparable format via computer – introduced in 2000 • Both latent image test booklets & computerized format being used Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
How Simulation Exams Differ from MCQs • Provides accurate assessment of higher order thinking related to a content area of interest (testing more than just recall) • Challenge test takers beyond complexity of MCQs • Simulation problems allow test takers to assess their skills against test content drawn from realistic situations or clinical events Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Sample relationship between multiple-choice and simulation scores assessing similar content Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Simulation Utility • Continuing competency examinations • Self-assessment/practice examinations • High-stakes examinations • Psychometric characteristics comparable to other assessment methodologies • That is, good reliability and validity Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Professions Using This Simulation Format • Advanced-Level Respiratory Therapists • Advanced-Level Dietitians • Lighting Design Professionals • Orthotist/Prosthetist Professionals • Health System Case Management Professionals (beginning 2005) • Real Estate Professionals • Candidate fees range from $200 to $525 for full-length certification/licensure simulation exam Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Structure of Simulations • Opening Scenario • Information Gathering (IG) Sections • Decision Making (DM) Sections • Single or multiple DM • All choices are weighted (+3 to –3) • Passing scores relate to judgment of content experts on ‘minimal competence’ Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Simulation Development(Graphic depiction of path through a simulation problem) Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
IG Section Details • IG section • A section in which test takers choose information that will best help them understand a presenting problem or situation • Facilitative options may receive scores of +3, +2, or +1 • Uninformative, wasteful, unnecessarily invasive, or potentially illegal options may receive scores of –1, –2, or –3 • Test takers who select undesirable options accumulate negative section points Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
IG Section Details • IG Section Minimum Pass Level (MPL) • Among all options with positive scores in a section, some should be designated as REQUIRED for minimally competent practice • The sum of points for all REQUIRED options in a section equals MPL Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
DM Section Details • DM section • A section of typically 4-6 options in which the test taker must make a decision about how to handle the presenting situation • Facilitative options may receive scores of +3, +2, or +1 • Harmful or potentially illegal options may receive scores of –1, –2, or –3 • Test takers who select undesirable options accumulate negative section points and are directed to select another option Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
DM Section Details • DM Section Minimum Pass Level (MPL) • May contain two correct choices, but one must be designated as REQUIRED for minimally competent practice • The REQUIRED option point value in the section equals MPL Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Minimum Passing Level • DM MPL • The sum of all DM section MPLs • IG MPL • The sum of all IG section MPLS • Overall Simulation Problem MPL • Candidates must achieve MPL in both Information Gathering and Decision Making Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Simulation Exam Development • 8 to 10 simulation problems per examination • Each problem assesses different situation typically encountered on the job Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Let’s Attempt A Computerized Simulation Problem!!! Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Karen S. Flint, Director, Internal Development & Systems Integration Applied Measurement Professionals, Inc. 8310 Nieman Road Lenexa, KS 66214 913.541.0400 (Fax – 913.541.0156) KFlint@goAMP.com www.goAMP.com Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
“Practical Testing”Dr. Frank Hideg, DCDr. Mark Christensen, PhD Dr. Paul Townsend, DC Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
NBCE History • The National Board of Chiropractic Examiners was founded in 1963 • The first NBCE exams were administered in 1965 • Prior to 1965 chiropractors were required to take chiropractic state boards and medical state basic science boards for licensure Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
NBCE Battery of Pre-licensure Examinations • Part I – Basic Sciences Examinations • Part II – Clinical Sciences Examinations • Part III – Written Clinical Competency • Part IV – Practical Examination for Licensure Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Hierarchy of Clinical Skills DO PRACTICE PART IV SHOW HOW KNOW HOW PART III KNOWLEDGE PARTS I & II Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
NBCE Practical Examination Content Areas • Diagnostic Imaging • Chiropractic Technique • Chiropractic Case Management Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Content Weighing TEC 17% DIM 16% CAM 67% Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Diagnostic Imaging • 10 Four-minute Stations • Candidate identifies radiological signs on plain film x-rays • Candidate determines most likely diagnoses • Candidate makes most appropriate initial case management decisions Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Chiropractic Technique • 5 five-minute stations • Candidate demonstrates two adjusting techniques per station • Cervical spine • Thoracic spine • Lumbar spine • Sacroiliac articulations • Extremity articulations Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Chiropractic Case Management • 10 five-minute patient encounter stations • 10 linked post-encounter probe (PEP) stations • Candidate performs focused case histories • Candidate performs focused physical examinations • Candidate evaluates patient clinical database • Candidate makes differential diagnoses • Candidate makes initial case management decisions Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Key Features of NBCE Practical Examination • Use of standardized patients • Use of OSCE format and protocols Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Case History Stations • Successful candidates use organized approach while obtaining case history information • Successful candidates communicate effectively with patients • Successful candidates respect patient dignity • Successful candidates elicit adequate historical information Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Perform a Focused Case History Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Post-Encounter Probe Station Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Part IV Candidate Numbers Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Part IV State Acceptance Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri
Presented at the 2004 CLEAR Annual Conference September 30 – October 2 Kansas City, Missouri