350 likes | 623 Views
The Third International Conference on Medical Education in the Sudan. Standard setting for clinical assessments. Katharine Boursicot, BSc, MBBS, MRCOG, MAHPE Reader in Medical Education Deputy Head of the Centre for Medical and Healthcare Education St George’s, University of London.
E N D
The Third International Conference on Medical Education in the Sudan Standard setting for clinical assessments Katharine Boursicot, BSc, MBBS, MRCOG, MAHPE Reader in Medical Education Deputy Head of the Centre for Medical and Healthcare Education St George’s, University of London
WHAT are we testing in clinical assessments? • Clinical competence • What is it?
A popular modern model: elements of competence • Knowledge • factual • applied: clinical reasoning • Skills • communication • clinical • Attitudes • professional behaviour Tomorrow’s Doctors, GMC 2003
Behaviour~ skills/attitudes Does Knowshow Cognition~ knowledge Knows Showshow Another popular medical model of competence Professional authenticity Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67.
Assessment of competence • A review of developments over the last 40 years
Knows Does Showshow • 1960: National Board of Medical Examiners in the USA introduced the MCQ • MCQs conquered the world • Dissatisfaction due to limitation of MCQs Knows how Knows
Knows how Does Shows how Knows how • 1965: Introduction of PMP Patient Management Problem Knows
Clinical Scenario Action Action Action Action Action Action Action Action Action Action Action Action Action Action Action Patient Management Problem
Knows how Does Shows how Knows how • 1965: Introduction of PMP • Patient Management Problem • Well constructed SBA format MCQs can test the application of knowledge very effectively Knows
Shows how Does Shows how Knows how • 1975: Introduction of Objective Structured Clinical Examination (OSCE) • OSCEs are conquering the world Knows
Does Does Shows how Knows how • > 2000: emerging new methods • WBAs – Workplace-Based Assessments • Mini Clinical Examination Exercise • Direct Observation of Practical Procedure • OSATS • Masked standardized patients • Video assessment • Patient reports • Peer reports • Clinical work samples • ……… Knows
Mini CEX (Norcini, 1995) • Short observation (15-20 minutes) and evaluation of clinical performance in practice using generic evaluation forms completed by different examiners (cf. http://www.abim.org/minicex/)
WBAs – Workplace-Based Assessments • All based on the principle of an assessorobserving a student/trainee in a workplace or practice setting
Performance assessment in vivo: Mini-CEX, DOP, OSATS, ….. Does Performance assessment in vitro: OSCEs Knows how (Clinical) Context based tests: SBA, EMQ, MEQ….. Knows Factual tests: SBA-type MCQs….. Shows how Past 40 years: climbing the pyramid..... Does Shows how Knows how Knows
Standard setting – why bother? • To assure standards • At graduation from medical school • For licensing • For a postgraduate (membership) degree • For progression from one grade to the next • For recertification
At graduation from medical school • To award a medical degree to students who meet the University’s standards (University interest) • To distinguish between the competent and the insufficiently competent (Public interest) • To certify that graduates are suitable for provisional registration (Regulatory/licensing body interest) • To ensure graduates are fit to undertake F1 posts (employer interest)
Definition of Standards • A standard is a statement about whether an examination performance is good enough for a particular purpose • a particular score that serves as the boundary between passing and failing • the numerical answer to the question “How much is enough?”
Standard setting All methods described in the literature are based on ways of translating expert (clinical) judgement into a score
‘Classical’ standard setting methods • For written test items: • Angoff’s method • Ebel’s method • For OSCEs: • Borderline group method • Regressions based method
Performance-based standard setting methods • Borderline group method • Contrasting group method • Regression based standard method Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L, van der Vleuten C Comparison of a rational and an empirical standard setting procedure for an OSCE, Medical Education, 2003 Vol 37 Issue 2, Page 132 Kaufman DM, Mann KV, Muijtjens AMM, van der Vleuten CPM. A comparison of standard-setting procedures for an OSCE in undergraduate medical education. Acad Med 2000; 75:267-271.
The examiner’s role in standard setting • Uses the examiner’s clinical expertise to judge the candidate’s performance • Examiner allocates a global judgementbased on the candidate’s performance at that station • Remember thelevelof the examination Pass Borderline Fail
Borderline Group Method Test score distribution Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL Borderline score distribution Pass, Fail, Borderline Passing score
Contrasting groups method Test score distribution Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL Fail Pass Pass, Fail, Borderline P/B/F Passing score
1 2 3 4 5 Regression based standard Checklist X= passing score 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL Checklist Score X Overall rating 1 2 3 4 5 Clear Borderline Clear Excellent Outstanding fail pass 1 = Clear fail 2 = Borderline 3 = Clear pass 4 = Excellent 5 = Outstanding
Work Based Assessment tools • No gold standard standard setting method!
Standard setting • Standards are based on informed judgments about examinees’ performances against a social or educational construct e.g. • competent practitioner • suitable level of specialist knowledge/skills
Standard setting for Work Based Assessment tools • Based on descriptors for a particular level of training • Information gathering relying on descriptiveand qualitative judgemental information • Descriptors agreed by consensus/panel of clinical experts • Purpose of WBA tools: formative rather than summative: feedback
Feedback • Giving feedback to enhance learning is some form of judgement by the feedback giver on the knowledge and performance of the recipient • It is a very powerful tool!
WBAs and feedback • Underlying principle of WBA tools is FEEDBACK from • Teacher/supervisor • Peers/team members • Other professionals • Patients
Conclusions • It’s not easy to set standards for Work Based Assessments (in the ‘classic’ sense) • Expert professional judgement is required • Wide sampling from different sources: range of tools, contexts, cases and assessors • Feedback to the trainee