670 likes | 1.78k Views
Training the OSCE Examiners. Katharine Boursicot Trudie Roberts. Programme. Principles of OSCEs for examiners Video marking Marking live stations Strategies for enhancing examiner participation in training. Academic principles of OSCEs. The basics What is an OSCE? More academic detail
E N D
Training the OSCE Examiners Katharine Boursicot Trudie Roberts
Programme • Principles of OSCEs for examiners • Video marking • Marking live stations • Strategies for enhancing examiner participation in training
Academic principles of OSCEs • The basics • What is an OSCE? • More academic detail • Why use OSCEs? • The role of examiners • Examiners in OSCEs
The basics • For examiners who don’t know about OSCEs • A brief reminder for those who are familiar with OSCEs
What is an OSCE? • Objective • Structured • Clinical • Examination
OSCE test design Station
OSCEs - Objective • All the candidates are presented with the sametest
OSCEs - Structured • The marking scheme for eachstation isstructured • Specific skill modalities are tested at each station • History taking • Explanation • Clinical examination • Procedures
OSCEs –Clinical Examination • Test of performance of clinical skills: not a test of knowledge • the candidates have to demonstrate their skills
More academic detail • Why use OSCEs in clinical assessment? • Improved reliability • Fairer test of candidate’s clinical abilities
Why use OSCEs in clinical assessment? • Careful specification of content • Observation of wide sample of activities • Structured interaction between examiner and student • Structured marking schedule • Each student has to perform the same tasks
Characteristics of assessment instruments • Utility = • Reliability • Validity • Educational impact • Acceptability • Feasibility Reference Van der Vleuten, C. The assessment of professional competence: developments,research and practical implications Advances in Health Science Education 1996, Vol 1: 41-67
Test characteristics • Reliability of a test/ measure • reproducibility of scores across raters, questions, cases, occasions • capability of differentiating consistently between good and poor students
Test Sample Test Sample Sampling Domain of Interest
Reliability • Competencies are highly domain-specific • broad samplingis required to obtain adequate reliability • across content i.e. range of cases/situations • across other potential factors that cause error variance i.e. • testing time, examiners, patients, settings, facilities
Test characteristics • Validityof a test/measure • the test measures the characteristic (eg knowledge, skills) that it is intended to measure
Behaviour~ skills/attitudes Does Knowshow Cognition~ knowledge Knows Showshow Model of competence Professional authenticity Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67.
Does Knowshow Knows Showshow Validity of testing formats Professional practice assessment Performance assessment: OSCEs, long/short cases, OSLERs, etc Problem-solving assessment: EMQs, SEQs Knowledge assessment: MCQs
Test characteristics: Educational impact Relationship between assessment and learning Curriculum Assessment Teacher Student
Test characteristics • Feasibility • cost • human resource • physical resources
Test characteristics • Acceptability • tolerable effort • reasonable cost • Acceptability • doctors • licensing bodies • employers • patients/consumer groups • students • faculty
The role of examiners in OSCEs • General • Types of stations • Standard setting • Practice at marking
The role of examiners in OSCEs • To observe the performance of the student at a particular task • To score according to the marking schedule • To contribute to the good conduct of the examination
The role of examiners in OSCEs • It is NOTto: • Conduct a viva voce • Re-write the station • Interfere with the simulated patient’s role • Design their own marking scheme • Teach
Types of OSCE stations • History taking • Explanation • Clinical examination • Procedures
Communication skills • Stations involving patients, simulated patients or volunteers • Content vs process i.e what the candidate says vs how the candidate says it
Clinical skills • People • Professional behaviour • Manikins • Describe actions to the examiner
The examiner’s role in standard setting • Use your clinical expertise to judge the candidate’s performance • Allocate a global judgement on the candidate’s performance at that station • Remember the levelof the examination
Global scoring Excellent pass Very good pass Clear pass Borderline Clear fail
Borderline method Test score distribution Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL Borderline score distribution Pass, Fail, Borderline P/B/F Passing score
1 2 3 4 5 Regression based standard Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL X= passing score Checklist Score X Overall rating 1 2 3 4 5 1 = Clear fail 2 = Borderline 3 = Clear pass 4 = V Good pass 5 = Excellent pass Clear Borderline Clear V Good Excellent fail pass pass pass
Practice at marking • Videos • Live stations • Mini-OSCE
Strategies for enhancing examiner participation • CME • Job plan/ part of contract • Specific allocation of SIFT • Experience for post-graduate examinations • Payment