330 likes | 499 Views
n MRCGP Workplace-based Assessment. April 2007. Workplace-based assessment. “The evaluation of a doctor’s progress over time in their performance in those areas of professional practice best tested in the workplace.”. Some principles of assessment. Validity Reliability Educational impact
E N D
nMRCGPWorkplace-based Assessment April 2007
Workplace-based assessment “The evaluation of a doctor’s progress over time in their performance in those areas of professional practice best tested in the workplace.”
Some principles of assessment • Validity • Reliability • Educational impact • Acceptability • Feasibility
Lessons learnt • Need to build up a whole picture • No single method is a panacea • Content important, not format • Less structure, more sampling • Focus on a programme of assessment, not individual methods Van der Vleuten C & Schuwirth L Assessing professional competence from methods to programmes. Medical Education 2005. 39; 309-317
Why workplace-based assessment? • Tests something important and different from other components • Reconnects assessment with learning • Has high educational impact • Valid and reliable • In keeping with PMETB guidance
The WPBA framework • Competency-based • Developmental • Evidential • Locally assessed progress monitoring • Triangulated • Nested within an “e-portfolio” • Applies over the entire training envelope
Competency-based • 12 competency areas • Best tested in the workplace setting • Developmental progression for each competency area • Competency demonstrated “when ready”
The 12 competency areas 1. Communication and consulting skills 2. Practising holistically 3. Data gathering and interpretation 4. Making a diagnosis/ making decisions 5. Clinical management 6. Managing complexity and promoting health 7. Primary care administration and IMT 8. Working with colleagues and in teams 9. Community orientation 10. Maintaining performance, learning and teaching 11. Maintaining an ethical approach to practice 12.Fitness to practice
Insufficient evidence Needs further development Competent Excellent Developmental progression
Evidential • Notion of multiple sampling • From multiple perspectives • Tool-box of “approved” methods • Sufficiency of evidence defined
Local assessment of progress • Assessed by clinical supervisor in hospital or general practice setting • Regular reviews at 6 month intervals by trainer/educational supervisor • Review all the assessment information gathered • Judge progress against competency areas • Provide developmental feedback
Gathering the evidence about the learner’s developmental progress
Evidence from • Specified RCGP tools and… • Naturally occurring information
Specified tools CBD (case based discussion) COT (consultation observation tool) PSQ (patient satisfaction questionnaire) MSF (multi-source feedback) mini-CEX (clinical evaluation exercise) DOPS (direct observation of procedural skills)
Case-based discussion • Structured oral interview • Designed to assess professional judgement • Across a range of competency areas • Starting point is the written record of cases selected by the trainee • Will be used in general practice and hospital settings
COT • Tool to assess consultation skills • Based on MRCGP consulting skills criteria • Can be assessed using video or direct observation during general practice settings
Mini-CEX • Used instead of COT in hospital settings
DOPS • For assessing relevant technical skills during GP training: • Cervical cytology • Complex or intimate examinations (e.g. rectal, pelvic, breast) • Minor surgical skills • Similar to F2 DOPS
MSF • Assessment of clinical ability and professional behaviour • ST1 Rated by 5 clinical colleagues, 2 occasions ST3 Rated by 5 clinical and 5 non-clinical colleagues on 2 occasions • Simple web based tool • Is able to discriminate between doctors BUT • Needs skill of trainer in giving feedback
PSQ • Measures consultation and relational empathy (CARE) • 30 consecutive consultations in GP setting • Can differentiate between doctors BUT • Needs skill of trainer in giving feedback
Naturally occurring evidence • From direct observation during training • “tagged” against appropriate competency headings • Other practice-based activities • Clinical supervisor’s reports (CSR)
Workplace-based assessment ST1 6M 6M Deanery panel Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD 1 x MSF DOPS ** Clinical supervisors’ report ** Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD 1 x MSF 1 x PSQ * DOPS ** Clinical supervisors’ report ** * if GP post ** if appropriate
Workplace-based assessment ST2 6M 6M Deanery panel Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD DOPS ** Clinical supervisors’ report ** Interim review Based on evidence: 3 x mini-CEX or COT* 3 x CBD DOPS ** Clinical supervisors’ report ** * if GP post ** if appropriate
Workplace-based assessment ST3 6M month 34 Deanery panel Interim review Based on evidence: 6 x COT 6 x CBD 1 x MSF DOPS ** Clinical supervisor’s report** Final review Based on evidence: 6 x COT 6 x CBD 1 x MSF DOPS ** PSQ * if hospital post ** if appropriate
Deanery Panels • Chaired by nominee of GP Director • ‘Gold Guide’ compliant • Membership: • Nominated Chair • Experienced GP trainer/Programme Director • RCGP assessor • Lay member
ST1/ST2 • Deanery panel review of e-portfolios of all trainees • Face to face review with trainees if problem with progress identified • Annual assessment outcome report generated (AAO) • Logged within e-portfolio
The Final Judgement • Recommendation of satisfactory completion of WPBA by educational supervisor to Deanery Director • Based on attainment of competence in each of 12 competency areas • Deanery panel review of e-portfolio +/- face to face interview • e-portfolio “sign off” or recommendation for further training
Quality control • Deanery • Specification • Training • Calibration • Moderation • National • Sampling of ETR • Verification and audit • Appeals process