390 likes | 423 Views
Faculty development. Work place based assessments. Aims. Rehearse the purpose of WBA Review the tools Discuss the feedback process and it’s potential benefits Outline the requirements for WBAs Allow opportunity for practise and discussion. Faculty development strategy. Produce materials
E N D
Faculty development Work place based assessments
Aims • Rehearse the purpose of WBA • Review the tools • Discuss the feedback process and it’s potential benefits • Outline the requirements for WBAs • Allow opportunity for practise and discussion
Faculty development strategy • Produce materials • Train the trainers • Cascade training • Single most important thing to improve training
Faculty Observation / Rating Skills Ratings based mostly on perceived knowledge and personality Little evidence of direct observation Tendency to score high Poor inter rater reliability Gray, Thompson, Haber, Grant, etc.
Approaches to assessor training Behavioral Observation Training- training to observe and look for actions Performance Dimension Training- familiarise Faculty with elements of competency and agree definitions Frame of Reference Training - having defined the performance dimensions – agree those clinical situations which might discriminate between different levels of trainee Direct Observation of Competence Training- combination of the above using live practice
Training by direct observation Shown to Increase satisfaction with the process by trainers Increased comfort in observation Changed rating behavior at 8 months Increased accuracy in identifying unsatisfactory performance Holmbroe
Your experiences • When was WBA good • For you • For the trainee • What made it good • What training have you already had • What are the outstanding issues you want addressed
Purpose of Assessment? • To aid learning through constructive feedback: • Assessment for Learning (formative) • Done frequently e.g. WBA • To check knowledge or skill has been learned: • Assessment of Learning (summative) • Done infrequently e.g. Exams
Who is interested in the results? • To distinguish between the competent and the insufficiently competent (Public/NHS Interest) • To ensure trainees are fit to progress with their training (Deanery/Faculty) • To ensure that trainees are performing satisfactorily within the precepts of the licensing body (GMC interest) • To show acheivement(Trainee interest)
Does Millers pyramid WBA Knows how Shows how OSCE SAQ /viva MCQ Knows The College of Emergency Medicine
In other words…. Assessment can be effective in improving performance if; • It looks at the whole professional behaviour • It looks at performance vs competence • It encourages practice (to make perfect) • It focuses on continuous improvement – not final snapshot
The subtext of WBA • Encourages reflection by trainee • Should support personal development plans for trainees – and therefore avoid exam disaster • Should enhance trainer-trainee contact • Should allow the trainer to calibrate themselves • It may provide CPD for the assessor
Essential elements of beneficial WPBAs • Culture of support it is not an exam • Recorded and that record reviewed • Encouraging- identifying learning opportunities • Includes constructive thoughtful feedback • There is an expectation of completion • Standardisation and trajectories- low scores the norm at the beginning of training. • Explicit end points and standards • Open transparent “scoring” • No retrospective recording and trainee must have been present
Invisible elements of practice that influence outcomes of assessment The significance of context The doctor’s professional values The kinds of knowledge used The clinical/critical thinking The professional judgement exercised The therapeutic relationship developed The learning by reflection
The tools • Mini-CEX • DOPs • CbD • MSF • ACAT-EM • Leadership tool (Lynsey Flowerdew)
What assess? CBD
What assess? CBD
What assess? CBD
Assessment theory • Assessment drives learning • Formative vs summative • Tools fit for purpose • Reliable • Valid • Feasible • Educational • Acceptable • Assessment reflects syllabus • Structure is reliable – ie multiple sampling (fatigue)
Formative • Termed coined in 1967 • A bi-directional process • Allows you to modify your training • Allows you to set trainee goals • Furnishes trainee with lifelong skills • Motivates trainees to learn • Assessment for learning
Summative • Reflects learning up to that point • Provides information on the ‘product’ • Needs well designed evaluation tools • Needs to have objectivity • Needs to have multiple components • Assessment of learning
Reliability of individual assessment • Familiarity with the tool • Observational skills • Knowledge of standard
Validity of assessments • Content • Face ?- Hawthorne effect • Predictive ? – too early
Feasibility • Core trainees – 1 per 2.5 weeks (1 per week counting holidays etc) • Higher trainees – 1 per 3 weeks • Feasibility therefore depends on your total number of trainees
MiniCEX • Purpose – the clinical encounter. Focus on clinical skills – history taking, examination and analysis of information to come to diagnosis • Reliability – shown to correlate with performance in senior docs in USA • Problems – standards, commitment to use, conflict between summative/formative and repetitive, rarely translate into an action plan • Solutions – enhanced feedback, use it to plan developments, use it to enhance contact
DOPS • Purpose – to identify problems with procedural skills • Reliability – variable • Problems – tick box, undervalued – most people “ok”, not enough procedures to go round, no construct with complexity of procedure • Solutions – experts to do this assessment, define the standards more clearly
CbD • Purpose – to explore the thinking behind decisions, to review the content of a case and outcomes, safety, use of investigations etc • Reliability – good at identifying cognitive problems • Problems – depends on trainer- requires preparation, lateral thinking, can be reduced to tick box • Solutions – understand purpose
MSF • Purpose - “hold a mirror up”, provide trainee with opportunity to focus on behaviour • Reliability – strongest evidence base, most valued, shown to predict those with problems, failing trainees • Problems – lack of participation, difficulty feeding back • Solutions – practice , culture -
ACAT - EM • Purpose – collect ongoing evidence, interactions with many people, opportunity for NTS feedback • Reliability – unknown • Problems – time consuming, uncertain how to conduct, standard not clear • Solutions – training, practice, feedback
Feedback • “Feedback is frequently reported to be too general or to late to be helpful” Westberg et al. 2001 • 8% of MiniCEX resulted in documented constructive feedback Holmbroe 2004
Why don’t assessors give feedback? • Focus on assessment process • Lack of skill / training • Documentation design • Lack of understanding of role of feedback Holmbroe 2004
What is ‘good’ feedback? • Specific • Timely • Balanced • Based on observed facts • Non-judgemental • Promotes reflection • Results in an action plan Norciniand Burch 2007
Feedback • Feed forward • Process of reflection • Pendleton’s rules - reflect on good and then what could be done better (trainee /assessor) • Use descriptors/domains • Construct an action plan • Return to evidence after a period of time - evidence that actions have been completed
Challenges • Time pressure • Space / privacy • Breaking bad news • Trainee defensiveness • Trainer ambivalence • Halo effect
Summary • Multiple tools – must be familiar with them • Feedback and action plans crucial • WBA for learning • Demands time and energy but is rewarding (and an eye opener)