550 likes | 597 Views
Identifying and Supporting Doctors in Difficulty – Bringing Research Into Practice. Dr Mumtaz Patel and Dr Joanne Rowell Postgraduate Associate Deans, HEE NW. Outline of Session. Update on current research in this area and impact on practice (role of current and new assessment tools)
E N D
Identifying and Supporting Doctors in Difficulty – Bringing Research Into Practice Dr Mumtaz Patel and Dr Joanne Rowell Postgraduate Associate Deans, HEE NW
Outline of Session • Update on current research in this area and impact on practice (role of current and new assessment tools) • Understanding the reasons why trainees get into difficulty • Focus on how trainers can identify and manage trainees earlier. • Discuss strategies around optimising educational supervision (improving quality of ESR and “holistic educational supervision”)
Doctors in Difficulty (DiD) • Internationally – up to 10% of trainees fail to meet the standards of their training programmes (Tabby, 2011) • Nationally 2-6% of all doctors experience difficulty that raises concern about their performance (NCAS, 2006) Varies (NACT, 2008) • Doctors in the early stages of training are more likely to get into difficulty (Brennan, 2010)
Common Presentations Firth-Cozens, 2004
Patterns or repetitive behaviours (rather than one off incidents) Sudden out of character behaviour Sickness Serious one-offs that are rationalised by trainee Eg. a small lie Potential triggers of concern
Early Signs and Identification Paice, 2004
Previous DiD Research • Being a “good doctor” – more than technical and clinical competence, skills and knowledge (Cox, 2006) • Doctors can face difficulty from a variety of sources – influence their performance (Patterson, 2013) • Traditional and summative assessments of competence don’t reliably predict performance (Rethans, 2002) • Many have behavioural and professionalism issues – difficult to assess (Wilkinson, 2009) • Question arises how good are WPBA in predicting DiD?
WBPAs and DiD Miller’s Pyramid for assessment WPBA Value in predicting doctors in difficulty ? Professional authenticity
Previous DiD Research (2) • DiD behave differently from their peers (Barrow, 2005) • Key features described by Paice (2004) • Lack of insight • Rejection of constructive criticisms • Defensiveness, rigidity, with poor tolerance • Avoidance of challenging situations • WPBA by nature require trainees to proactively select challenging situations, be open to criticism.. • Raises the question whether DiD use WPBA differently
Previous WPBA in DiD Research • Mitchell (2011) • Retrospective observational study • 1646 Foundation trainees (92 DiD) • Overall lower mean CBD scores in DiD • Mean WPBA score - very weak predictive value • Mitchell (2013) • 76,115 WPBA in 1900 foundation trainees • 3 major themes – choice of .. • low difficulty assessments (CBD and mini-CEX) • familiar procedures (DOPS) • assessor profession and seniority (nurse and non-clinical)
Changes to WPBA - 2012 • In 2012, SLEs replaced traditional WPBA • SLEs use the established set of WPBA such as mini-CEX, CBD, ACATs greater emphasis on constructive feedback and action plans through free text • Key element of SLEs: - incorporate trainee reflection and structured feedback to drive learning - reduce the overall number of assessments - identify training issues earlier
SLEs and DiD Miller’s triangle for assessment • Supervised Learning Events (SLEs) • New WPBAs – 2012 • Identify training issues early • Value in predicting doctors in difficulty ?
Study Aim • Explore the value of the Supervised Learning Events in Predicting Doctors in Difficulty
Methods • 1. Retrospective case control study examining foundation trainee portfolios • 2. Qualitative interview based study looking at senior educators views of the value of newer WPBA in predicting DiD
Methods (1) Retrospective case-control study Subjects: NW Foundation School trainees (2012-3 cohort, n=1086). Data: All DiD N=71 Controls (same cohort) N=142 Anonymised E-Portfolios SLEs (Mini-CEX, CBD, DOPS, DCT), TAB and ESR Free text assessed qualitatively Coded blindly using GMC Good Medical Practice Guideline domains
Results: Prevalence of DiD 71 DiD from 1086 FY trainees = 6.5%
Binary logistic regressionDependent variable: Difficulty statusSingle covariate: Cumulative scores from GMC GMP Domains Sensitivity Specificity TAB ESR Mini-CEX
Fishers Association Test*Prediced DID estimated from the GMC GMP cumulative domain scores for each SLEResults – Association Analyses (Fisher’s test) TAB ESR Mini-CEX
Conclusion • TAB/MSF is the only SLE useful in predicting DiD • Educational Supervisor report plays a pivotal role in predicting and evaluating DiD • The SLEs are useful in predicting DiD but not used to their full potential with lack of constructive, especially negative feedback. • M Patel, S Agius, J Wilkinson, L Patel, P Baker, Med Ed 2016, 50: 746-756
Study 2… • Factors affecting whether WPBA predict DiD • Understand and explore the reasons around this
Methods (2) • Qualitative interview-based study involving senior staff from Health Education England North-West (HEE NW) actively involved in postgraduate medical education (n=15). • Semi-structured interviews conducted until data saturation achieved using Grounded Theory principles. • Thematic analysis done using the Framework method.
Activity • On a score of 1-10 how effective do you think the current or new WPBA at evaluating: a) Clinical competence b) Performance c) Professionalism d) Predicting doctors in difficulty • Which WPBA/SLE do you think are better/worse? • What factors affect whether WPBA predicts DiD?
Results – Thematic analysis • Types of WPBA • (TAB)(MSF) useful and specific but not sensitive; rest of limited value. • Content and Quality of WPBA • poor owing to lack of detail and discriminatory value. • Assessor critique • time constraints affecting trainer engagement, WPBA completion and quality of feedback often poor especially lacking negative feedback. • Trainee critique • trainee bias with choice of assessors/cases; trainees lacking understanding of WBPA and use incorrectly.
Results – Narratives… Theme 1 • MSF/TAB gives a good perception of the trainee in the holistic sense and picks up behavioural and professionalism issues very well (P6) • MSF encompasses the breadth of consultant opinion and contextualise individual areas to give an overall opinion of a doctors performance (P15) • SLEs are similar to older WPBA but less of a tick-box exercise with more free-text (P4) • SLEs.. They have just changed the name with a few more free-text boxes but are useless if they are not completed properly (P11)
Results – Narratives… Theme 2 • WPBA are as good as the information you put in.. The process is good but they are not used as they should (P11) • Feedback given to trainees is really important but is often poor, overwhelmingly positive and not constructive (P3) • Failing the failures rarely happens … failures need to be discussed openly as a profession (P14)
Results – Narratives… Theme 3 • Many trainers are not engaging with WPBA and still see it as a tick-box exercise … WPBA need to be ubiquitous and accepted by all (P2) • Less time pressure will improve willingness to engage with the process and improve completion of the WPBA (P9) • Hawk versus dove effect with some assessors being very harsh with their marking and others quite lenient (P15)
Results – Narratives… Theme 4 • Trainees can be a little contrived as they can choose consultants who are sympathetic and less likely to give them negative feedback (P4) • Trainees could see the first 5 patients in clinic or a ward round and then the WPBA are done.. This would make the assessment for valid and less artificial (P13) • Trainees select assessors who will be kind to them and behave with them in such a way to create an impression they want assessors to see (P14)
Recommendations… What next… • Greater use of TAB with more structured feedback • Improve quality of ESR with better triangulation of information from WPBA. • Improving training of trainers/trainees to enhance engagement and improve quality of WPBA and ESR completion and feedback.
Subsequent Research • Assessment of Quality of ESR has now been initiated during ARCPs using a standardised framework. • Following individual formative feedback to ES, we have shown significant improvement in quality of successive ESR. • More detailed reports synthesising evidence from a number of sources • More constructive feedback to trainees with clear suggestions of learning outcomes to be achieved and incorporated in the trainee’s PDP. • Increase in the excellent grading in reports
ESR Feedback – Trainer perceptions • Qualitative assessment of the feedback from ES was overwhelmingly positive. • Structured form and individual formative feedback very helpful. • Many commented just knowing what domains need to be addressed and completed was really useful • No negative comments of the structured form or feedback received • Other than one who mentioned that the time constraints affected the quality of their ESR
Conclusion • Simple structured form to assess quality of ESRs during ARCPs can provide: • Useful formative feedback to educational supervisors • Significantly improves quality of successive ESRs • Recommendations include: • Rolling this process across all medical specialties • Larger programmes such as CMT and Foundation
Steps to take this forwards.. This work has now been rolled out: • Regionally through School of Medicine at HEE NW. • Nationally through the Renal SAC and JRCPTB through External Advisor Training • Working to roll out to Foundation through Horus • Added in trainee feedback of quality of ESR in 2016 • Started to assess quality of SLEs in 2016
Qualitative Feedback - SLEs • Qualitative assessment of the feedback from trainers and trainees was positive • Welcomed feedback and found it useful • Useful to know what domains being assessed and how to improve quality • Many trainees felt SLEs not much different than traditional WBPA • Still seen as tick-box exercise • Not done in timely fashion • Poor quality feedback and not very formative
Steps to take this forwards.. This work has now been rolled out: • Regionally through School of Medicine at HEE NW. • Nationally through the Renal SAC and JRCPTB through External Advisor Training • Working to roll out to Foundation through Horus • Added in trainee feedback of quality of ESR in 2016 • Started to assess quality of SLEs in 2016 Workshops set up and delivered to trainers and trainees at induction/meetings to improve engagement and quality of completion of ESR/WPBA/SLE
DEMEC 2015 • My work presented alongside other DiD work • East Midlands Group – Sathya Naidoo.. • Showed WPBA (ESR, MSF, PSQ) reliably predicted poor performance (exams, additional training time at ARCP) in GP trainees • Developed assessment tool (matrix) to identify struggling trainees early • Intervention with targeted questionnaire to identify what factors contributing to their difficulty • Providing support early to reduce longer term problems and improve outcomes
Developing Assessment Tool to Identify DiD (collaboration East Midlands Group) • Our study has shown TAB and ESR to be strongly predictive of DiD; Mini-CEX weakly positive • Qualitative data but can numerical assess (0,1,2) on no, some or major concern • Easily retrievable from Horus • FPAT selection scores – quantitative from application can be evaluated for predictive value • Can be factored into a “risk” assessment tool for predicting DiD
Current Research in Progress Doctors and Dentist Review Group Study HEE NW • Analysing the DDRG/Foundation DiD cohort to look at the patterns of referral, process of management, outcome of each case and see whether it does what it is intended to. • Assessing individual cases to evaluate whether earlier intervention could have altered outcome. • Evaluating the support provided to trainees referred to the DDRG/Foundation DiD and the trainees’ perception of the support received. • Collaborative work with GMC – DiD longer term outcomes
Current Research in Progress • SLE feedback – foundation trainee perceptions of feedback with actual feedback (MSc project) • Foundation trainees from HEE NW (75-80) • 4 centres – questionnaire based study and focus groups around educational supervision, use of portfolios, SLE feedback • Anonymised SLEs from trainee portfolios- evaluate actual feedback • Trainee perception of value of WPBA in predicting DiD (Medical Education Fellow project) • Higher speciality trainees in HEE in NW (80) • Questionnaire based and focus groups • Assesses trainees perceptions how struggling trainees can be identified and supported earlier
Collaborative Research • JRCPTB – Clinical Lead for Quality • Recently analysed 6 key quality datasets for 29 medical specialties and 3 subspecialties • Analysed E&D data for over 15,000 trainees for all 9 protected characteristics • Compared outcomes (Recruitment, ARCP, MRCP, PYA outcomes, New Consultant Survey, GMC NTS) by specialty and deanery • Mapped across the GMC themes for standards of Med Education • To publish Summer/Autumn 2017 • GMC – Differential Attainment • Understanding the barriers to progression • Educational environment and culture huge issue • Importance of a more holistic approach to educational supervision M Patel, S Agius, Med Ed 2017, 51: 342-350
GMC Fair Training Pathways for All • Being ‘different’ to the majority was perceived to be one of the most significant risks to progression - such as inexperience with UK systems and/ or cultural norms. More significant for IMGs than for BME UKGs. • Being able to personally influence change affects perceptions of how difficult or easy it is to tackle risk. Easier to tackle at personal, organisation level rather than regional, national level • Perceived barriers to challenging risks included - lack of knowledge or understanding about differential attainment; lack of available information or evidence about good practice; and sensitivities about race. • Attitudes towards asking for and accepting are changing; a range of interventions and good practice are already taking place across postgraduate education and employers, including training for trainers and examiners, and training and support for trainees; transparency around data and engagement with stakeholders, and designing recruitment and assessments to minimise bias. http://www.gmc-uk.org/about/research/30991.asp
Summary… • Brief overview of some of the evidence around identifying and supporting trainees early • Given you a flavour of some of the previous and current research work in this area and how these can be applied in practice • Ultimately … identifying trainees at risk earlier, understanding the reasons why they are struggling, earlier interventions and support, will ultimately help improve their outcomes…