260 likes | 372 Views
Awarding global scores in OSCEs: evaluation of an online examiner training resource Dr Gerry Gormley, Clare Thomson, Dr Kieran McGlade and Eamonn O’Hagan Centre for Medical Education. Background. OSCEs are commonly used in assessing aspects of clinical competency
E N D
Awarding global scores in OSCEs: evaluation of an online examiner training resource Dr Gerry Gormley, Clare Thomson, Dr Kieran McGlade and Eamonn O’Hagan Centre for Medical Education
OSCEs are commonly used in assessing aspects of clinical competency Borderline regression (BLR) method widely used in standard sitting OSCEs Assessors award a ‘global score’ on a candidates performance Global score then regressed against total item score for each station Process dependent on assessor performance Need for improved assessor training and maintenance of skills (Pell et al Med Teach 2010) Training has impact on marks awarded by assessors (Pell et al Jour Res Meth in Education 2008)
Increasing and competing demands on NHS staff as being OSCE assessors • e-training resource developed at QUB for OSCE assessor training • Explanation of the philosophy and process of OSCEs • Examiner conduct and role • Practice in marking OSCEs • Practice in awarding global scores and comparing responses with peers • Open access to doctors (requires GMC / IMC numbers) • OSCE videos scripted and acted to represent a range of student ability
Example of global scores awarded by users in Final MB OSCE videos Percentage of users
Evaluate the perceived impact of an e-training resource on assessors ability to award global scores in OSCEs Aim
Users of resource were asked to complete an evaluation questionnaire: User details: speciality; years since qualification *Perceived level of: overall usefulness of training resource specific usefulness of comparing global scores / free text comments *Perceived impact on confidence in awarding global scores in OSCEs: pre and post levels of confidence in awarding global scores Preference regarding online vs face-face training in awarding global scores Free text comments on what users: gained from the resource and perceived impact on future practice *5 point likert scale (Strongly agree – Strongly disagree)
Simple descriptive statistics used to analyse responses Paired sample t test to compare pre- and post- confidence scores Thematic analysis of free text comments
Analysis performed after first 100 users (i.e. from February – April 2011) 75% of users had examined in OSCEs before using resource Results
Reported speciality backgroundof users Percentage of users
Reported duration since qualification of users Percentage of users Years since qualification
Reported level of confidence in awarding global scores before and after using the e-training resource Percentage of users P<0.001
Usefulness in comparing i) global scores and ii) free text comments with fellow users Percentage of users
Preference regarding location of training in awarding global scores Percentage of users
364 videos observed (and global scores awarded) After awarding global score and comparing with peers – users were asked: “In light of comparing your judgement with that of your fellow users – would you now consider changing your global score?” In 13% (47/364) of all videos observed - users reported YES 11 would upgrade global score awarded 36 would downgrade global score awarded
Thematic analysis of 100 free text statements regarding “After using this e-training resource - what have I learned about awarding global scores in OSCEs?”
Thematic analysis of 100 free text statements regarding“After using this e-training resource - how will my practice change in the future when awarding global scores in OSCEs?”
e-training resource appears to be valued by new and established OSCE assessors Particularly the opportunity to benchmark global score decisions with peers Perceived positive impact on assessors confidence in awarding global scores Strong preference for online compared to face-to-face training For many reinforces current practice is in keeping with peers in awarding global scores For others brought to their attention: degree of assessor variance need to more diligent in assessing in OSCEs and the importance of considering candidates level of studies Impact on actual assessor variance in OSCEs remains to be seen Conclusions
Funding and support: The Higher Education Academy Subject Centre for: Medicine, Dentistry & Veterinary Medicine Website development: Clare Thomson and Eamonn O’Hagan Audio visual Dept (QUB): Amanda McKittrick and Stephen Mullan ‘The actors’: Dr Phil Cooke, Dr Caroline Gormley, Dr Catherine Coyle Andy Crooks, Frank Crummey, Joanna Fyffe Statistics Dr Jenny Johnston Acknowledgements