280 likes | 637 Views
Background. Need exists for competency assessment in pathology trainingACGME mandatedEvaluate the effectiveness of our programRelative lack of structured objective measures of competency during pathology trainingThe ASCP RISE is one of the only measures currently available. Background. Common me
E N D
1. General Surgical Pathology Competency using Virtual Slides Leslie Bruch, M.D.
University of Iowa
2. Background Need exists for competency assessment in pathology training
ACGME mandated
Evaluate the effectiveness of our program
Relative lack of structured objective measures of competency during pathology training
The ASCP RISE is one of the only measures currently available
3. Background Common methods of evaluation such as global evaluations have limitations
Often subjective (not objective)
Reliability often poor
Literature review for pathology
Few examples of performance based competency tools in either AP or CP
Few examples of tools using virtual slides
4. Literature Review Surgical Pathology Competency Evaluation
Longitudinal Case-Based Evaluation of Diagnostic Competency Among Pathology Residents
(Ducatman and Ducatman Arch Pathol Lab Med 2006 130:188-193)
Describe a method for assessing resident competency in diagnostic surgical pathology over training using routine cases
More objective than traditional evaluations
Increased systems accountability
Fosters earlier identification & remediation of deficiencies
5. Literature Review Use of Virtual Slides as compared to glass
95-100% concordance in diagnostic accuracy in validation studies done to date
Examples of studies:
Development and evaluation of the virtual pathology slide: a new tool in telepathology
(Costello SS et al. J Med Internet Res 2003 5:e11)
Compared diagnostic accuracy and acceptability between virtual & glass slide diagnoses among pathologists and pathology trainees
User satisfaction relatively high
Ease of use and confidence evaluated
Virtual slides can be used to make correct diagnoses
6. Literature Review Use of whole slides imaging in surgical pathology quality assurance: design and pilot validation
(Ho J, Parwani AV, Jukic DM, Yagi Y, Anthony L, Gilbertson JR. Human Pathol 2006 37:322-331)
Compared glass slide with whole slide imaging on QA review on 24 GU cases (391 slides)
No difference in perceived case complexity
No difference in diagnostic confidence
Identical number of discrepant cases using virtual & glass slides for the review
Virtual slide evaluation took longer
Image quality affected accuracy on virtual slide interpretation in one case
7. Literature Review Assessment using Virtual Slides
Virtual microscopy for learning and assessment in pathology
(Kumar RK, Velan GM, Korell SO, Kandara MJ, Dee FR, Wakefield D. J Pathol 2004 204:613-618)
Describe the implementation of teaching microscopic pathology with virtual slides and their use in summative assessment for medical students
Good acceptance by students & faculty
Allowed different teaching format (pairs of students)
Student scores on glass vs virtual slides indistinguishable
Some practical issues arose with trying to administer a secure test for a large group at one time in terms of technology
8. Methods Premise:
The ability to recognize and interpret findings on a slide to arrive at a diagnosis (e.g. morphologic skill) is one of the most important skills of an anatomic pathologist
Goal of Study:
Develop a way to objectively measure morphologic skill (performance based assessment) and their application within a clinical simulation
9. Methods This project is educational research; supported by a grant from the National Library of Medicine
Our goal was to develop and evaluate a tool for use in our program
We have not developed this for sale, but would be interested in potential collaborations
10. Methods Rationale for use of virtual slides rather than glass slides:
Efficiency
Ease of administration
One slide, one scan
Accessibility
Portable (anywhere there are computers)
Resident exposure to virtual slides important now that ABP uses virtual slides
11. Methods Exam Format
20 questions using 20 virtual slides
Provide only limited clinical information to decrease clueing from clinical history
Each question has 10 foils
Questions and foils were created with help from surgical pathologists
12. Methods Criteria for slide selection
Range of difficulty
Aim for 1st year residents to get 40% correct and senior residents to get 90% correct
Slides (and list of potential diagnoses) reviewed by several SP faculty
13. Methods Resources
Technology
Virtual slides created with Aperio scanner and delivered to the program using MicroBrightField viewer
Other technology/vendors could be utilized
Labor
Programming of the test interface done with Perl scripted MySQL database
Slide selection and foil creation done by surgical pathology faculty
Support
Grant from National Library of Medicine
14. Methods Exam Administration
Residents given a 10 day window in which to take exam (required)
Exam given in the Pathology Learning Center (only activated on computers there)
Sign in included signing a confidentiality statement to not discuss with peers
Complete in one sitting 1 to 1 hours
15. Methods Study Questions:
1) Can we create a reliable virtual slide exam with only ~20 items that can be given in 1-1 hours?
2) Is a virtual slide exam as weve constructed it a valid measure of the morphologic skill needed for competency in clinical practice?
Statistical Evaluation
Item analysis discrimination, difficulty
Exam statistics
Cronbach coefficient alpha (reliability)
Correlation with months of SP training
Comparison with resident RISE SP scores
Comparison with Global ratings for surgical pathology
16. Methods
Exam Demonstration
17. Results
18. Virtual Slide Exam Results Item Analysis (20 items)
Mean Difficulty = .65 (range .17-.91)
Mean Discrimination = .43 (range = -.20-.70)
Exam reliability = .84
Correlation with months of SP experience r = .81 with sig at .0001
20. Virtual Exam vs RISE SP RISE reliability = .94 (entire exam)
RISE SP reliability (calculated for UIHC residents) = .75
Correlation of RISE with months of training = .74
Correlation between Virtual Exam and Rise SP component = .70
23. Global Ratings Our global ratings form uses a 5 point scale for each ACGME competency
(1=unacceptable, 2=needs improvement in many areas, 3=needs improvement in some areas, 4=meets expectations, 5=outstanding)
The scores for Diagnostic and patient care activities were averaged for each resident
Median resident rating = 4.25
Correlation with Virtual Exam score = .23
26. Conclusions Our 20 question SP competency exam using virtual slides has excellent reliability
Correlation with RISE scores is moderately good
Correlation with our global ratings is poor
Feasible to administer; well-accepted by residents
There may be a unique aspect of competency that this exam measures that is not measured by the RISE or Global evaluation
27. Discussion What is that unique aspect that this exam measures?
Is this a good performance-based assessment tool?
How important is it to measure (or try to measure) morphologic skill? Does a good eye matter?
Where to go from here?
28. Acknowlegements Department of Pathology
Fred Dee, M.D.
Tom Haugen, M.D., Ph.D.
Tim Leaven, M.A.
Surgical Pathology Faculty
Pathology Residents
Office of Consultation and Research in Medical Education
Clare Kreiter, Ph.D.