500 likes | 610 Views
Standardizing Learner Surveys Across the Enterprise. Francis Kwakwa, MA, Radiological Society of North America Valerie Smothers, MA, MedBiquitous. Disclosure. We have no financial relationships to disclose. Objectives. At the completion of this session, you will be able to:
E N D
Standardizing Learner Surveys Across the Enterprise Francis Kwakwa, MA, Radiological Society of North America Valerie Smothers, MA, MedBiquitous
Disclosure We have no financial relationships to disclose.
Objectives At the completion of this session, you will be able to: • Adopt strategies to improve the collection of consistent evaluation data from learners • Adopt strategies to improve the analysis of evaluation data across the CME enterprise
Overview • Challenges in analyzing learner surveys • MedBiquitous and MEMS • RSNA’s Implementation of a Standardized Survey • Results of RSNA course evaluation • Challenges faced by RSNA • Key strategies for improving data collection and analysis
Challenges in Analyzing Learner Surveys • Most of us use surveys • Surveys often differ based on activity • Survey data may be in different systems or formats • The result: it’s hard to analyze results across activities
RSNA • Radiological Society of North America • “to promote and develop the highest standards of radiology and related sciences through education and research” • Over 40,000 members • Online and in-person CME activities • Member of MedBiquitous • Francis Kwakwa, Chair of the MedBiquitous Metrics Working Group
MedBiquitous • Technology standards developer for healthcare education • ANSI Accredited • Develops open XML standards • 60 members (societies, universities, government, industry) • 7 working groups
The Focus on Metrics • “Without the creation of a standard data set for reporting CME program outcomes … it is difficult to obtain consistent metrics of those outcomes. And if you can’t measure it, you can’t improve it.” • Medical Education Metrics – MEMS Ross Martin, MD, Director Healthcare Informatics Group, Pfizer
Another Perspective • “I need this to better understand how my program as a whole is doing.” Nancy Davis, American Academy of Family Physicians
The MedBiquitous Metrics Working Group • Mission:to develop XML standards … for the exchange of aggregate evaluation data and other key metrics for health professions education. • Originally a subcommittee of the Education Working Group • Became a working group in April 2005 We’re all using the same measuring stick… --Francis
Francis Kwakwa, RSNA, Chair Linda Casebeer, Outcomes Inc. Nancy Davis, AAFP Michael Fordis, Baylor College of Medicine Stuart Gilman, Department of Veterans Affairs Edward Kennedy, ACCME * Jack Kues, University of Cincinnati Tao Le, Johns Hopkins University Ross Martin, Pfizer Jackie Mayhew, Pfizer Mellie Pouwels, RSNA Andy Rabin, CE City Donna Schoonover, Department of Veterans Affairs Who is Involved? * Invited experts
What’s in MEMS • Participation Metrics • how many participants • Learner Demographics • profession, specialty • Activity Description • name, type • Participant Activity Evaluation • survey results • Other types of evaluation metrics planned for future versions
For more information: • Metrics Working Group Pagehttp://www.medbiq.org/working_groups/metrics/index.html • MedBiquitous Websitehttp://www.medbiq.org
Discussion • Describe the learner surveys that you are using and how they differ from or are similar to the survey described. What are the benefits or drawbacks of using a standardized survey?
RSNA’s Project… • Adoption of MEMS survey instrument coincided with implementation of a new Learning Management System • Currently MEMS is used to evaluate over 300 online courses
RSNA’s Project… Types of online courses using MEMS • Cases of the Day (COD) • RadioGraphics CME Tests/Education Exhibits (EE) • Refresher Courses (RSP)
RSP-2904 (N = 43)The course achieved its learning objectives
COD-45 (N = 24) The course was relevant to my clinical learning needs
EE-355 (N = 32)The course was relevant to my clinical learning needs
RSP-2904 (N = 43)The course was relevant to my clinical learning needs
COD-45 (N = 24) The course was relevant to my personal learning needs
EE-355 (N = 32)The course was relevant to my personal learning needs
RSP-2904 (N = 43)The course was relevant to my personal learning needs
COD-45 (N = 24) The online method of instruction was conducive to learning
EE-355 (N = 32)The online method of instruction was conducive to learning
RSP-2904 (N = 43)The online method of instruction was conducive to learning
COD-45 (N = 24) I plan to change my practice based on what I learned in the course
EE-355 (N = 32)I plan to change my practice based on what I learned in the course
RSP-2904 (N = 43)I plan to change my practice based on what I learned in the course
COD-45 (N = 24) The faculty provided sufficient evidence to support the content presented
EE-355 (N = 32)The faculty provided sufficient evidence to support the content presented
RSP-2904 (N = 43)The faculty provided sufficient evidence to support the content presented
COD-45 (N = 24) Was the course free of commercial bias towards a particular product or company?
EE-355 (N = 32) Was the course free of commercial bias towards a particular product or company?
RSP-2904 (N = 43) Was the course free of commercial bias towards a particular product or company?
COD-45 (N = 24) Did the course present a balanced view of clinical options?
EE-355 (N = 32) Did the course present a balanced view of clinical options?
RSP-2904 (N = 43) Did the course present a balanced view of clinical options?
Group Discussion • What challenges to survey data collection and analysis have you faced?
Challenges Faced by RSNA • Survey is optional; little data available for some courses • Little variation in the data • Some disconnect with educators on how the data is used • Difficult to get data out of the LMS • Surveys for live events are not included
Key Strategies • Data Collection • Common core set of survey questions • Common format for evaluation data • Data Analysis • Compare within type and modality • Compare across type and modality • Look for trends and variation • Look for red flags
An Added Benefit • Assists with program analysis and improvement required by the ACCME • “The provider gathers data or information and conducts a program-based analysis on the degree to which the CME mission of the provider has been met through the conduct of CME activities/educational interventions.”--ACCME Updated Accreditation Criteria, September 2006