1 / 50

Standardizing Learner Surveys Across the Enterprise

Standardizing Learner Surveys Across the Enterprise. Francis Kwakwa, MA, Radiological Society of North America Valerie Smothers, MA, MedBiquitous. Disclosure. We have no financial relationships to disclose. Objectives. At the completion of this session, you will be able to:

Download Presentation

Standardizing Learner Surveys Across the Enterprise

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Standardizing Learner Surveys Across the Enterprise Francis Kwakwa, MA, Radiological Society of North America Valerie Smothers, MA, MedBiquitous

  2. Disclosure We have no financial relationships to disclose.

  3. Objectives At the completion of this session, you will be able to: • Adopt strategies to improve the collection of consistent evaluation data from learners • Adopt strategies to improve the analysis of evaluation data across the CME enterprise

  4. Overview • Challenges in analyzing learner surveys • MedBiquitous and MEMS • RSNA’s Implementation of a Standardized Survey • Results of RSNA course evaluation • Challenges faced by RSNA • Key strategies for improving data collection and analysis

  5. Challenges in Analyzing Learner Surveys • Most of us use surveys • Surveys often differ based on activity • Survey data may be in different systems or formats • The result: it’s hard to analyze results across activities

  6. RSNA • Radiological Society of North America • “to promote and develop the highest standards of radiology and related sciences through education and research” • Over 40,000 members • Online and in-person CME activities • Member of MedBiquitous • Francis Kwakwa, Chair of the MedBiquitous Metrics Working Group

  7. MedBiquitous • Technology standards developer for healthcare education • ANSI Accredited • Develops open XML standards • 60 members (societies, universities, government, industry) • 7 working groups

  8. The Focus on Metrics • “Without the creation of a standard data set for reporting CME program outcomes … it is difficult to obtain consistent metrics of those outcomes. And if you can’t measure it, you can’t improve it.” • Medical Education Metrics – MEMS Ross Martin, MD, Director Healthcare Informatics Group, Pfizer

  9. Another Perspective • “I need this to better understand how my program as a whole is doing.” Nancy Davis, American Academy of Family Physicians

  10. The MedBiquitous Metrics Working Group • Mission:to develop XML standards … for the exchange of aggregate evaluation data and other key metrics for health professions education. • Originally a subcommittee of the Education Working Group • Became a working group in April 2005 We’re all using the same measuring stick… --Francis

  11. Francis Kwakwa, RSNA, Chair Linda Casebeer, Outcomes Inc. Nancy Davis, AAFP Michael Fordis, Baylor College of Medicine Stuart Gilman, Department of Veterans Affairs Edward Kennedy, ACCME * Jack Kues, University of Cincinnati Tao Le, Johns Hopkins University Ross Martin, Pfizer Jackie Mayhew, Pfizer Mellie Pouwels, RSNA Andy Rabin, CE City Donna Schoonover, Department of Veterans Affairs Who is Involved? * Invited experts

  12. What’s in MEMS • Participation Metrics • how many participants • Learner Demographics • profession, specialty • Activity Description • name, type • Participant Activity Evaluation • survey results • Other types of evaluation metrics planned for future versions

  13. For more information: • Metrics Working Group Pagehttp://www.medbiq.org/working_groups/metrics/index.html • MedBiquitous Websitehttp://www.medbiq.org

  14. Discussion • Describe the learner surveys that you are using and how they differ from or are similar to the survey described. What are the benefits or drawbacks of using a standardized survey?

  15. RSNA’s Project… • Adoption of MEMS survey instrument coincided with implementation of a new Learning Management System • Currently MEMS is used to evaluate over 300 online courses

  16. RSNA’s Project… Types of online courses using MEMS • Cases of the Day (COD) • RadioGraphics CME Tests/Education Exhibits (EE) • Refresher Courses (RSP)

  17. Results…

  18. COD-45 (N = 24)The course achieved its learning objectives

  19. EE-355 (N = 32)The course achieved its learning objectives

  20. RSP-2904 (N = 43)The course achieved its learning objectives

  21. COD-45 (N = 24) The course was relevant to my clinical learning needs

  22. EE-355 (N = 32)The course was relevant to my clinical learning needs

  23. RSP-2904 (N = 43)The course was relevant to my clinical learning needs

  24. COD-45 (N = 24) The course was relevant to my personal learning needs

  25. EE-355 (N = 32)The course was relevant to my personal learning needs

  26. RSP-2904 (N = 43)The course was relevant to my personal learning needs

  27. COD-45 (N = 24) The online method of instruction was conducive to learning

  28. EE-355 (N = 32)The online method of instruction was conducive to learning

  29. RSP-2904 (N = 43)The online method of instruction was conducive to learning

  30. COD-45 (N = 24) The course validated my current practice

  31. EE-355 (N = 32)The course validated my current practice

  32. RSP-2904 (N = 43)The course validated my current practice

  33. COD-45 (N = 24) I plan to change my practice based on what I learned in the course

  34. EE-355 (N = 32)I plan to change my practice based on what I learned in the course

  35. RSP-2904 (N = 43)I plan to change my practice based on what I learned in the course

  36. COD-45 (N = 24) The faculty provided sufficient evidence to support the content presented

  37. EE-355 (N = 32)The faculty provided sufficient evidence to support the content presented

  38. RSP-2904 (N = 43)The faculty provided sufficient evidence to support the content presented

  39. COD-45 (N = 24) Was the course free of commercial bias towards a particular product or company?

  40. EE-355 (N = 32) Was the course free of commercial bias towards a particular product or company?

  41. RSP-2904 (N = 43) Was the course free of commercial bias towards a particular product or company?

  42. COD-45 (N = 24) Did the course present a balanced view of clinical options?

  43. EE-355 (N = 32) Did the course present a balanced view of clinical options?

  44. RSP-2904 (N = 43) Did the course present a balanced view of clinical options?

  45. Group Discussion • What challenges to survey data collection and analysis have you faced?

  46. Challenges Faced by RSNA • Survey is optional; little data available for some courses • Little variation in the data • Some disconnect with educators on how the data is used • Difficult to get data out of the LMS • Surveys for live events are not included

  47. Key Strategies • Data Collection • Common core set of survey questions • Common format for evaluation data • Data Analysis • Compare within type and modality • Compare across type and modality • Look for trends and variation • Look for red flags

  48. An Added Benefit • Assists with program analysis and improvement required by the ACCME • “The provider gathers data or information and conducts a program-based analysis on the degree to which the CME mission of the provider has been met through the conduct of CME activities/educational interventions.”--ACCME Updated Accreditation Criteria, September 2006

More Related