310 likes | 486 Views
CCC Development: Lessons Learned from a Beta-Test Site. Emily Colson, MD and Michael Geurin, MD Montana Family Medicine Residency UW FMRN Webinar Jan. 8, 2014. What the ACGME/RRC Have to say about the CCC. CCC Composition. Who? Must have 3 program faculty members
E N D
CCC Development: Lessons Learned from a Beta-Test Site Emily Colson, MD and Michael Geurin, MD Montana Family Medicine Residency UW FMRN Webinar Jan. 8, 2014
CCC Composition • Who? Must have 3 program faculty members • Who else? Each have something to offer • Advisors • Behavioral health faculty • Non-physician clinician(s) • Program Director (PD) / Associate Program Director (APD) • Chief resident(s) • Workload and time considerations – PEC, Progress Committee, outside commitments?
Advisor Role • Active or Passive? • Presenting the resident’s performance to the CCC (may have most critical information, but hard to schedule) OR • Submitting pre-meeting report and receiving post-meeting recommendations • If advisor is a permanent member of the CCC, should that member recuse himself/herself from the final determination of the SAE? – We aren’t sure yet Mentorship Bias (+/-) Remediation
Program Director Role • Member of CCC, or separate? “The Decider” • Flow of information: • Provide information to CCC prior to Semi-Annual Evaluation (SAE) meeting? • Review of CCC’s SAE prior to final transmission to ACGME? • With or without modification?
Other CCC Responsibilities? • Potential locally added responsibilities • Evaluation Development – Natural extension • Remediation Plans – Stoplight approach, leaving details to Advisors/Progress Committee • Advising Program Evaluation Committee – Identifying Programatic Holes • Curriculum Development – A bridge too far?
Evaluation Tool Development • WHY? Do your existing evaluations give you the data the CCC needs to assign sub-competency levels? Beta-test showed us that ours did not • 22 sub-competencies X 5 levels X multiple milestones per level = a lot of data-driven decision-making • General vs. specific • Who are the evaluators? • RRC requires ICS competency to have 360° evaluation including staff and patients (and may be helpful for all of them) • Milestone language needed to be “translated” for non-core faculty evaluators
MFMR CCC Products • Revised milestone-based evaluation tools for every learning experience • Milestones-based SAE report to ACGME (22 data points) • PD to receive Red/Yellow/Green at least • SAE report “with feedback” to resident • This is a “program” requirement separate from CCC description [V.A.2.b).(4)] • CCC will prepare, and will funnel to resident through advisor • Semi-annual report to PD/Program Evaluation Committee about general trends in resident performance that might require program modification FUTURE
Considerations for SAE Meetings (Overview) • Who needs to be there? • Quorum, all, key people for each resident? • How will the CCC prepare? • How will the data be aggregated? • When should the meetings be held and how should you go about discussing the residents?
Who needs to be there? • Large enough to have wide range of input, small enough to be efficient and have all engaged • Program size will influence this
Preparation for Meetings • Pre-meeting review of portfolio • By advisor? By CCC member? • Complete draft SAE or just review portfolio contents? • More efficient use of group time • Preparation time may not be protected • Risk of bias by preparer • Preparer must not be absent from CCC session • No preparation outside of CCC meeting time • All CCC members seeing information for first time • Fill out SAE together • All CCC group members see all of the data • Group attendance can be flexible • More time required for CCC to meet about each resident • Risk of hurried process, less reflection VS
SAE Time Needs: Our experience • MFMR was Beta Test Site for milestones, and simulated SAE of 2 residents from each class • Each CCC member reviewed evaluations for 1 or 2 residents prior to CCC meeting • Time required per resident averaged 2.5 hours • Roughly 1.5 hours prep time before meeting • Roughly 1 hour discussion time during meeting • Prep time could be reduced if data aggregation tool works well • Residents in difficulty will require more time • This 1 hour did include one member drafting the narrative feedback and the other members agreeing to the wording • We have 24 residents….
Data Aggregation • If using New Innovations, “Direct Responses” do not aggregate with “Indirect Responses” in the NI Milestones Tool (M-Tool) • Also, comments from evaluations do not aggregate in the M-Tool • Can aggregate comments in Portfolio tool • Comment fields must be identically named across evaluations in order to aggregate in the Portfolio • NI does not offer a single tool that will pull all evaluation data and comments together for the CCC!
NI M-Tool: Direct Responses • More robust graphing of aggregate responses • No comment fields in Direct Responses
NI M-Tool: Indirect Responses • Less robust graphing of aggregate responses • Any associated comments not visible in NI M-Tool (but can be seen in Portfolio)
Structure of SAE Meetings • Rolling SAE meetings vs. doing all SAEs in November-December and May-June? • Lump residents based on class? • Potentially easier to identify superior or deficient performance? • If advisors have role in SAE meeting, lump residents based on advisor? • Easier to coordinate schedule with each advisor • Does whole CCC need to be present for all SAE meetings? • Quorum? Vacations? Illness?
Timing of Meetings • Marathon or piecemeal? • Relative to SAE due date? • Availability of members? • Availability of advisor (if presenting)? • Timing of feedback to other interested parties • Residents • Advisors • Remediation/Progress committees
Timing of Meetings MFMR planning for semi-annual marathons 24 residents = 6 half-day sessions twice a year (rather than 24 “lunch meetings” every 6 months)
MFMR SAE Plan • Who needs to be there? All 5 Members Scheduled • Quorum, all, key people for each resident? No advisor presentations • How will the CCC prepare? CCC members will prepare specific residents • How will the data be aggregated? • As much as possible on NI • As much as possible as Direct Responses (some creative paper evals for non-core faculty will be entered into system as Direct Responses) • When should the meetings be held and how should you go about discussing the residents? 6 half-days over 3 weeks just before SAE due – each class in 2 half-days
Results to Resident, PD, ACGME PD / ACGME PD / ACGME
What We Think Works Well • CCC identifying specific areas of concern for residents in difficulty • Using SAE to assist advisor-advisee plans (e.g., next steps, remediation plans) • Using CCC to advise the PEC/PD about program-level needs • Having support of PD/administration for adequate resources • Having IT champion involved in data management • Behaviorist part of CCC • Reviewing residents grouped by year of training
What We Think Doesn’t Work Well • Sifting through piles of paper evaluations • Asking community attendings, nurses, patients to use raw milestones language • Tendency toward gestalt over specific data-driven conclusions • Faculty not sharing frames of reference (hawks vs. doves) • Tendency to under-rate high performers “to leave room” for future improvement • Tendency to over-rate low performers to avoid difficult discussions, hurt feelings, etc. • Inadequate time for completion of evaluation/feedback (in culture of multi-tasking, this may be first thing to be dropped) • Trying to run the SAE meetings cold without someone prepping each resident first (superficial assessment risk) • Using existing evaluation tools and then having CCC translate into Milestones
Lessons Learned • Lesson #1: Not a lot of guidance from the RRC and ACGME • Lesson #2: We have to do something, and there is a lot to do • Lesson #3: If we have to do it, we might as well get as much collateral out of the CCC’s efforts as we can. • Lesson #4: define your CCC role at the outset – you touch on everything, so its easy to get carried away. • Lesson #5: We need specific data from evaluations, not gestalt.
Lessons Learned • Lesson #6: Make a map for your evaluations • Lesson #7: Everyone can’t be an expert on the Milestones • Lesson #8: Having the presenter prepare a draft of the SAE is more efficient, but anchors the results • Lesson #9: Build a coherent system of evaluation that can be aggregated • Lesson #10: Doing SAE’s for a whole day straight is exhausting
Lingering Questions • Who is educating the residents about the milestones/SAEs, and how do we do this? • Who is in charge of the ongoing faculty development that is needed? • Frame of reference training, needs maintenance • Who decides promotion and graduation? • What if CCC and PD disagree about level of competency? • Who determines specific remediation plans?