210 likes | 486 Views
“Performance Reports for Failing Candidates ”. Carol O’Byrne Pharmacy Examining Board of Canada. Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona. What failing candidates want to know. How close was I to passing? What did I do wrong? What did I miss?
E N D
“Performance Reports for Failing Candidates ” Carol O’Byrne Pharmacy Examining Board of Canada Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
What failing candidates want to know • How close was I to passing? • What did I do wrong? What did I miss? • How many such errors and omissions lead to a failing result? • In which area(s) do I need to improve? • What does PEBC expect in these areas? • Why am I expected to perform at a higher level than what I see some pharmacists doing? Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
PEBC rationale for providing feedback to candidates • Supports PEBC’s mandate: to certify candidates who demonstrate that they have the knowledge, skills, abilities and attitudes required for practice • Increases candidates’ awareness of practice requirements • Supports the cooperative but arms length relationship between credentialing bodies and training bodies Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Rationale… Benefits all parties: • Assists candidates to recognize and address their weaknesses • Improves efficiency of PEBC processes and lessens potential threat on exam security by reducing the number of retakes • Benefits the profession and the public by supporting further development of qualifications of those preparing to enter practice • Addresses manpower needs - guides remediation and bridging efforts, facilitating earlier entry to the profession of those who may not yet have received adequate training Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Why only to failing candidates? • No demand from passing candidates • Resource issues • Issuance of reports • Failing candidates often retake the exam without appropriate preparation and plug the system Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Based on national competencies and standards Offered in English and French Must be PEBC certified to license in 9/10 provinces Mobility enabled by mutual recognition (if PEBC certified) Part I (MCQ) – 200 scored items Part II (OSCE) – 15 scored stations 12 SP/HP interactions + 3 non-client stations 7 minutes/station 1 assessor/station 2 sets of scores/station Analytical checklist Holistic scales PEBC Qualifying Examination Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Competencies assessed Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Test format • Interactive client stations • Standardized patients • Standardized health professionals • Non-client stations • Technical, e.g.: • Screening prescriptions for appropriateness • Checking dispensed prescriptions • Written short answer, e.g.: • Responding to drug information requests - evaluating and interpreting drug information from several / conflicting sources • Medication management - reviewing patient data and recommending therapeutic options, along with a rationale Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Assessor scoring sheet - ratings Three 4-point scales • Communications – generic scale • Rapport • Organization and flexibility (adaptive to the client/situation) • Verbal and nonverbal skills (including language proficiency) • Outcome (problem solving)– station specific scale • Based on critical checklist items • Overall Performance– inclusive, global scale • Communications and outcome • Process quality and thoroughness (critical and noncritical items) • Accuracy (vs misinformation) • Risk (occurrence, degree) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Assessor scoring sheet - checklist • ‘Critical’ items () • essential to solve the problem & meet station objective/s • each linked to a competency assessed in the station • ‘Noncritical’ items • represent good practice & contribute to effective outcome(s) • each linked to a competency • Risk and misinformation • Unique response (UR) - for scoring & QA purposes • Comment boxes - to record evidence to support scores (used for QA purposes) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Analytical scores Each checklist item relates to one competency Competency sub-scores = percent of items related to each competency to which candidate responds Frequency of risk and misinformation tabulated Holistic scores Each scale 1 to 4 points 12 points per client station (Comm, Outc, Perf) x 12 stations 8 points per nonclient station (Outc, Perf) x 3 stations Raw score = sum of all stations holistic scale scores Holistic cut score set for each scale in each station Cut score = sum of all stations holistic cut scores Scoring the examination Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Mean scores & alphas Holistic Scales Coefficient • Communications .83 12 stns - competency 4 • Outcome .66 15 stns – all competencies • Performance .73 15 stns – all competencies Analytical Scoresn itemsCoefficient • Pharm care 107 .80 • Ethics 7 .43 • Drug information 8 .24 • Communications 14 .33 • Drug distribution 8 .57 • Management 8 .55 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Factors affecting competency sub-score reliabilities • Candidate variability (or lack of) • Number and context of stations in which the competency was assessed • Number of non-critical items vs critical items (importance of their performance to the task at hand) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Reports to candidates • Results: pass-fail status (all candidates) • Feedback (for failing candidates, on request): • Individual score breakdown • by major skill – mean Communication, Outcome and Performance ratings – aggregated across all stations • by competency – mean percent scores – aggregated across all stations in which the competency was assessed • by critical incident – frequency of risk, misinformation • Comparative data • ‘Reference group’ mean scores and frequencies for score comparison with a stable population • to show where performance needs to improve Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Assessor scoring sheet Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
OSCE feedback report Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Candidate findings • Most candidates understand the information provided but want more guidance (content information – where they went wrong) • Some do not accept the exam results and feedback information – may request hand-scoring • Failing candidates generally score low in Communications (rating scale and competency 4) and/or Pharmaceutical Care (competency 1 – clinical role) • Many failing candidates lack clinical training in Canada (or the US) - though many have some technical training / experience (as a pharmacy technician) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Are we really helping candidates? • Anecdotally, yes – some do not know where to start, what to focus on • Skills scores and competency sub-scores are consistent enough to be meaningful in areas that are weighted more heavily • All candidates who fail show weaknesses in one or more of these areas (low scores relative to the reference group) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
What questions do (can) we answer? • What area(s) do I need to improve? • What does PEBC expect in these areas? • What did I do wrong? What did I miss? • Why am I expected to perform at a higher level than what I see some pharmacists doing? • How many errors and omissions lead to a failing result? • How close was I to passing? Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
What other strategies are (may be) helpful? • Provide information about training and/or remedial resources, e.g.: • Clear expressions, including visual exemplars, of good practice in each competency area • Recognized training programs and resources • Practice exams (e.g. ‘mock OSCE’s’) for format familiarization • Provide general tips, e.g.: • Typical performance errors/deficits in each competency • Competency-related descriptions of candidates who are clearly qualified, borderline qualified and unqualified Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona
Contact information Carol O’Byrne Pharmacy Examining Board of Canada 415 Yonge Street, Suite 601 Toronto, ON M5B T: 416-979-2431, ext 226 Email: obyrnec@pebc.ca Website: www.pebc.ca Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona