2.47k likes | 4.84k Views
Faculty Development. WORKPLACE- BASED ASSESSMENT. COURSE OBJECTIVES. By the end of the course participants will have: Described the educational principles that underpin workplace-based assessment (WPBA) Discussed key issues in the implementation of WPBA
E N D
Faculty Development WORKPLACE- BASED ASSESSMENT
COURSE OBJECTIVES By the end of the course participants will have: • Described the educational principles that underpin workplace-based assessment (WPBA) • Discussed key issues in the implementation of WPBA • Compared types of assessment in common use • Considered the use of WPBAs to develop proficiency in trainee practice • Explored ways to use WPBAs as developmental opportunities and to aid reflection
WORKPLACE-BASED ASSESSMENTS • Experiences – Of using WPBAs • Challenges – What are the limitations and challenges of the current system of WPBA? • Constraints – What prevents learners and trainers from maximising effectiveness? • Opportunities – What educational opportunities arise from their use?
TRAINEE-REPORTED CHALLENGES • Lack of clear role/responsibility • Knowing what is expected of them • Competing demands on time • Limited opportunity to be observed or to receive feedback • Unclear about immediate relevance of WPBAs
LEARNING IN POSTGRADUATE TRAINING Traditional approaches: Immersion Professional knowledge – cases, reasoning Professional skills – history/exam Technical skills – procedures Professional socialisation – attitudes and behaviours Continuing professional development WPBAs tools: Case-based discussion (CBD) Mini clinical evaluation (CEX) Directly observed procedural skills (DOPs) Multi-source feedback (MSF)
WHY DO WE ASSESS? Patient safety and standards of care To ensure we are training correctly To develop trainees
WHAT IS ASSESSMENT? • ‘A systematic procedure for measuring a trainee’s progress or level of achievement against defined criteria to make a judgement about a trainee’ (GMC 2010) • A check of the learning that has taken place • ‘About getting to know our students and the quality of their learning’ (Rowntree 1977)
TYPES OF ASSESSMENT • Formative – Aids learning through constructive feedback • Summative – Determines levels of competence for progression • Appraisal – Formal review of progress
Miller (1990) AUTHENTICITY OF CLINICAL ASSESSMENT – MILLER’S PYRAMID Does Behaviour Shows how Professional authenticity Knows how Cognition Knows
Actual performance assessment (WPBA) MILLER’S PYRAMID – 1990 Does Procedural competence assessment (OSCE), simulation Shows how (Clinical) context-based tests, MCQ, essays Knows how Factual tests, MCQ, essays Knows
WHAT DO WE ASSESS? Case-based discussion (CBD) Direct observation of procedures (DOPs) Clinical knowledge and skills Practical skills Interpersonal skills and judgement Professional behaviours Direct observations of non-clinical skills (DONCS) 360º appraisal (360)
WHAT DO YOU WANT FROM ASSESSMENT? Clear purpose Fair Clearly related to the learning that has taken place Tests what should be assessed rather than what is easy to assess Helps to improve performance if formative, or reliably sorts out the pass from the fail if summative Multi-faceted Reliable
WORKPLACE-BASED ASSESSMENT Assessment for learning (formative) Assessment of learning (summative) Learning is at its most powerful when it is authentic (workplace) Valid but not always reliable assessor (subjective versus objective) Reliability when part of many Learning by doing, reviewing, reflection
FORMATIVE AND SUMMATIVE Reports from Educational supervisors • ARCP CEX, DOPs, PBAs, OSATs,CBDs, PATs, TABs, MSFs
FORMATIVE AND SUMMATIVE 6 5 4 3 2 1 ARCP Late WPBAs Mid WPBAs Early WPBAs
TRAINEES SHOULD BE ‘SAFER’ • Spread assessments through job • As many assessors as possible • Feedback as well as scores • Evidence it all (follow-up actions) • Reflect on what they do
WORKPLACE-BASED ASSESSMENT Conscious competence (C/C) Conscious incompetence (C/IC) CONSCIOUSNESS Unconscious competence (UC/C) Unconscious incompetence (UC/IC) COMPETENCE
WORKPLACE-BASED ASSESSMENT 3. What could you improve? 1. What do you think you did well? C/C C/IC CONSCIOUSNESS 4. I think you could improve… 2. I think you did well at… UC/IC UC/C COMPETENCE
FEEDBACK ‘Giving feedback is not just to provide a judgement or evaluation. It is to provide [develop]insight. Without insight into their own limitations, trainees cannot process or resolve difficulties’ (King 1999)
FEEDBACK • Allows an individual to identify what they have done/are able to do effectively • Gives suggestions about alternative approaches to a task to improve effectiveness • Allows the learner to identify ongoing learning needs • Both challenges and supports the subject
GIVING FEEDBACK – REFLECTION • How do you think it went? (insight check) • What went well? • Examples of the good • What could be improved/how? • ‘I noticed…’ • ‘If you were doing it again…’ (ask/suggest) Describe gap between current and desired performance Agree a plan and how to get there
CLINICAL SKILLS • What kinds of clinical skills do trainees need to develop? • Where do they have a chance to do this? • When in your working week can you offer the opportunity to develop trainees’clinical skills? • How can this be recorded and used to develop trainees?
MINI-CEX – ALL IN A DAY’S WORK The bulleted areas are to be rated as one of the following: 1 – below expectations 2 – below expectations 3 – borderline 4 – meets expectations 5 – above expectations 6 – above expectations • History taking • Physical examination • Communication • Clinical judgement • Professionalism • Organisation and efficiency • Overall clinical care
MULTI-SOURCE FEEDBACK • Trainee-centred: • ‘How do you think you are settling in?’ • ‘What sort of feedback do you think you have had?’ • ‘What do you think about what I have said?’ • Balanced: • Strengths from MSF/last post before concerns • Provides possible explanation for poor comments • Personally values confidence • Seeks specifics: • ‘Why do you feel we don’t value confidence?’ • Asks for reasons for the comments • ‘Can you think of a situation where a clinical decision you made…’
MULTI-SOURCE FEEDBACK • Clarifies difficult areas: • ‘Do you think some people may not feel valued as part of the team?’ • ‘I think it’s an important thing for you to do’ (ask others) • Action plan: • Reflective case-based discussion next week • Action plan • Perspective/Honesty: • Re the issues • Re the possible outcome
PRACTICAL • In pairs: • One assumes role of supervisor • One assumes role of trainee • Using the data for Dr M and Dr K, feed back the results of the MSF to one another • Two sessions of 10 minutes each
CASE-BASED DISCUSSION • What benefits are offered by case-based discussion (CBD)? • Where can CBDs take place? • When can they happen? • Who can be involved?
CASE-BASED DISCUSSION – TYPES • Short case/long case/viva • Knowledge-based • Management of patient • Multi-disciplinary team • Decision making • Ethical • Reflection • Developmental change
CASE-BASED DISCUSSION – AREAS • Medical record keeping • Clinical assessment • Investigation and referrals • Treatment • Follow-up and future planning • Professionalism • Overall clinical care
CASE-BASED DISCUSSION – SUMMARY • Summative and formative components • Based on what has happened not what would happen • Explores reasoning • Questioning to ‘dig deep’ • Promotes learning and new insights if used well • Just ticks the boxes if done badly
TAKING IT BACK TO PRACTICE • What have you learned today about WPBAs – key messages? • Where, when and how will you be involved in WPBAs? • In what ways could you plan more effective ‘supervised learning experiences’ for trainees with identified learning needs? • How will you encourage documentation of learning?
RECORDING LEARNING • Reflective writing • Log book • Portfolio • Evidence of: teaching, presentations, observation notes, peer discussions, journal clubs, e-learning, multi-disciplinary teams, leadership, etc.