420 likes | 666 Views
Evaluating Clinical Simulations. Pamela R. Jeffries, DNS, RN, FAAN Johns Hopkins University School of Nursing Mississippi – February 25, 2011. Objectives of the Session:. The participant will be able to: Discuss the evaluation process using simulations
E N D
Evaluating Clinical Simulations Pamela R. Jeffries, DNS, RN, FAAN Johns Hopkins University School of Nursing Mississippi – February 25, 2011
Objectives of the Session: The participant will be able to: • Discuss the evaluation process using simulations • Describe strategies using simulations as an evaluation tool in courses or programs • Describe different components that need to be assessed when using simulations
WHAT IS EVALUATION? • Feedback • Coaching • Assigning Grades • Judgmental: objective or subjective • Form of quality improvement • Assessment
ASSESSMENT • What it is:“Systematic collection, review and use of information about educational programs undertaken for the purpose of improving student learning and development.” (Palumba and Banta, 1999) • Focus:development and improvement rather than judgment and grading
WHY EVALUATE? • Determine learning outcomes and program goals achieved • Give feedback while learning • Improve effectiveness of teaching and learning • Attain performance standards • Assure patient safety
WHEN TO EVALUATE? • Frequently when: • Learning is complex • There is a high risk of failure • When the consequences of error would be serious • When outcomes are critical: • Ensure that learners are prepared for clinical practices • When findings can be used to alter the direction of the project • End of the module, activity, or project to be certain of learning outcomes
EVALUATION PROCESS • Is…A systematic effort involving identifying what, who, when and why, and then gathering, analyzing and interpreting the data • Concludes when…the findings of the evaluation are reported and used
EVALUATION PROCESS • Judgment:Relative worth or value of something • Focus:Specifically on the student learner in e-learning • Throughout educational process:Interwoven throughout the learning process usually in the form of feedback
EVALUATION INSTRUMENT • Norm-referenced: • Focus: how learners rank in comparison to each other • Outcome: The interpretation of the evaluation is to determine who has the most knowledge, best skill performance, etc. and who has the least
Evaluation Instruments • Criterion-referenced • Focus: the learner’s ability to attain the objectives or competencies that have been specified • Purpose: to determine who has achieved the outcomes and who has not • Outcome: The outcome standard must be specified by the educator, and then the learner is evaluated to determine if the standard is met
PRINCIPLES OF EVALUATING ADULT LEARNERS • Involve in planning • Capable of self-evaluation • Motivated to achieve • Deal with real world problems and need real world evaluation • Like variety in teaching, learning, and evaluation
PRINCIPLES OF EVALUATING ADULT LEARNERS • Respond to feedback (evaluation) • Need frequent and informative feedback • Are self-directed • Learn from each other and can evaluate each other • Respond to choices…provide options for evaluation
GUIDELINES FOR SELECTING EVALUATION STRATEGIES • Appropriate for the domain of learning • The learner should have the opportunity to practice in the way he/she will be evaluated • Used to provide multiple ways of assessing learning • Ease of development/use on the part of the educator • Evaluation instrument should be valid and reliable
Steps to Take When Evaluating • Identify the purpose of the evaluation • Determine a time frame • Identify when to evaluate • Develop the evaluation plan • Select the instrument(s) • Collect the data • Interpret the data
Evaluations and Simulations Four areas of Evaluation: • Evaluating the simulation itself • Evaluating the Implementation phase • Evaluating student learning outcomes • Using simulation as an evaluation tool
Evaluating the simulation (design) • A good simulation design and development is needed to obtain the outcomes you need • The SDS provides a measure of the importance of each design feature
Data Results • Simulation Design Scale (SDS) analyzed using factor analyses with varimax rotation on items for each scale • Cronbach alphas calculated on each subscale and the overall scale to assess instrument reliability
Incorporating the Educational Practices into the Simulation • Assemble the simulation with the educational practices in mind when implementing the simulation • Active learning • Collaboration • Diverse ways of learning • High expectations
Data Results • Educational Practices within a Simulation Scale (EPSS) analyzed using factor analyses with varimax rotation on items for each scale • Cronbach alphas calculated on each subscale and the overall scales to assess instrument reliability
Using factor analyses with varimax rotation on items for each scale, the analyses revealed 4 factors Educational Practices Scale
Instrument Testing • Testing of the two instruments, Simulation Design Scale and Educational Practices Scale continued in this Phase. • Reliabilities for both scales were found to be good. • It was important we found measures to assess the quality of the designed simulations that were being used.
Evaluating Student Learning Outcomes • Simulation technologies used to measure both process and outcomes range from case studies to standardized patients (OSCE), to task trainers and high fidelity mannequins
Example of a Clinical Competency Measure to Measure a Curriculuar Thread(J. Cleary – Ridgewater, MN)
Evaluating outcomes • Formative evaluation measures: simulation is used by the learner/faculty to mark progress toward a goal • Summative evaluation measures: include determining course competencies, licensing and certification examinations, and employment decisions
Exemplars of student outcome measures used today • Knowledge • Skill performance • Learner satisfaction • Critical Thinking • Self-confidence • Skill proficiency • Teamwork/collaboration • Problem-solving Skills
Using simulations as the Evaluation Tool • When skill sets, clinical reasoning, and selected clinical competencies need to be measured, simulations can be used as the mechanism to do this.
Simulations to Evaluate • Set-up a simulation as an evaluation activity Issues to address: • Make sure student is aware it is an evaluation • Describe the evaluation metrics • Is it objective?
Ways to Use Simulations as an Evaluation Metric • As an Objective Structured Clinical Exam (OSCE) • Hired simulated patients are portrayed by actors or actresses • Students are immersed into specific scenarios
Ways to Use Simulations as an Evaluation Metric • Set-up a simulation to measure such concepts as teamwork, patient safety competencies, ACLS protocol, communication skills, any selected intervention
Ways to Use Simulations as an Evaluation Metric • Simulations used for evaluation can come in the form of computer-based learning, scores, and metrics to demonstrate knowledge, skill competency, and proficiency
Examples of products that can be used for evaluation purposes: • MicroSim DVD- scoring mechanism • ACLS computer-based software scenario packages – scoring • Cath simulators – scoring devices – standardized scales, benchmarks • CPR models/mannekins – programming and scoring
Summary • Simulations require evaluation of many variables, including the simulation design, the implementation process, and learning outcomes • In addition, simulations can serve as the mechanism to evaluate students
Nursing Implications • When developing the course and curriculum planning in nursing, decide what the purpose of the simulation encounters serve and evaluate to make sure the outcomes or purpose is being achieved. • More evidence/documentation is needed that simulations are serving the need for improved clinical performance, critical thinking, and improved diagnostic reasoning
Conclusion “How to tell students what to look for without telling them what to see is the dilemma of teaching.” Lascelles Abercrombie
Any Questions? pjeffri2@son.jhmi.edu