710 likes | 844 Views
PERFORMANCE ASSESSMENT. Free Powerpoint Templates. DEFINITION. Performance assessment is a type of assessment that requires students to actually perform, demonstrate, construct, and/or develop a product or a solution under defined conditions and standard. Levels of Performance.
E N D
PERFORMANCE ASSESSMENT Free Powerpoint Templates
DEFINITION • Performance assessment is a type of assessment that requires students to actually perform, demonstrate, construct, and/or develop a product or a solution under defined conditions and standard.
Levels of Performance • Skill Acquisition • Skill Competency • Skill Proficiency
NATURE OF P A KNOWS PERFORMANCE COMPETENCE DOES SHOWS HOW KNOWS HOW
DIFFERENCES BETWEEN TA AND PA TRADITIONAL ASSESSMENT PERFORMANCE ASSESSMENT Assessment drives curriculum Real life Performance of task Application Student structured Direct evidence • Curriculum drives assessment • Contrived • Selection of response • Recall/recognition • Teacher structured • Indirect evidence
CHARACTERISTICS OF PA • HIGHER ORDER THINKING • AUTHENTICITY • INTEGRATIVE • PROCESS AND PRODUCT • DEPTH IN PLACE OF BREADTH
ELEMENTS OF PA • A PERFORMANCE TASK • FORMAT • PREDETERMINED SCORING SYSTEM
STEPS IN CONDUCTING PA • Define the purpose • Choose the activity • Defining the criteria • Create performance rubrics • Assess the performance • Provide feedback or remedial teaching
USES OF PA • DIAGNOSTIC • INSTRUCTIONAL • MONITORING
TYPES OF PA • ORAL INTERVIEWS • PROJECTS • EXPERIMENTS/DEMONSTRATIONS • PAPER AND PENCIL TESTS • PORTFOLIO • OSCE • OSATS • DOPS
RUBRICS • A rubric is a scoring tool that is used to evaluate student work or performance • Can be used to evaluate AND educate • It is a quality continuum
Checklist Vs Rubric • Checklists do not have judgment of quality. • Checklists can only be used when “present or absent” is a sufficient criterion for quality. • Rubrics include descriptors for each targeted criterion. • Rubrics provide a scale which differentiates among the descriptors.
COMPONENT OF RUBRIC • TASK DESCRIPTION: Eg; VALUING HUMAN BEINGS PROFESSIONAL NURSING PRACTICE
STANDARDS OF EXCELLENCE • Degrees of quality. • Even number. • Language or numbers. • Weighting. Eg., novice, advanced beginner, competent, proficient , expert dependent, novice, assisted, supervised, self directed 0 1 2 3 4 5
CRITERIA • The specific areas for assessment. • Focus areas for instruction. • Clear and relevant. • Age appropriate. • Form and function represented. Eg; Enhances the dignity, individuality and self esteem • Provides nursing care in a safe environment
INDICATORS • Descriptors of level of performance for the criteria. • Clear, observable language. • Clear to the learner. • Examples for learners. Eg; Informs and educates individuals about their rights Seeks consent of individuals after giving adequate and factual information
STEPS IN DEVELOPMENT OF RUBRIC • Reflecting • Listing • Grouping and labeling • Application
Rubric Development: Reflecting • Reflect on the outcomes of the activity • How does the activity relate to what you want students to learn? • What skills, knowledge, or attitudes will students need to develop to accomplish the activity well? • What evidence would students need to provide to complete the rubric? • What are your highest/lowest expectations?
Rubric Development: Listing • List the outcomes that this activity should foster • This is a brainstorm, you can evaluate the items later
Rubric Development: Groupingand Labeling Dimensions • Group similar or related performance outcomes to create dimensions • Develop a label for each dimensions
Rubric Development: Groupingand Labeling Levels • Then, draft a description of each level of performance you expect for each group of outcomes • Start with the highest level, then lowest level, and then middle level(s) • Label the levels for your scale
Rubric Components: Scale Levels • Exemplary, proficient, marginal, unacceptable • Distinguished, proficient, intermediate,novice • Accomplished, average, developing, beginning • Excellent, good, developing • 1, 2, 3,
Rubric Development: Application • Transfer the list and groupings to the grid • Revise as needed
Types of rubric • Holistic:Views product or performance as a whole; describes characteristics of different levels of performance. Criteria are summarized for each score level. • Analytic: Separate facets of performance are defined, independently valued, and scored. Facets scored separately
3 - Excellent Researcher • included 10-12 sources • no apparent historical inaccuracies • can easily tell which sources information was drawn from • all relevant information is included • 2 - Good Researcher • included 5-9 sources • few historical inaccuracies • can tell with difficulty where information came from • bibliography contains most relevant information • 1 - Poor Researcher • included 1-4 sources • lots of historical inaccuracies • cannot tell from which source information came • bibliography contains very little information
Variations on a Themes • Use check boxes for elements of levels to speed process • Circle applicable elements in the description • Rubric with check boxes • Rubric with circled elements • Scoring rubric
How does rubrics help? • It is a feedback system for students to judge a product or performance. • It is a feedback tool for teachers to provide clear, focused coaching to the learner. • It is a system that promotes consistent and meaningful feedback over time
Issues in rubrics • Special populations. • Applications for teaching “criteria”. • Developmental rubrics. • First and second draft. • Consistency across grades/departments. • Changing tasks. • Weighting for grades. • Report cards.
Tips for good rubrics • Use as many generalized rubrics as possible. • If using pre-designed rubrics carefully consider quality and appropriateness for your project. • Aim for concise, clear, jargon-free language • Limit the number of criteria, but • Separate key criteria. • Use key, teachable criteria. • Use concrete versus abstract and positives rather than negatives • Use measurable criteria
Tips for good rubrics • Aim for an even number of levels • Create continuum between least and most • Define poles and work inward • List skills and traits consistently across levels • Include students in creating or adapting rubrics • Consider using “I” in the descriptors
OSCE • Objective • Structured • Clinical • Examination
Objective:without bias. Use of checklist and training of examiners ensures objectivity. Structured:organized in a standardized way. Clinical:the examination entails the clinical aspects of an health worker. Examination:An examination that declares those who are competent to handle patients.
WHAT IS AN OSCE? • Evaluation tool that allows people to be observed performing in many different clinical situations on tool that allows people to be observed performing in many different clinical situations
KNOWS HOW PERFORMANCE OSCE • E combines • Multiple observations • Standardization of content • Range of difficulty • OSCE combines • Multiple observations • Standardization of content • Range of difficulty DOES SHOWS HOW KNOWS
DOES Skills & Attitudes SHOWS HOW Professional Authenticity KNOWS HOW Knowledge KNOWS Miller et al., 1990
Goal of an OSCE Help in the learning process of students Produce competent health workers Research and staff development.
SKILLS ASSESSED IN OSCE • interpersonal and communication skills • history-taking skills, physical examination of specific body systems, mental health assessment • clinical decision making, clinical problem-solving skills • interpretation of clinical findings and investigations • management of a clinical situation, including treatment and referral • patient education , health promotion, acting safely and appropriately in an urgent clinical situation.
Station • This is the region where the skill is demonstrated by the candidate • The observer checks () against the skill shown in check list.
OSCE test design Station
IN Station 1 Station 3 Station 2 Carousel Station 4 OUT Station 7 Station 6 Station 5
Observer sheet
Weight of skills A mark is always provided against any correct maneuver. However some (parts of) skills must be given more weight because of their importance!
Scoring the skill • Tick behavior if adequate performance is observed • Count the total score as indicated • Divide the total score obtained against the total possible score FOR 100
Practical organization of OSCE • The number of students to be assessed? • The available staff members • The available workspace, accommodation & equipment
Organisation of OSCE Months Initiation and basic preparation Step 1 Week The Week- & Day-Before Tasks Step 2 Day Step 3 The Day-Itself Tasks
Step 1 months • Set dates of the OSCE Date(s) • Staffing Observers • Students Class Test Subjects • Facilities Rooms • Manikins • Attributes
Step 2 weeks • Identify the Stations Number of stations • Skill in each station • Design Observer Sheet • Consent on grading Pass mark Failure allowed? • Secretarial Tasks Announcements • Roster design • Task distribution
Step 3 days • Are all students involved properly informed? • Suspension of all classes, practicals etc? • Is administration fully informed ? • Are all OSCE staff members committed to their tasks? THE DAY BEFORE OSCE • Organise the SET of the OSCE • Station Rooms, Requisites, Manikins, Waiting Rooms • Security of the Content of the Stations • Final rehearsal & briefing of observers • Programme in minute detail is distributed
Typically… • 5 minutes most common (3-20 minutes) • (minimum) 18-20 stations/2 hours for adequate reliability • Written answer sheets or observed assessed using checklists • Mix of station types/competences tested • Examination hall is a hospital ward • Atmosphere active and busy