1 / 31

Assessment 101

Assessment 101. Zubair Amin MD MHPE. “ Assessment Drives Student Learning.” George E Miller 1919-1998. “Assessment drives learning in at least four ways: its content, its format, its timing and any subsequent feedback given to the examinee.”

droth
Download Presentation

Assessment 101

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment 101 Zubair Amin MD MHPE

  2. “ Assessment Drives Student Learning.” George E Miller 1919-1998

  3. “Assessment drives learning in at least four ways: its content, its format, its timing and any subsequent feedback given to the examinee.” vander Vleuten, C. (1996) The Assessment of Professional Competence: Developments, Research and Practical Implications, Advances in Health Sciences Education, 1, pp. 41–67.

  4. “The ‘law’ of educational cause and effect states that: for every evaluative action, there is an equal (or greater) (and sometimes opposite) educational reaction.” Schuwirth, L.W.T. (2001) General Concerns About Assessment. Web address: www.fdg.unimaas.nl/educ/lambert/ubc.

  5. Evolution of Medical Students Website by NUS students:http://medicus.tk “Assessment drives learning in the direction you wish.”

  6. Linking Learning, Assessment, and Feedback

  7. Assessment Learning, teaching and assessment must not be viewed as isolated concepts. In the ideal scenario effective teaching, effective learning and effective assessment are all part of the same educational process. The value of changing assessment to reflect what needs to be learned is evident since students learn what they know they will be tested on. Doyle, W (1983) Academic Work, Review of Educational Research 52:159-199.

  8. We should assess what we teach and teach what we assess.

  9. Assessment Fundamentals • Why do we assess? • What should we assess? • When should we assess? • How should we assess?

  10. Why Do We Assess? • Determine whether learning outcomes are met • Support of students’ learning • Certification and competency judgment • Teaching program development and implementation • Accountability • Understanding the learning process

  11. Assessment Serves Multiple Stakeholders • Students • Teachers • Department, Faculty; University; Administrators • Public; Governmental Agencies • Stakeholders’ interest in assessment is not necessarily aligned.

  12. Students Teacher Faculty, University Public, Government Stakeholders’ Priorities

  13. What Should We Assess?

  14. Performance Professional authenticity Cognition Knowledge and Performance Does Shows how Knows how Knows Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.

  15. Shows how Knows how Knows What Should We Assess? Performance Assessment in vivo Does Performance Assessment in vitro Context-based tests Factual tests

  16. Date of Examination When Should We Assess? Concept of Mastery ‘All or none state’ – not really

  17. Continuum of Performance • ‘Learning Curve’

  18. A examination that attempts to test students’ mastery at a given point of time is less preferable than one that tests the mastery over a span of time

  19. How Should We Assess? • Utility of Assessment Instruments: • Validity • Reliability • Educational Impact • Cost • Acceptability Utility = validity x reliability x educational impact x cost effectiveness x acceptability

  20. Validity • Validity: Ability of the assessment instrument to test what it is supposed to test. • Example: The course aims to determine whether the students are able to communicate effectively. • What assessment instrument would you choose for the given purpose?

  21. Content validity: ability of the assessment instrument to sample representative content of the course. Course content Assessment

  22. Reliability • Reliability refers to the consistency of test scores and the concept of reliability is linked to specific types of consistency. • Over time • Between different examiners, • Different testing conditions • Instruments for student assessment needs high reliability to ensure transparency and fairness

  23. Examiner Question

  24. Examiner Question

  25. Examiner Question

  26. Educational impact

  27. There is probably more bad practice and ignorance of significant issues in the area of assessment than in any other aspect of higher education. Boud, 1995

  28. ..… this would not be so bad if it were not for the fact that the effects of bad practice are far more potent than they are for any aspect of teaching. Students can, with difficulty, escape from the effects of poor teaching, they cannot (by definition, if they want to graduate) escape from the effects of poor assessment. Boud, 1995

  29. Assessment is a moral activity. What we choose to assess and how shows quite starkly what we value. Knight, 1995

  30. Assessments need to be reproducible (reliable), valid, feasible, fair, and beneficial to learning; Content and form of assessments need to be aligned with their purpose and desired outcomes; Student performance is case or content specific and broad sampling is needed to achieve an accurate representation of ability (multiple biopsies); Systematically derived pass-fail scores and the overall reliability of the an assessment are important; and Assessments need to be constructed according to clearly defined standards and derived using systematic and credible methods. Norcini et al; 2011; Med Teach; Criteria for Good Assessment

  31. Backbone of Assessment Select few assessment tools for most the assessment purpose High quality, high psychometric value, and relatively easy to administer Look at clinical competency as a whole (i.e., at the programmatic level Use assessment to create and support learning

More Related