310 likes | 328 Views
Assessment 101. Zubair Amin MD MHPE. “ Assessment Drives Student Learning.” George E Miller 1919-1998. “Assessment drives learning in at least four ways: its content, its format, its timing and any subsequent feedback given to the examinee.”
E N D
Assessment 101 Zubair Amin MD MHPE
“ Assessment Drives Student Learning.” George E Miller 1919-1998
“Assessment drives learning in at least four ways: its content, its format, its timing and any subsequent feedback given to the examinee.” vander Vleuten, C. (1996) The Assessment of Professional Competence: Developments, Research and Practical Implications, Advances in Health Sciences Education, 1, pp. 41–67.
“The ‘law’ of educational cause and effect states that: for every evaluative action, there is an equal (or greater) (and sometimes opposite) educational reaction.” Schuwirth, L.W.T. (2001) General Concerns About Assessment. Web address: www.fdg.unimaas.nl/educ/lambert/ubc.
Evolution of Medical Students Website by NUS students:http://medicus.tk “Assessment drives learning in the direction you wish.”
Assessment Learning, teaching and assessment must not be viewed as isolated concepts. In the ideal scenario effective teaching, effective learning and effective assessment are all part of the same educational process. The value of changing assessment to reflect what needs to be learned is evident since students learn what they know they will be tested on. Doyle, W (1983) Academic Work, Review of Educational Research 52:159-199.
We should assess what we teach and teach what we assess.
Assessment Fundamentals • Why do we assess? • What should we assess? • When should we assess? • How should we assess?
Why Do We Assess? • Determine whether learning outcomes are met • Support of students’ learning • Certification and competency judgment • Teaching program development and implementation • Accountability • Understanding the learning process
Assessment Serves Multiple Stakeholders • Students • Teachers • Department, Faculty; University; Administrators • Public; Governmental Agencies • Stakeholders’ interest in assessment is not necessarily aligned.
Students Teacher Faculty, University Public, Government Stakeholders’ Priorities
Performance Professional authenticity Cognition Knowledge and Performance Does Shows how Knows how Knows Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.
Shows how Knows how Knows What Should We Assess? Performance Assessment in vivo Does Performance Assessment in vitro Context-based tests Factual tests
Date of Examination When Should We Assess? Concept of Mastery ‘All or none state’ – not really
Continuum of Performance • ‘Learning Curve’
A examination that attempts to test students’ mastery at a given point of time is less preferable than one that tests the mastery over a span of time
How Should We Assess? • Utility of Assessment Instruments: • Validity • Reliability • Educational Impact • Cost • Acceptability Utility = validity x reliability x educational impact x cost effectiveness x acceptability
Validity • Validity: Ability of the assessment instrument to test what it is supposed to test. • Example: The course aims to determine whether the students are able to communicate effectively. • What assessment instrument would you choose for the given purpose?
Content validity: ability of the assessment instrument to sample representative content of the course. Course content Assessment
Reliability • Reliability refers to the consistency of test scores and the concept of reliability is linked to specific types of consistency. • Over time • Between different examiners, • Different testing conditions • Instruments for student assessment needs high reliability to ensure transparency and fairness
Examiner Question
Examiner Question
Examiner Question
There is probably more bad practice and ignorance of significant issues in the area of assessment than in any other aspect of higher education. Boud, 1995
..… this would not be so bad if it were not for the fact that the effects of bad practice are far more potent than they are for any aspect of teaching. Students can, with difficulty, escape from the effects of poor teaching, they cannot (by definition, if they want to graduate) escape from the effects of poor assessment. Boud, 1995
Assessment is a moral activity. What we choose to assess and how shows quite starkly what we value. Knight, 1995
Assessments need to be reproducible (reliable), valid, feasible, fair, and beneficial to learning; Content and form of assessments need to be aligned with their purpose and desired outcomes; Student performance is case or content specific and broad sampling is needed to achieve an accurate representation of ability (multiple biopsies); Systematically derived pass-fail scores and the overall reliability of the an assessment are important; and Assessments need to be constructed according to clearly defined standards and derived using systematic and credible methods. Norcini et al; 2011; Med Teach; Criteria for Good Assessment
Backbone of Assessment Select few assessment tools for most the assessment purpose High quality, high psychometric value, and relatively easy to administer Look at clinical competency as a whole (i.e., at the programmatic level Use assessment to create and support learning