520 likes | 635 Views
Assessment Design and E-learning. Geoffrey Crisp ALTC National Teaching Fellow Director, Centre for Learning and Professional Development University of Adelaide. Assessment 2.0 examples from ALTC Fellowship. http://www.transformingassessment.com. 4/09/2014. 2. Outline of presentation.
E N D
Assessment Design and E-learning Geoffrey Crisp ALTC National Teaching Fellow Director, Centre for Learning and Professional Development University of Adelaide
Assessment 2.0 examples from ALTC Fellowship http://www.transformingassessment.com 4/09/2014 2
Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 3
Typical learning and assessment today? http://www.flickr.com/photos/cristic/359572656/ http://www.pharmtox.utoronto.ca/Assets/Photos/pcl473+classroom+editted.JPG 4
New learning environments for students and teachers 4/09/2014 5
“Big problem” learning and assessment http://www.nasaimages.org/luna/servlet/detail/nasaNAS~9~9~58363~162207: 6
Assessment tasks should be worth doing if students can answer questions by copying from the web, they are being set the wrong questions if students can answer questions by using Google, they are being set the wrong questions if students can answer questions by guessing, they are being set the wrong questions Why the hell am I doing this course? 4/09/2014 7
Students’ perception of learning strategies? http://blog.oregonlive.com/pdxgreen/2008/02/chimp.jpg 4/09/2014 8
Operationalising assessment for current and future learning Authentic tasks Life-long learning Authentic tools Meaningful feedback Self-review and critique Standards Learning-oriented assessment 10
Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 11 11
What role for learning (virtual) management systems? http://4.bp.blogspot.com/_hBiBaUg_1rA/SJTQE0ymK7I/AAAAAAAABxI/OIKhMiaaQ2E/s400/confusing_signs2.jpg http://www.teach-ict.com/ecdl/module_1/workbook15/miniweb/images/stressed.jpg 4/09/2014 12
Question types - LMS 4/09/2014 13
Effective assessment design Graham Gibbs, David Nicol and David Boud Nicol, D E-assessment by design: using multiple-choice tests to good effect. Journal of Further and Higher Education, 31(1) 2007, 53–64
JISC - Reports and papers on (e)-assessment http://www.jisc.ac.uk/whatwedo/programmes/elearning/ assessment/digiassess.aspx
Report on Summative E-Assessment Quality (REAQ) MCQ (selected response) effort is front loaded start with easier questions and make later questions more difficult checking assessments with subject matter experts and high performers identifying ‘weak’ questions and improving or eliminating them http://www.jisc.ac.uk/media/documents/projects/reaqfinalreport.pdf
Report on Summative E-Assessment Quality (REAQ) reviewing question content to ensure syllabus coverage assisting academics who may have limited experience of psychometrics attending to security using accessibility guidelines http://www.jisc.ac.uk/media/documents/projects/reaqfinalreport.pdf
Why do we assess students? it encourages (current and future) learning it provides feedback on learning to both the student and the teacher it documents competency and skill development it allows students to be graded or ranked it validates certification and licence procedures for professional practice it allows benchmarks to be established for standards hasn’t changed much for decades in some cases 4/09/2014 18 18
Types of assessment responses convergent type, in which one ‘correct’ response is expected, and divergent type, in which the response depends on opinion or analysis assessment requiring convergent responses has its origins in mastery-learning models and involves assessment of the learner by the master-teacher assessment requiring divergent responses is associated with a constructivist view of learning, where the teacher and learner engage collaboratively within Vygotsky’s zone of proximal development 4/09/2014 19 19
You need to think about: whether a norm-referenced or criterion-referenced assessment scheme is more appropriate for the particular learning outcomes whether the process of solving a problem and the productof solving a problem are both assessed, and what is the relative weighting for the two components whether constructed or selected responses are appropriate 4/09/2014 20
Why use e-assessment for selected respoinses? flexibility in test delivery providing timely feedback easy reporting and analysis of student responses construction of questions is straightforward, but designing good assessment items difficult can reduce overall workload for academics, but effort is frontloaded
Design for MCQ exams for summative use use of question banks security guessing
Certainty-Based Marking (CBM) Tony Gardner-Medwin - Physiology (NPP), UCL • CBM rewards thinking: • identification of uncertainty • or of justification • Highlights misconceptions • negative marks hurt! • Engages students more • Enhances reliability & validity
Nuggets of knowledge ? ? ? ? ? ? ? ? EV I DENCE Inference Confidence-based marking places greater demands on justification, stimulating understanding Networks of Understanding Thinking about uncertainty and justification stimulates understanding Confidence (Degree of Belief) Choice To understand = to link correctly the facts that bear on an issue.
Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 30 30
Report on Summative E-Assessment Quality (REAQ) The design principles for preparing quality assessment tasks in higher education have been well documented (Biggs, 2002; Bull & McKenna, 2003; Case & Swanson, 2001; Dunn, Morgan, Parry & O'Reilly, 2004; James, McInnis & Devlin, 2002; McAlpine, 2002a; PASS-IT) There is also an extensive body of work in the discipline of validity and reliability testing for assessments and there are numerous descriptions that are readily available for academics on how to apply both psychometric principles and statistical analyses based on probability theories in the form of Classical Test Theory and Item Response Theory, particularly the Rasch Model (Baker, 2001; Downing, 2003; McAlpine, 2002b; Wright, 1977)
Report on Summative E-Assessment Quality (REAQ) Thus, there is no shortage of literature examples for academics to follow on preparing and analysing selected response questions; academics and academic developers should be in a position to continuously improve the quality of assessment tasks and student learning outcomes However, the literature evidence for academics and academic developers generally using these readily available tools and theories is sparse (Knight, 2006)
Analysing student responses Engaging academics with a simplified analysis of their multiple-choice question (MCQ) assessment results. G. T. Crisp E. J. Palmer. Journal of University Teaching & Learning Practice Volume 4, Issue 2 2007 Article 4
Survey of academics Academics were familiar with common statistical terms such as mean, median, standard deviation and percentiles Some were familiar with the different types of terms used to describe validity, but very few were aware of the formal psychometric approaches associated with Classical Test Theory, the Rasch Model or Item Response Theory
Score distribution 50-60%
Facility Index – Classical Test Theory 0.3 to 0.8
How effective are distracters? 20-30% each distracter
Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 43 43
Process of problem solving - IMMEX http://www.immex.ucla.edu
IMMEX output Kong, S.C., Ogata, H., Arnseth, H.C., Chan, C.K.K., Hirashima, T., Klett, F., Lee, J.H.M., Liu, C.C., Looi, C.K., Milrad, M., Mitrovic,A., Nakabayashi, K., Wong, S.L., Yang, S.J.H. (eds.) (2009). Proceedings of the 17th International Conference on Computers in Education
Role Plays http://www.ucalgary.ca/fp/MGST609/simulation.htm http://www.roleplaysim.org/papers/default.asp?Topic=toc9 46
What Happens in a Role Play? Reflection & Learning Adopt a role Issues & problems occur Interaction & debate 47
Scenario-based learning http://www.pblinteractive.org 48
Future assessments? • Will we see universal development of immersive and authentic learning and assessment environments? • Will assessments measure approaches to problem solving and student responses in terms of efficiency, ethical considerations and the involvement of others? • Will teachers be able to construct future assessments or will this be a specialty activity? 49