260 likes | 384 Views
Piloting CAA: All Aboard. Gavin Sim & Phil Holifield. Overview. Introduction CAA at UCLAN Key Challenges Staff Uptake Framework Staff Development Students Other Stakeholders Conclusions and Discussions. Introduction.
E N D
Piloting CAA: All Aboard Gavin Sim & Phil Holifield
Overview • Introduction CAA at UCLAN • Key Challenges • Staff Uptake • Framework • Staff Development • Students • Other Stakeholders • Conclusions and Discussions
Introduction • Teaching and Learning strategy incorporated e-learning – mainly content development • First summative CAA test summer 2003 WebCT • TRIADS and Questionmark evaluated for pilot study • Questionmark - adopted felt easier for staff develop their own questions
Introduction • Technical Infrastructure analysed concerns • Scalability – expansion over time • Connectivity – internal and external colleges • Bandwidth – 10 Mbps available, multimedia • Purchased dedicated server host Questionmark, Internet Information Server & SQL Server • Integration with other systems concern but addressed later • Piloting Within Department of Computing
Key Challenges • Encouraging Staff Uptake • Staff Development • Stakeholder Acceptance – e.g. management • CAA perceived ability to test range of cognitive skills • Practical Issues - Labs
Methodology • Questionnaire Staff n=34 response rate 64% • Views in relation to CAA, support and training • Framework developed based on Blooms Taxonomy, 6 staff, 8 modules • Questionnaire Students n=86 response rate 94% • Acceptance of technique • Question styles • Language Used • Usability
Staff Uptake • Computing encompasses range of subjects technical networking, subjective HCI • CAA may readily lend itself assessment specific disciplines • Questionnaire revealed only five members of staff used CAA, 3 actively using it • Encourage uptake CAA being incorporated into department’s strategy, all level 1 formative and summative being optional
Staff Uptake • Five staff now using CAA within the department • Questionnaire revealed • 91% use CAA formative • 56% Summative • Difference could be attributed level lecturer teaches
Framework • Analysing structure of module identify how CAA could be incorporated into modules Learning Outcomes Other Assessment format Bloom’s Taxonomy CAA Syllabus
Number of Learning Outcomes at each level of Blooms Taxonomy Framework
Framework • Variations between number of Learning Outcomes from 3 – 8 • Level 1 modules at lower Cognitive Level • Level 2 Module CO2601 (Technical Solutions and Business) requires students to demonstrate similar ability found on CO3707 • Next is to identify elements of syllabus and relationship to Learning Outcomes • Prevent unrelated content being integrated into exam
Example for CO3707 Identify the parts of the syllabus that relate to the learning outcomes. Framework
Framework • Number of syllabus elements at each level of Bloom’s Taxonomy
Framework • Is going to be used on MSc Web Development Module • Module is all coursework • Formative test in first semester • Enable students gain early feedback • Lecturer obtain early indication of their progress • Framework shows how staff can integrate CAA into modules but further development necessary
Staff Development • Asked staff ‘ Would you be prepared to input the questions into the software yourself?’ • 80% Yes • May not reflect attitude staff in other departments
Staff Development • Lecturers need support in question design 74% • LDU organised staff development in CAA • An introduction to Computer Assisted Assessment • CIF bid for funding pay developer to work with staff develop multimedia questions • 81% more time required to write questions • Question banks and experience reduce time • 61% lecturers help invigilation (essential)
Staff Development • Informal Focus Groups • Discuss problems and share experiences • How accommodate students special needs • Invigilation issues • Risk issues e.g. server fails • Without this students experience may be different from module to module
Students • Attitude measured through series of questionnaires • Students asked ‘ Would you find this format of assessment an acceptable replacement for part of your final exam?’ • 5 Point Likert Scale, Strongly Disagree=0, Strongly Agree=4 • Mean=2.9, SD=.9, 99% Conf. Interval ± 0.26 • Indicates reasonable level of support
Students • Research into computer anxiety and CAA (Liu et al. 2001; Zakrzewski & Steven 2000) • Concerns, students no prior experience of QM • ‘This format of assessment is more stressful than a paper based test’ • Mean=.99, SD=.987, Conf. Interval ± 0.28 • Comments ‘I prefer completing a test in this way as it is less intimidating’ • ‘As a computer geek I feel more at ease in front of a computer.’ (final exam)
Students • ‘Did you have any difficulties accessing the test?’ • 14% Yes • Majority problems copying password from email with white space • Software could trim white spaces • Authentication could be achieved through LDAP process
Students • Questionmark used question by question delivery • Standard Templates • Question the suitability of a number of templates e.g. scrolling, navigation • Idea have a template bank
Students • Series of questions relating to the interface
Students • 81 Students completed questionnaire 3o provided qualitative feedback • Requested facility go directly back to previous question (11 times) • ‘Proceed’ button felt inappropriate near main navigation • Features incorporated into forthcoming test and further analysis will be conducted
Other Stakeholders • Information System Services and Management informed through steering committee • Responsibility report finding of the evaluation for institutional wide deployment • Without support of management additional resources will not be made available
Conclusions and Discussions • Scepticism about CAA appropriateness at level 3,4 for summative assessment • Framework showed how it may be incorporated further research required • Adopting CAA into departments strategy increased uptake but staff development necessary • Students responded positively to experience • Logging in process could be improved • Comparison of WebCT and Questionmark planned