330 likes | 493 Views
Improving student success through implementation of weekly, student unique, CAA tutorial sheets. Mark Russell & Peter Bullen University of Hertfordshire. Why listen to me?. This CAA implementation has made a difference! Student attendance is up. Exam performance is up.
E N D
Improving student success through implementation of weekly, student unique, CAA tutorial sheets. Mark Russell & Peter Bullen University of Hertfordshire
Why listen to me? • This CAA implementation has made a difference! • Student attendance is up. • Exam performance is up. • Retention of students is likely to be up. • This project is transportable to other areas.
Background. • First Year module in Fluid mechanics and Thermodynamics. • ~ 150 Student • 4 teaching staff on team • Variety of teaching and learning settings. • Explicit use of Universities MLE (StudyNET).
So why bother? • Poor exam performance. Possible causes include- • The language of subject is probably new to many students. • There is a need for some mathematical ability. • Attendance at lectures and tutorials is patchy. • The exam performance is particularly concerning given the development of structured learning materials and a real desire to support the students using StudyNET. • We believed we were doing our bit!
The perceived problem. • Students did not always expend the required effort. • Tutorial questions remained unanswered. • Revision time became the primary learning time! • Did the assessment process actively support the learning?
What do we want to do? • Actively encourage (force!) students to engage with all the materials. Why - • Consolidates learning. • Helps apply the new language. • The maths is problem oriented. • Develops confidence in students ability. • Forces a structured study pattern.
How did we do it? • Developed an integrated, summative, assessment regime. It’s key features were • Weekly. • Student unique. • Forcing.
The integrated assessment regime included • A student unique, Weekly Assessed Tutorial Sheet (WATS) • An evolving automated marking sheet. • A manual nudge to non-participants. • A full worked solution (uploaded to StudyNET after student submissions). • A report on the groups weekly performance. • Some generic notes on issues observed.
How did we do it? ‘Word’ sets generic question • ‘Excel’ creates randomised numeric data • ‘Mail merge’ combines generic question with randomised numeric data to create a student unique WATS. • WATS • StudyNET is used to deliver questions to the students
What did it look like? • 11 WATS were developed. • Each WATS had parent and child questions • Each WATS had to be done within one week. • The students had to submit a hard copy of the their results sheet. • A supplementary Excel marking sheet was developed to help the MANUAL marking! • Marking and analysis of the groups performance was prompt. Within 1 day of submission.
As time went on … • WATS 9-11 inc. had an automated student submission sheet. • Unfortunately the timing of the development and server security issues did not allow widespread deployment. We had to settle for loading on one pc only. • Students were issued passwords and were only allowed one submission per WATS. A plea to their good nature not to go looking for things to delete also helped!
Why did we choose this approach? • Did not suffer the more obvious potential drawbacks of a MCQ CAA approach. i.e. • Answers for Q1 are not all the same! • Fairness of test/equality of questions in a question bank. • Does not tie students to a PC. • Does not inadvertently create bias. • Does not inadvertently give hints. • A ‘chance’ answer is not an issue.
Key benefits. • Stops solution sharing at source. • Students could still help each other. • Students get the feedback they so often like. • Forces a structured study pattern. • ‘Not cool to be studious’ is not an issue with this approach. • Attempts to engage with everybody. • Allows students to see where they went wrong.
Critical success factors. • Attendance at lectures and tutorials was improved. • More tutorial questions were tackled by the students. • But what about exam performance ?
Can WATS predict exam grade? 52% of the students had a difference of only 15% or less between their WATS and their exam grade.
Where next ? • Build on lessons with greater emphasis on automating more of the processes. • There now exists a one stop C++ program for entry to the WATS submissions.
More where next ? • More analysis of results • Incorporate automated nudges. • Provides additional student care and individualised contact. • Consider adopting a competence pass threshold. • May help close the learning cycle. • Provide student unique additional study material. • Match material to individual weakness.
Why automate? • Reduce staff time. • Marking rules can be set up and applied to all students. No matter how fair! • Will help with the move towards a competence pass structure. • Allows implementation without becoming too time consuming. Approach is already likely to be borrowed by an electrical science module.
Conclusions. • The WATS has improved the exam performance. • The WATS has improved attendance. • The WATS will help with student retention. • This WATS approach would not have been feasible without exploiting the use of computers.
Conclusions. • The students liked the whole experience • We will be looking to export this approach to other modules. • There are still some outstanding issues to investigate. • The application of CAA to this module has been a remarkable success.
Acknowledgements. • The authors wish to thank the LTSN engineering for their support of this work.