200 likes | 363 Views
Personalisation of retention support and the use of MIS systems and data. Jonathan Staal – Student Academic Support Co-ordinator, University of Abertay Dundee. Background. Established July 2004 in part response to a significant student retention / progression problem.
E N D
Personalisation of retention support and the use of MIS systems and data. Jonathan Staal – Student Academic Support Co-ordinator, University of Abertay Dundee.
Background. • Established July 2004 in part response to a significant student retention / progression problem. • Ostensibly provides study skills related support but with an underlying focus on students’ broader meta-cognitive development. • Approach set by Widening Participation Strategy, centring on: • HEFCE student lifecycle model. • USEM model of graduate qualities.
Ubiquitous. Implementation. Individual support. Targeted outreach to students. Visible presence around campus. Teaching classes in collaboration with tutors. Self selected. Posters. Leaflets. Blog, database. Flyers. Floor stickers. Commissioned by teaching staff. Developed jointly. Team teaching. Driven by MIS, service data. Appointments. Drop-ins. Email. MSN. Email. Mailshots. Permeable.
Repetition & waste. • We no longer have a student community. • Our students are not simply students. • Students’ engagement with the university is not homogenous or consistent. • We have to consider a multiplicity of lifecycles that are not necessarily linear or synchronised. • Main reasons not to use multiple regression analysis of student demographics to target support. • How to meet such a range of needs when we are not resourced to be all things to all people at all times?
Broad targeting. • Institution-level data: • Targeting students by course, module, fees status, date of registration, level of study, (re)assessment status. • Using that information to provide on-going support during the year to a range of groups.
General examples. • Deferred entrants. • Mature entrants. • Direct entrants. • Late registrants. • Service clients. • Summer school / bridging programme participants. • Reassessment students. • Students carrying a module. • Students repeating year. • Withdrawn students.
Specific example. Visits to colleges by course teams. Highers / HNQDegree Term 1 Term 2 Term 3 Term 1 Advisors’ visits to colleges. Visits by students to university. Bridging Programme. Drop-ins and appointments. Summer School. Induction guide. Induction week. Skills classes.
Narrower targeting. • Service-level data: • Monitoring clients by year group, course, module, problem, time of visit. • Using that information to schedule the following year’s programme of activities and to identify potential problem groups.
Next steps. Personalisation. • Working efficiently within resource constraints (especially time) to provide individualised support. • Exploiting ICT, not necessarily ICT-driven. • Enhanced pro-active targeting of students using data on individual performance at unit of assessment level and specially-designed survey tools.
Personalisation. Personalisation. • Based on detailed knowledge of students’ strengths and weaknesses. • Mixed-ability support of students’ competence and confidence. • Choice of routes through learning. • Support built around students, rather than students opting in to standard arrangements. • Role for technology.
Modes. Personalisation. • Bespoke service: • Individually tailored provision. • Mass customisation: • Students offered choice from a selection of standard components. • Mass personalisation: • Students’ active participation in choosing and creating content.
Induction quiz. Induction quiz. • Guidance at the point of entry. • 8 questions, 4 different feedback messages to each question. • 4,096 permutations based on students’ selections. • Range of resources recommended in feedback messages based on students’ choices. • Use of captured response data to target further support information emailed according to a schedule through semester 1.
Individual targeting. • Moving away from service / institution-level data. • Moving to data on students’ actual performance. • Reactive use: • Individuals’ module performance and progression. • Pro-active use: • Individuals’ module performance. • Previous cohort’s module performance.
Assessment data. • From module descriptors, identifying types of assessment and scheduling in module. • From student assessment records, identifying failed units of assessment – types of assessment and point in semester – and also failed modules. • Individualised letters to students and staff based on above datasets generated and emailed automatically.
Summary of actions. • University-wide trends: • Posters. • Broad student category: • Articulation support. • Individual self-assessment: • Interactive survey tools. • Module performance: • Reassessment guide. • Service-user data: • Service schedule. • Unit of assessment data / tools: • Individual reassessment advice.
What next? • ‘Open-source’ materials, inviting student input to content - explicit acknowledgement that we don’t hold ‘ultimate truths’. • Interactive multimedia support resources. • Your examples?