320 likes | 438 Views
Enhancing the student experience: taking a whole university approach to improving satisfaction (and NSS outcomes). Professor Sally Brown. The problems experienced in a large metropolitan university.
E N D
Enhancing the student experience: taking a whole university approach to improving satisfaction (and NSS outcomes) Professor Sally Brown
The problems experienced in a large metropolitan university • Downward drifting NSS scores from a previously middling score. (e.g. Scores on bank 2 on assessment and feedback fell from 70 in 2005 to 56 in 2009) and other indicators of low student satisfaction; • Although there was praise in some areas, a QAA institutional audit had recorded one area of limited confidence in the QA processes of the university. • Low morale across the university affecting staff, managers and governors • Considerable financial challenges also.
Particular problems highlighted by NSS • High levels of dissatisfaction with the quality of assessment and particularly the speed and depth of feedback; • Perceived lack of enthusiasm for teaching by academic staff; • Concerns about cancelled classes and slow responses to queries, particularly to emails. • Lack a clear understanding of what they could expect from the university and what was expected of them. • Students’ perception that they rarely received a response telling them what had been done as a result of issues identified in previous years negative comments, particularly with module evaluations.
Setting changes in place Marshall and Massy argue that when leading in turbulent times, the first step to be taken is to: “create a sense of urgency about the crisis. While it’s easy to scare people, the aim is to at the same time present a plan about how, by doing g things differently, the university can break the momentum taking it in the wrong direction and work its way out of the problem. The key is to create a sense of urgency without instilling a feeling of hopelessness ” (p68)
The ground rush effect? “Although changes may seem to come upon us without warning, experience shows this is rarely the case. Unfortunately we often disregard or misinterpret the signals of change. We tend to spend our time on issues we perceive to be most important right now; we fail to scan our surroundings for changes that are in the early stages of development. The flood of problems that forces us to into crisis management makes concern for emerging issues to appear to be a luxury. It is not. It is a necessity.” (Renfro &Morrison 1983)
Gaining ownership of the process Jethro Newton (2003) talking about implementing an institution-wide learning and teaching strategy in a not-dissimilar university suggests: “The more strategy in this area comes to be received as being prepared to meet external requirements, the less it will gain the acceptance necessary for implementation” Professor Sir David Watson (in Brown and Denton 2010) argues that ‘Institutional strategic choice and decision making should ideally come from all members of the university community , having, of course consulted appropriately outside’.
So what did we do? • New CEO indicated improving student satisfaction was a top priority and gave authority to the implementation of university-wide approaches; • Staff across the university started working in concert to address the issues and to remove inconsistencies of practice between faculties; • Establishment of an NSS steering group (co-chaired by the PVC Academic) with representation from all faculties, Registry, Estates, Library IT services and importantly, students. • Regular review of progress at Academic and Faculty Boards
Change agents • Staff across the university; • Teacher Fellows; • Associate Deans (Assessment, learning and Teaching); • Senior managers including PVC (Academic); • NSS Steering group, Faculty Boards, Quality Enhancement committee and Academic Board • The Students’ Union.
Changing the culture: • The major change that was brought about was a shift in orientation across the university towards satisfying students. • The institution took external advice from other PVCs who had wrought considerable improvements in their own universities’ NSS scores who emphasised that short term reactive activities and concentrating on the survey alone would not bring about improvements. Instead they argued for a long term view that would involve changes in attitude as well as practical changes. • The seriousness of the situation that the university faced meant that there was a clearly expressed will across the organisation to bring about deep seated changes consistently carried forward if the effect were to be lasting.
Building on positive outcomes • An area of provision that scored consistently well in many of the surveys was the Library. The NSS steering group was keen to capitalise on this positive experience and to seek guidance from Library staff on what kinds of actions were leading to high levels of satisfaction. Where possible, positive practice was transferred to other areas of provision. • The institution emulated across the university the Library’s successful “You said: we did” campaign to enable students to know what action had been taken in response to students’ suggestions. • In the lead up to the NSS in 2010, this was widely used across the university and electronic notice boards and posters highlighted positive changes that had been made, in addition to clear follow-up information being provided in course committees and institution-wide student forums.
Avoiding questionnaire fatigue: • Following concerns that students were being asked too frequently to complete surveys, a collective decision was made at the senior management level that no written surveys other than the NSS would be used with final year students other than those required by Professional and Subject bodies (some of whom expected module evaluations to be carried out each semester). • The Students’ Union agreed to exempt final year students from their Annual Student Survey. Where course teams were keen to ensure they received feedback on the first module of the final year, they were advised to use other informal means of collecting data such as open meetings, or to evaluate the two final year modules together at the end of the year.
Zero tolerance on cancellation of classes • Although in 2008-9 relatively few classes were cancelled, senior managers agreed that in 2009-10 the university would ensure that classes were postponed or rescheduled rather than cancelled on those occasions where it was not possible for them to take place at the original time. • The university investigated and piloted a system to communicate with students about deferred classes and other important matters by text as well as other means, since very high proportions of students carry mobile phones with them. • This system was due to be rolled out fully in the academic year 2010-11.
Mutual expectations document: • One of the faculties had piloted a very successful document setting out exactly what students could expect from the university (for example, in relation to timely provision of timetables and notification when classes had to be deferred) and what would be expected from them reciprocally, for example, that they would attend and participate in classes, be punctual and submit work to deadlines. • After discussion, all faculties agreed to use this as a template for their own faculty mutual expectation document for that year, although this was being reviewed in the light of potential sector guidance from the UK charter group and also in light of what has come out of the Browne Review. • The terms ‘student charter’ , ‘student contract’ or ‘compact’ were deliberately avoided as they were not intended to be regarded as legally enforceable agreements.
Recruiting the right staff for teaching: • The institution focused more closely on ensuring that applicants for teaching posts should demonstrate the capacity to teach effectively or, for those new to higher education, the potential to do so. • The university strategy states: “We are committed to employing enthusiastic and inspiring academic and support staff who embrace opportunities for professional development to enhance our students’ experiences.” • A group comprising Teacher Fellows and colleagues from HR worked together to design a range of means to interrogate this capability in the application and interview process, for example by asking applicants to comment on a video of a teaching session, to draft some curriculum materials or to assess some assignments. • The current approach to be used is still under consideration, but it was agreed that interview panels for such posts would include experts in teaching, for example, Teacher Fellows or Associate Deans (ALT)
Training for new academic staff: • The university already had a policy that stated that all teaching staff (teaching full time or at least 50% of the week) should do the Postgraduate Certificate in Higher Education Teaching and Learning. This was reinforced strongly and we asked AD(ALT)s to monitor take up without exemptions. • The PGCHE course had as its first component in September a three day block entitled ‘In at the deep end’ designed to give first level guidance to staff facing their first classes and assessment task and this was offered additionally in January and June to ensure no staff member had to wait too long for this support. • The 3-day block was opened up to people teaching less than 50% of their time and was mandated for research students who taught even for only a few hours. • The guidance booklet that accompanied the 3-day block was widely distributed to sessional staff who taught only occasionally and for whom the 3-day block was inappropriate.
Peer observation of teaching: • We implemented fully a previously agreed peer observation of teaching system, which was designed to encourage conversations among teachers about good classroom practice. • The expectation that all teaching staff observe and are observed teaching each semester was strongly reinforced by faculties and compliance was monitored by the NSS steering group. • Over the course of a year, the proportion of teaching staff who met university requirements in relation to Peer Observation of Teaching rose from around 20% to at least 75%. • A guidance booklet (Race et al 2010) was provided to all academic staff outlining a range of approaches and templates that could be used. • Formal records of the conversations were not required: however, during annual PDRs, teachers were invited to discuss what they had learned and for these to fed into Staff Development.
Feedback and assessment: the major challenges • This was the area that the students highlighted most strongly as being a matter of dissatisfaction in common with trends nationally and internationally. • Overall, (with honourable exceptions of some programmes), students were unhappy about the quality and nature of feedback and particularly what they saw as unacceptably slow return of work. • The university already had a policy of returning assessed work with comments within three weeks, but this was not universally achieved. • Over the year in question, extensive work was undertaken within faculties to convince staff involved in assessing student work that feedback needs to be timely, detailed and formative. It was recognised that this is difficult to achieve, especially where staff are marking high numbers of scripts.
Our approach to improvement was informed by inter alia the work by Nicol on ‘Good feedback practice’: 1. Helps clarify what good performance is (goals, criteria, expected standards); 2. Facilitates the development of self-assessment (reflection) in learning; 3. Delivers high quality information to students about their learning; 4. Encourages teacher and peer dialogue around learning; 5. Encourages positive motivational beliefs and self-esteem; 6. Provides opportunities to close the gap between current and desired performance; 7. Provides information to teachers that can be used to help shape the teaching. (Nicol and Macfarlane Dick)
The Assessment for Learning movement was also considered 1. Tasks should be challenging, demanding higher order learning and integration of knowledge learned in both the university and other contexts; 2. Learning and assessment should be integrated, assessment should not come at the end of learning but should be part of the learning process; 3. Students are involved in self assessment and reflection on their learning, they are involved in judging performance; 4. Assessment should encourage metacognition, promoting thinking about the learning process not just the learning outcomes; 5. Assessment should have a formative function, providing ‘feedforward’ for future learning which can be acted upon. There is opportunity and a safe context for students to expose problems with their study and get help; there should be an opportunity for dialogue about students’ work;
Assesment for learning 2 6. Assessment expectations should be made visible to students as far as possible; 7. Tasks should involve the active engagement of students developing the capacity to find things out for themselves and learn independently; 8. Tasks should be authentic; worthwhile, relevant and offering students some level of control over their work; 9. Tasks are fit for purpose and align with important learning outcomes 10. Assessment should be used to evaluate teaching as well as student learning (Assessment Reform Group (1999))
Assessment for Learning: see http://www.northumbria.ac.uk/sd/central/ar/academy/cetl_afl/ • Emphasises authenticity and complexity in the content and methods of assessment rather than reproduction of knowledge and reductive measurement. • Uses high-stakes summative assessment rigorously but sparingly rather than as the main driver for learning. • Offers students extensive opportunities to engage in the kinds of tasks that develop and demonstrate their learning, thus building their confidence and capabilities before they are summatively assessed. • Is rich in feedback derived from formal mechanisms e.g. tutor comments on assignments, student self-review logs. • Is rich in informal feedback e.g. peer review of draft writing, collaborative project work, which provides students with a continuous flow of feedback on ‘how they are doing’. • Develops students’ abilities to direct their own learning, evaluate their own progress and attainments and
Improvements we made to feedback on assessed work: • Workshops led by Teacher Fellows in faculties and subject groups on how to give feedback effectively and efficiently. • The production of a booklet on using assessment to support student learning by an international pedagogic expert (Gibbs, 2010) who worked with staff of the university to elicit and include local examples of good practice. All teaching staff received a copy, and conversations on issues raised within it were encouraged within faculties and course teams. • Publications for students on assessment and how to make the best use of feedback were produced and distributed widely. One of these publications was linked to a National Teaching Fellowship funded project on assessment while others were more generic.
Further actions to improve feedback • A JISC-funded project on creating digital audio files to give feedback orally which could be emailed to students was led by Bob Rotheram, Reader within the Assessment, Learning and Teaching team, who widely disseminated the project outcomes demonstrating that students take feedback seriously and use it on multiple occasions when it is provided in this form (Rotheram 2009). • International experts on oral assessment and formative feedback undertook short residences in the university and led workshops and seminars on improving assessment. These experts included Royce Sadler, David Boud and Gordon Joughin. • Associate Deans (ALT) in each faculty led very rigorous monitoring of return rates for assessed work and reported these to the NSS steering group, as well as developing some good practice proformas
Making the changes • Workshops were led by Teacher Fellows in faculties and subject groups on how to give feedback effectively and efficiently • The production of a booklet on using assessment to support student learning by an international pedagogic expert (Gibbs 2010) who worked with staff of the university to elicit and include local examples of good practice. Individual copies of this publication were provided to all teaching staff and conversations on issues raised within it were encouraged within faculties and course teams. • Discussions were held with union representatives during which issues concerning workload were raised, with resultant action by HR staff in relation to deployment. • Publications for students on assessment and how to make the best use of feedback were produced and distributed widely. One of these publications was linked to a National Teaching Fellowship funded project on assessment while others were more generic.
Actions in parallel to improve quality assurance • An extremely thorough review of review of processes and practices across the university was undertaken, including a complete revision of academic regulations to remove inconsistencies and remove areas for potential ambiguity. • This was supportive of efforts to improve the student experience, since it ensured that information to students was more transparent and accessible than previously. • In particular, assessment regulations were simplified in places and consistent documentation in relation to mitigating circumstances and appeals was made available via the web for students to scrutinise as necessary.
Preparations for the NSS process in 2010 • The NSS projects team worked hard to ensure high and early rates of response to achieve the most representative sample possible and to monitor and follow up areas with low return rates. • The UK NSS has rigorous guidelines to discourage rogue HEIs from putting undue pressure on students to respond positively to the survey (“ Would you want to get a degree from a university with low ratings?” being the kind of approach that is frowned upon), so endeavours necessarily focussed on ensuring participation in the survey rather than guiding the kinds of responses students gave. • The university chose the date to open the survey to be the earliest date by which it could be confident that students would have returned following the inter semester break. • Incentives were offered, including data sticks and credits for university copiers were offered to students participating on receipt of the email from NSS thanking the student for participating.
And the results in 2010? • The results of the 2010 NSS were significantly improved on the previous year. • On Question 22, ‘Overall, I am satisfied with the quality of the course’ there was an 8% improvement on the previous year’s score which brings the institution better into line with its benchmark institutions, with five courses scoring 100% on this question. • On Bank 2 Assessment and feedback scores rose from a low of 56 in 2009 to 62 in 2010, against a national average of 62: there was particular satisfaction in seeing a better score for question 7 ‘feedback on my work has been prompt’, with an improvement form 2009-2010 of 11% . • Similarly scores for question 15 ‘The course is well organised and is running smoothly’ rose by 12%.
Free response comments included “Third years tutors have been really great, a lot more support given to us than in previous years” , “In the past couple of years I don’t think the tutors have been very communicative, but this year has greatly improved”, “The course is most definitely improving”, “The course has improved drastically and become much more organised” “There have been some positive changes due to the course leader taking on board things that students aren’t happy with”.
But no room for complacency • Considerable further work still to do to keep up the momentum; • Some of the improved scores still lag behind benchmark HEIs; • A new Student Experience Steering Group has been established to be chaired by the incoming DVC (Student Experience) which will have the responsibility of determining university priorities in response to the results of the NSS and also in relation to other forms of student feedback that are collected throughout the year.
Conclusions • The changes which are made to improve the student experience are important in their own right, not just in response to poor NSS scores; • It is possible to make significant improvements, but it needs a strategic approach, ideally at institutional level; • Strategic approaches aren’t worth a fig if individual staff don’t embrace the need to improve student satisfaction; • Doing the same things which have always been done in the same way that they have always done is doomed to failure.
References 1 Assessment Reform Group (1999) Assessment for Learning : Beyond the black box Cambridge UK, University of Cambridge School of Education Brown, S and Denton, S (2010) Leading the University Beyond Bureaucracy in A practical guide to University and College management (Eds. Denton, S and Brown, S) New York and London: Routledge. Browne, J (2010) Securing a sustainable future for higher educationwww.independent.gov.uk/browne-report Gibbs, G (2010) Using assessment to support student learning Leeds: Leeds Metropolitan University. Jones, J (2010) Building pedagogic excellence: learning and teaching fellowships within communities of practice at the university of Brighton in Innovations in Education and Teaching Internationalvol 47 No 3 p 271-82 Routledge/ Taylor and Francis Abingdon Oxford Marshall, P and Massy, W (2010) ‘Managing in turbulent times’ in Forum for the Future of Higher Education, papers from the 2009 Aspen symposium Massachusetts Institute of Technology Cambridge USA Newton J (2003) ‘Implementing an Institution-wide learning and Teaching strategy: lessons in managing change Studies in Higher Education Vol 28 No 4 Nicol, D J and Macfarlane-Dick: Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education (2006), Vol 31(2), 199-218
References 2 Rust, C., Price, M. and O’Donovan, B. (2003) Improving students’ learning by developing their understanding of assessment criteria and processes Assessment and Evaluation in Higher Education. 28 (2), 147-164. Brown S and Denton S Leading the University Beyond Bureaucracy in A practical guide to University and College management (Ed Denton S and brown S) Routledge New York and London Pascale P Managing on the edge New York Touchstone Race P et al 2009 Using peer observation to enhance teaching Leeds metropolitan Press Leeds Renfro WL and Morrison JL (1983) ‘Anticipating and managing change in educational organisations’ Educational Leadership Association of Supervision and Curriculum Development Roxa T and Martensson K (200() Significant conversations and significant networks- exploring the backstage of the teaching arena Studies in Higher Education Vol 34 no 5 p547-559 Robertson C Robins A and Cox R (2009) ‘Co-constructing an academic community ethos- challenging culture and managing change in higher education: a case study undertaken over two years’ in Management in Education Vol 23 Issue 1 Sage London Wisker G and Constable J (2005) Fellowship and Communities of Practice, SEDA, Anglia Ruskin University UK