360 likes | 560 Views
BENCHMARKING OF ASSESSMENT. Articulating and Comparing Standards through. Deakin University Ms Heather Sainsbury University of Tasmania Dr Sara Booth University of Wollongong Ms Anne Melano | Ms Lynn Woodley. Dr Sara Booth University of Tasmania.
E N D
BENCHMARKING OF ASSESSMENT Articulating and Comparing Standards through • Deakin University Ms Heather Sainsbury • University of Tasmania Dr Sara Booth • University of Wollongong Ms Anne Melano | Ms Lynn Woodley
Dr Sara Booth University of Tasmania
CONTESTED SPACE STANDARDS AND BENCHMARKING • Argument 1Explicit • Argument 2Implicit Explicit • Implicit Standards in universities are self-monitoring and self-regulating • Explicit Standards means diversity, substance, accountability and transparency • They are a basis for comparison and collaboration • Universities need to become more explicit in comparison of standardsTo do this: • Make explicit definition of standards used • Make explicit definition of benchmarking used • Standards mean uniformity - one size fits all - national curriculum • 5 sets of sector standards(DEEWR & TEQSA) for Provider Registration, Provider Category, Qualification (AQF), Information, Teaching and Learning, Research • Sets of academic standards – a contested space including professional (e.g. teaching standards); quality assurance; minimum threshold (what is achieved); aspirational and student achievement standards (Carmichael, 2010) • TEQSA’s discussion paper on Teaching and Learning Standards (July, 2011) • Learning/Teaching standards/role of TEQSA/role of universities • Definition of Benchmarking is varied across sector
BENCHMARKING AS A PROCESS FOR IMPROVEMENT THROUGH COMPARISON OF STANDARDS Jackson and Lund (2000, cited in Stella & Woodhouse, 2007, p.14) define benchmarking as ‘ first and foremost, a learning process structured so as to enable those engaging in the process to compare their services/activities /products in order to identify their comparative strengths and weaknesses as a basis for self improvement and/or self regulation’. Agreed points of comparison – Deakin, UOW, UTAS • Three Cycle 1 AUQA Audits specified more benchmarking • Comparable institutions - age, structure, regional presence, disciplines • Benchmarking awareness and confidence at similar level
UNIVERSITIES ARE AT DIFFERENT STAGES OF DEVELOPMENTTOWARDS BENCHMARKING
Ms Heather Sainsbury Deakin University
ASSESSMENT BENCHMARKING –CASE STUDY OF A SUCCESSFUL PARTNERSHIP Planning • Establishing the benchmarking partnership • Agreement on area and scope • Planning for success Implementation • Communicating with faculties • Streamlining the process • Putting it together
THE BENCHMARKING PARTNERSHIP Success factors • Shared understanding of benchmarking goals • High level of trust • Willingness to share information and discuss successes and failures
THE BENCHMARKING PARTNERSHIP Success factors • Similar enough to offer transferable strategies
THE BENCHMARKING PARTNERSHIP Success factors • Comparable commitment
THE BENCHMARKING PARTNERSHIP Success factors • Sustained commitment
THE BENCHMARKING PARTNERSHIP Success factors • Sustained commitment
THE BENCHMARKING PARTNERSHIP Success factors • The more partners there are the harder it gets • Communication and flexibility the keys to success
AGREEMENT ON AREA AND SCOPE What to benchmark? • Catalyst for assessment project – 2009 AUQF in Alice Springs • Paper by Linda Davies (Griffith Uni) on ALTC Teaching Quality Indicators Project – external reference point • Shared commitment to review assessment practice in the lead up to our respective AUQA audits in 2011 • Potential to deliver significant benefits to all three universities • Support from relevant Executive and other leaders critical
AGREEMENT ON AREA AND SCOPE Agreement on scope • Careful scoping through collaborative process involving senior academic and quality leaders from each university • Time period • Coverage – undergraduate but excluding Honours • Focus on standards – assessment design not covered • Agreement on data to be shared • Make sure that you are talking about the same thing – different terminology a potential barrier • Take the time to get it right…
AGREEMENT ON AREA AND SCOPE Agreement on scope • Keep sight of the main objective
PLANNING FOR SUCCESS Agreement on methodology • Derived from existing successful methodology - ACODE Benchmarking Framework (2007) • Self-review by each partner • Peer review • Action plans (shared) • Adapted indicators and measures developed through TQIP project • Tested against literature on good practice, expert reviewers and academic leaders at each university • Agreement reached on: • Performance indicators • Good practice statements • Performance measures • Trigger questions
PLANNING FOR SUCCESS Agreement on performance indicators and measures PI #1: Assessment purposes, processes and expected standards of performance are clearly communicated and supported by timely advice and feedback to students Good Practice Statement: Students receive clear and timely information on the aims and details of assessment tasks; marking and grading practices; expected standards of achievement; and requirements for academic integrity. They are provided with timely feedback on their performance and supported in making improvements. Performance measures: 1.1 Expectations are clearly communicated 1.2 Advice and feedback are provided Trigger questions under each measure
PLANNING FOR SUCCESS Agreement on self-review templates
PLANNING FOR SUCCESS Agreement on timelines • Build in flexibility for partners to move at slightly different speeds at different times, while still all meeting critical common dates: • Finalising templates • Completion of self-reviews and sharing of self-review reports • Peer review workshops • Contributions to shared reports • Accommodate internal deadlines of partners wherever possible (key committee dates, AUQA deadlines)
IMPLEMENTATION Communicate with faculties • Prepare a communication plan • Consider the culture – eg UOW is very consultative, very engaged faculty T&L chairs • Hold a high level briefing – establishes importance, brings faculty leaders together • Hold informal one-on-one meetings – answers questions and address concerns • Don’t rush – do invite comments on documents and processes – builds ownership • Send out updates as project progresses • Thank/acknowledge along the way
IMPLEMENTATION Provide support • Appoint a project coordinator • Encourage faculties to identify a person to support faculty leader • Offer funding or admin assistance if possible • Provide a clear guide to the process • Provide data packs • Offer draft emails, information sheets etcthat faculties can send to staff • Attend faculty self-reviews – helpful as questions of interpretations do arise
IMPLEMENTATION Streamline the process • Faculties are time poor - risk of backlash iftime contributed not rewarded by benefits • Clear, realistic timeline and expectations • ONE self-review meeting in each faculty – if put together the right people, most questions can be answered • ONE template to work through – all questions clearly set out • Simple rating scale • As much as possible of the template completed in that meeting • A rating on each measure MUST be agreed by the group. Otherwise there is no clear result • A similarly streamlined process for institutional reviews and for the peer review across three universities
IMPLEMENTATION But it does need rigour… • Question design based on: • Griffith ALTC project, additional workby Boud, advice from Joughin, testing in a faculty • Evidence: • has to be provided to support each rationale/rating • collecting this is a major effort by faculty leadersand their admin assistant • survey conducted at UTAS – valuable and can bedone centrally • all evidence checked centrally
IMPLEMENTATION Sharing • At each level, encourage the conversations – these can be just as important as the project outcomes. Good practice sharing, questioning and problem solving naturally occurs – let it • Faculties aren’t mediaeval castles – encourage interaction • UOW – each faculty leader sat in on another’s self review • Deakin – four Associate Deans (T&L), very collegial • Avoid the ‘black hole of benchmarking’. Reward evidence-gathering by selecting and disseminating good practice
IMPLEMENTATION Putting it together – the institutional self-review • Faculty reports combined into an institutional report • All leaders brought together • Agreement on institutional rating, good practice and gaps/issues • Discussion of each measure with top issues agreed – these form the basis of an action plan for the future
IMPLEMENTATION Putting it together– the three-university peer review • Face-to-face if possible • Selection of leaders brought together • Icebreakers, time to mingle • Template provided to work through – each institution’s results and ratings on each measure • Review of institutional ratings • Discussion of good practice and gaps/issues • Expect surprises! You may be doing better than you think … • OR your ‘best practice’ may be just ‘ho-hum that’s what everyone is doing’!
KEY OUTCOMES THE PROCESS Using and sharpening the tools: • What works and what doesn’t • The broad indicators of the Griffith TQIP project (Davies, 2009) • The ACODE Benchmarking Framework • Templates – the Pollard Rating Index "No but yeah but no but yeah but no but... • Killing two birds : making the most of the project • Benchmarking logistics: checking the steps and the flight plan • Escaping the black hole –the action plan • Becoming a toolmaker
KEY OUTCOMES THE PROCESS Collegial partnerships • Institutional: self-review activity; cross faculty bonds • Cross- university: co-ordinators, executive and academic staff • A mutual learning process for all involved
KEY OUTCOMES THE TOPIC Assessment - Standards at work: • The academic standards trinity: Learning Outcomes, Assessment, Graduate Qualities • An “academic” exercise in definition or a “real world” definition - how academics set, monitor and review standards? • Uniformity Vs Quality and Good Practice
KEY OUTCOMES Assessment - Good Practice and Quality Improvement: • Insights and ideas from the practices of others • Good practice and areas for improvement for each faculty and each university What we do well: • For example: Deakin - Online Unit Guide; UTAS - Criterion-referenced assessment (CRA) supported by faculty champions; UOW - educativefocus of Academic Integrity Policy What we needed to do better: • Connecting learning outcomes, Graduate Attributes/Qualities and Assessment (the crux of academic standards) • Staff development (incl. sessional staff) • Marking practices for group work • Use of best practice models • Benchmarking at the course/program level (Oliver, 2009)
KEY OUTCOMES ‘ first and foremost, a learning process structured so as to enable those engaging in the process to compare their services/activities /products in order to identify their comparative strengths and weaknesses as a basis for self improvement and/or self regulation’. Did we achieve the Project Aims? • Compare processes within faculties, across each university and across the three universities. • Compare the effectiveness of Academic Boards/Senates in performing their role in policy and standards, across the three universities. • Identify good practice and areas where improvements can be made for the benefit of students and staff at each university. • Develop and share knowledge and experience between the three benchmarking partners about the process of benchmarking. Your rating? "No but yeah but no but yeah but no but..."