1 / 59

Assessment and feedback: less is more?

Assessment and feedback: less is more?. Peter Hartley, Professor of Education Development, University of Bradford p.hartley@bradford.ac.uk http://www.brad.ac.uk/educational-development/aboutus/team/Full_details_27414_en.php National Teaching Fellow Visiting Professor, Edge Hill University.

santiago
Download Presentation

Assessment and feedback: less is more?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment and feedback:less is more? Peter Hartley, Professor of Education Development, University of Bradford p.hartley@bradford.ac.uk http://www.brad.ac.uk/educational-development/aboutus/team/Full_details_27414_en.php National Teaching Fellow Visiting Professor, Edge Hill University

  2. Elluminate protocols • Use of mics/headsets. • Please use the text box for links and questions/comments. • Use the smiley to indicate ok. • Please raise your hand if you want to make an audio comment. • We’ll try polling later this session. • This session is recorded.

  3. This session • Why bother? What’s the problem? • Four potential solutions • Developing strategic approaches. • Changing processes and systems. • Reconsidering feedback. • Talking advantage of new technology.

  4. And the details • Developing more strategic approaches. • Programme-based assessment (PASS project). • Comparing assessment environments (TESTA project). • Applications and implications, e.g. work at Brunel. • Changing processes and systems. • Examples from the Curriculum Design Programme. • Reconsidering feedback. • Taking advantage of new technology. • Audio feedback. • Video and audio combinations. • Clickers and response systems. • Adaptive systems. • Integrating systems and mobile applications.

  5. A word on ‘costs’ • Should not be short-term ‘penny-pinching’ • Should be: • Making better use of limited staff time. • Helping students to perform to their capacity. (NB impact on retention) • Increasing focus on ‘assessing the important’.

  6. Assessment: multi-purpose and multi-audience

  7. Issues in assessment:what’s bothering you?

  8. Issues in assessment:what’s bothering you? • Managing large classes • Return on time and effort by staff? • Are we using the right mix of methods? • Student anxiety • Getting students to act on feedback • Reliable technology? • Do staff understand the technology available ? • Fit for purpose? • Can online assessment deal with complex tasks? • Is our feedback timely? • How do we reliably assess open-ended tasks? • Balancing freedom and control • Aligning assessment and teaching

  9. Strategy

  10. Programme-based assessment: PASS • NTFS group project over 3 years: • Two years of development and investigation and one year of implementation. • Consortium: • Led by Bradford; • 2 CETLs – ASKE and AfL. • Plus Exeter, Plymouth and Leeds Met. • Plus critical friends.

  11. What are we investigating? How to design an effective, efficient, inclusive and sustainable assessment strategy that delivers the key course/programme outcomes.

  12. Why investigate this? • Consider the elements buried in the project aim:

  13. Learning from interesting friends

  14. TESTA project • NTFS group project with 4 partners: ‘aims to improve the quality of student learning through addressing programme-level assessment.’ • starting from audit of current practice on nine programmes: • surveyed students using focus groups and AEQ – Assessment Experience Questionnaire – Graham Gibbs et al • also using tool to identify programme level ‘assessment environments’ (Gibbs)

  15. Consistent practice? Characterising programme-level assessment environments that support learning by Graham Gibbs and Harriet Dunbar-Goddet Published in:Assessment & Evaluation in Higher Education, Volume 34, Issue 4 August 2009 , pages 481 - 489

  16. The ideal number of assessment methods? • 2 • 6 • 10 • 15

  17. Total number of words in assessment feedback? • 1,000 • 3,000 • 5,000 • 10,000

  18. Data from TESTA

  19. The need for strategy • An example finding from Gibbs • ‘greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ • And what did make a difference?

  20. The need for strategy • An example finding from Gibbs • ‘greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ • And what did make a difference? • Formative-only assessment; • More oral feedback; • Students ‘came to understand standards through many cycles of practice and feedback’.

  21. Also look out for: NB Event on June 22.

  22. PASS outputs to date • General literature review. • Students’ view of assessment strategies. • Assessment issues. • Medical school case study. • Inclusive assessment. • Survey of staff attitudes.

  23. Outputs in progress • Assessment types at professional level. • Survey of practice across the UK & international perspective. • Further case studies. • ‘Manifesto’/position paper.

  24. Issues to disentangle include: • Defining PBA. • Assessment environments & impact. • Effective regulatory frameworks. • Staff perspectives and workload. • Student perceptions and expectations. • How to develop an effective strategic approach. • Grading and credit (and the ‘best’ relationships between them).

  25. Defining assessment: a challenge • Programme outcomes “need to be assessed in complex, multidimensional student performances” (Rogers, Mentkowski, & Reisetter Hart, 2006, p. 498). • How do students define/perceive their performance? • e.g. what makes you a ‘First Class Engineer’?

  26. Student perceptions and concerns • perceptions of ‘the course’ are variable; • assessment experienced as ‘fragmented’. BUT • anxieties re move to more integrated assessment – perceived risk in terms of performance; • concerns about feedback and timing.

  27. Searching for types

  28. Searching for types

  29. An example: Peninsula Medical School • Case study already available. • Includes: • four assessment modules that run through the 5 year undergraduate medical programme and are not linked directly to specific areas of teaching. • focus on high-quality learning (Mattick and Knight, 2007).

  30. Further case studies being explored • Brunel • New regulations which separate study and assessment blocks. • Liverpool Hope • New regulations which ‘abandon modules’ in all undergraduate programmes. • ‘Key Honours Assessment’.

  31. Brunel: the regs • 120 credits per year of study. • Course/programme can include mix of study, assessment and modular blocks. • Option blocks must be modular. • Blocks must be in multiples of 5 credits • Maximum assessment block is 40 points

  32. Examples from Brunel • Biomedical Sciences • Study and assessment blocks in all years. • Cut assessment load by 2/3rds; generated more time for class contact. • Synoptic exam in all three years.

  33. Examples from Brunel • Biomedical Sciences • Study and assessment blocks in all years. • Cut assessment load by 2/3rds; generated more time for class contact. • Synoptic exam in all three years. • Mathematics • Conventional modules in final year only. • Improved understanding and ‘carry-over’ of ‘the basics’ into year 2.

  34. And finally on PASS … • Visit the web site: • www.pass.brad.ac.uk • Contact us at: • pass@bradford.ac.uk

  35. Processes and systems

  36. Examples from the Curriculum Design Prog. • eBioLabs (Bristol) • By combining interactive media with formative self-evaluation assessments students learn the methods and techniques they will use in the lab, without risking valuable time, equipment or materials. Because students first experiment on-line there is a reduced chance of cognitive overload during the practical and they are more able to concentrate on the wider aims of the experiment, rather than blindly following the lab instructions. • Because eBiolabs includes tools that automatically mark student assignments, and tools that allow academics to easily track student attendance and achievement, the marking and administrative burden associated with running practicals is very significantly reduced.

  37. Curriculum Design • CASCADE project (Oxford) • Online assessment submission • Students can now submit assignments much more easily at any time from anywhere in the world. It is also possible to predict significant efficiencies in assignment handling time for the Registry staff who deal with student submissions for approximately 260 course assignments across 48 course cohorts a year: a saving of 30 minutes per assignment or more soon cumulates savings in the order of half a day per week. Other advantages of the new online system are the reduction in paper handling and photocopying, as well as better auditing and control. Reduction in paper storage is a further advantage both in terms of less physical space being required and also in terms of less staff time being required to retrieve data from the archive.’

  38. Curriculum Design • ESCAPE project (Hertfordshire) • Effectiveness vs efficiency. (watch the video)

  39. Feedback

  40. The importance of feedback:an example to start … 59% Excellent. • This was the only tutor comment on a student assignment. How do you think the student reacted and felt?

  41. Assessment is a problem:feedback is part of it. • See the PASS Project Issues Paper • Please comment/feedback and use. • http://www.pebblepad.co.uk/bradford/viewasset.aspx?oid=260486&type=file • Would highlight: • Assessment ‘drives and channel’. • What/why are we measuring: the‘slowly learnt’problem. • Limitations of grading (e.g. marks are not numbers). • Implications for course structures/regulations. • .

  42. Assessment tools: the meaning of feedback • Cannot we ‘recapture’ the ‘original’ meaning of feedback: enabling self-correcting behaviour towards a known goal. • This means rediscovering the ‘feedback loop’ whereby information must be ‘fed back’ so that it: • relates to the goal. • is received. • is correctly interpreted. • enables corrective action. cf. the work of Royce Sadler in Higher Education, e.g. http://www.northumbria.ac.uk/sd/central/ar/academy/cetl_afl/earli2010/themes/rsadler/

  43. New technology

  44. Example 1: audio • The ASEL project • led by Bradford with Kingston as partner. • various uses of audio, including feedback, in different disciplines. • Noted: • Technology is now easy and accessible. • Positive student reactions. • Different tutor styles and approaches. • A different form of communication? • Serendipity – e.g. feedback stimulated podcasts.

  45. ASEL main conclusions • … audio is a powerful tool, providing opportunities for personalising learning, promoting greater student engagement, and encouraging creativity. • In introducing audio into their practice, lecturers were required to rethink their pedagogical approaches and learning design, adopting new and innovative ways to enable students to be more actively involved in the learning process. • It allowed lecturers to provide more personal and richer feedback to students, and increased the level of interaction and dialogue amongst students and between students and lecturers. • (Stewart and Dearnley)

  46. Example 2:audio and video • Growing number of examples. • ALT/Epigeum Awards 2010: see the ALT Open Access Repository • See the winning entry by Read and Brown from Southampton: • Organic Chemistry. • Use of tablets to show solutions and working. Focus on self-assessment.

  47. Example 3:clickers are coming • Student Response Systems at the moment? • They work … they can change staff and student behaviour and performance. But • can be cumbersome and fiddly. • setup time. • need strong commitment and support (e.g. see experience at Exeter Business School).

  48. Example 3:clickers are coming • Student Response Systems in the future? • They will radically change staff and student behaviour. • They will be flexible and easy to use. • They will be on the student’s own device!

  49. Example 4:adaptive systems • PBL with consequences – you get immediate feedback on the consequences of your decisions. • e.g. The G4 project at St George’s • http://www.generation4.co.uk/ • Their Ethics simulation - iEthics • Adaptive assessment • e.g. the work of Trevor Barker • http://www.heacademy.ac.uk/contacts/detail/ntfs/2008/barker_trevor_2008wd

More Related