420 likes | 433 Views
Explore challenges and convergent theories in assessment, offering pragmatic ideas like exemplars and rubrics. Address student feedback issues, academic integrity, inclusivity, and actionable approaches to feedback.
E N D
Making sense of assessment complexity Dr Lydia Arnold EdD, PFHEA, NTF @HarperEdDev lydiaarnold.net larnold@harper-adams.ac.uk
About me: Educational Developer, Head of eLearning & Principal Lecturer, Harper Adams University. Key roles: Curriculum, eLearning, PgC, internal assessment and feedback consultant, staff development, action research supervisor Realist (or so I hope!) Read more lydiaarnold.net Follow @HarperEdDev
This session: Share challenges around assessment Explore some convergent theories that might guide us Offer some pragmatic ideas • Exemplars • Rubrics • Authentic assessment Seed ideas for action
Context Context matters (Arnold, 2017) Staff view is underrepresented (Arnold, 2017; Evans, 2013; Norton, Floyd and Norton, 2019; Bearman et al. 2016) Others share the concerns we hold Contention growing around some of the beliefs that have been almost ubiquitously central to the anglo-assessment landscape (see for example Carson, 2019) More research is STILL needed ….
Considerations when forming assessments (all provide challenges): Learning outcomes alignment Fairness Level Reliability Inclusivity National and Institutional requirements Opportunities for feedback Authenticity Academic integrity Validity Quality assurance factors Manageability Verification/Accountability Cultural relevance Discipline traditions Variety Preparatory support Others?
Student use of feedback • Maybe consider: • Legibility • Volume • Tone • Lack of curriculum links • Focus • Alignment to criteria • Accessibility • Consistency • Timeliness They just don’t use it ! “we as a department do spend a lot of time writing feedback. But what’s the point if they’re not going to use it, they moan that they don’t get it”. “he's not even bothered reading his feedback and it's probably the most important thing as lecturers we can give … So why aren’t they using that feedback? why don’t see it as being very valuable?”
Clarity First person is OK, and they really should discuss how they collated their data. I don’t just want results. And those who are critical of the literature will get the best grades. I want my students to include at least ten references and they should write in third person … oh and they must include a summary in their report. • Polanyi (widely cited) highlighted the need for articulated understanding back in 1958! • O’Donovan, Price & Rust identified tacit knowledge as a problem in 2004 • Still it’s an issue today… see for example Boud & Carless (2018).
“Students are strategic as never before” (Gibbs, 2019) Should we enable students to be strategic? Is it OK? What else would you have done in your research project Adam?
Level I never know if the assessments I am setting are at the right level. How do you tell? I looked at the standards document and thought, my goodness, we are no where near at the right level. Or is it just the way I am reading it? I spoke to Mark and he said what I was doing is fine. I am so confused. See the work of (Brown 2014) for more on staff perspectives on levelness
Academic integrity (plagiarism, essay mills etc.) 50,000 cases in 3 years: 17,000 per year= 0.7% (Mostorous & Kenber in King, 2019) “Hardly the student cheating crisis that the media headline purported’ (King, 2019)
Inclusivity and accessibility Othering (BTEC/A’Level; Disability; Mature; Culture; Preference) Reasonable adjustments Morals and metrics Intersectionality Context specific othering Created from Thomas & May (2010)
Feedback paradigm shifts? Reimann, Saddler and Sambell (2019) Disconnect between theory and practice Mismatch between dialogic and traditional Winstone & Carless (2019): Timed to enable application to new tasks (it’s not just about turnaround times – consider timeliness) Actionable, implementable comments (not comment overload) Moving from an approach of delivering feedback to students seeking guidance (feedback as process and dialogue, not product)
Calls for interconnectivity of assessment (See Reimann, Saddler and Sambell, 2019; Winstone and Carless, 2019*) Hovering feedback… is it even feedback at all?
Examples of interconnected feedback Coordinate assessments of the same type (lab reports … 1x peer to calibrate judgment, then 4 x equally weighted tutor marked) Feedback on planning stages (e.g. me-maps) Two stage assessments (presentation of reading to inform an essay - ) Feedback on drafts (check out screencast-o-matic) –parity issues and self preservation. Consider no-feedback at the end. Feedback 1, 2 or 3! Building up feedback – section by section
Structured exercises (Jessop (2019) suggests a word cloud or active collation Exam practise in the same format (MCQ will be limited for extended essay exams) Portfolio tasks (patchwork and multimedia see Arnold, Thomson and Williams, 2010) Don’t forget peers! Peer house groups – is this a culture of feedback? Formal exercises
Maximise Minimise “It is not until students apply criteria and standards to judge their own work … that their own work will improve (Gibbs, 2019, p. 27). ‘Providing feedback rendered the students more active and involved in their learning, enhanced their responsibility and commitment to the task’ (Ion, Sánchez Martí and Morell, 2019)
Exemplars Examples of assessment used to facilitate the development of student performance
Concerns I worry students will just copy Demotivating for weaker students Students will have their creativity stifled I’m unsure about what’s OK to share and what consent is needed I don’t have an example of ‘this’ assignment Don’t work for ‘science’ subjects Quotes paraphrased from Pittson (2018)
Benefits Clarify assessment requirements (Orsmond, Merry and Reiling, 2002; Kean, 2012); Surface tacit assumptions about assessment (Chong, 2018; Scoles, Huxham and McArthur, 2014) Increase students’ self-direction (Grainger, Heck, and Carey, 2018); Prompt activities that generate feedback (Orsmond, Merry and Reiling, 2002); Bring about greater confidence in standards (Bamber, 2014); Pro-actively provide feedback in an efficient and timely manner (Scoles, Huxham and McArthur, 2014) Improve dialogue around assessment (Carless and Chan, 2017). Reduces anxiety Reduces housekeeping Helps students understand assessment Helps students get started Visualise different levels Encourage and inspire
Rubrics Who uses a rubric? Many meanings: My kind of rubric (take care in discussions) According to Popham (1997 cited in Dawson 2017) rubrics must include: Evaluative criteria (the stuff that you measure) Quality definitions (what you need to do to achieve a certain outcome) Scoring strategy
Is that the house rubric or would Sir like something more bespoke? (drawing on and adding to Dawson, 2017) (for a substantial review of rubric design see Dawson, 2017)
Bringing rubrics to life (Arnold & Headley, 2019 Forthcoming)
To make the theory work … Standardisation Teams for assessment Cross-teamworking Engage new staff Original source frameworks Iterative development Take a longitudinal view Analysis of grades and questions Explore surrounding levels Specific CPD Exemplars, rubrics and socialisation
Engagement Feedback and self feedback Level Academic integrity (-ish) Time pressure on teaching staff Clarity More to do … engagement, motivation and integrity
Authentic assessment Simplistically: Providing assessment opportunities which are like tasks in the ‘real world’. Higher cognitive demand, associated with problem solving, creation, synthesis and complexity.
A spectrum of opportunities for industry or community engagement Jessop (2019) – the power of audience Ultraversity – ‘The Exhibition’ The Food Spectacular (0.34)
Benefits Assessment FOR learning Motivating for students Higher order cognitive skills Develops a wide range of skills Interesting and rewarding for staff Helps the defence of HE against essay mills Can feed in to real world challenges when paired with industry Challenges Risk of utilitarian narrative Time in preparation Not all students align with the career area Support needs to be aligned More challenging in non-vocational courses For an exploration of authentic assessment I’d recommend Villarroel et al (2018)
Examples Video commissioned project Wildlife surveys Client consultation role play Social media strategy for local charity or start up business • Case study: • Rationale • Real clients • Managablilty • Tech help • Pedagogy of uncertainty • Criteria with headroom • https://www.youtube.com/watch?v=fcFJ19LyMVI&list=PLRKWnZs4l1bDuAvAE0UwmzeOL6EdZ06l8&index=4 • https://www.youtube.com/watch?v=JgrRCiW_OgU&t=4s
Can exams be authentic too? Viva Triple jump Open book (reduces anxiety) Simulation 1st year vivas give students an opportunity to excel at a different kind of assessment. I find they often reflect on their learning and development more maturely and with richer examples and ideas when they can do this verbally, face-to-face.
A note on universality Good design works for all students, not just sub-groups Move away from a medical model towards an inclusive community
Assessment should lead students to learn Assessment should be an integrated part of the learning process Assessment must be supported by a dialogic and transparent culture amongst staff and students Assessment should be varied, but purposefully coordinated Assessment should be linked to enable feedback to be usefully actioned Assessment should provide a fair opportunity to enable success amongst all who enter higher education Assessment should be authentic Assessment should include a degree of choice and personalisation Assessment should enable students to be more successful and creative than we can imagine, it should not be limiting (provide scope for students to excel and delight) Assessment must be manageable for students and staff
Concluding thought Assessment is highly complex. We can ease the stress, improve engagement and resolve many of the impasses, while making the process more enjoyable and valuable by reframing assessment and feedback as a social process, a means of binding the scholarly community and as integral to learning and not as a bolt on. Exemplars, rubrics, and authentic assessment along with communities of practice can help with this.
Arnold L., Thompson, K. and Williams, T. (2009). Advancing the Patchwork Text: The Development of Patchwork Media Approaches. The International Journal of Learning, Volume 16, Issue 5, pp.151-166. Arnold, L. (2017) ‘No Title’, The Higher Education Journal of Learning and Teaching, 1(1). Available at: http://hejlt.org/article/the-craft-of-feedback-in-a-complex-ecosystem/. Bamber, M. (2014) ‘The impact on stakeholder confidence of increased transparency in the examination assessment process’, Assessment & Evaluation in Higher Education, 40(4), pp. 471–487. Bearman, M. et al. (2016) ‘Support for assessment practice: developing the Assessment Design Decisions Framework’, Teaching in Higher Education. Routledge, 21(5), pp. 545–556. doi: 10.1080/13562517.2016.1160217. Brown, S. (2014) ‘What are the perceived differences between assessing at Masters level and undergraduate level assessment? Some findings from an NTFS-funded project’, Innovations in Education and Teaching International. Routledge, 51(3), pp. 265–276. doi: 10.1080/14703297.2013.796713. Brown, S. (2019) ‘Using assessment and feedback to empower students and enhance their learning’, in Bryan, C. and Clegg, K. (eds) Innovative Assessment in Higher Education. 2nd edn. Oxford: Routledge, pp. 50–63. Carless, D. and Boud, D. (2018) ‘The development of student feedback literacy: enabling uptake of feedback’, Assessment and Evaluation in Higher Education. Routledge, 43(8), pp. 1315–1325. doi: 10.1080/02602938.2018.1463354. Carless, D. and Chan, K. K. H. (2017) ‘Managing dialogic use of exemplars’, Assessment & Evaluation in Higher Education, 42(6), pp. 930–941. Carson, J. T. (2019) ‘Blueprints of distress?: Why quality assurance frameworks and disciplinary education cannot sustain a 21st-century education’, Teaching in Higher Education, pp. 1–10. doi: 10.1080/13562517.2019.1602762. Chong, S. W. (2019) ‘The use of exemplars in English writing classrooms: from theory to practice’, Assessment & Evaluation in Higher Education, 44(5), pp. 748–763. Dawson, P. (2017) ‘Assessment rubrics: towards clearer and more replicable design, research and practice’, Assessment and Evaluation in Higher Education. Routledge, 42(3), pp. 347–360. doi: 10.1080/02602938.2015.1111294. Gibbs, G. (2019) ‘How assessment frames learning’, in Bryan, C. and Clegg, K. (eds) Innovative Assessment in Higher Education. Oxford: Routledge, pp. 22–35. Grainger, P.R., Heck, D. and Carey, M.D. (2018) ‘Are Assessment Exemplars Perceived to Support Self-Regulated Learning in Teacher Education?’, Frontiers in Education. Jessop, T. (2019) ‘Changing the narrative: a programme approach to assessment through TESTA’, in Bryan, C. and Clegg, K. (eds) Innovative Assessment in Higher Education. 2nd edn. Ox: Routledge, pp. 36–49.
Kean, J 2012, ‘Show AND Tell: Using Peer Assessment and Exemplars to Help Students Understand Quality in Assessment’, Practitioner Research in Higher Education, vol. 6, no. 2, pp. 83–94. King, H. (2019) Stepping back to move forward: the wider context of assessment in higher education. 2nd edn. Edited by C. Bryan and K. Clegg. Oxford: Routledge. Ion, G., Sánchez Martí, A. and Agud Morell, I. (2019) ‘Giving or receiving feedback: which is more beneficial to students’ learning?’, Assessment and Evaluation in Higher Education. Routledge, 44(1), pp. 124–138. doi: 10.1080/02602938.2018.1484881. Norton, L., Floyd, S. and Norton, B. (2019) ‘Lecturers’ views of assessment design, marking and feedback in higher education: a case for professionalisation?’, Assessment & Evaluation in Higher Education, pp. 1–13. doi: 10.1080/02602938.2019.1592110. O’Donovan *, B., Price, M. and Rust, C. (2004) ‘Know what I mean? Enhancing student understanding of assessment standards and criteria’, Teaching in Higher Education. Informa UK Limited, 9(3), pp. 325–335. doi: 10.1080/1356251042000216642. Orsmond, P., Merry, S. and Reiling, K. (2002) ‘The Use of Exemplars and Formative Feedback when Using Student Derived Marking Criteria in Peer and Self-Assessment’, Assessment & Evaluation in Higher Education, 27(4), pp. 309–23. Pittson, H (2018), Action Research, Submitted to Harper Adams University in Part Fulfilment of a Postgraduate Certificate. Reimann, N., Sadler, I. and Sambell, K. (2019) ‘What’s in a word? Practices associated with “feedforward” in higher education’, Assessment and Evaluation in Higher Education. Routledge. doi: 10.1080/02602938.2019.1600655. Scoles, J., Huxham, M. and McArthur, J. (2014) ‘No longer exempt from good practice: using exemplars to close the feedback gap for exams’, Assessment & Evaluation in Higher Education, 38(6), pp. 631–645. Thomas, L. and May, H. (2010) Inclusive learning in higher education, Enhancing Teaching in Higher Education. yORK: HEA. doi: 10.4324/9780203416006_chapter_16. Villarroel, V. et al. (2018) ‘Authentic assessment: creating a blueprint for course design’, Assessment and Evaluation in Higher Education. Routledge, 43(5), pp. 840–854. doi: 10.1080/02602938.2017.1412396. Winstone, N. and Carless, D. (2019 forthcoming)* Designing effective feedback processes in higher education. A learning focussed approach. London, Routledge*. Notes * Extracts from forthcoming book shared on Twitter via conference slides
Source of images under Creative Commons Man Juggling https://www.publicdomainpictures.net/en/view-image.php?image=232490&picture=man-juggling-balls Elephant http://lh5.ggpht.com/_jhfurX4cT_4/Sjwrot_ia0I/AAAAAAAABNQ/f4cpwWJ7z4c/s1600-h/How-to-eat-elephant%5B3%5D Essay mill headlines https://www.theguardian.com/education/2019/mar/20/essay-mills-prey-on-vulnerable-students-lets-stamp-them-out Clip art teachers http://clipartmag.com/universities-clipart Acknowledgements: Helen Pittson (HAU Exemplars CoP Lead) Jane Headley (HAU Exemplars CoP Founder) Jayne Powles (Lab report example) Helen Morrell (Patchwork text) Emma Tappin (Originator of Student Video Projects – Authentic Assessment Examples) And the HAU Exemplars Community of Practice