1 / 13

From St George’s to GAME (Advanced Assessment Course)

From St George’s to GAME (Advanced Assessment Course). Bringing ideas and tools home ........ Josephine Boland 4 th December 2012. Reliability is a necessary but insufficient condition for validity (get the validity right first).

zev
Download Presentation

From St George’s to GAME (Advanced Assessment Course)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From St George’s to GAME (Advanced Assessment Course) Bringing ideas and tools home ........ Josephine Boland 4th December 2012

  2. Reliability is a necessary but insufficient condition for validity (get the validity right first) 2. Embedding feedback in a programmatic approach to assessment

  3. Reliability Assessment which provides a consistent, reliable measurement of the student’s performance. Same result would be obtained if it was carried out on the: - same student at a different time - same student by another assessor

  4. Validity A test is valid if it measures what it purports to measure (Kelly 1927) Aligned to learning outcomes • Construct validity • Criterion validity • Content validity

  5. Reliability is a necessary but insufficient condition for validity Striking the right balance? Getting the validity right first!

  6. Reliability is a necessary but insufficient condition for validity Striking the right balance? Getting the validity right first! A plug for Curriculum Mapping!

  7. 2. Embedding feedback in a programmatic approach to assessment

  8. A programmatic approachto assessment planning • Multiple methods • Assessment of learning + assessment for learning • Continuum of stakes (low/ high) • Explicit performance standards • Frequent formative, information-rich feedback • Student centred, longitudinal, holistic and self-directed • Planned programmatically....... Adapted from Dennefer, E (2012) Programmatic Assessment; Theory and Practice. Presentation to the Advanced Assessment Course, St Georges, London, November 2012

  9. Student attitudes to formative assessment We’re so stressed about exams … I love writing case reports but if I’ve a choice between studying for my exam and writing a beautiful case report with no marks …. It’s unfortunate that I think like that but….. 5th year medical student (NUIG) Boland. J. (2012) Quality Issues in Assessment .Presentation to INMED, UL November 2012

  10. Planning programmatically • See handout

  11. Types of feedbackWays of giving feedback • Tutor feedback, peer feedback • Qualitative /quantitative • Written, oral, face-to face, electronic • Individualised, generic (whole group) • Solution/correct answer (reasons why) • Comments on performance • Numeric score per criterion • Others....?

  12. Identifying the best opportunities for formative feedback • See handout

  13. Reliability is a necessary but insufficient condition for validity (get the validity right first) 2. Embedding feedback in a programmatic approach to assessment

More Related