1 / 18

Sophie Brettell, Justin Durham, Steve McHanwell – University of Newcastle upon Tyne

“Well nobody reads Learning Outcomes do they?” – An evaluation of CAA and its feedback on directed student learning . Sophie Brettell, Justin Durham, Steve McHanwell – University of Newcastle upon Tyne. The Context. Dental School at the University of Newcastle

Download Presentation

Sophie Brettell, Justin Durham, Steve McHanwell – University of Newcastle upon Tyne

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Well nobody reads Learning Outcomes do they?” – An evaluation of CAA and its feedback on directed student learning Sophie Brettell, Justin Durham, Steve McHanwell – University of Newcastle upon Tyne

  2. The Context • Dental School at the University of Newcastle • Prides itself both on the quality of its teaching and its willingness to try new techniques in an attempt to improve teaching further. • Currently 70 students per year • Intake increased to 100 from next year

  3. The Issue • Dental students taught anatomy of head and neck in year 1 • Need to apply this basic scientific knowledge to clinical situations • Apparent that students have an appreciation of anatomy and understand its importance but find it difficult to apply their knowledge to clinical problems as they arise. • GDC encouraging vertical integration of basic and clinical sciences in all Dental Schools

  4. The Project purpose • To develop and evaluate the usage of six clinically relevant anatomy online tutorials for final year students • Not compulsory or summatively assessed in this first year of development but aim for embedding in curriculum if works • To provide a potential paradigm for development of similar tutorials in other areas of dentistry and other professional subject areas (eg Medicine, Speech and Language Sciences, Law • Interdisciplinary project team providing institutional demonstration that it can work… • Dental Surgeon • Anatomist • Learning technologist(ish)

  5. Tutorial structure • Kept simple, text and image. Supported by timetabled virtual q&a sessions. • Each contained two assessments using EMI-type questions. • Pre-test consisted of pure anatomy questions • Intention was to remind students of vocabulary and language used in anatomy and to trigger recollection of information learnt in early years • Post-test consisted of applied questions • Intention was to test ability of students in applying theoretical knowledge to clinical scenarios

  6. Feedback to students • Detailed feedback provided for each question (same for correct and incorrect answers) • Used a draft version of the ‘Checklist for effective feedback’ produced by the FDTL4 OLAAF project • Focussed on the task not the learner • Included detailed explanation of correct answer

  7. Availability of tutorials • Tutorials released over an eight week period during first semester • Students informed electronically and in lectures as each tutorial released • Students at liberty to work through tutorial and take assessments as many times as wished.

  8. Evaluation methodology • Used a priority sequence model • Two phases – qualitative and quantitative • Qualitative – semi-structured interviews with purposive sample of students and discussion guide • Purposive sample included • completers, partial completers and non-completers • Students who did well, the same as, worse than their cohort in the pre-test, and students who didn’t attempt the pre-test • Students who met the same criteria in the post-test • Students who had a positive or negative pre/post test comparison

  9. Evaluation methodology 2 • Qualitative evaluation cont… • Discussion guide set out areas to be covered in interviews • Assessment related areas included use of feedback, focussing effect of tests, patterns of test usage • Interviews carried out until data saturation reached (n=13) • Transcripts analysed and common concerns/areas for further questioning identified

  10. Evaluation methodology 3 • Qualitative evaluation consisted of questionnaire to whole cohort delivered through institutional vle • 70% completion rate for this questionnaire • Interviewed students also asked to complete an ‘Approaches and Study Skills Inventory for Students’ (ASSIST) questionnaire

  11. Other elements measured • Usage stats – top level only • Assessment success – one time only

  12. Evaluation questions • Did attitude to study have an effect on student likelihood of completing all materials (and doing well) • Did test scores have an effect on student attitude to completing • What did students think of assessment and feedback materials and how did they use them. • Would students continue to use materials as summative assessments approached

  13. Some resulting themes • Unease at using computers expressed in many interviews. • Learning outcomes not directing student efforts in tutorials – used pre-tests instead • Feedback aided understanding and direction of learning • Tutorials highlighted how much students had forgotten

  14. Attitudes to study v success • Analysed in subsets of Completers, Partial Completers and Non-Completers • Initially appeared that there was a connection between attitude and completion. • None of these statistically significant using one-way ANOVA • Could be due to small size of sample • Could be due to nature of programme. Attracts highly motivated students, offers significant reward on successful completion

  15. Test Scores vs completion • Tools available proved inadequate, both for tracking student accesses and for tracking scores if an assessment used more than once.

  16. What did the students think of the materials and how did they use them • Pretests “allowed me to open right parts of my brain and say right this is what I’m actually going to be learning” • Focussing effect of pretests unexpected • Generally liked the assessments – wanted more questions • Liked the feedback “it tells you what you need to know and points you in the right direction”

  17. Would the students continue using materials as assessments approached? • The last tutorial ran in February of this year • Students have accessed tutorials over 100 times since end of project

  18. What next? • Total redesign of tutorial texts, images, tests (ensuring that pre-tests cover all elements listed in learning outcomes) taking into account all that learnt from evaluation process • Use with much wider range of students • Internally with 3rd, 4th and 5th year students and with 2nd year GPT dentists • Use with international PG students at one Russell Group institution • Use with one, possibly two years of UG students at a second Russell Group institution • Integration into the UG curriculum at Newcastle should take place in 05-06 • Further evaluation, including using Attitudes to Study questionnaire (or similar) with all students, not just interviewees • Any suggestions for other improvements to evaluation methodology gratefully received.

More Related