340 likes | 556 Views
Teaching HTA in the classroom and by e-learning. Dr Catherine Meads Senior Lecturer in Health Technology Assessment c.meads@qmul.ac.uk. A bit about me. Senior lecturer in Health Technology Assessment Director of Master’s Degree in HTA
E N D
Teaching HTA in the classroom and by e-learning Dr Catherine Meads Senior Lecturer in Health Technology Assessment c.meads@qmul.ac.uk
A bit about me... • Senior lecturer in Health Technology Assessment • Director of Master’s Degree in HTA • Developed distance learning MSc module in Public Health, Epidemiology and Statistics • Worked with NICE when at University of Birmingham
A bit about the talk... • Three topics • Did our courses result in more publications of systematic reviews? • What knowledge and skills should students of HTA acquire? • What we learned from developing an e-learning course
Advantages and disadvantages • A clinician would have background knowledge but possibly: • a relatively established view about the evidence • lack of systematic review skills • An HTA specialist may produce a report to a very high standard but possibly: • miss vital biomedical facts • not answer a clinically relevant question • Alternatively the HTA and clinical specialists could work together
My (humble) expert opinion • Almost all HTAs embedded in clinical subject • Lack of biomedical input can easily result in clinically irrelevant report. • When completing HTAs, is it more appropriate for a clinician to learn sufficient HTA skills or for an HTA specialist to learn sufficient clinical skills?
A clinical perspective - should students of HTA have biomedical knowledge of their objects of evaluation?
What does the evidence suggest? • RCT? • Cohort study (prospective/retrospective)? • Case control study? • Case series? • Qualitative? • Systematic search for evidence?
A small (underpowered) study • Past students of MSc EBH&HTA • Categorised into clinician and non-clinician (before looking at the marks!) • Dissertation marks for each student (all dissertations are full systematic reviews) • Analysed using SPSS, t test
The results • Mean (SD, range) dissertation mark • for non-clinicians = 65% (9%, 50-72%) (n=13) • for clinicians = 60% (8%, 51-83%) (n=6) • T test p=0.26 • So trend towards clinicians scoring slightly lower than non-clinicians but not statistically significant (or educationally important) result
Conclusions • In my humble opinion HTA is a multidisciplinary activity • For any good systematic review need a variety of skills • Clinical knowledge of the subject area is vital • But also systematic review skills are vital • So a team approach is best
Did our systematic review and health technology assessment courses result in more peer-reviewed publications of research synthesis? Catherine Meads and Amanda Burls Department of Public Health and Epidemiology, University of Birmingham, Edgbaston, Birmingham, B152TT, GB c.a.meads@bham.ac.uk a.j.burls@bham.ac.uk
Aim of project To evaluate whether courses on research synthesis run by the West Midlands Health Technology Assessment Collaboration (WMHTAC) at the University of Birmingham led to subsequent publication of systematic reviews or health technology assessments
Three courses • How to do a Systematic Review • From 1998-2001 • MSc in Evidence Based Healthcare and Health Technology Assessment • From 2001 onwards • Single module from MSc, run as a short course, called Methodological Basis of HTA • From 2001 onwards
Methods • All participants identified through student records and data on file • Name, topic title and other details on Excel spreadsheet • Contacted by letter, telephone or in person to find current job and whether their topic was ever published • Names searched in Medline and Embase for relevant publications
Discussion and conclusions • Very useful evaluation • Participants on ‘How to do a systematic review’ course had more publications than other two courses • Possibly because of increased individual attention and orientation towards practical skills
Developing an e-learning course • Existing courses: • Masters in Public Health (MPH) • Masters in Heath Technology Assessment • Needed some capacity development…. !?!
Health Information, Epidemiology and Statistics • Shared module between Masters courses • Also very popular course for CPD • Large module over 3 weeks (30 credits) • Substantial amount of teaching involved • Small department with few teachers capable of teaching at sufficiently high level • Limited classroom size
The structure became: • 24 weeks of teaching material • Delivered via WebCT online learning platform • Initial paper-based mailout to get students started • Each week has a study guide plus any of • Associated Excel/StatsDirect files • Critical appraisal activities • Web based activities • Self-assessment test
But • Enormous volume of work • Teaching staff reluctance to write the study guides and other teaching materials • Not sure how to do • Little time to do it in • Not sure if ‘they were doing it right’ • I ended up doing far more than originally planned
Please help!
Running the course • Student support rather shaky • Very good IT officer, but • Admin support keeps leaving mid course • I was most of academic student support • I was also managing IT officer and admin support • Many original study guide writers left so I had to find new tutors and support them • (in addition to the anticipated development work on the next module(s) to be converted)
Another difficulty • Same assessment as the classroom version • Two assignments • Health information assignment • Critical Appraisal assignment • A 3 hour exam • Not the usual assessment types found in distance learning courses
Student numbers • 8 students on pilot year, 3 took all assessments, 2 passed • 17 students in first full year, 12 took all assignments, 8 passed, two now doing full MPH course • 23 students next year (but some difficulty with enrolment) • Course closed after the third year
Discussion • Very useful hard outcome evaluation of course because it has the same assessment criteria as the classroom variant (same pass rate of 2/3) • Difficulty of running both courses in tandem • Same taught components but classroom version much more quickly altered and distance version lagged behind • Recognition as equivalent lacking • Lack of people to be etutors + lack of time
Discussion cont. • Participants found course not sufficiently interactive • Probably because of the lack of overt individual attention • But major drive to increase numbers • Even after course closed, requests for several years from management to develop this course
What would I do differently next time? • Insist that development work is separate from running the course • Insist that the classroom teachers develop and renew distance teaching materials at same time as classroom materials • Ensure classroom module co-ordinators sufficiently aware of time commitment needed • Buy in outside help wherever possible