290 likes | 304 Views
This text discusses the reasons for evaluating health informatics projects, the problems faced during evaluation, and the objective and subjective models of evaluation. It explores the perspectives of stakeholders and the effects of evaluation on the structure, processes, and outcomes of healthcare. The complexity of evaluating the combination of medicine, health care, and information systems is highlighted, along with the challenges of evaluating the impact of information systems on patient care. Various evaluation methodologies and study features are also discussed, including measurement studies, demonstration studies, descriptive studies, comparative studies, and correlational studies.
E N D
Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model
Definitions • Evaluate – to determine the value of (Chambers) • To examine and judge carefully (Dictionary.com)
Reasons for evaluation(Friedman/ Wyatt) • Promotional – • encouraging people to use systems • Scholarly – • study of the impact etc. of HI systems • Pragmatic (practical) – • finding out what is good and bad, improving future systems • Ethical – • like any medical intervention, safe and effective • Legal – • same reason. Also to inform users so they know when and when not to use it
Perspectives • Stakeholders • Developers • Users • Patients • Managers • Sources of funding
Effects • Structure – environment, staff, money • Processes – diagnosis, investigation, treatments • Outcomes – success of treatment, survival, continuing health
Complexity • Combination of • Medicine & health care • Information systems / IT • Evaluation methodology • Each of these is a huge area • Arguably IT is the simplest, or at least the most structured
Of Medicine • Extremely large and growing area of knowledge • Complex structure • Equipment, staff, regulation • Processes • Treatments etc. • Outcomes • Long term, difficult to measure • Knock-on effects of innovation • Effect of IT particularly hard to measure
Of Information systems • Difficult to fully test • Combinatorial explosion • Multi-function • Has a range of effects • System itself vs. impact on health care
Of Evaluation • Have to measure impact • This means impact on people - difficult to study • Need patients and staff to perform tests • May not be enough willing to cooperate • Range of things that can be evaluated, ranging from • ‘Does it work?’ to • ‘Does it help patients?’
Evaluation • In theory, • study situation before & after • In practice, • don’t know what changes would have occurred without innovation • don’t know what interesting questions will arise during study
Tips • Tailor study to problem • Not research – specific to this project • Collect useful data • Data which inform final decision • Look for side-effects • Effects not related to intended purpose • Formative & summative • Study during & after development
Tips (continued) • In vitro vs. in vivo • Evaluate on-site & off-site • Don’t accept developer’s view • Take account of environment – context • Let questions appear during study • Be prepared to use a range of methods
What can be studied • Need for resource • What does it give us that we didn’t have before • Development process • What methods do developers use to design their solution? • Structure of resource • What does the program & spec look like? • Functions of resource • How well does it work? • Impact • How does it affect HCPs and patients?
Study features • Focus • As previous slide • Setting • Laboratory or hospital • Data • Real or simulated • Users • Developers, evaluators, end-users • Decisions • None, simulated or real
Types of study • Need validation • Design validation • Structure validation • Laboratory function • Field function • Laboratory user impact • Field user impact • Clinical impact
Objectivist or quantitative approach • Can measure things objectively and without affecting thing being measured • What to measure can be agreed rationally • Can use numerical data • Draw definite conclusions
Objectivist approaches • Comparison-based • Like randomised clinical trial • Objectives-based • Does it do what the designers said? • Decision facilitation • Answers questions posed by managers • Goal-free approach • Evaluators not aware of project goals
Methods • Measurement • Demonstration studies • Descriptive • Comparative • Correlational • Statistical analysis
Measurement studies • Terminology for measurements • Object e.g. patient • Object class e.g. patient group • Attribute e.g. temperature • Instrument e.g. thermometer • Observation e.g. temperature at one time • Validation – calibration of thermometer
Demonstration studies • Demonstrate effect • ‘Do patients who have been inoculated have a higher temperature?’ • Object -> subject (patient) • Attribute -> variable (temperature)
Descriptive • ‘The patients in this study have a rather high temperature’. • Mean, standard deviation etc.
Comparative • ‘The patients in this study have a higher temperature than a control group’ • Controlled environment (usually) • T-test etc.
Correlational • ‘We are seeing more patients with fever since we introduced inoculation’ • Live situation • Could still be a t-test • Trying to associate one factor with another in a real situation
Subjectivist or qualitative • Observations depend on observer • Observations only meaningful in context • Different points of view may be valid • Descriptions as valuable as numbers • Discussion of results
Subjectivist approaches • Quasi-legal • Cf ethical debate • Art criticism • Expert review • Professional review • Site visit • Responsive/illuminative • Immersion in environment • Questions evolve over time
Qualitative approach • Attempts to understand why as well as measure differences e.g. • Is system working as intended? • How can it be improved? • Does it make a difference • Are differences beneficial? • Are the effects those expected?
Stages in qualitative study • Negotiation of ground rules • Immersion into environment • Initial data collection to focus questions • Iteration • Report and feedback • Final report
Methods in qualitative study • Observation • Interviews • Document analysis • Others, e.g. structured questionnaires
Mixed study • Can combine qualitative and quantitative approaches