860 likes | 967 Views
Evaluation of information systems. Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam. Outline. Significance of evaluation Process of evaluation Evaluation questions Methods – study design, data collection Triangulation. IT in health care: possible benefits.
E N D
Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam
Outline • Significance of evaluation • Process of evaluation • Evaluation questions • Methods – study design, data collection • Triangulation
IT in health care: possible benefits • Reduction of medication error by CPOE • Better treatment / diagnostics by decision-support systems • Increase quality of documentation • Reduce costs by telemedical applications
Unintended Consequences of Information Technologies • Aim • Determine the effect on mortality of introducing CPOE into Pittsburgh childrens hospital • Methods • Demography, clinical and mortality data collected on all children transported to a hospital where CPOE implemented institution-wide in 6 days. Trends for 13 months prior and 5 months after compared. • Results • Mortality rate increased from 2.80% (39 of 1394) to 6.57% (36 of 548) • After adjustment for other covariables, CPOE independently associated with increased odds of mortality (odds ratio 3.28, 95% C.I. 1.94 – 5.55)
Unintended Consequences of Information Technologies • Conclusion • When implementing CPOE systems, institutions should continue to evaluate mortality effects, in addition to medication error rates • Importance • Received disproportionate media attention due to reactionary message • Follow-on study in Seattle, using same vendor system, also published in Pediatrics, showed no increase in mortality
Negative effects of CPOE: Example 2 • Brigham and Womens' Hospital, Boston introduced a CPOE system, that allows physicians to order the medication online (and not on paper anymore). • After implementation, the rate of intercepted Adverse Drug Events (ADE) doubled! • Reason: The system allowed to easily order much too large dosages of potassium chloride without clear indicating that it be given in divided doses. • Bates et al The impact of computerized physician order entry on medication error prevention. JAMIA 1999, 6(4), 313-21.
Unintended Consequences of Information Technologies • Reference • Linder et al., Arch Intern Med. 2007 Jul 9;167(13):1400-5. [Brigham & Women’s Hospital] • Aim • Assess effects of Electronic Health Records on quality of care delivered in ambulatory settings • Methods • Retrospective, cross-sectional analysis of 17 quality measures from 2003-2004 National Ambulatory Medical Care Survey, correlated with use of EHRs.
Unintended Consequences of Information Technologies • Results • EHRs used in 18% of 1.8 billion visits • For 14 of 17 quality measures, fraction of visits where recommended best practice occurred was no different in EHR settings than manual records settings. • 2 better with EHR: avoiding benzodiazepines in depression, avoiding routine urinalysis • 1 worse with EHR: prescribing statins for hypercholesteremia (33% vs. 47%, p=0.01) • Conclusion • As implemented, EHRs not associated with better quality ambulatory care
Unintended Consequences of Information Technologies • Reference • Linder et al., Arch Intern Med. 2007 Jul 9;167(13):1400-5. • Importance • Received disproportionate media attention due to reactionary message • Lost in the media hype: Less than 40% of EHR implementations have all elements important for effects on quality (e-prescribing, test ordering, results, clinical notes, decision support). • Best performance regardless of infrastructure was suboptimal (< 50% adherence to best practice).
Other examples • LondonAmbulance Dispatch System collapsed due to inadequate testing. Thousands of emergency calls were answered not or too late. http://www.cs.ucl.ac.uk/staff/A.Finkelstein/las.html • The malfunction of Therac-25, a medical linear accelerator caused the death of three patients in the late 1980th. http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_1.html • University hospital stops introduction of Order Entry System due to user boycott. Costs: up to 34 MIO dollars • Ornstein C (2003) Los Angeles Times Jan 22nd, 2003
Insufficiently designed, badly integrated or wrongly used IT systems can lead to user frustration and errors. „Bad health informatics can kill“ Ammenwerth E, Shaw NT. Bad health informatics can kill - is evaluation the answer? Methods of Information in Medicine 2005;44:1-3. Examples: http://iig.umit.at/efmi/ -> Bad Health Informatics can Kill
Need for Evaluation IT systems can have large impact on quality of care IT systems are costly IT systems can fail or be sub-optimal designed Evaluation is a way to provide better IT Systems
Systematic Evaluation of IT is essential • Formative (constructive): evaluation looking forward • Summative: evaluation looking backward
Evaluation: Definition (1/2) • Evaluation is • the act of measuring or exploring properties • of a health information system (in planning, in development, in implementation, or in operation), • the result of which informs a decision to be made concerning that system in a specific context. • Ammenwerth E, Brender J, Nykänen P, Prokosch U, Rigby M, Talmon J. Visions and strategies to improve evaluations of health information systems - reflections and lessons based on the HIS-EVAL workshop in Innsbruck. Int J Med Inform 2004 Jun 30;73(6):479-91.
Evaluated types of information systems 1982- 2002 (n = 1.035)
Annualy published papers in PubMed on IT evaluation in health care (PubMed analysis) 1.0% of all papers 0.6% of all papers Ammenwerth, de Keizer (2004)
Evaluation in Informatics is Notoriously Difficult • We live in a pluralistic world. There will be many points of view on need, benefit, quality. • Real people using deployed technology: things can go wrong or right for very complicated reasons • Intersection of 3 domains where progress is very rapid: • Work in the domain (health care, bio-science) • Information technology • Evaluation methods • Sometimes they don’t really want to know...
Outline • Significance of evaluation • Process of evaluation • Evaluation questions • Methods – study design, data collection • Triangulation • Publication bias
Negotiation Questions Investigation Report "Contract" Decisions The General Process of Evaluation
Questions Investigation Report Negotiation and Contract • Identify the primary audience(s) and interact with them Negotiation "Contract"
Development Director's Superiors Resource Users and Their Clients Peers, Relations of Clients Roles in Evaluation: The Playing Field Evaluation Funder Public Interest Evaluation Team Groups and Director Professional Societies Staff Development Team Director Development Staff Funder Those Who Use Similar Resources
Questions Investigation Report Negotiation and Contract • Identify the primary audience(s) and interact with them • Set general goals and purposes of the study • Identify, in general, the methods to be used • Identify permissions, accesses, confidentiality issues and other key administrative aspects of the study • Describe the result reporting process • Reflect this in a written agreement Negotiation "Contract"
Negotiation Investigation Report "Contract" Questions • Specific questions derived from the general • Maximum 5-10 • They do not have to be stated as hypotheses • Depending on methods used, the questions can change over the course of the study Questions
Investigation • Choose data collection methods • There are two major families of investigational approaches: objectivist and subjectivist • Although some studies use both families, typically you will choose one or the other Negotiation Questions Investigation Report "Contract"
Report • Process of communicating findings: reporting is often done in stages • It doesn’t have to be a written document exclusively • Targeted at the audience(s), in language they can understand • Report must conform the evaluation agreement Negotiation Questions Investigation Report "Contract"
Subjectivistic vs. objectivistic Subjectivistic (interpretative, explorative) approaches hypothesis Objectivistic (positivistic, explanative) approaches Generate hypothesis Open and broad Detect relationships Inductive Focus on qualitative methods Test hypothesis Focused and exact Prove relationships Deductive Focus on quantitative methods
Group work • Judge abstracts 1 and 2: • Subjectivist or objectivist study?
Outline • Significance of evaluation • Process of evaluation • Evaluation questions • Methods – study design, data collection • Triangulation
Evaluation and IT life cycle Information needs Resource impact: effects, acceptance, costs, benefits, … Software Engineering: Verification, validation, Usability, stability, …
Group work • What kind of aspects do you want to evaluate if your hospital implement a nursing documentation system or a DSS application? • Physician • Nurse • Manager • Developer
Measures for IT evaluation studies 1. Static IT attributes (hardware and software quality) Static user attributes (computer knowledge) 2. Quality of interaction between IT and user (e.g. usage patterns, user satisfaction, data quality) 3. Effects of IT on process quality of care (efficiency, appropriatness, organisational aspects) 4. Effects of IT on outcome quality of care (quality of care, costs of care, patient satisfaction)
Evaluted aspects in evaluation studies 1982- 2002 (n = 983) Ammenwerth, de Keizer (2004)
Why formulate questions ? • Crystallize thinking of evaluators and key “stakeholders” • There is a need to focus and prioritize • It converts broad aims into specific questions that can potentially be answered
Further benefits of identifying questions • Stakeholders can see where their concerns are being addressed • The choice of methods follows from the question • A list discourages evaluators from focusing only on questions amenable to their preferred methods
Evaluation question: Recommendations • Evaluation is part of the whole IT life cycle • Any evaluation must have a clear evaluation question (there is no „global“ or „absolute“ evaluation). • The evaluation questions should be decided by the stakeholders. Take the time for elaborating clear and agreed evaluation questions!
Outline • Significance of evaluation • Process of evaluation • Evaluation questions • Methods – data collection & study design • Triangulation
Evaluation question Evaluation methods Data Answer to evaluation question
Evaluation generates data to answer questions 85% of care plans are incomplete. Mean time for data entry: 3.5 min. „It does not work!“ Which types of data are represented? „I like it“ Costs of 3.500 Euro per workstation. Nurse tries to enter password several times. 4 medication errors per day. Physician curses at the computer. Mean user satisfaction: 1.9 (from 5 max.)
Data Quantitative data: Numbers Qualitative data: Text, videos, … Count, measure, weight, … Describe, observe, … Positive attributes? Rich in content Needs less standardization Needs no large numbers Generates exact results Easy to work with Easier to aggregate
Quantitative vs. qualitative methods in evaluation studies 1982- 2002 (n = 983) Ammenwerth, de Keizer (2004)
Example 1a: Does documentation system reduce time? Introduction of nursing documentation system Reduce time efforts for documentation Hypothesis: causal relationship Quantitative RCT study
Example 1b: What are effects of documentation system? Reduce time efforts for documentation Improve quality of documentation Improve communication in team Improve transparency of nursing care Improve IT skills Improve communication with physicians Improve quality of nursing care Qualitative ethnographic study
Example 2a: Which factors determine user satisfaction with a nursing documentation system? Attitude towards nursing process Performance and Stability of the system Age of nurse Computer experience (in years) Quality of support Attitude towards computers in nursing Quality of training Qualitative interview study
Example 2b: Does age or quality of training determine user satisfaction with a nursing documentation system? Quality of training User satisfaction Hypothesis: relationship Design: e.g. RCT, observational Age of nurse User satisfaction Hypothesis: relationship Quantitative study
Kinds of evaluation study Evaluation studies Objectivist studies Subjectivist studies
Kinds of evaluation study Evaluation studies Objectivist studies Subjectivist studies Demonstration studies Measurement studies Descriptive studies Reliability studies Correlational studies Validity studies Comparative studies
Descriptive studies Aim: to describe something Example: how often do doctors use a CPOE? Methods: survey, log file, observation, case note audit… Variables: single variable of interest – the “dependent” variable (usage rate) Analysis: simple descriptive statistics – mean & SD; median & inter-quartile range…
Correlational studies Aim: to correlate something with something else Example: is CPOE use associated with less calls from pharmacy to department ? Methods: survey, log file, observation, case note audit… Variables: “dependent” variable (calls) + independent variables (usage rate, age…) Analysis: univariate or multivariate regression