1 / 21

Motivation

G uidelines for Good E valuation P ractices in H ealth I nformatics - A shared networked initiative EFMI and IMIA Coordinators Pirkko Nykänen, Tampere University Jytte Brender, Aalborg University. Motivation. Need to have good practice guidance for evaluation

Download Presentation

Motivation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Guidelines for Good Evaluation Practices in Health Informatics- A shared networked initiative EFMI and IMIA CoordinatorsPirkko Nykänen, Tampere UniversityJytte Brender, Aalborg University

  2. Motivation • Need to have good practice guidance for evaluation • No single, global methodology exists • Good practice guidelines should serve as: • Framework to design evaluation studies, to select methodologies, to conduct studies • Supportfor health professionals and users to understand evaluation and contribute to evaluation studies

  3. Objectives • Generic practical guidelines that provide evaluators, users, health professionals with structured, scientifically justified, comprehensive and understandable set of rules for good practice • To design and carry out evaluation studies in health informatics domain

  4. Objectives – practically • Guidelines: • Criteria and aspects, both in quantitative and qualitative terms, to consider at each evaluation stage • Carry out evaluation of the specific aspect, criteria at specific stage • Design and manage the evaluation study • Guidelines from all stakeholders viewpoints: third party evaluators, users, health professionals, managers, decision makers, health economists,...

  5. How GEP-HI guidelines have been developed • Existing guidelines development principles • Existing literature and material on evaluation studies, methodologies, reported evaluation experiences, guidelines for good clinical practice, codes of ethics, good implementation practices critical study and assessment, review • Spiral, consensus making approach: • Coordinators - editorial team - external experts by HISEVAL website / mailing list • Revised versions – comments – input – revisions… • Presentations / workshops in MIE, MEDINFO-confereneces

  6. First start GEP-HI guidelines Project Controlling & Risk Management Study Implementation The Study Receiver Exploration Publication 1... n Report Buyer Team: iterative discussion and consensus First Study Design Report Buyer Internal Report Selection of Methods Study Design Including: Goal and Mandate Report Buyer Participants Internal Reports Interim Reports Buyer (?) Participants Final Study Report C Weßel 2006-12-06 GEP-HI Suggestion for a Flowchart Buyer Participants

  7. GEP_HI guidelines flowchart Project control and risk management Funder / client Proj. Managem. Proj. Managem. Proj. Managem. Proj. Managem. Eval. consultant Eval. consultant Eval. consultant Object organisation Eval. consultant Finalisation of the evaluation study. Study exploration Detailed Planning First study design Operational. of methods Evaluation Implement. Agreement Design outline Method Plans & procedures Information need Evaluation team Evaluation team Evaluation team Evaluation team

  8. GEP-HI guidelines • Study exploration • Starting question • First study design • Preliminary design • Operationalisation of methods • methodological approach, methodical aspects • Detailed study plan • detailed planning • Evaluation study implementation • Actual accomplishment of the planned evaluation study • Project control and risk management • Good project management practices • Finalisation of the evaluation study • Accounting, archiving, reporting of evaluation study results

  9. Status now and progress • Guidelines-paper has been accessible on iig.umit.at/efmi-website since 2006 • Comments and input received • Final revision is ongoing • Final version will be available by end 2009 • Comments and input received will be incorporated and the guidelines finalised for publication

  10. Contact • Feedback and input Pirkko.Nykanen@uta.fi Jytte.Brender@v-chi.dk • paper accessible: iig.umit.at/efmi Editorial team: Elske Ammenwerth, Nicolette de Keizer, Jan Talmon, Michael Rigby, Jytte Brender, Pirkko Nykänen

  11. A walk-through of the key guidelines issues Pirkko Nykänen, Tampere UniversityJytte Brender, Aalborg University

  12. GEP_HI guidelines flowchart Project control and risk management Funder / client Proj. Managem. Proj. Managem. Proj. Managem. Proj. Managem. Eval. consultant Eval. consultant Eval. consultant Object organisation Eval. consultant Finalisation of the evaluation study. Study exploration Detailed Planning First study design Operational. of methods Evaluation Implement. Agreement Design outline Method Plans & procedures Information need Evaluation team Evaluation team Evaluation team Evaluation team

  13. Guidelines summary • Study exploration • First study design • Operationalisation of methods • Detailed study plan and project plan • Evaluation study implementation 5.1 Project control and risk management • Finalisation of the evaluation study

  14. Phase 1: Study exploration focuses on the starting question of the evaluation study 1.1 The information need 1.2 Primary audience 1.3 Identification of the buyer / sponsor / study funding party 1.4 The context of the evaluation study 1.5 A first identification of stakeholders 1.6 A first identification of (external) consultants 1.7 A first sketch of the setting 1.8 First exploration of evaluation methods to be used 1.9 Exploring the restrictions of study execution and publication 1.10 Budget 1.11 Ethical, moral and legal issues 1.12 Result of Study Exploration= Outline, draft design of the study 1.13 Formal accept to proceed to the next phase

  15. Phase 2: First study design focuses on the preliminary design of the evaluation study 2.1 Elaboration of the rationale for the study 2.2 Key evaluation issues/questions 2.3 Budget 2.4 Establishment of the design team 2.5 Stakeholder analysis/Social Network analysis 2.6 Study constraints 2.7 Methods 2.8 Organisational setting, the study context 2.9 Technical setting, the study context 2.10 Participants from the organisational setting 2.11 Material and immaterial resources 2.12 Time and timing 2.13 Risk analysis 2.14 Ethical, moral and legal issues 2.15 Strategy for reporting and dissemination of results 2.16 Result of First Study Design = a more detailed evaluation study plan

  16. Phase 3: Operationalisation of methods focuses on making the design and evaluation methods concrete and compliant with the organizational setting and the information need, while taking into account the known pitfalls and perils 3.1 Study type 3.2 Approach 3.3 Assumptions 3.4 Pitfalls and perils 3.5 Skills 3.6 Frame of reference 3.7 Timing 3.8 Justification 3.9 Usability studies 3.10 Outcome measures 3.11 Quality Control on data (measures) 3.12 Participants 3.13 Study flow 3.14 Result of Operationalisation of Methods = Decisions on the methodological and methodical approach 3.15 Ethical, moral and legal issues = required approvals, ethical committee permissions achievable and prepared

  17. Phase 4: Detailed study plan and project plan focuses on providing plans, prescriptions and procedures detailed to the level necessary for the specific study 4.1 Project management 4.2 Evaluation activity mapping 4.3 Quality management 4.4 Risk management 4.5 Communication strategy 4.6 Recruitment of necessary additional staff 4.7 Result of Detailed Study Plan and Project Plan = Practical plans with incorporated tools. Upon approval of plans the evaluation study can now be started

  18. Phase 5: Evaluation study implementation focuses on activities related with the actual accomplishment of the designed evaluation study 5.1 Establishment of the frame of reference 5.2 Observation of changes 5.3 Quality control of findings 5.4 Interpretation of observations 5.5 Continuous project management, quality management and risk management 5.6 Regular reports 5.7 Final result of Evaluation Study Implementation = The final evaluation report summarising the study plan and results, and presenting conclusions

  19. Phase 5.5 Project controlling and risk management Focuses on the good project management practices and risk management specifically for an evaluation study • Dedicated project management tools • Quality management in relation to the objectives and milestones of the study > Develop a quality management plan • Risk management • Prospectively during planning • Retrospectively with the risks realised

  20. Phase 6: Finalisation of the evaluation study focuses on accounting, archiving of materials, and reporting of evaluation studies in terms of the STARE-HI guidelines 6.1 Accountings 6.2 Reports and publications 6.3 Archiving 6.4 Reporting guidelines 6.5 Reporting scope 6.6 The reporting message 6.7 Authorship 6.8. Ethical and moral aspects 6.9 Preparation of reports / publications - Follow STARE-HI guidelines / Statement on reporting of evaluation studies in health informatics, IJMI 78, 2009, 1-9 6.10 Usability documentation in the context of the EU regulations and standards

  21. Generalisability of the guidelines • Designed for all types of evaluation study types, also applicable on usability evaluation study planning • The guidelines are for • to plan an evaluation study • to carry out an evaluation study • The guidelines are NOT for • Which methods to use • How to use the methods • Handbooks available, see e.g. Brender 2006

More Related