1 / 39

Mark Welter Software Testing Manager, Department of Information Technology

Soft Testing at Mayo Clinic. Mark Welter Software Testing Manager, Department of Information Technology. Mayo Clinic. 4,729 physicians and scientists and 58,405 allied health staff Campuses in Rochester, MN; Jacksonville, FL; and Phoenix/Scottsdale, AZ.

mmiller
Download Presentation

Mark Welter Software Testing Manager, Department of Information Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Soft Testing at Mayo Clinic Mark Welter Software Testing Manager, Department of Information Technology

  2. Mayo Clinic • 4,729 physicians and scientists and 58,405 allied health staff • Campuses in Rochester, MN; Jacksonville, FL; and Phoenix/Scottsdale, AZ • Serve more than 60 communities through Mayo Clinic Health System • Extends knowledge and expertise to physicians and providers through Mayo Clinic Care Network • Collectively, care for more than 1.3 million patients each year

  3. Department of Information Technology • Staff of 1700+ • Serves all Mayo Clinic entities at all locations • 57,000+ workstations • 7800+ servers

  4. Division of Laboratory Pathology and Extramural Applications (LPEA) – Software Quality Assurance • SQA Staff of 51 • 3 Software Quality Analysts • 14 Software Testers • 7 Laboratory System Interface Testers • 5 Test Automation Developers • 22 Managed Service Testers • Supports all Department of Laboratory Medicine and Pathology applications (215+) • Also lead and support testing in other departments through the Mayo IT Testing Center of Excellence

  5. Mayo / SCC Development Partnership • Current model • Implement multiple releases a year • Have averaged over 100 deliveries of code into our Soft environments, per year, for the past 4 years • SCR / Customization intensive • Work with Soft as they have developed new modules • Manage 10 Soft environments at Mayo and now have one environment at Soft • New model • Transition to a maintenance client • Two planned GA releases a year • Limited SCR/Customizations

  6. Verification vs. Validation • At Mayo we break testing down into two basic concepts: Verification Did we build the system correctly? (per defined specifications) Validation Did we build the correct system? (per user expectations)

  7. Terminology: Verification / Validation Verification Are we building the product right? Does the software meet specifications… Validation Are we building the right product? Can the staff use the system as built…

  8. Types of Testing

  9. Exercise the functions Business rules & triggers Simulate interfaces [data transfers] Modules, Systems, Instruments Simulate traffic loads Compare tables settings Review deliverables Patient Reports Management Reports Labels Observe user / system interactions Multi-faceted Approach Evaluating Health: Software V&V activities, combined with training & competency testing, together will assist us in achieving system confidence.

  10. Risk Based Testing It’s naïve to think that you can test everything. • There is not just one path for generating results • Different permutations, instruments, sites, EMRs, etc. • There is more than one way that a test can be resulted • Each of these variations evokes different flows and user processes • Development / set-up is done simultaneously (code, test definition, instruments, interfaces, process) • At Mayo, no application is an island – there are coordinated changes that must occur across applications • Both system and human behavior is altered with volume • Change Control: knowledge of changes and impact analysis is crucial

  11. User Acceptance Testing (UAT) • Why do User Acceptance Testing? • UAT captures whether the product/system meets the end user’s needs prior to putting new code into production • End users find different defects than testers do • History has proven that failure to perform UAT frequently results in significant and often urgent issues in production • Who performs UAT? • Information Management Techs in each lab • Super Users from each lab

  12. User Acceptance Testing (UAT) – When? • When is UAT performed? • Before go-live • In the test environment, on the new code version • Coordinated with software and interface verification • Coordinated with new test and instrument verification • UAT needs to be as ‘real’ as possible to catch as many issues prior to go-live • Tools • All labs document their UAT in a Test Management Tool • Allows for standardization, monitoring of progress, trending, logging of issues, status reporting

  13. User Acceptance Testing (UAT) - Workflows • Workflows • End users exercise their pre, post and analytical workflows for UAT • Includes connected instruments • Focus on the core analytic processes first to ensure that those workflows still work as expected with the new code version • Choose permutations to exercise for your high volume and/or complex analytic workflows

  14. Testing Process

  15. Prepare Test Plan • A Test Plan document describes the scope, approach, resources and schedule of intended testing activities. • Test Scope • Test Approach • Project Risks & Assumptions • Test Environment • Test Controls • Planned Test Suites / Test Cases • Development / Test Procedures • Test Plan Approvals

  16. Test Case Development • A Test Case is a set of execution preconditions (setup), inputs and expected outcomes developed for a particular objective which traces back to the requirement(s) • Contains the list of actions and expected results to verify if a function is working as designed

  17. Perform Software Testing • The records kept during test execution are vital for supporting system V&V • Test records must be sufficient enough to allow for test reconstructionand identifies: • Tester • Test date • Test environment • Test steps • Test data exercised • Acceptance criteria • Test results (pass/fail) • Defect/bug disposition

  18. Create Test Summary Report • The Test Summary Report summarizes the V&V activity and is the basis for submission for approval to implement • The report includes: • Summary statement of the testing outcomes • Any testing variances from the plan • Traceability Matrix • All open release defects (includes deficiencies that have been accepted "as is" or with a workaround) • Test Metrics

  19. Release Readiness Review Record Center Review & Approve TFS Implementation Test Test Plan Plan Summary Release Readiness Review / Release Readiness Board Confirm Content & Obtain Stakeholder Approvals Participants Release Manager, Development Representative, Test Representative, Deployment Manager, Business Representative, Support Representative Tool Support Structure Define Release Managers and Applications Release Managers defined as a RRB High Risk Changes Change Approval Board (CAB) Implement

  20. Takeaways: How might this apply to you? • Evaluate minimum coverage based on Risk • Risk based testing is the real answer • Not enough resources or time to test “everything” • Areas to consider: • File build verification (Mandatory) • Regulatory Compliance Validation (Patient reports, RD, Billing) (Mandatory) • End-to-End (Strongly recommended) • User Acceptance Testing (Strongly recommended)

  21. Questions?

  22. Testing Activities Performed by SCC Soft Computer Kenton Smith Director, Quality Management SCC Soft Computer

  23. FDA Establishment Registration

  24. FDA Website Listing (FURLS) of SCC Medical Devices

  25. FDA – Verification vs. Validation • From 21 CFR 820.30(f) & (g): • Design verification. Each manufacturer shall establish and maintain procedures for verifying the device design. Design verification shall confirm that the design output meets the design input requirements….. • Design validation.…Design validation shall ensure that devices conform to defined user needs and intended uses and shall include testing of production units under actual or simulated use conditions…..

  26. V & V at SCC • Verification • Primarily performed by product development • Meet functional requirements • Focused on code • Both included in test plan • Both required for release of software • Validation • Performed independently of product development • Meet end user requirements • Focused on functionality

  27. Where is Validation Performed? • By SCC’s QC Team – Current Model • Team comprised of both Lab and I.T. backgrounds • Test cases derived from verification test cases (Redundant?) • Little involvement early in development • Compliant from a regulatory perspective • Effective as they could be??

  28. Reset of SCC QC Team • Goals: • Transition from QC Team to Customer Validation Team • Advocates for customers • Involved early in development • Risk based approach • More robust regression testing • Create unique test cases based upon best practice workflow

  29. Reset of SCC QC Team • Goals: • Learn from post live feedback • Gather data from post live tasks • Analyze data to see if we had the opportunity for prevention

  30. Reset of SCC QC Team • Goals: • More closely mimic end user hardware • Meet or exceed hardware specifications • Test using thick and thin clients • Allows for better performance testing

  31. Challenges • What is best practice workflow? • Different configurations • Thousands of hosparams • Numerous different workflows • Millions of lines of code Conclusion: IT’S NOT POSSIBLE TO TEST EVERY FACET OF SOFTWARE

  32. Does the FDA have any ideas? • From General Principles of Software Validation; Final Guidance for Industry and FDA Staff (2002): • “…the complexity of most software prevents it from being exhaustively tested. Software testing is a necessary activity. However, in most cases software testing by itself is not sufficient to establish confidence that the software is fit for its intended use…”

  33. Then What Do We Do? • Back to The FDA Guidance Document: • “..In order to establish that confidence, software developers should use a mixture of methods and techniques to preventsoftware errors and to detect software errors that do occur..”

  34. Dream! • Mission: To develop a culture of defect prevention throughout the whole software development life cycle • Vision: SCC, ISD and STS will avoid defects in SCC software to deliver the best product possible, creating user and customer loyalty • Values: • Forethought • Client Focus • Analytical thinking • Absence of bias • Influence throughout the Business

  35. Dream! • Defect Regression Avoidance Management • Phase I: Retrospective Review – Complete • Heavy Data Mining • Identify Root Causes of Released Defects • Phase II: Define Corrective Actions – In Progress • Execute the corrective actions to resolve the issue(s) and to prevent recurrence

  36. End User Validation • Test Plan • Same hardware, configuration, and connectivity • Include performance testing • Validate to intended use

  37. Questions?

  38. References U.S. Food and Drug Administration (January 11, 2002). General Principles of Software Validation; Final Guidance for Industry and FDA Staff.Retrieved from: https://www.fda.gov/downloads/MedicalDevices /DeviceRegulationandGuidance/GuidanceDocuments/ucm085371.pdf 21 CFR 820

  39. Kenton Smithkentons@softcomputer.com727-789-0100 Ext. 4901

More Related