1.02k likes | 1.18k Views
11:45 – 12:30 From IHE Profiles to conformance testing , closing the implementation gap Helping the implementers , testing tools , connectathons 12:30 – 13:30 Lunch Break 13:30 - 15:00 How to use IHE resources : hands on experience Technical Frameworks : navigating , Q&A
E N D
11:45 – 12:30 • From IHE Profiles to conformancetesting, closing the implementation gap • Helping the implementers, testingtools, connectathons • 12:30 – 13:30 Lunch Break • 13:30 - 15:00 • How to use IHE resources: hands on experience • TechnicalFrameworks: navigating, Q&A • Test tools: finding, using, configuring • Participating in the testingprocess
IHE Resources Eric Poiseau, INRIA, IHE Europe technical manager Charles Parisot, GE, IHE Europe
Connectathon • Started in 1998 in Chicago within the RSNA HQ • Europe started in 2001 • Japan in 2003 • China and Australia now also in the process
Charenton le pont 2001 • 11 companies • 18 systems • 40 m2 • 30 participants Formation IHE France
Paris 2002 • 33 companies • 57 systems • 130 m2 • 100 participants Formation IHE France
Aachen 2003 • 43 companies • 74 systems • 350 m2 • 135 participants Formation IHE France
Padova 2004 • 46 companies • 78 systems • 600 m2 • 180 participants Formation IHE France
Noordwijkerhout 2005 • 75 companies • 99 systems • 800 m2 • 250 participants Formation IHE France
Barcelona 2006 • 67 companies • 117 systems • 1500 m2 • +250 participants Formation IHE France
Berlin 2007 • Companies • systems • 1500 m2 • +300 participants
Oxford 2008 • 83 companies • 112 systems • 1500 m2 • 300 participants
C.A.T Participation in Europe Berlin Oxford Barcelona Noordwijkerhout Padova Aachen Paris Paris
Purpose • Test the implementation of the integration profile within product • Verify that the vendors did a good job • Verify that what the committees invented makes sense ! • Verify that the text is clear enough • Verify that that the committee did not miss anything • Build a community of …
From the vendor perspective • Unique Opportunity for vendors to test their implementations of the IHE integration profiles • Controlled environment • Customer is not present ! • Not in a clinical production environment • Specialists available • From SDO • From the peer companies • Bugs are identified and most of the time fixed !!!! • Connectathon Result Matrix • http://sumo.irisa.fr/con_result
But… • Testing is sub-optimal • Only a part of all the possible tests are performed • A system successful at the connectathon is not guaranteed to be error free !!!! • We do not do certification !
From the IHE perspective • Feedback from the vendor community • Did the committee do a good job ? • Did the developed integration profile respond to a demand of the vendors ?
European C.A.T • We have reach now our cruise speed • NA and EU C.A.T are very alike • C.A.T used as an IHE promoting tool • Workshop in parallel to the C.A.T • Berlin : ITEG • Oxford • Vienna
The IHE testing process Users Testing Results Deployed Systems Testing Tools Sponsors:Project Management Team Develop Testing Tools Approves Test Logs Connectathon Product +IntegrationStatement Implement Profile Actors In-House Testing Vendors Sponsors: Exhibits Demonstration IHE Technical Framework (Profiles Specification) Projet IHE-Dev Inria Rennes
Pre-connectathon • Registration • See what can be tested • Exchange of configuration parameters • IP addresses • AE Title • Assigning authorities • OID • Certificates • Affinity domain specification
Pre-connectathon • Mesa testing • In-house testing for vendors to get ready • Vendors return logs • Upon log return participation to C.A.T is accepted
Connectathon Testing • 3 types of test to be performed • No peer tests • Peer to peer tests • Workflow tests Participant Workshop
No Peer Tests • Calibration Tests -CPI : • screen calibration • Printer calibration • Scrutiny Tests • Verifythat the objectscreated are « valid » • Providepeerswithsamples Participant Workshop
Peer To Peer Tests (P2P) • Test subsections of a workflowbetween 2 vendors • Preparation to workflow test • Vendor chose when to runthem • Vendor select theirpeer. • Not to berunwithothersystemsfromsamecompany Participant Workshop
Workflow Tests • Test an entire workflow that may combined more than one integration profile • We have a schedule, vendors need to be ready at the time of the test. • We have a list of difficulties to check. • Some test can run in 15 minutes • Some will require more than an hour • No second chance test Participant Workshop
5 days • Monday morning till 11 am • Set up time • Till Friday noon : • Free peer to peer and no peer testing • From Wednesday till Friday noon : • Directed workflow testing
Monitors • Volunteers • Independent from vendors • Standard specialist • Verify tests • Act as moderator between vendors
Results • Failure are not reported • To be successful • Each peer to peer test needs to be verified with at least 3 peers • There are some exceptions • A vendor may fail for an actor but pass for the others
Connectathon Results • IHE does not report failure • Public resultsonlyat the companylevel • IHE willnever tell youwhat system participated to the connetathon • Vendors have access to theirown test results. Formation IHE France
Connect-a-thon Results Browser Formation IHE France
ConnectathonResults Browser Formation IHE France
ConnectathonResults Browser Formation IHE France
What does it mean ? • The Companywassuccessfulat the connectathon for the actor/integration profile combination • Results do not guarantyproductconformity • This is the role of the « IHE integrationstatements » Formation IHE France
IHE Integration Statement Formation IHE France
Participation Fees • First System € 2750 • Othersystems € 2850 • Per domain € 750 • Covers : • Infrastructure : room, power, monitors, internet… • Lunch and coffee breaks for 2 engineers during 5 days
Next Connectathon • Where : Remise, Vienna, Austria • http://www.koop-kundenweb.at/remise/ • When : Monday 20th April to Friday 24th April 2009 • Registration : November 1st – January 7th 2009 • Announcement to bereleasedsoon
C.A.T : Conclusion • It’s not a certification process • Unique opportunity for vendor to test and discuss • Seems to beusefull as proved by increased participation over the years • Sure, needsimprovement… • … but, we are working on it
Beforewestart • Impossible to test everything • Whatwe do not test • Design • Performance (Load) • Whatwe are looking for • interoperability • conformity Projet IHE-Dev Inria Rennes
Conformance / Interoperability Specifications/Standards Conformance testing ImplementationA ImplementationB Interoperability testing VendorA Vendor B Projet IHE-Dev Inria Rennes
Conformance Testing (1/2) • Is unit testing • Tests a single ‘part’ of a device • Tests against well-specified requirements • For conformance to the requirements of specified and the referenced standards • Usually limited to one requirement per test. • Tests at a 'low' level • At the protocol (message/behaviour) level. • Requires a test system (and executable test cases) • Can be expensive, tests performed under ideal conditions
Conformance Testing (2/2) • High control and observability • Means we can explicitly test error behaviour • Can provoke and test non-normal (but legitimate) scenarios • Can be extended to include robustness tests • Can be automated and tests are repeatable • Conformance Testing is DEEP and NARROW • Thorough and accurate but limited in scope • Gives a high-level of confidence that key components of a device or system are working as they were specified and designed to do
Limitations of Conformance Testing • Does not prove end-to-end functionality (interoperability) between communicating systems • Conformance tested implementations may still not interoperate • This is often a specification problem rather than a testing problem! Need minimum requirements or profiles • Does not test a complete system • Tests individual system components, not the whole • A system is often greater than the sum of its parts! • Does not test functionality • Does not test the user’s ‘perception’ of the system • Standardised conformance tests do not include proprietary ‘aspects’ • Though this may well be done by a manufacturer with own conformance tests for proprietary requirements