370 likes | 508 Views
Reuse Versus Reinvention **. Mary Ann Malloy, PhD mmalloy@mitre.org LFM 2008 Conference. ** How Will Formal Methods Deal with Composable Systems?. Composable C2 needs Formal Methods!. As a public interest company, MITRE works in partnership with the U.S. government to address
E N D
ReuseVersus Reinvention ** Mary Ann Malloy, PhD mmalloy@mitre.org LFM 2008 Conference ** How Will Formal Methods Deal with Composable Systems?
As a public interest company, MITRE works in partnership with the U.S. government to address issues of critical national importance.
Disclaimers • The views, opinions, and conclusions expressed here are those of the presenter and should not be construed as an official position of MITRE or the United States Department of Defense (DoD). • All information presented here is UNCLASSIFIED, technically accurate, contains no critical military technology and is not subject to export controls.
What you WILLNOT take away today • A “shrink-wrapped” solution to DoD’s emergent testing challenges.
What you WILL take away today • An understanding of what “service-orientation” and “composability” mean. • Insight regarding how DoD is trying to build composable systems, but may not be testing them appropriately nor learning from the testing it does do. • Ideas for formal methods / testing investigation paths that may improve the state of composable systems verification & testing.
Overview • Background • Examples • Recent Work • Challenges • Summary
Online Who uses services? BANKING DIRECTIONS TRAVEL eCOMMERCE NEWS & INFORMATION Answer: YOU do! … and DoDwants to!
What is a service? “A mechanism to enable access to one or more capabilities, where the access is provided using a prescribed interface and is exercised consistent with constraints and policies as specified by the service description.” – DoD Net-Centric Services Strategy May 2007 • Characteristics of services • Modular; composable, much like “lego” building blocks • Network-accessible • Reusable • Standards-based • Distributed capabilities
What DoD sees… Ability to sort by type of incident, date, location, etc. Worldwide threats and incidents: airport, chemical, bridge, railway, bombs, etc. It also has links to related news stories and a searchable database. Listing of bomb-related events between 14 Feb 08 and 15 Feb 08 …and wants!
What is “service-oriented architecture”? “A paradigm for organizing and utilizing distributed capabilities that may be under the control of different ownership domains.” – OASIS Reference Model for Service-Oriented Architecture October 2006 • An architectural style based on flexibly linked software components that leverage web standards and services • NOT a product • NOT a bunch of web services
What is composability? • A design principle dealing with interrelated components that do not make assumptions about the compositions that may employ them; they are “fit for the unforeseen.” • – proposed definition Composable solutions – the desired end-state of a full-scale SOA implementation – are the direction DoD, federal stakeholders & commercial enterprises are evolving their automation assets. • What is a composable system? • one that consists of recombinant atomic behaviors (components) selected and assembled to satisfy specific [new] processing requirements. • NOTE: Composability is meaningful at many layers of abstraction.
Testing principles • Testing the composition • Validate the chosen composition of individual elements performs the desired functions. • The “composition layer” is an additional one that must be tested separately. • A composition can be VALID yet still not do anything USEFUL with respect to the relevant CONTEXT • Testing for composability • Ensure individual processing elements do not make undue assumptions about the composition • Code analysis or inspection for “hidden assumptions” or “out of band” dependencies
Service Implementations Typical DoD approach to testing compositions Enables Capability Demonstration Capability Delivery Capability Need START HERE Enables Drives DECLARE SUCCESS! Service Needed Drives Enables Data Needed Drives Community Information Exchange Vocabulary
Better DoD example: Net-Centric Diplomacy ** • “Few realize the complexity that must be taken into account when attempting to quantitatively measure performance and reliability when dealing with web services.” • – Derik Pack, SPAWAR System Center, 2005 ** see http://www.dtic.mil/ndia/2005systems/thursday/pack2.pdf • General findings from the testing of the NCD initiative of Horizontal fusion: • Many different types of interrelated testing are needed. • Exhaustive testing is impossible. • testing must still be iterative • it is time consuming! • Operationally specific test cases are needed • Performance testing must focus on service dependencies vice user interface. • The number of requests that will cause a web service to fail is far lower than for a web server.
Better DoD example: Net-Centric Diplomacy concluded Is this practicable across all of DoD? DoD may need to stand up multiple access points for heavily used services / compositions; and the “sweet spot” will likely differ in times of war vice times of peace. • Testing was conducted until “error thresholds” were reached: • Round trip time (90 sec) • Error (15%) • Specific findings • A mean of 3.06 Connections per Second could be achieved • WSDLs define interfaces, but not valid service use
What DoD must create to “get there…” • Loosely coupled, relevant, “right-sized” services that can be leveraged across continuously changing processes. • New governance that can deal with complex management of distributed, loosely coupled, dynamically composable services. • A better understanding of maintenance implications: • How long does it take? How will other components or clients be impacted? • Components with low or unknown MTBF should be highly accessible and easily replaceable…can this be automated? • Rapidly deployable, virtual, continuous test environment • Examples provided on the next two slides …
…something like ELBA? ** 1) Developers provide design-level specifications of model and policy documents (as input to Cauldron) and a test plan (XTBL). 2) Cauldron creates a provisioning and deployment plan for the application. 3) Mulini generates staging plan from the input components referred to from XTBL (dashed arrows). 4) Deployment tools deploy the application, monitoring tools to the staging environment. 5) The staging is executed. 6) Data from monitoring tools is gathered for analysis. 7) After analysis, developers adjust deployment specifications or possibly even policies and repeat the process. ** see http://www.static.cc.gatech.edu/systems/projects/Elba/pub/200604_NOMS_mulini.pdf
…or STARSHIP? POC: Ms. Janet McDonald (520) 538-3575 (DSN 879) Janet.McDonald@us.army.mil PM ITTS IMO – Ft. Huachuca Key component of the Electronic Proving Ground Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) tool kit for live distributed test environments. Provides a “threads-based” composable environment to plan, generate planning documents, verify configuration, initialize, execute, synchronize, monitor, control, and report the status of sequence of activities. Freely available & customizable to any problem domain. Complexity may be a barrier.
What MITRE is doing • “Our hypothesis is that web service optimization can be realized through • machine learning techniques and statistical methods. In this research we intend to find a computational solution to the problem of creating and maintaining web service processes.” • c2c-composable-c2-list and Community Share site • Composable C2 is a “Grand Challenge Problem” within 2009 MITRE Innovation Program (MIP): • How to build reconfigurable components that can be mashed together in an agile fashion • Visualization and Analysis • Info Sharing • Interoperability and Integration • Resource Management to enable composablility (of people, organizations, networks, sensors, platforms…) • Acquisition and Systems Engineering • Collaborative and Distributed C2 • Example proposal: Web Service Process Optimization
What MITRE is doingconcluded • A series of brainstorming sessions on Composable C2 • “static” vs. “dynamic” composablity viz legacy systems • do services derived from a “proven capability” have lower or non-existent testing requirement • Resources for Early and Agile Testing • Recently showed how low-cost simulation games can create a simple, “good-enough” simulation capability to evaluate new concepts early in development and expose the most challenging issues. • REACT “Online” A composed testing environment for composed solutions! • a loosely coupled simulation capability delivering dynamic flexibility for “quick look” experiments
Practical challenges • Can we “right-size” testing as “fit-for-composition?” • Is composability binary? A sliding scale? When is it [not] OK to use “lower-rated” components? • Can we characterize the right amount of testing based on the anticipated longevity of the composition? Other factors? • What metadata must be exposed to assess contextual validity of components in composition? • Should WSDL be enriched? Supplemented? • Can what constitutes valid compositions be expressed as rules? How narrowly / broadly? • What thresholds / metrics are required? Nice to have? • Performance thresholds? Ongoing component health? • Can we “borrow” ideas from other composability abstractions for applicability here?
Levels of composability testing? For a composition that will only be used a few times, can we tolerate higher risk? Risk (e.g., loss of life) testing as-is for composable C2 Can we “rate” the composability of components?
“Pressure-points” for formalisms • How can the lessons-learned from the past inform the way ahead for extending formal methods to testing & verification of composable systems? • Can we derive principles to compose systems in methodical, rather than ad-hoc ways, that will produce more satisfactory results? • How can we handle partial and incremental specifications? • How can we cope when building a composition with parts that make incompatible assumptions about their mutual interactions? • What kinds of automated checking and analysis can we support?
Take-away points DoD will continue to deploy composed solutions to realize its SOA vision. Current testing focuses more on the level of service provided and less on how reliably the capability is delivered or whether it actually meets the need. Different levels of testing are probably appropriate for different contexts (“static” versus “dynamic,” use frequency, loss-of-life consequences). Automated environments are needed to test composed solutions targeted for rapid deployment
Pointers to more information • www.thedacs.com • Data & Analysis Center for Software: A repository of documents, tools in research areas including testing and reuse • www.peostri.army.mil/PRODUCTS/STARSHIP/ • Starship II homepage • Search for the latest results on: composability web service testing composable systems composable C2 testing composable
ACRONYMS • C2 = Command & Control • C4ISR = Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance • DoD = Department of Defense • MTBF = mean time-between-failures • NDC = Net-Centric Diplomacy • PEO STRI = Program Executive Office for Simulation, Training & Instrumentation • SOA = service-oriented architecture • WSDL = web services descriptive language
Observations about compositions Technical realm Semantic realm Solutions are built from primitive and composite components and connectors. Components and connectors can be described by interface and protocol specifications. Common patterns provide abstractions we may be able to exploit in design, development, analysis and testing. To delivery meaningful capability, the components must be composable regarding their underlying ideas 35
“Lines of Evolution” vision for DoD systems A capability is realized through a pre-defined orchestration of services 5 3 A single system with a non-flexible hierarchical structure Services are orchestrated on an ad-hoc basis to deliver a capability…and then disappear 1 4 2 A system consisting of several independently functional but integrated components Reusable, “mobile” services
1 2 3 4 5 6 Another view: stages of SOA adoption DoD is lurking around here Business Process Understanding: How is the work done? Infrastructure (ESB, Registry, Management Governance: How will services, application, people interact and communicate? SOA Design/ Determination: What should be a service? SOA Enablement (Java EE, .NET, federated data services): How will application and data services be developed and deployed? IT Assessment: What IT assets exist supporting the business process? Process Orchestration/ Composition: How will business processes and rules be developed and deployed? ** Mark Driver, Optimizing Open Source and SOA Strategies, Gartner Application Architecture, Development & Integration Summit 2007, http://www.gartner.com/it/page.jsp?id=506878&tab=overview