400 likes | 729 Views
Open Architecture CONTRACTS PERSPECTIVE. Marcia Rutledge Contracts Directorate Branch Head, ISR/Comms Space and Naval Warfare System Command. AGENDA. Why open architecture makes sense from a contracts perspective? Is OA working? Is it a good deal for industry?.
E N D
Open ArchitectureCONTRACTS PERSPECTIVE Marcia Rutledge Contracts Directorate Branch Head, ISR/Comms Space and Naval Warfare System Command
AGENDA • Why open architecture makes sense from a contracts perspective? • Is OA working? • Is it a good deal for industry?
Open Architecture Policies, Laws, Instructions • Reduced DoD budgets • Ashton Carter’s Better Buying Power Initiative • National Defense Authorization Act for Fiscal Year 2010 - Section. 804. • SECNAVINST 5000.2E
Why Open Architecture makes sense from a Contracts Perspective? • Contracts becomes an implementation arm • OA is one of our tools to be able to procure more with less • OA allows us to develop • Competitive strategies to achieve savings • Reduce Vendor Lock • Data Management • Method to control costs and reduce redundancies • E.G. Allows contracting of S/W independent of H/W • Allows incremental delivery of new capabilities • Enhances interoperability
Is OA working? • OA tenets are going forward • Severalcontracts let across PEOC4I • Open Architecture is the Premise for Joint Tactical Radio Systems (JTRS) program • Decoupling of software from hardware will lead to increased competition of software application • Three C4I examples
Is Open Architecture Working? • New development • Global Positioning System Based Positioning, Navigation, and Timing Service (GPNTS) • Reduced footprint and increased capability/technology insertion • Applications transformed to support Service Orientated Architecture • Reduced sustainment costs • Reusing existing software & refactoring into services • Consolidated Afloat Networks and Enterprise Services (CANES) – Afloat Core Services (ACS) • Shared infrastructure services • Existing programs • Global Command and Control System- Maritime (GCCS-M) • Increase competition to achieve better pricing and products • Exercising our data rights and breaking vendor lock • Agile process for incremental delivery of increased capability • Reduced life cycle costs (maintenance)
GPNTS MOSA/OA Statement of Work (SOW) • Key OA Tenants: • The Contractor shall develop and maintain an Open Architecture that incorporates appropriate considerations for interoperability, supportability, composeability, technology insertion, vendor independence, reusability, scalability, ability to upgrade, and long-term supportability • The architectural approach shall provide a viable technology insertion methodology and refresh strategy that supports application of a MOSA and is responsive to changes driven by mission requirements and new technologies. • The Contractor shall develop an open architecture that supports a Modular Open System Approach (MOSA), and deliver an Open System Management Plan. • The Contractor shall provide an orderly, planned approach to address migration of proprietary or closed software components or interfaces to a modular design when technological advances are available
New Development CANES • Competitive Down-select Contracting strategy • OA considered in competitor’s Allocated Baseline Decision Analysis and Preliminary Design Review (PDR) • The contractor shall provide an Allocated Baseline Decision Analysis (ABDA) as part of the PDR. The contractor shall ensure the ABDA documents the contractor technical and non technical rationale and reasoning on how and why they chose to allocate functions to physical CIs from FBL. The ABDA shall describe all requirements as stipulated in the FS, with emphasis on system hosting; scalability and modularity; minimization of SWAP; system security and requirements necessary to achieve an ATO; minimization of manpower and training; minimization of system variation; and use of Modular Open Systems Architecture/Open Architecture (MOSA/OA). • Afloat Core Services (ACS) • Multiple applications hosted on shared infrastructure • Cost Savings seen in the Certification and Accreditation • Cost avoidance in engineering integration costs • Reduced footprint • Only building it once.
Existing programsGCCS-M • GCCS-M competitive strategy • Reusing existing software to build future capabilities • Enforcing data rights • Managing software using software repository • Using collaboration approach to build software • Use commercial best-practice software engineering, standards & processes • Explicit governance and contractual guidance • Automated testing, code scan and report generation • Software Quality key enabler to reduction of life cycle cost
GCCS-M Evaluation Criteria • GCCS-M incremental development was openly competed using a series of Command and Control Multiple Award Contracts (MACS) • Excerpt From Section M, Evaluation Factors for Award • FACTOR 1 – Technical Approach. –Subfactor 1 is equal in importance to subfactors 2 and 3 combined. Subfactor 2 is more important than 3. Subfactors will be evaluated based on the evaluation criteria described below. • SUBFACTOR 1 – Software Development Methodology • SUBFACTOR 2 – Project Resources • SUBFACTOR 3– Software Quality – Offerors will be evaluated on the extent to which the Software Quality proposal addresses the content described in Section L, effectively integrates information described in the Software Development Kit (See Attachment No. 4) and SPAWAR’s Software Quality Assurance Plan (See Attachment No. 8).
DoD Development Issues with Software Acquisition • 1:10:100 ratio forms a basic model to estimate ROI • Defects have negative multiplicative effect on cost • High Sustainment Cost – Finding issues too late • Poor Government – Industry Relationship • Institutional Knowledge Lock • Agile methods leave fewer defects (higher ROI) Boehm, B. W. . Software engineering economics. Englewood Cliffs, NJ: Prentice-Hall.
Is OA a good deal for industry? • Industry may be able to increase their market share through competition as we go away from vendor lock situations • Increase small business participation • Increase our industrial business base • Stimulus for more innovative technology • Reuse can assist in the development of market strategies • Future market strategies • Cloud computing • Contract Incentives
Potential Industry Incentives • Decoupling of hardware from software • E.G. CANES contracting strategy • Incorporate published standards that have been used by industry • Ease of technology insertion • Quality • Meet or exceed established quality metrics • For software issues worked upfront versus correcting STR • No cut and paste errors or memory leaks, etc • Assessing whether an artifact is a candidate for reuse across the DOD enterprise • Companies can identify reuse artifacts and through the use of award term options ..or.. • Award of an enterprise requirement IDIQ contracts where vendor can increase their market share • Substantial savings to the government and the company
Summary • The Future of OA • Expect to see more • Looking for Lessons Learned • Open Door Policy
Aligns with New IT Acquisition Cycle and OSD Efficiency Initiatives • …. and supports new DoD Efficiency Initiatives to: • “move defense enterprise toward more efficient, effective and cost-conscious way of doing business” • 4 Tracks • Shifting overhead costs to force structure and future modernization accounts • Inviting experts to suggest ways the Department can be more efficient • Conducting front end assessment to inform the FY 2012 budget request • Reducing excess and duplication across the defense enterprise HR 2647, National Defense Authorization Act for Fiscal Year 2010 Section. 804. “The Secretary of Defenseshall develop and implement a new acquisition process for information technology systems. The acquisition process developed and implemented pursuant to this subsection shall, to the extent determined appropriate by the Secretary — . . . be based on the recommendations in chapter 6 of the March 2009 report of the Defense Science Board Task Force on Department of Defense Policies and Procedures for the Acquisition of Information Technology; and . . . be designed to include — • early and continual involvement of the user; • multiple, rapidly executed increments or releases of capability; • early, successive prototyping to support an evolutionary approach; and • amodular, open-systems approach.”
New developmentGPNTS Shipboard Migration Path TODAY SWAP INCREMENT 1 INCREMENT 2 Ship’s Networks SOA Transport SOA Transport TODAY • Design based on evolved user requirements • Multiple configurations • Non-SAASM GPS receiver • Large footprint • Applications hosted by NAVSSI • Many point-to-point and application specific interfaces • Hardware dependent software • G-PNTS Increment 2: • Modernized (M-code) GPS receiver • JPALS support • Real time applications can be hosted by a CCE G-PNTS Increment 1: • SAASM GPS receiver • Non-real time applications can be hosted by a CCE • Time & Frequency management in real-time • Publish services for non-real time applications
GPNTS OA Evaluation Criteria • From Section L – Instructions to Offerors (Information the offeror was asked to provide to support the evaluation of OA as described in Section M • The Offeror shall describe how its systems design (hardware & software) employs open architecture tenants and a modular standards-based open systems approach to satisfy the GPNTS TRD requirements. The Offeror shall describe how its proposed design incorporates the open architecture design tenets of interoperability, extensibility, maintainability, and composeability. The Offeror shall describe how interfaces will be selected from existing open, de facto, proprietary or Government standards with emphasis on maximizing system level or enterprise level (where applicable) interoperability. The Offeror shall describe how its selection of interfaces will maximize the ability of the system to readily accommodate technology insertion (both hardware and software) and facilitate the reuse of alternative or reusable modular system elements. If the Offeror proposes to reuse software as part of its system design, the Offeror shall provide the rationale for which software was selected for reuse. • The Offeror shall describe how its system design minimizes reliance on proprietary, vendor unique, or closed elements. The Offeror shall justify any use of proprietary, vendor-unique, or closed components (software and/or hardware) and interfaces. The justification shall include documentation of the decision leading to selection of specific COTS products (e.g. test results, architectural suitability). The Offeror shall define its process for identifying and justifying proprietary, vendor-unique or closed interfaces, code modules, hardware, firmware, or software to be used. When interfaces, hardware, firmware, or modules that are proprietary or vendor unique are required, the Offeror shall explain how those proprietary elements do not preclude or hinder Government’s desire to: • Enable Government to separately purchase its own equipment and assemble GPNTS configurations in Government labs; • Enable third party Government or contractor teams to integrate additional hosted applications; • Enable third party Government or contractor teams to do LRIP/Full Rate Production installations; • Enable third party Government or contractor teams to perform ISEA functions; • Enable hand off of design documents, install documents, and software to the LRIP/Full Rate Production contractor for production; and • Enable hand off of design documents and software to the LRIP/Full Rate Production contractor for modifying and extending GPNTS design and GPNTS software. • The Offeror shall describe how its system design incorporates Non-Developmental Items and COTS items to meet the GPNTS performance requirements and provide the rationale for selection of these items, to include the results of any trade-off analyses or studies.
GCCS-M Evaluation Criteria • From Section L – Instructions to Offerors • Technical Approach : • Software Development Methodology • Project Resources • Software Quality • (1) Subfactor (1) – Software Development Methodology. • Within the Technical Approach document, the Offeror shall describe its proposed Software Development Methodology, referencing the Software Development Kit (See Attachment No. 4), SPAWAR's Delivery and Acceptance Process (See Attachment No. 5), PMW 150’s Integrated Master Schedule (See Attachment No. 6) where the Offeror deems appropriate. …. The Software Development Methodology shall contain the following content: • System overview. Briefly describe the general nature of the system and software. • Software-development process. Describe the software development process to be used. The planning shall identify planned builds, if applicable, their objectives, and the software development activities to be performed in each build. • Software development methods. Describe or reference the software development methods to be used. Include descriptions of the manual and automated tools and procedures to be used in support of these methods. Reference may be made to other paragraphs in this proposal if the methods are better described in context with the activities to which they will be applied. • Incorporating reusable software products. Describe the approach to be followed for identifying, evaluating, and incorporating reusable software products, including the scope of the search for such products and the criteria to be used for their evaluation. Candidate or selected reusable software products known at the time this plan is prepared or updated shall be identified and described, together with benefits, drawbacks, and restrictions, as applicable, associated with their use. If the Government approves any requested reuse software and or documentation, it will be provided on an as is basis. • SLOC Estimate. Offeror shall estimate the number of newly developed source lines of code (SLOC) and any assumptions used to establish the estimate. • Plans for performing detailed software development activities. Describe the proposed software unit testing, integration testing, and computer software configuration item (CSCI) testing. If different builds or different software on the project require different planning, these differences shall be noted in the paragraphs. The discussion of each activity shall include the approach (methods/procedures/tools) to be applied to: 1) the analysis or other technical tasks involved; 2) the recording of results; and 3) the preparation of associated deliverables. The discussion shall also identify applicable risks and uncertainties, and the plans for dealing with them. Reference may be made to other proposed sections if applicable methods are described there. • Schedule. Describe the schedule(s) identifying the proposed CWBS activities by calendar date in each build and showing initiation of each activity, availability of draft and final deliverables and other milestones, and completion of each activity. • Activity Network. Describe an activity network, depicting sequential relationships and dependencies among activities and identifying those activities that impose the greatest time restrictions on the project.
GCCS-M Evaluation Criteria • 2) Subfactor (2) – Project Resources. • Within the Technical Approach, the Offeror shall describe its proposed Project Resources, referencing the Software Development Kit (See Attachment No. 4), SPAWAR’s Global Work Breakdown Structure (See Attachment No. 7), and PMW 150’s Integrated Master Schedule for the Project (See Attachment No. 6) where the Offeror deems appropriate. • (3) Subfactor (3) – Software Quality. • Within the Technical Approach, the Offeror shall describe its Software Quality, referencing the Software Development Kit (See Attachment No. 4) and SPAWAR’s Software Quality Assurance Plan (See Attachment No. 8) where the Offeror deems appropriate. The Software Quality description shall contain the following content: • Access for Government review. Describe the approach to be followed for providing the Government or its authorized representative access to developer and subcontractor facilities for review of software products and activities. • Software product evaluation. Describe the approach to be followed for software product evaluation, including: in-process and final software product evaluations; software product evaluation records and items to be recorded; and, independence in software product evaluation. If different builds or different software on the project require different planning, these differences shall be noted. The presentation of this activity shall include the approach (methods/procedures/tools) to be applied to: 1) the analysis or other technical tasks involved; 2) the recording of results; and 3) the preparation of associated deliverables. The discussion shall also identify applicable risks/uncertainties and plans for dealing with them. • Software quality assurance. Describe the approach to be followed for software quality assurance, including: software quality assurance evaluations; and, software quality assurance records and items to be recorded. If different builds or different software on the project require different planning, these differences shall be noted. The presentation of this activity shall include the approach (methods/procedures/tools) to be applied to: 1) the analysis or other technical tasks involved; 2) the recording of results; and 3) the preparation of associated deliverables. The discussion shall also identify applicable risks/uncertainties and plans for dealing with them.
Four Pillars • RITE addresses findings and facilitates development and distribution of Navy C2 systems: • Check –software development • Stabilize – the current build • Influence – the final product delivery • “Manage as close to the source, as possible” • RITE PILLARS
RITE in action • Contracts (asking for what we want) • Boiler plate CDRLS to include language about GPR • Adding expectation of quality to contracting language • Template SOW’s created • Processes (receive, verify, ALL of what was paid for) • Source code analysis tools used to provide better cost estimates at source code level • Internal inspection of code revealing state of system from a software engineering level and software complexity, dependencies and coverage • With incorporation of automated test tools, able to reduce the time required to run a large number of test cases and Increase the number of test events completed in less time • Infrastructure (Ensure we can duplicate what we paid for) • Centralized our CM repository, thereby decentralizing development vendor teams • Allows government to review current state development • Internal inspection reveals and enforces configuration management best practices • Organization • Transformation of workforce and software intensive training • Government able to respond quicker and with authority
RITE Accomplishments • CONOPS and SOPs • Contract language • CDRLs, DIDs, SOW language, Guidance, PWSs • Automated test and source code analysis tools • Assessing project complexity/costs • Source code profiling • Improved Metrics evaluation criterion – Contractor performance assessment • Information Repository and Configuration Management • Supports Distributed Development • Govt. more involved in the development process • Acceptance process and checklists • Requires independent buildable Source code with deliveries • Web site for submitting and sharing source and object level code • NIPR/SIPR Sites • Workforce Reshaping underway using updated Position Descriptions
Duplicate Code Analysis Reportaids in STR propagation detection If an STR was fixed in one file, there are seven other places that it will occur Copy and Paste can be a haven for reoccurring STRs in other places
RITE Current toolsets • Dependency analysis • http://www.lattix.com/ • Static analysis • FindBugs - http://findbugs.sourceforge.net • PMD - Http://pmd.sourceforge.net • KlocWork – http://klocwork.com • Coverity – http://coverity.com • Dynamic analysis • Rational PurifyPlus suite • Automated Functional testing • AutoIT - http://www.autoitscript.com/ • HP QuickTest - http://en.wikipedia.org/wiki/HP_QuickTest_Professional
MOSA/OA Statement of Work (SOW) Requirement • 4.1.1.3 Modular Open Systems Approach/Open System Architecture • The Contractor shall develop and maintain an Open Architecture that incorporates appropriate considerations for interoperability, supportability, composeability, technology insertion, vendor independence, reusability, scalability, upgradeability, and long-term supportability as required by the 23 DEC 2005 Office of the Chief of Naval Operations (OPNAV N6/7) requirements letter. • The Contractor shall develop an open architecture that supports a Modular Open System Approach (MOSA), and deliver an Open System Management Plan. The open architecture shall support a layered and modular implementation approach, which maximizes the use of available COTS technology (e.g., hardware, operating systems, software, and middleware) and Government-Off-The Shelf (GOTS), where development is required for building systems. MOSA and analysis of long term supportability, interoperability, and growth for future modifications shall be major factors in the Contractor's final integration approach. All of the system components shall facilitate future upgrades and permit incremental technology insertion to allow for incorporation of additional or higher performance elements with minimal impact on the existing systems. • The architectural approach shall provide a viable technology insertion methodology and refresh strategy that supports application of a MOSA and is responsive to changes driven by mission requirements and new technologies. • The Contractor shall develop a detailed open architecture modular design and integration that takes into consideration system interoperability, intra-operability, upgradeability, reconfigurability, transportability, software standards, interface standards, long-term supportability, sources of supply and/or repair, business strategies, and other entities that affect application of a MOSA. • For those portions of software that are driven to proprietary and/or closed system architectures by mission specific requirements, a software partitioning or other design features to mitigate the system level impacts shall be provided to and approved by the Government. • The Contractor shall provide an orderly, planned approach to address migration of proprietary or closed software components or interfaces to a modular design when technological advances are available. The Contractors modular design and integration shall preclude long term dependence on closed or proprietary interface standards, technologies, products, or architectures. Secure or classified data systems shall also conform to the modular design approach as much as practicable. The design shall provide sufficient growth and open interface standards to allow future reconfiguration and addition of new capabilities without large-scale redesign of the system.
GPNTS OA Evaluation Criteria • Excerpt From Section M, Evaluation Factors for Award • The Government will evaluate the extent to which the Offeror’s system design (hardware and software) addresses open architecture tenets and employs a modular, standards-based open systems approach, as well as the extent to which the offeror’s design incorporates the design tenets of interoperability, extensibility, maintainability, and composeability. The Government will evaluate the Offeror’s approach for selecting and adhering to the standards it selected for inclusion in its system design. The Government will evaluate the extent to which the interfaces selected will accommodate technology insertion (hardware and software) and facilitate the reuse of alternative or reusable modular system elements. • If the Offeror proposes to reuse existing software as part of its design, the Government will evaluate the rationale for the software that was selected for reuse. • The Government will evaluate the extent to which the Offeror’s system design minimizes reliance on proprietary, vendor unique, or closed software and/or hardware elements and the extent to which the use of proprietary, vender unique or closed elements hinder the Government’s ability to do the following (see list below). In the event that proprietary, vendor unique, or closed elements are included in the Offeror’s system design, the Government will evaluate the Offeror’s justification for selection of such components. The Offeror will receive favorable consideration for proposing a system design that minimizes reliance on proprietary, vendor unique, or closed elements. • Enable Government to separately purchase its own equipment and assemble GPNTS configurations in Government labs; • Enable third party Government or contractor teams to integrate additional hosted applications; • Enable third party Government or contractor teams to do LRIP/Full Rate Production installations; • Enable third party Government or contractor teams to perform ISEA functions; • Enable hand off of design documents, install documents, and software to the LRIP/Full Rate Production contractor for production; and • Enable hand off of design documents and software to the LRIP/Full Rate Production contractor for modifying and extending GPNTS design and GPNTS software. • The Government will evaluate the extent to which the Offeror’s system design incorporates Non-Developmental Items and COTS items to meet the GPNTS performance requirements.