220 likes | 411 Views
NDIA Software Industry Experts Panel Paul R. Croll, Chair. NDIA Systems Engineering Division. Who We Are. The NDIA Software Industry Experts Panel acts as a “voice of industry” in matters relating to DoD software issues
E N D
NDIASoftware Industry Experts PanelPaul R. Croll, Chair NDIA Systems Engineering Division
Who We Are • The NDIA Software Industry Experts Panel acts as a “voice of industry” in matters relating to DoD software issues • The Panel helps identify and resolve software acquisition and development issues facing the industry and its Government customer base • The Panel may also, from time to time, identify for investigation certain technologies or practices that promise to improve industry responsiveness to DoD needs. • Members • Paul Croll, CSC, Chair • JoAn Ferguson, General Dynamics • Gary Hafen, Lockheed Martin • Blake Ireland, Raytheon • Al Mink, SRA International • Ken Nidiffer, SEI • Shawn Rahmani, Boeing • Rick Selby, Northrop Grumman
What We Do • Investigate, analyze, and develop recommendations concerning software issues, in response to NDIA and Government requests • Provide industry comments on Government positions, initiatives, or work products • Develop industry white papers and position papers • Reach out to relevant stakeholders through NDIA conferences and other venues as appropriate
NDIA Top Software Issues Workshop 24-25 August 2006 • Identify Top 5 Software Engineering problems or issues prevalent within the defense industry • Document issues • Description and current state • Rationale and SW impacts • Develop recommendations (short term and long term) • Generate task report • Submit to OSD
Top Software Issues • The impact of requirements upon software is not consistently quantified and managed in development or sustainment • Fundamental system engineering decisions are made without full participation of software engineering • Software life-cycle planning and management by acquirers and suppliers is ineffective • The quantity and quality of software engineering expertise is insufficient to meet the demands of government and the defense industry • Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems • There is a failure to assure correct, predictable, safe, secure execution of complex software in distributed environments • Inadequate attention is given to total lifecycle issues for COTS/NDI impacts on lifecycle cost and risk
Defense Software Strategy Summit18-19 October 2006 • Keynote Address, the Honorable Dr. James I. Finley, Deputy Under Secretary of Defense (Acquisition and Technology) • Program Executive Officer and Service/Defense Agency panels on software related acquisition issues and initiatives • Plenary Session Topics -- NDIA Top Software Issues, Software Industrial Base Study, and Software Producibility • Workshops • Software Acquisition and Sustainment • Mr. Mike Nicol, Air Force Aeronautical Systems Center • Mr. Lawrence T. Osiecki, US Army, Armament Software Engineering Center • Policy • Mr. Jim Clausen, DoD CIO, Office of Commercial IT Policy • Col Peter Sefcik, Jr., USAF Chief, Air Force Engineering Policy and Guidance Team • Lt Col Mark Wilson, SAF/AQR Systems & Software Engineering • Human Capital • Dr. Kenneth E. Nidiffer, Fellow, Systems and Software Consortium • Mr. George Prosnik, Defense Acquisition University E&T Center • Software Engineering Practices • Mr. Grady Campbell, Software Engineering Institute • Mr. Paul R. Croll, CSC, Industry Co-Chair NDIA Software Committee
Software Acquisition and Sustainment Software issues not addressed early in lifecycle Software requirements not well defined at program start Management has limited visibility into software development processes and status Risk areas – single point failures not adequately addressed, e.g., single software providers, incomplete data rights, key personnel stability, life cycle support of COTS Acquirers do not adequately address software sustainment and total life cycle early in the program Some agencies contract before engineering is complete, prior to system design and development Policy PMs need assistance with software policy and analysis Arbitrary separation of weapon and information technology software policies Policy implementation guidance and follow-up monitoring is limited Department needs software group with good expertise to oversee and implement policy Need capability to share policy and guidance information DoD Software Summit Issues • Software Engineering • Weak linkage between software requirements and capabilities/portfolios • System development methods do not properly leverage software ability to rapidly field new capability • Systems and software engineering lifecycles not always consistent or harmonized • Software considerations not consistently addressed in architectures • Inadequate software estimating methods, e.g., COTS/NDI; best practices not applied • Human Capital • Experienced system & software engineers seem missing from key DoD leadership positions • Shortage of highly experienced software managers, architects, domain and technical experts • Eroding depth and breath of experience for personnel in DoD • Young people may consider system and software engineering as a career dead end • Emerging skill set may be needed for future complex DoD systems, e.g., systems of systems Reaffirmed NDIA Top Software Issues
SW & SE Integration Software Acquisition Management Requirements – GAP SE/SW Process Int – O SW Council – N SW Dev Plan – N SW in SEP – N SW in Tech Reviews – N SW Quality Attributes - GAP Standards – O, N DAG Ch 4/7 – O, AF Prog Spt – O, All Contract Language – A, M, N SW Estimation – GAP Lifecycle Policy – AF Risk Identification - GAP Human Capital Knowledge Sharing Data and Metrics Education Sources – N, A Leadership Training – A, SEI SETA Quals – GAP SW Human Cap Strategy – GAP Industrial Base – O University Curriculum – O Worforce Survey - AF DAU Software ACC – DAU Best Practices Clearinghouse – DAU, O SW Inventory – LMR Lifecycle Guides – M, N Root Cause Analysis – O Local Knowledge Portals - N SW Metrics – A, O SW Cost – O SW EVM – DCMA SW Estimation - GAP Software Issues/GapsWorkshop Findings *based on NDIA Top SW Issues, OSD Program Support Reviews, and DoD Software Summit findings Primary Software Focus Areas* Software Development Techniques Agile – O, SEI Architecture – A, SEI COTS – SEI Open Source – AF Sustainment – GAP SW Interoperability – GAP SW Test - GAP Ongoing Initiative Owners O – OSD/SSA A – Army N – Navy AF – Air Force M – MDA SEI DCMA DAU L&MR GAP – No activity Source: Kristen Baldwin, Deputy Director, Software Engineering and System Assurance, OUSD(AT&L), April 18, 2007
DoD Software Gaps • Estimation • Risk Identification • Sustainment • Interoperability • Test • Requirements • Quality Attributes • SETA Qualifications • Human Capital Strategy Source: Kristen Baldwin, Deputy Director, Software Engineering and System Assurance, OUSD(AT&L), April 18, 2007
DoD Software Gaps • Estimation • Risk Identification • Sustainment • Interoperability • Test • Requirements • Quality Attributes • SETA Qualifications • Human Capital Strategy Source: Kristen Baldwin, Deputy Director, Software Engineering and System Assurance, OUSD(AT&L), April 18, 2007
Software Industry Experts PanelAction Plan • Software Points of Influence list • bi-directional commitment • Software interested parties list • information awareness • Supporting resolution of identified DoD Software Gaps • Human Capital, Requirements • Risk, Quality Attributes • Test, Estimation • Sustainment • Interoperability, SETA
Overall Workshop Objectives • Three Workshop Panels • Software Requirements • Software Risk and Estimation • Software Quality Attributes • Workshop Objectives For Each Panel • Define a specific plan to crystallize concrete progress within the next 6-18 months • Define work products and a plan to develop them over 6, 12 and 18 month periods • Identify stakeholders relevant to each of these work products
Software In Acquisition Workshop Attendance: - 100+ attendees - Services, Agencies, Industry, Academia, FFRDC, NASA Workshop Topics: - Software Requirements - Software Estimation / Software Risk - Software Quality Attributes
Requirements Workshop Recommendations • Define an effective “software portfolio” management framework • Protect the continuity of systems/software and requirement engineering throughout the software life cycle • Implement the techniques we know will work and identify any shortcomings • Find ways to leverage the malleability of software • Software has the ability to adapt to changing requirements • Change our view/perspective of “sustainment” to “continuous evolution” • Establish a research program
Software Estimation/Risk Recommendations • Establish Work Breakdown Structure guidance to better highlight Software Engineering activity • Developing and evolving an integrated software data repository and related tools • Conduct Root Cause analysis studies to understand the problems in software estimation and the use of estimates in the acquisition process • Develop and implement an incremental acquisition approach (as well as the overall acquisition framework) that accommodates the uncertainty associated with early software estimates and allows for adjustment and refinement over time • Establish policy, related guidance, and recommended implementation approaches for software data collection and analysis across all DoD acquisition programs
Software Quality AttributePriority Recommendations • Develop engineering guidance on quantitatively identifying, predicting, evaluating, verifying, and validating Quality Attributes • Address tie-in to KPPs and TPMs • Identify methods for predicting quality attribute outcomes for the delivered system, throughout the life cycle • Improving OSD/Service-level acquisition policy regarding Quality Attributes • Identify benefits of addressing software quality attributes as part of an acquisition risk reduction strategy • Address gaps in SEP, TEMP, JCIDS, DAG, RFP language • Define expectations for Quality Attribute review during Acquisition Milestone Reviews (e.g. PDR) • Develop taxonomy of software quality attributes and how they are related • Develop Program Manager guidance on Introduction to Software Architectural Evaluation of Quality Attributes • Develop Collaboration site for collecting data, sharing work products, facilitating on-going discussion
Software In Acquisition Spring Workshop 2008 • The purpose of this workshop was to be a “touch point” for the actions that resulted from the SSA annual software workshop in October of last year • Review of issues and recommendations in each of the areas covered under the October workshop: • Requirements • Risk/Cost • Quality Attributes • Three Working Groups • Software Requirements • Software Risk and Estimation • Software Quality Attributes • Objectives For Each Working Group • Task Statements • Deliverables • Schedule
Outcome • Knit Together Selected “Big Ideas” into Unified Proposed Initiative: • Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements, Risk, Estimation, and Quality
Task Definition • Task 1: Conduct surveys and interviews of leading software professionals (government, industry, academia) to gather ideas, assess impacts, and sense expectations for Competitive Prototyping • Task 2: Provide amplification of Competitive Prototyping memo for integrated SE/SW, [including where in the lifecycle there are opportunities for Competitive Prototyping [and how they can be contractually achieved] • Task 3:Identify first adopters of Competitive Prototyping and facilitate and gather insights on effective usage, including collecting and analyzing data • Task 4: Develop guidance for early selection and application of integrated SE/SW quality systems for Competitive Prototyping [for RFP-authors] • Task 5: Develop Software Engineering Handbook for Competitive Prototyping including material explicitly targeted to different audiences (acquirer, supplier, etc.). • Task 6: Develop training assets (materials, competencies, skill sets, etc.) that capture best-of-class ideas/practices for Competitive Prototyping
Competitive Prototyping Survey • Purpose • “To gather recommendations, assess impacts, and sense expectations for Competitive Prototyping" from key members of government, industry, and academia • Conducted by the Center for Systems and Software Engineering (CSSE), at the University of Southern California (USC) • The domain of interest for the survey and interviews comprise those projects that are considered large-scale "software-intensive systems" (SiS) • Systems for which software is a principal and defining component • Where software likely represents the key source of technological and programmatic risk in its development • The software development costs of projects of interest should be valued at $100 million, or higher
OUSD(AT&L)/SSE-USC/CSSE CP Workshop • OUSD(AT&L)/SSE-USC/CSSE Workshop on Integrating Systems and Software Engineering under Competitive Prototyping with the Incremental Commitment Model • Washington, DC, July 14-17, 2008 • Discussion of the way forward from the survey through the remainder of the tasks
Paul R. Croll CSC 17021 Combs Drive King George, VA 22485-5824 Phone: +1 540.644.6224 Fax: +1 540.663.0276 e-mail: pcroll@csc.com For More Information . . .