1 / 11

A Practical Definition of “Software Quality” (Predictable and Measurable)

A Practical Definition of “Software Quality” (Predictable and Measurable). Low Defect Potentials (< 1 per function point) High Defect Removal (> 95%) Unambiguous, Stable Requirements (< 2.5% change) Explicit Requirements Achieved (> 97.5% achieved)

neorah
Download Presentation

A Practical Definition of “Software Quality” (Predictable and Measurable)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Practical Definition of “Software Quality” (Predictable and Measurable) • Low Defect Potentials (< 1 per function point) • High Defect Removal (> 95%) • Unambiguous, Stable Requirements (< 2.5% change) • Explicit Requirements Achieved (> 97.5% achieved) • High User Satisfaction Ratings (> 90% “Excellent”) • Installation - Error Handling • Ease of Learning - User Information (screens, manuals, etc) • Ease of Use - Customer Support • Functionality - Defect Repairs • Compatibility Software developers and acquirers at firms that GAO visited use three fundamental management strategies to ensure the delivery of high-quality products on time and within budget: working in an evolutionary environment, following disciplined development processes, and collecting and analyzing meaningful metrics to measure progress. “Stronger Management Practices are Needed to Improve DoD’s Software-Intensive Weapon Acquisitions, GAO Report, March 2004 Rev 1.1

  2. Perspectives on Software Quality THE ESSENCE: Does it do what it is supposed to? Efficiency Completeness “Software problems and defects are among the few direct measurements of software processes and products. Problem and defect measurements also are the basis for quantifying several significant software quality attributes, factors, and criteria--reliability, correctness, completeness, efficiency, and usability among them” Software Quality Measurement, SEI-92-TR-22 SUITABILITY Reliability Accuracy Usability Consistency QUALITY Understandability Modifiability MAINTAINABILITY Testability Portability

  3. The Software Quality Metrics Methodology 1) Establish Requirements a) Identify list of possible quality requirements b) Determine list of quality requirements c) Assign a direct metric to each quality requirement 2) Identify Metrics a) Apply the software quality metrics framework b) Perform a cost-benefit analysis c) Gain commitment to the metrics 3) Implement Metrics a) Define the data collection procedures b) Prototype the measurement process c) Collect the data and compute the metric values 4) Analyze Results a) Interpret the results b) Identify software quality c) Make software quality predictions d) Ensure compliance with requirements 5) Validate Metrics a) Apply the validation methodology b) Apply the validity criteria c) Apply the validation procedures d) Document results IEEE Standard for a Software Quality Metrics Methodology IEEE Std 1061-1998

  4. Goal-Question-Metrics (GQM) Measurement Methodology SOFTWARE ACQUISITION GOLD PRACTICETM Goal-Question-Metrics (GQM) Measurement Methodology is a way to implement Goal-Driven Measurement STEP 1: Identify goals STEP 2: Identify questions that need to be asked if the goal is to be achieved STEP 3: Identify an indicator to display the answers to your questions in STEP 2 STEP 4: Identify measures that can satisfy the question (PSM Method) Vers 1.0

  5. GOAL-DRIVEN MEASUREMENT (GDM) SOFTWARE ACQUISITION GOLD PRACTICETM

  6. Key Practices of the GQM Approach SOFTWARE ACQUISITION GOLD PRACTICETM • Get the right people(at all levels of developers) involved in the GQM process • Set and state explicit measurement goals and state them explicitly • Thoroughly plan the measurement program and document it (explicit and operational definitions) • Don’t create false measurement goals • Acquire implicit quality models from the team • Consider context • Derive appropriate metrics • Stay focused on goals when analyzing data

  7. Key Practices of the GQM Approach (Cont’d) SOFTWARE ACQUISITION GOLD PRACTICETM • Let the data be interpreted by the people involved • Integrate the measurement activities with regular project activities • Do not use measurement for other purposes • Secure management commitment to support measurement results • Establish an infrastructure to support the measurement program • Ensure  that measurement is viewed  as a tool, not the end goal • Get training in GQM before going forward Optional: Goal-Question-Metric Exercise (Link) DACS Gold Practices Website https://www.goldpractices.com/practices/gqm/

  8. Example Typical Costs of Software Fixes *Once a system is fielded, PDSS costs are typically 50-70% of total system lifecycle costs

  9. Other Software Quality and Capability Initiatives • Practical Software and Systems Measurement (PSM) • Best practices within the software/system acquisition and engineering communities. • Goal is to provide Project Managers with the information needed to meet cost, schedule, and technical objectives on programs. • Control Objectives for Information and related Technology (COBIT) • Provides good practices across a domain and process framework • Practices designed to help optimize IT-enabled investments, ensure service delivery and provide a measure against which to judge when things do go wrong. • Information Technology Infrastructure Library (ITIL) • Provides international best practices for IT service management • Consists of a series of books giving guidance on the provision of quality IT services, and on the accommodation and environmental facilities needed to support IT • SPICE (SW Process Improvement and Capability Determination) (ISO/IEC 15504) • An international standard for software process assessment • Derived from process lifecycle standard ISO 12207 and ideas of maturity models like Bootstrap, Trillium and the CMM.

  10. Schedule and Progress • Milestone completion • Work unit progress • Incremental Capability • Prospective Measures • Requirements traced • Requirements tested • Requirements status • Problem reports opened • Problem reports closed • Reviews completed • Change requests opened • Change requests resolved • Units designed • Units coded • Units integrated • Test cases attempted • Test cases passed • Action item opened • Action item completed • Components integrated • Functionality integrated • Goal • Question • Metric Selecting Measures Vers 1.0

  11. References The Data and Analysis Center for Software (DACS) https://www.thedacs.com/ Practical Software and Systems Measurement (PSM) http://www.psmsc.com Software Engineering Information Repository (SEIR) https://seir.sei.cmu.edu/seir/ Software Program Manager’s Network (SPMN) http://www.spmn.com/lessons.html Software Engineering Institute (SEI) – Carnegie Melon http://www.sei.cmu.edu/ DoD Information Technology Standards Registry (DISR Online)https://disronline.disa.mil/a/public/index.jsp Best Practices Clearinghouse https://acc.dau.mil/sam

More Related