290 likes | 319 Views
Software Quality Engineering Roadmap. Business Objectives. Maximize Revenue Minimize Costs. Maximize Revenue. Improve market share: Improve product quality Increase customer satisfaction Better marketing Improve cycletime Increase product value: Increase product quality
E N D
Business Objectives • Maximize Revenue • Minimize Costs
Maximize Revenue • Improve market share: • Improve product quality • Increase customer satisfaction • Better marketing • Improve cycletime • Increase product value: • Increase product quality • Define value proposition and match product capabilities and quality to customer needs
Improve Market Share • Improve product quality • Increase customer satisfaction • Identify and address key customer concerns • Continuous improvement process • Practices for measuring and improving satisfaction • Expectation management practices • Better marketing • Match product to marketing needs • Improved requirements practices • Practices for measuring and improving marketing effectiveness • Improve cycletime • Cycletime measurement and continuous improvement • Increase product value • Define value proposition and match product capabilities and quality to customer needs
Minimize Costs – Improve Productivity • Productivity measurement and continuous improvement • Practices for improving productivity: • Tools usage • Technology selection • Reuse and common architecture development • Streamlining and automation • Workflow, templates, tools • Task elimination • Developer capability improvement through training
Improve Project Management– Activity Management • Practices: planning, estimation, progress tracking • Metrics: estimation accuracy, progress metrics • Estimation practices: • Maintain database of past projects • Estimate based on history of comparable projects • COCOMO (constructive cost model) provides questionnaire and formula for estimation based on organizational and project profile
Improve Project Management – Risk Management • Identify and track risks • Metrics examples: • Percentage of unidentified high-impact risks among those that actually occurred • Percentage of unmitigated high-impact risks among those that actually occurred • Percentage of effort spent on risk management and mitigation
Improve Project Management – People Management • Practices to measure and improve employee satisfaction • Teaming practices and effectiveness: • Metrics: Turnover %
Improve Project Management: In-Process Metrics • Slippage, progress • Inspection effectiveness (evolve as project proceeds) • Defect rates • Test progress • Reliability growth curves
Product Quality- Features • Requirements quality & effectiveness • Requirements prioritization & release planning
Product Quality - Performance • Defining performance objectives • Practices: designing for performance, performance analysis, performance testing • Primarily measurements and not metrics
Product Quality - Dependability • Reliability engineering: objective setting, practices and metrics • Availability: objective setting, practices and metrics • Security: • Practices for identifying threats (defining objectives) • Security engineering practices • Security testing: threat injection • Safety (similar to security and reliability)
Product Quality - Usability • Specifying usability objectives • Usability engineering practices • Usability measurements: surveys, avg. learning time, keystrokes for frequent operations etc. • Usability metrics: SUMI
Product Quality - Evolvability • Evolvability goals • Practices for improving evolvability
Development Effectiveness • Requirements • Design • Testing & Inspections • Configuration Management • Quality Engineering Creative Research Systems
Development Effectiveness - Requirements • Requirements effectiveness practices • Elicitation practices, prototyping, requirements analysis practices, traceability • Requirements metrics • Activity metrics, also requirements defect injection rate
Development Effectiveness - Design • Practices for design effectiveness • Modularity practices, design patterns for achieving various quality attributes, design diagramming, design analysis techniques • Design metrics • Activity metrics, injection rates, product quality metrics, reuse percentage
Development Effectiveness – Testing & Inspections • Practices for testing effectiveness • Multistage testing, test strategy & planning, test case identification through equiv classes, automation of test generation, randomized parameters, automated testing • Reliability engineering • Test metrics • DRE charts, defect rates and densities, reliability … • Inspection practices and metrics
Development Effectiveness – Configuration Management • Configuration management practices • Metrics for configuration management effectiveness • Configuration management defect injection rate
Development Effectiveness – Quality Engineering Practices • Processes • Process Improvement / Continuous Improvement • Assessments
Quality Engineering Process • Initial process definition and tailoring • Process compliance metrics (certifications, assessments) • Process improvement • Improve effectiveness of practices • Improve breadth of process & practices (increased “maturity level”, CMMI-style) • Minimize volume of process, maximize effectiveness (agile development) • Define set of processes that address the most frequent causes of defects and achieve more of the business objectives: maximizing cost-benefit ratio for process practices • Requires deep understanding of process/quality relationships, high developer maturity and competence, and extensive customization to situational needs • A stage of evolution beyond simply adding process breadth and doing improvement.
Continuous Improvement • Quality tools and causal analysis, defect elimination • Identify problem area • Use fishbone diagrams to brainstorm and capture possible causes • Use results to design data collection scheme and gather data about actual causes of problem - This is an extremely important step. The fishbone itself is just a set of guesses. It does not tell you which causes need addressing • Use pareto charts to identify most significant causes of the problem • Use defect elimination techniques (checklists, templates etc) to try to prevent the problem in the future • Repeat continuously or until objective is achieved • Effectiveness measured through trend lines for various process metrics
Metrics Concepts • Defining Metrics • GQM paradigm • Metrics must have significant value: High impact (value provided to user of metric) compared to effort spent in collecting data. • Meaningfulness of metrics • Reliability • Validity • Importance of interpretation • Gathering Measurements
Goal-Question-Metric (GQM) Paradigm • What are the objectives for the metric? • Should identify the purpose of the metric – who wants to do what with it? • Includes “viewpoint”: whose viewpoint are we interested in for this? E.g. developer, project manager, organization manager, quality engineer, external assessor, customer. • What specific questions do we want answered? • Identify specific aspects of the problem that we care about • Includes operational definitions of the quantities we want to measure • Define measurements and metrics • What data will need to be gathered? Who will gather it and how? • What metric will be computed from the measurements? • Metrics must provide a basis for action • Metrics must have an interpretation: given a value, how will this be interpreted? • Metrics must have significant value: high impact (value provided to user of metric) compared to effort spent in collecting data • Meaningfulness of metrics • Reliability • Validity
Importance of Interpretation • Understanding the intrinsic limitations of the metric • Understanding the limitations of the data collection process • Understanding situational factors that provide context to the numbers
Gathering Measurements • Minimizing measurement effort: • Effort spent on providing data is “wasted” effort from project perspective – needs to be minimized! • Often, person who has to put in effort is different from the one who gets immediate value! Higher the effort, less the motivation for the provider to supply good data • Need for tools: • Tools improve accuracy of data, especially if collection is automated • May sometimes need to provide for manual adjustments to tool-generated data • Tools can automate processing of data and generate useful views (graphs) directly • Motivating measurement: • Set high bar on value: only ask for measurements that provide considerable value • Make value visible to people providing measurements (clear how the gathered data will be used) • Try to provide value to those providing measurements: • e.g. customer concerns actually get addressed • e.g. developers get useful feedback that helps them avoid problems and reduce effort
Metrics Perspective • Metrics are a small part of the overall quality engineering process. • Define objectives for metrics. • Metrics interpretation is key. • Actions and improvements based on metrics interpretation is the primary value achieved in the use of metrics.