350 likes | 606 Views
Software Measurement Activities. Software Measurement Framework. SEG3202 N. El Kadri . Software Measurement Activities. Cost and effort estimation models and measures Productivity models and measures Data collection Quality models and measures Reliability models
E N D
Software Measurement Activities.Software Measurement Framework SEG3202 N. El Kadri
Software Measurement Activities • Cost and effort estimation models and measures • Productivity models and measures • Data collection • Quality models and measures • Reliability models • Performance evaluation and models • Structural and complexity metrics • Capability maturity assessment • Management by metrics • Evaluation of methods and tools
Cost and effort estimation • managers must plan projects by predicting necessary cost and effort and assigning resources appropriately • Doing this accurately has become one of the ‘holy grail’ searches of software engineering. • numerous measurement-based models for software cost and effort estimation have been proposed and used. • Examples: Boehm’s COCOMO model, Putnam’s SLIM model and Albrecht’s function points model.
Simple COCOMO ModelEffort Predict Effort = a(size)b Effort = person month Size = predicted a,b: constants depending on type of system: “organic”: a = 2.4 b = 1.05 “semi-detached”: a = 3.0 b = 1.12 “embedded”: a = 3.6 b = 1.2
User User Spelling Checker Dictionary Albrecht’s Function Points Count the number of: • External inputs • External outputs • External inquiries • External files • Internal files Giving each a “weighting factor” The Unadjusted Function Count(UFC) is the sum of all these weighted scores To get the Adjusted Function Count(FP), multiply by a Technical Complexity Factor(TCF) FP = UFC * TCF
Productivity models and measures • Traditional model: simply divides size (LOC) by effort (person-month). • Productivity model as a decomposition into measurable attributes: • This model is a significantly more comprehensive view of productivity than the traditional one
Data Collection • Effective use of measurement is dependent on careful data collection • Ensure that measures are defined unambiguously, that collection is consistent and complete, and that data integrity is not at risk. • Require carefully-planned data collection, as well as thorough analysis and reporting of the results. • Example: failure data collection 1) Time of failure 2) Time interval between failures 3) Cumulative failure up to a given time 4) Failures experienced in a time interval
Quality Models • Models of quality for various views of software quality constructed in a tree-like fashion • The tree describes the pertinent relationships between factors and their dependent criteria, so we can measure the factors in terms of the dependent criteria measures. • upper branches hold important high-level quality factors of software products, such as reliability and usability, that we would like to quantify • Each quality factor is composed of lower-level criteria, such as modularity and data commonality. • The criteria are easier to understand and measure than the factors; thus, actual measures (metrics) are proposed for the criteria.
ISO 9126 Quality Model Factors Criteria Metrics (see ISO9126-2, ISO9126-3)
Reliability Models • Most quality models include reliability as one of its component factors. • software reliability modeling is applicable during the implementation phase of software QA. • Based on observation and record information about software failures during test or operation.
Reliability Models • Plot the change of failure intensity against time. • The most famous reliability models are the basic exponential model and logarithmic Poisson model • The basic exponential model assumes finite failures in infinite time; • the logarithmic, Poisson model assumes infinite failures. • Automated tools such as CASRE are available.
Performance Evaluation and Models • Performance Model includes externally-observable system performance characteristics, such as response times and completion rates. • Performance modeling is part of the implementation and maintenance phases of software QA. • Performance specialists also investigate: • the efficiency of algorithms as embodied in computational and algorithmic complexity [Harel 1992] • the inherent complexity of problems measured in terms of efficiency of an optimal solution.
Structural and complexity metrics • We measure structural attributes of representations of the software which are available before the implementation: • Control- flow structure • Data- flow structure • Data structure • Information flow attributes • Complexity metrics (1979~) • Halstead’s “Software Science” metrics • McCabe’s “Cyclomatic Complexity” metrics (McCabe 1989) - number of independent paths in execution of a program • Influenced by • Growing acceptance of structured programming • Notions of cognitive complexity
McCabe’s Cyclomatic Complexity If G is the control flowgraph of program P and G has e edges and n nodes v(P) = e – n + 2 v(P) is the number of linearly independent paths in G here, e = 16, n=13, and v(P) = 5 More simply, if d is the number of decision nodes in G then v(P) = d + 1 McCabe proposed: v(P) < 10 for each module P
Management by metrics • Estimate project elements such as cost, schedules, and staffing profiles • Track project results against planning estimates • Validate the organizational models as the basis for improving future estimates
Measurement for Guiding Management: Example • Assume that an organization’s goal is to decrease the error rate in delivered software while maintaining (or possibly improving) the level of productivity; • further assume that the organization has decided to change the process by introducing the Cleanroom method. • SEL assessed the impact of introducing the Cleanroom method. • The results of the experiment appear to provide preliminary evidence of the expected improvement in reliability following introduction of the Cleanroom method and may also indicate an improvement in productivity.
Evaluation of methods and Tools • Efficiency of methods (1991~) • Efficiency and reliability of tools • Certification test of acquired tools and Components
Capability Maturity Assessment • US Software Engineering Institute (SEI) model (1989): Grading using five-level scale. • ISO 9001: Quality systems: models for quality assurance in design/development, production, installation and servicing (1991) • ISO 9000-3: Guidelines for application of ISO 9001 to the development, supply and maintenance of software (1991)
Capability Maturity Model (CMM) 5. OptimizingContinuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies. Continuously improving process 4. ManagedDetailed measures of the software process and product quality are collected. Both the software process and products are quantitatively understood and controlled. Predictable process 3. DefinedThe software process for both management and engineering activities is documented, standardized, and integrated into a standard software process for the organization. All projects use an approved, tailored version of the organization's standard software process for developing and maintaining software. Standard, consistent process Disciplined process 2. RepeatableBasic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications. 1. InitialThe software process is characterized as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual effort.
Software Measurement Program • Measurement is the mechanism to provide feedback on software quality. • A measurement program without a clear purpose will result in frustration, waste, annoyance, and confusion. • To be successful, a measurement program must be viewed as one tool in the quest for the improved engineering of software.
SE Standards • ISO/IEC 9126 Software product evaluation: Quality characteristics and guidelines for their use • ISO/IEC 15939:2002 Software Measurement Process • ISO 9000 Standards are used to regulate internal quality and to assure quality of suppliers. Measurement is part of ISO 9000 • IEEE 1061: Software Quality Metrics Methodology • IEEE 1045: Software Productivity Metrics
Clarification: Metrics v.s. Measures v.s. Measurements • Metrics are commonly accepted scales that define measurable attributes of entities, their units and their scopes. • Measureis a relation between an attribute and a measurement scale. • In the literature, measurements,measures, metrics are used as synonymous
Rigorous Measurement Framework • Measurement = data collection + context • Data collection: • Why you are collecting the data • How you plan to use the data • Purpose or destination for collecting the data (f.i., improving quality of your software from some perspective) • Trade-off between costs and benefits
How to Build Valid Measurement Context? • Points of view on software development: • Strategic: long-term performance of the organization • Tactical: short-term performance of an individual process • Technical:details of products and processes that influence the development processes and products • Classes of software development objects • Products • Processes • Resources Measurement Context= selected points of view+ selected object(s)
Views of Measurement: Strategic View • Organization’s goals are stated in measurable terms • Measures of products, projects, and resources are summarized as means or medians, with some indication of variability • Unit cost (labor hours / size) • Defect rates (delivered defects / size) • Cycle time (project days / size) • Strategic View tracks trends of these summary statistics. • Strategic data is used to determine if and how well those goals are being met • Primary user of strategic measurement data: strategic manager
Views of Measurement: Tactical View • Concerned with performance of individual project • Measurement data is used to • compare actual results to target (estimated or planned) results. Any variances are noted and investigated. • Defect discovery rate during inspection or testing activities • predict values of certain indirect project measures • Using project size to predict cost and schedule • Primary user of tactical measurement data: project manager
Views of Measurement: Tactical View Project manager uses tactical measurement data for:
Views of Measurement: Technical View • Physically, all measurement takes place at the technical level. • All measures used at the strategic and tactical levels are built from fundamental technical measures • Strategic and tactical users of measurement data depend on technical users to supply the data • Technical measures are focused upon a set of internal attributes of a single product or process, highly dependent on the technology in the product • Primary user of technical measurement data: software engineer
Objects of Measurement • First obligation of measurement effort is to identify those objects which are to be measured • Processes, Products, Resources • We measure attributes of those objects • Internal: measured purely in terms of the process, project, product or resource itself • External: can be measured only with respect to how the process, project, product or resource relates to its environment
Objects of Measurement: Process • Processes are measured by comparing instance measurements to each other over time • Direct Internal process measures:
Objects of Measurement: Process • Indirect Internal process measures • External Process Measures • Productivity – the unit of product produced per unit of input • Stability of the process • Variation – the extent to which instances of the process differ from each other
Objects of Measurement: Resources • resources are those objects that serve as input to the processes: • People, tools, materials, methods, time, money, training • Internal attributes measures: • Cost, capability, constraints on use • External attributes measures: • Performance, productivity