220 likes | 351 Views
VARIABILITY AND REPRODUCIBILITY IN SOFTWARE ENGINEERING: A STUDY OF FOUR COMPANIES THAT DEVOLPED THE SAME SYSTEM. Bente C.D Anda , Dag I.K Sjberg Audris Mockus. Reproducibility: closed related to Repeatability and Replicability.
E N D
VARIABILITY AND REPRODUCIBILITY IN SOFTWARE ENGINEERING: A STUDY OF FOUR COMPANIES THAT DEVOLPED THE SAMESYSTEM Bente C.D Anda, Dag I.K Sjberg AudrisMockus
Reproducibility:closed related to Repeatability and Replicability • The closeness of agreement between independent results with the same method on identical test material but under different lab’s, apparatus, conditions, operators and/or different intervals of time.
Definitions • Repeatability is a measure of the variability of a single response within a single laboratory. • Reproducibility measures the variability of same response across different laboratories. • Replication means that other researchers in other settings with different samples attempt to reproduce the research as closely as possible. • Reliability is the stability of a set of observations generated by an indicator under fixed set of conditions, regardless who collects the observations or when or where they are collected.
Key Idea To investigate the extent to which key dimensions of software projects and products are reproduced.
Measuring Reproducibility • What dimensions to measure reproducibility? • What is the best way to measure them? • What values indicate low, medium, high reproducibilities? Coefficient of Variation: cv = stddev/mean Reproducibility = 1/cv
Selection of projects • 81 Norwegian and international software consultation companies: the specification of requirements , Java programming and minimum size of the company • 35 companies provided bids • 4 companies were selected to develop individual systems
Project success criteria • Contractor Related costs • Actual Lead time • Schedule Overrun
Software Product Quality • Functionality • Reliability • Usability • Efficiency • Maintainability • Portability
Reproducibility is indicated • A set of software providers that similar project and product outcomes. • If a random set of providers produce same results. • If dissimilar providers produce similar results. • If different types of providers produce different results that can be predicted from differences among the providers.
Contributions • Methodological : The measurement framework of entire cycle of a project and multifaceted observations. • Scientific: Better understanding of Reproducibility in software productions that should provide the basis for SE methods, tools and studies. • Practical: Descriptions of detailed scenarios of what can be expected in a software contracting context and how to use software bids to anticipate project outcomes.
Validity Construct Validity • How accurately the values of the variables are measured • How different values are aggregated to measure the dimensions of software development • How well those dimensions represent the constructs that they suspend to operationalize.
Validity Internal Validity • Variation in skills • Development Tools External Validity • No previous Business or personal Relationships • External Consultant • Higher Reproducibility for Usability
Related Work • 9 Companies developed same system to investigate the three development platforms • COCOMO– cost based model • Analogy based Estimation-use the cost model of completed similar projects
Future work • More companies and investigate different contexts of software production. • Maintenance Phase should be that would complete the picture of the entire life cycle. • Outsource the participation in studies to professionals in countries with lower costs.
Conclusions • Deep and detailed understanding of the relationships is absent. • Investigate only isolated aspects of software production ,that does not capture how software works in real practice. • Effort was put into the conceptual definitions and operationalization of fundamental SE terms.