220 likes | 349 Views
Towards an Automatic Approach for Quality Improvement in Object-Oriented Design. Dr. Radu Marinescu. Problem Statement. Numerous large-scale OO systems with signs of "rotting design" monolithic, inflexible, fragile, etc. Discarding these systems is no t an option !
E N D
Towards an Automatic Approachfor Quality Improvement in Object-Oriented Design Dr. Radu Marinescu
Problem Statement • Numerous large-scale OO systems with signs of "rotting design" • monolithic, inflexible, fragile, etc. • Discarding these systems is not an option ! • high business value and large scale • reuse and maintenance necessary • Design Flaws – an invariant issue in software • time pressure during development • permanent change of requirements How to correlate external signs of poor quality and the occurrence of concrete structural design flaws?
Develop methods, techniques and toolsthat provide a proper mapping between external quality attributes and the internal, structural characteristics of object-oriented design. Research Goal Central focus is to support the process of quality assessment and improvement for existing object-oriented systems
Design Flaws ... • Exclusive Focus on Structural Flaws • Design Fragments • i.e. methods, classes, modules (subsystems) • Need criteria for high-quality design • design rules, principles and heuristics etc. • also negative rules (e.g. "bad-smells") … are structural characteristics of design fragments that express deviations from a given set of criteria typifying a high-quality design.
Problem Detection • The process of identifying the parts of a software system affected by a particular design flaw • It‘s not easy! • manual and empirical • time-expensive and non-scalable • hard to quantify design rules and principles.... • "Measuring" the Design • map source-code entities to numerical values • used as quality indicators Detection of Design Problems Idea: Use metrics to detect design problems!
Problems with Software Measurement • Definitions of Metrics • Mapping attributes of a software product to numeric values [Fent97] • Imprecise, confusing, or conflicting definitions • Interpretation Models • Interpretation level is too fine-grained to lead to design decisions • metrics values are like symptoms • indicate an abnormality, but can’t indicate the cause • reduces the relevance of measurement results There is a large gap between what we do measure and what we should measure! Define mechanism for higher-level interpretation of metrics!
Detection Strategy • Generic mean for defining metrics-based design rules • use metrics together! • Based on mechanisms of filtering and composition Themeasurable expression of a rule, by which design fragments that are conformant to the rule can be identified in the source-code
Anatomy of a Detection Strategy • Metrics • measure internal characteristics • entity value pairs • Filtering Mechanism • statistical functions that return a subset of a data-set • Semantical Filters(e.g.HigherThanor BottomValues) • Statistical Filters(e.g.BoxPlots) • Composition Operators • articulate the composition of a detection rule • i.e. compose metrics in a rule • Three operators used: and , or , butnotin
Quantified Design Flaws • Around 20 design flaws quantified as DS • Different abstraction levels • method to omission of patterns • Different literature sources • Relevant design flaws
Sources (Java, C++) parsing Meta-Model using Metrics 1 .. n Detection Strategy (*.sod) executing with PRODEOOS List of Candidates 1 .. m Definitions of Statistical Outliers manual inspection Process of Design Inspection
The Unified Meta-Model • Design information needed in the DS • i.e used by the metrics • Declarations • classes, variables and methods • Inheritance relations • Cross-referencing information • Variable accesses • Method invocations • Contains information about packages • for C++ the directory structure • neccessary for the strategies at the subsystem level • For Java and C++ • TableGen and MeMoJ-Tables
What did we gain so far … • Detection Strategy • proper mechanism for higher-level interpretation of measurements • Methodology for quantifyingdesign-related rules • Quantifiedviolations of design principles, rules and heuristics • around 20 strategiesfor design flaws • from several literature sources • different abstraction levels • from method to subsystem level … and what is left? Build bridge betweenquality attributesand design problems Prove that the approach is scalable and accurate
A Classical Quality Model Design principles and rules are implicit Hard to construct • Metrics are too fine-grained Hard to interpret
NOPA Data Classes NOAM TCC God Classes WMC AOFD CIW Int.Seg Principle COC AUF NOD Lack of Bridge LR communicates with Factor-Strategy Quality Model Quality decomposed in Factors Principles, rules, heuristics quantified in Detection Strategies
The Case-Study • A Re-engineered Case-Study • two versions of the an industrial business application • second version re-engineered for maintainability purposes • Relevance of the Case-Study • Size (~100KLOC) proper to evaluate scalability of approach • Before-After reengineering scenario evaluate accuracy of DS • Clear reengineering goal evaluate relevanceof FS models • Size Characteristics
Automatic Evaluation Method Assumption 1 • All major design problems in SV1were eliminated during the reengineering process not present in SV2 Assumption 2 • Maintainability level in SV2is higher (better) than in SV1
Automatic Evaluation of Strategies — Results • Accuracy Rate: between 50% and 81% • Average Accuracy Rate: 64.5 % • Average Number of Suspects:~ 15% of entities reported suspect
Evaluation of FS Quality Model • Defined a Factor-Strategy quality model for maintainability • Evaluation mechanisms • Score = nr. of suspects • Qualifier = “grade” • 1-10 scale • based on Score • Compared results between the two versions • based on Assumption 2 • expected improvement in SV2
Why is the FS Model Better ? • See the problems not the numbers! • problems expressed in terms of design rules that are violated • interpretation is much easier • Easier to improve the quality of the design • understand the causes of problems • easier to correlate with “correction strategies”
Summary • A mechanism for quantifying design-related rules • the “Detection Strategy” concept • Suite of detection strategiesfor finding design flaws • quantification of well-known design flaws and…”smells” • Described a novel concept of quality model • based on detection strategies • the bridge between qualitative and quantitative statements • Strong tool support for the entire approach • high degree of automatization and scalability • Concrete evaluationof the concepts, methods, techniques
Perspectives • Refinement • Applicability • Migration • Refinement • The issue of threshold values[Diploma Thesis in progress] • define a tuning-machine • Unify means of expression [Master Thesis in progress] • SAIL language • Applicability • Bridge the gap between automated problem detection and problem correction [PhD in progress] • Integrate techniques in the development process • Migration • Adapt the approach to emerging technologies (e.g. EJB) • …and programming paradigms (e.g. AOP)