1 / 25

Alexander Serebrenik, Serguei Roubtsov , and Mark van den Brand

Alexander Serebrenik, Serguei Roubtsov , and Mark van den Brand. D n -based Design Quality Comparison of Industrial Java Applications. What makes a good software architecture?. Among other things… easy to build additions and make changes. Why?.

xyla-thomas
Download Presentation

Alexander Serebrenik, Serguei Roubtsov , and Mark van den Brand

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Alexander Serebrenik, Serguei Roubtsov, and Mark van den Brand Dn-based Design Quality Comparison of Industrial Java Applications

  2. What makes a good software architecture? Among other things… easy to build additions and make changes Why? Maintenance typically accounts for 75% or more of the total software workload Software evolves: not flexible enough architecture causes early system’s decay when significant changes become costly

  3. Goals of good architectural design • Flexible design: • more abstract classes • and • less dependencies between packages Make software easier to change when we want to minimize changes which we have to make

  4. Stable packages Do not depend upon classes outside Many dependents Should be extensible via inheritance (abstract) Instable packages Depend upon many classes outside No dependents Should not be extensible via inheritance (concrete) Abstractness/stability balance Stability is related to the amount of work required to make a change [Martin, 2000].

  5. What does balance mean? A good real-life package must be instable enough in order to be easily modified  It must be generic enough to be adaptable to evolving requirements, either without or with only minimal modifications  Hence: contradictory criteria

  6. How to measure Instability? Ca – afferent coupling measures incoming dependencies Ce – efferent coupling measures outgoing dependencies Instability = Ce/(Ce+Ca)

  7. Dn – Distance from the main sequence Abstractness = #AbstrClasses/#Classes 1 zone of uselessness main sequence Instability = Ce/(Ce+Ca) zone of pain 0 1 Dn = | Abstractness + Instability – 1 | [R.Martin 1994]

  8. Maintainability of software architecture in numbers D n ?

  9. Instability: What does “depend” mean? Still: Entire architecture? Instability = Ce/(Ce+Ca) Ce: 4 3 1 [Martin 1994] [JDepend] [Martin 2000]

  10. Averages Industrial practice Benchmarking for Java OSS Distributions Expectation for threshold exceeding values 2 Flavors of Architecture Assessment

  11. Benchmarks? • 21 Java OSS • Different domains • EJB frameworks, entertainment, web-app development tools, machine learning, code analysis, … • Different ages (2001 - 2008) • Size: ≥ 30 original packages • Development status: focus on Stable/Mature • Also include alpha, beta and inactive

  12. Average Dn 1.00 But… average is not all! Exceeds  + 4 Dresden OCL Toolkit 0.32 0.25 [-2; +2] Benchmarks 0.15 0.00

  13. How are the Dn-values distributed? Exponential distribution?

  14. Exponential distribution? • Exponential distribution: • Support [0;1] rather than [0;): Hence, we normalize: • And use max-likelihood fitting to find 

  15. Benchmarking  • Higher  • Sharper peaks • Thinner tails • Smaller averages Why is  interesting?  increases  Dn PAGE 14

  16. Estimate excessively high values! • How many packages exceed threshold z? P(Dn ≥z) -3 z  +3 PAGE 15

  17. Dn ≥ 0.6 • Dresden OCL Toolkit: 23.7% packages P(Dn ≥0.6)  -3 +3

  18. Dresden OCL Toolkit: Why? • Started in 1998. • BUT: • We are looking at the Eclipse version! • Version 1.0 – June 2008 • Version 1.1 – December 2008 • Has yet to mature…

  19. Can we compare proprietary systems using Dn? Case study: • System A and System B support loan and lease approval business processes • Both systems employ three-tier enterprise architecture: • System A uses the IBM Websphere application server • System B uses a custom made business logic layer implemented on the Tomcat web server • System A: 249 non-third-party packages • System B: 284 non-third-party packages

  20. Average Dn 1.00 Exceeds  + 4 System B 0.337 Benchmarks System A 0.186 0.00

  21. What about distributions? % of packages beyond threshold an average OSS System A System B Dn threshold value

  22. Independent assessments The dependencies between packages must not form cycles [Martin, 2000] B A C • JDepend reports # of cyclic dependencies: System A - 1 System B - 23 • Cyclic dependencies between packages A, B, C should be released and maintained together

  23. Layering System B System A Upcoming dependencies

  24. Chidamber and Kemerer OO metrics * The lower the better System A (white bars) has more (%) low-WMC packages than System B (blue bars). The same holds for LCOM.

  25. Conclusions • Java OSS benchmarks for average Dn • g(x) – statistical model • Expectation for threshold exceeding values • Applicable to other metrics as well! • practical feasibility of Dn-based assessment of industrial applications

More Related