1 / 28

An Approach to Measure Java Code Quality in Reuse Environment

An Approach to Measure Java Code Quality in Reuse Environment. Aline Timóteo Advisor: Silvio Meira UFPE – Federal University of Pernambuco alt.timoteo@gmail.com. Summary. Motivation Background Metrics An Approach to Measure Java Code Quality Main Contributions Status. Motivation.

mili
Download Presentation

An Approach to Measure Java Code Quality in Reuse Environment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Approach to Measure Java Code Quality in Reuse Environment Aline Timóteo Advisor: Silvio Meira UFPE – Federal University of Pernambuco alt.timoteo@gmail.com

  2. Summary • Motivation • Background • Metrics • An Approach to Measure Java Code Quality • Main Contributions • Status

  3. Motivation

  4. Motivation • Reuse Benefit • Productivity • Cost • Quality • Reuse is a competitive advantage!!!!! • Reuse environment [Frakes, 1994] • Process • Metrics • Tools • Repository • Search engine • Domain tools • …

  5. Problem • Component Repository promote reuse success [Griss, 1994] • Artifacts quality must be assured by the organization that maintains a repository? [Seacord, 1999] How to minimize low-quality artifacts reuse?

  6. Background

  7. Metrics • “Software metrics is a method to quantify attributes in software processes, products and projects” [Daskalantonakis, 1992] • Metrics Timeline • Age 1: before 1991, where the main focus was on metrics based on the code complexity • Age 2: after 1992, where the main focus was on metrics based on the concepts of Object Oriented (OO) systems

  8. Age 2: Object Oriented Age 1: Complexity

  9. Most Referenced Metrics • LOC • Cyclomatic Complexity [McCabe, 1976] • Chidamber and Kemerer Metrics [Chidamber, 1994] • Lorenz and Kidd Metrics [Lorenz, 1994] • MOOD Metrics [Brito, 1994]

  10. Problems related to Metrics [Ince, 1988 and Briand, 2002] • Metrics Validation • Theoretical Validation • Measurement goal • Experimental hypothesis • Environment or context • Empirical validation • Metrics Automation • Different set of metrics implemented • Bad documentation • Quality attributes x Metrics

  11. An Approach to Measure Java Code Quality

  12. An Approach to Measure Java Code Quality • Quality Attributes x Metrics • Metrics Selection and Specification • Quality Attributes measurement

  13. Quality in a Reuse Environment [Etzkorn, 2001] • ISO 9126

  14. Quality Attributes x Metrics

  15. Metrics Selection • Applicable for Java • Empirical Validation • Theoretical Validation • Acceptance

  16. Metrics Selection • McCabe Metric [McCabe, 1976] • Theoretical Validation, according to graphos teory • Independence of technology • Empirical Validation • Acceptance [Refactorit, 2001; Metrics, 2005; JHawk, 2007]

  17. Metrics Selection • CK Metrics [Chidamber, 1994], • Theoretical Validation, • Developed in a OO context • Empirical Validation [Briand, 1994; Chidamber, 1998; Tang, 1999]. • Acceptance [Refactorit, 2001; Metrics, 2005; JHawk, 2007]

  18. Metrics Specification

  19. Quality Attributes Measurement (QAM) • QAM = (the number of metrics that have a allowable value) • Heuristically • QAM >= Number of metrics /2 • Example: • 2,5 <= QAM <= 5 Max Testability = 5 Min Testability = 2,5

  20. Approach Automation

  21. Approach Automation Analyzability QAM = 3.0 -------> RFC: 2.0 -------> WMC 1.0 -------> CC 0.0 Changeability QAM = 3.0 -------> CBO 3.0 -------> RFC: 2.0 -------> WMC 1.0 public class Client implements Runnable, CommandListener { /** * Start the client thread */ public void start() { Thread t = new Thread(this); t.start(); } }

  22. Experiment

  23. Experiment • Main Question • The retrieval component quality is better? • Compare B.A.R.T. search results • Results before introduce filter • Results after introduce filter • Apply questionnaire for customers

  24. Main Contributions • Introduce quality analysis in a repository • Reduce code problem propagation • Highest Reliability

  25. Current Stage

  26. Referências • [Frakes, 1994] W. B. Frakes and S. Isoda, "Success Factors of Systematic Software Reuse," IEEE Software, vol. 11, pp. 14--19, 1994. • [Griss, 1994] M. L. Griss, "Software Reuse Experience at Hewlett-Packard," presented at 16th International Conference on Software Engineering (ICSE), Sorrento, Italy, 1994. • [Garcia, 2006] V. C. Garcia, D. Lucrédio, F. A. Durão, E. C. R. Santos, E. S. Almeida, R. P. M. Fortes, and S. R. L. Meira, "From Specification to Experimentation: A Software Component Search Engine Architecture," presented at The 9th International Symposium on Component-Based Software Engineering (CBSE 2006), Mälardalen University, Västerås, Sweden, 2006. • [Etzkorn, 2001] Letha H. Etzkorn, William E. Hughes Jr., Carl G. Davis: Automated reusability quality analysis of OO legacy software. Information & Software Technology 43(5): 295-308 (2001) • [Daskalantonakis, 1992] M. K. Daskalantonakis, “A Pratical View of Software Measurement and Implementation Experiences Within Motorola”, IEEE Transactions on Software Engineering, vol 18, 1992, pp. 998–1010. • [McCabe, 1976] T. J. McCabe, “A Complexity Measure”. IEEE Transactions of Software Engineering, vol SE-2, 1976, pp. 308-320. • [Chidamber, 1994] S. R. Chidamber, C. F. Kemerer, “A Metrics Suite for Object Oriented Design”, IEEE Transactions on Software Engineering, vol 20, Piscataway - USA, 1994, pp. 476-493. • [Lorenz, 1994] M. Lorenz, J. Kidd, “Object-Oriented Software Metrics: A Practical Guide”, Englewood Cliffs, New Jersey - USA, 1994. • [Brito, 1994] A. F. Brito, R. Carapuça, "Object-Oriented Software Engineering: Measuring and controlling the development process", 4th Interntional Conference on Software Quality, USA, 1994. • [Ince, 1988] D. C. Ince, M. J. Sheppard, "System design metrics: a review and perspective", Second IEE/BCS Conference, Liverpool - UK, 1988, pp. 23-27. • [Briand, 2002] L. C. Briand, S. Morasca, V. R. Basili, “An Operational Process for Goal-Driven Definition of Measures”, Software Engineering - IEEE Transactions, vol 28, 2002, pp. 1106-1125. • [Morasca, 1989] S. Morasca, L. C. Briand, V. R. Basili, E. J. Weyuker, M. V. Zelkowitz, B. Kitchenham, S. Lawrence Pfleeger, N. Fenton, "Towards a framework for software measurementvalidation", Software Engineering, IEEE Transactions, vol 23, 1995, pp. 187-189. • [Seacord, 1999] Robert C. Seacord. Software engineering component repositories. Technical report, Software Engineering Institute (SEI), 1999

  27. [Refactorit, 2001] Refactorit tool, online, last update: 01/2008, available: http://www.aqris.com/display/ap/RefactorIt • [Jdepend, 2005] JDepend tool, online, last update: 03/2006,available: http://www.clarkware.com/software/JDepend.html • [Metrics, 2005] Metrics Eclipse Plugin, online, last update: 07/2005, available: http://sourceforge.net/projects/metrics • [Jhawk, 2007] JHawk Eclipse Plugin, online, last update: 03/2007, available: http://www.virtualmachinery.com/jhawkprod.htm

  28. Aline Timóteo UFPE – Federal University of Pernambuco alt.timoteo@gmail.com

More Related