170 likes | 307 Views
Survey of Tools to Support Safe Adaptation with Validation. Alain Esteva-Ramirez School of Computing and Information Sciences Florida International University. Bárbara Morales-Quiñones Department of Computer Engineering University of Puerto Rico-Mayaguez. REU Summer Program. 06/26/2007.
E N D
Survey of Tools to Support Safe Adaptation with Validation Alain Esteva-Ramirez School of Computing and Information Sciences Florida International University Bárbara Morales-Quiñones Department of Computer Engineering University of Puerto Rico-Mayaguez REU Summer Program 06/26/2007
Outline • Introduction • Testing Autonomic Systems • Motivation • Background • Safe Adaptation by Zhang et. al • Tool Classification & Criteria • Survey of Tools • Selection of Tools • Questions?
Introduction: Testing Autonomic Systems (1) • Autonomic Computing • Automated low level task/actions • Specify behavior as high level policies • Self-management features • Testing Autonomic Systems • Requires testing prior to initial deployment • Requires runtime testing, since structure and behavior can change at runtime • Pioneers of autonomic computing stated that validation is one of the grand challenges of autonomic computing.
Introduction: Testing Autonomic Systems (2) • Two approaches developed by King et al. [1] • Replication with Validation • Only feasible when managed resources can be replicated • Requires the system to create and/or maintain copies of the managed resource for validation purposes • Changes implemented and validated on copies • Safe Adaptation with Validation • Validates changes resulting from self-management as part of a safe adaptation process • It can be used when duplicating managed resources are too expensive, impractical, or impossible • Occurs directly on the managed resource, during execution. [1] Towards Self-Testing in Autonomic Computing Systems
Motivation • Survey represents preliminary work for Testing Autonomic Computing Systems During Safe Adaptation. • Motivation stems from the need to test autonomic computing systems at runtime (i.e., to avoid the high cost of system failures.) • Since the strategy is based on safe adaptation, investigation of tools can be useful for building dependable adaptive (and autonomic) systems. • Many new tools/plugins have emerged; integrated development platforms, open-source comm.
Background Safe Adaptation by Zhang et al. (1) • Safe adaptation • Developed by Zhang et al. (WADS 2004) • Directed towards using a disciplined approach to building adaptive systems. • An adaptation is safe if and only if: • It does not violate the dependencies between components • It does not interrupt any critical communications that could result in erroneous conditions [2] Enabling Safe Dynamic Component-Based Software Adaptation
Background Safe Adaptation by Zhang et al. (2) Source: J. Zhang, B. H. C. Cheng, Z. Yang, and P. K. McKinley. Enabling safe dynamic component-based software adaptation. In WADS, pages 194–211, 2004.
Tool Classification • Dependency Analysis Tools • Partially automate safe adaptation process. • Extract dependency relationships among and between components. • Metrics Tools • Allows to measure efficiency based on certain performance metrics (memory, response time). Aids validation of self-optimization features. • Complexity metrics – automatically generate costs of adaptation steps. • Unit Testing Tools • Support or enhance previous unit testing (REU 2006). [2] Enabling Safe Dynamic Component-Based Software Adaptation
Tool Selection Criteria (1) • Dependency Analysis • Exportable dependencies – we need a way to access the generated information, to later analyze it. • Graphical visualization – aids us in visualizing these dependencies • Dependency cycle detection – it is important to know these to accurately manage the impacts of changes
Tool Selection Criteria (2) • Performance Analysis • Load – mechanisms to check object’s memory usage. • Speed – mechanisms to evaluate response time of method calls. • Complexity – code complexity related metrics like cyclomatic complexity and coupling between objects
Unit Testing Support Java Unit Testing – support or enhance unit testing Code Coverage – support dynamic analysis of code for unit test coverage. Branch and line coverage Tool Selection Criteria (3)
Survey of Tools Dependency Analysis Tools • JDepend • Eclipse plugin that exports generated design metrics. • Dependency Finder • Independent tool • Filters dependencies by packages, classes or features • Code Analysis Plugin • Information is presented through diagrams that help visualize the dependencies.
Survey of Tools Performance Metrics Tools • Test and Performance Tools Platform (TPTP) • Provides framework in which developer build test and performance tools that integrates with Eclipse • Finds problems faster and with less difficulty • Finds performance bottlenecks and other metrics easily • Addresses the entire test and performance cycle including test editing and execution, monitoring, profiling, among other capabilities.
Survey of Tools Unit Test Support Tools • JUnitPerf • JUnit test decorators that are used to measure efficiency of JUnit tests. • Timed tests – create a top level bound • Load tests – creates an artificial load • Cobertura • Provides XML coverage reports from the system level to an individual line of code. • Provides complexity metrics. • Provides a maxmemory attribute that helps to restrict the amount of memory used.
Tool Selection • Dependency Analysis • JDepend – exporting XML dependency report • CAP – Visualizing dependency graphs • Performance Analysis • TPTP – addresses the entire test and performance life cycle • Unit Testing • JUnitPerf – measure efficiency of JUnit tests • Cobertura – exporting XML coverage report
References [1] Towards Self-Testing in Autonomic Computing Systems [2] Enabling Safe Dynamic Component-Based Software Adaptation [3] Survey of Tools to Supporting Testing Autonomic Computing Systems During Safe Adaptation 16
Questions? Questions, comments and queries.