1 / 78

SENG 521 Software Reliability & Testing

Cleanroom Software Development (Part 12). SENG 521 Software Reliability & Testing. Background. Chaos Report [Standish 1995]

amina
Download Presentation

SENG 521 Software Reliability & Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cleanroom Software Development (Part 12) SENG 521Software Reliability & Testing

  2. Background • Chaos Report [Standish 1995] • Based on data representing 8,380 SE projects, only 16.2% of projects met the delivery date, the budget and with all of the specified features and functions. 31% of projects were cancelled before they were completed, 52.7% were delivered with over-budget, over-schedule or with fewer features and functions than specified. • Software Productivity Research [Chapman 2000] • %60 of the United State’s software work force is dedicated to fixing software errors that could have been avoided. In addition, there are only 47 days in a calendar year dedicated to doing development or enhancement of software applications. The rest is spent mainly on fixing the bugs.

  3. Cleanroom SE • Cleanroom software engineering (CSE) is an engineering process for the development of high quality software with certified reliability with the emphasis on design with no defects and test based on software reliability engineering concepts. • CSE focuses on defect prevention instead of defect correction, and certification of reliability for the intended environment of use. • CSE yields software that is correct by mathematically sound design, and software that is certified by statistically valid testing. • CSE represents a paradigm shift from traditional, craft-based SE practices to rigorous, engineering-based practices.

  4. CSE: Characteristics • Objective: Achieve zero defects with certified reliability • Focus:Defect prevention rather than defect correction • Process: Incremental (short) development cycles; long product life

  5. CSE: History • 1983: Original idea of Cleanroom came from one of Dr. Harlan Mills’ published papers • 1987: Proposed by Dr. Mills as a SE methodology. The name “Cleanroom” was borrowed from the electronics industry • 1988: Defense Advanced Research Projects Agency (DARPA) Software Technology for Adaptable Reliable Systems (STARS) focus on Cleanroom • 1991-1992: Prototyping of Cleanroom Process Guide • 1992: A book of CSE published, foundation of CSE • 1992-1993: Army and Air Force Demonstration of Cleanroom Technology • 1993-1994: Prototyping of Cleanroom tools • 1995: Commercialization of a Cleanroom Certification Tool • 1995: Cleanroom and CMM Consistency Review • …

  6. Comparison

  7. Cleanroom SE: Technologies • Development practices are based on mathematical function theory • Test practices are based on applied statistics. • Analysis and design models are based on incremental software model and created using box structure representation. A box encapsulates the system (or some aspect of the system) at a specific level of abstraction. • Correctness verification is applied once the box structure design is complete.

  8. Cleanroom SE: Technologies • Software is tested by defining a set of usage scenarios (i.e., operational modes), determining the probability of use for each scenario (i.e., operational profile), and then defining random tests that conform to the probabilities. • Error records are checked. No corrective actions are taken. Only certification test is conducted to check whether errors (i.e., current failure intensity) meet the projected reliability (i.e., failure intensity objective) for the software component.

  9. CSE: Processes /1 Cleanroom processes: • Management process • Specification process • Development process • Certification process

  10. CSE: Processes /2 1.Cleanroom Management Process • Project planning • Project management • Performance improvement • Engineering change 2.Cleanroom Specification Process • Requirements analysis • Function specification • Usage specification • Architecture specification • Increment planning

  11. CSE: Processes /3 3.Cleanroom Development Process • Increment design • Correctness verification • Software reengineering (reuse) 4.Cleanroom Certification Process • Usage modeling and test planning • Statistical testing and certification

  12. CSE: Management Process • Project Planning • Cleanroom engineering guide • Software development plan (incremental) • Project Management • Project record • Performance Improvement • Performance improvement plan • Engineering Change • Engineering change log

  13. CSE: Specification Process /1 • Requirements Analysis • Elicitation and analyzes of requirements • Define requirements for the software product • Obtain agreement with the customer on the requirements • Requirements are reconfirmed or clarified throughout the incremental development and certification process. • Function Specification • Base on the result of Requirements Analysis • Specify the complete functional behavior of the software in all possible modes of use • Obtain agreement with the customer on the specified function as the basis for software development and certification

  14. CSE: Specification Process /2 • Usage Specification • Identify and classifysoftware users, usage scenarios, and environments of use (operational modes) • Establish andanalyze the probability distribution for softwareusage models • Obtain agreement with the customer on the specified usage as the basis for software certification • Architecture Specification • Define theconceptual model, the structural organization, and the executioncharacteristics of the software • Architecture definition is a multi-level activitythat spans the life cycle

  15. CSE: Specification Process /3 • Increment Planning • Allocate customerrequirements defined in the Function Specification to a series of softwareincrements that satisfy the Software Architecture, • Define schedule andresource allocations for increment development and certification • Obtain agreement with the customer on the increment plan

  16. Increment 1 RG RG BSS BSS FD FD CV CV CG CG CI CI SUT SUT C C TP TP Increment 2 CSE: Development Process SE CG: Code Generation CI: Code Inspection SUT: Statistical Use Testing C: Certification Test TP: Test Planning SE: System Engineering RG: Requirement Gathering BSS: Box structure specification FD: Formal Design CV: Correctness Verification

  17. Cleanroom Strategy /1 • Requirement gathering (RG) • A detailed description of customer level requirements for each increment. • Box structure specification (BSS) • Functional specification using box structure to separate behavior, data and procedures. • Formal design (FD) • Specifications (black boxes) are refined to become analogous to architectural (state boxes) and procedural (clear boxes) design.

  18. Cleanroom Strategy /2 • Correctness verification (CV) • A set of correctness verification activities on the design and moves later to code. First level verification is via application of a set of “correctness questions”. • Code generation, inspection & verification (CG & CI) • The box structure transformed to a programming language. Walkthrough and code inspection techniques are used to ensure semantic conformance with the box structure.

  19. Cleanroom Strategy /3 • Statistical test planning (TP) • Planning the test based on operational modes, operational profiles and reliability. • Statistical use testing (SUT) • Creating test case, execute them and collecting error data. • Certification (C) • Conducting certification test rather than reliability growth to accept/reject developed software components (using reliability demonstration chart, etc).

  20. Box Structure /1 • Box structures are used to move from an abstract specification to a detailed design providing implementation details

  21. Box Structure /2 • Black box • Specifies the behavior of a system or a part of a system. The system responds to specific stimuli (events) by applying a set of transition rules that map the stimuli to response. • State box • Encapsulates state data and services (operations). Input to the state box and outputs are represented. • Clear box • Transition function that are implied by the state box. It contains the procedural design of the state box.

  22. State T f: S* → R S R S R f: S* → R Black box State T State box g12 R S g11 cg1 g13 clear box Box Structure /3 Black boxes (specifications) State boxes (architectural designs) Clear boxes (component designs)

  23. CB1.1.1 BB1.1.1 SB1.1.1 CB1.1.2 BB1.1 BB1.1.2 CB1.1.3 BB1.2 BB1 BB1.1.3 BB1.n Box Structure /4 Clear box State box Black box

  24. Correctness Verification • Mathematical based techniques are used to verify the correctness of a software increment • Examples • If a function f is expanded into a sequence g and h, the correctness rule for all input to f is: Does g followed by h do f? • If a function f is expanded into a condition if-then-else, the correctness rule for all input to f is: Whenever condition <c> is true does g do f and whenever <c> is false does h do f? • When function f is refined as a loop, the correctness rule for all input to f is: Is termination guaranteed? Whenever <c> is true, does g followed by f do f? and whenever <c> is false, does skipping the loop still do f?

  25. Advantages of Verification • Design verification has the following advantages: • Verification is reduced to a finite process. • Every step of design and every line of code can be verified. • Near zero defect level is achieved. • Scalability is possible. • Better code than unit testing can be generated.

  26. CSE: Certification Process /1 • Usage modeling and test planning • A usage model represents a possible usage scenario of the software • Usage model is based on usage specification and is used for testing

  27. CSE: Certification Process /2 • Statistical Testing and Certification • Testing is conducted in a formal statistical design under experimental control. • The software is demonstrated to perform correctly with respect to its specification. • Statistically valid estimates of the properties addressed by the certification goals are derived for the software. • Management decisions on continuation of testing and certification of the software are based on statistical estimates of software quality.

  28. Cleanroom Testing • Using statistical usage concept for testing. • Determine a usage probability distribution via the following steps: • Analyze the specification to identify a set of stimuli (direct and indirect input variables). • Create usage scenarios (operational modes). • Assign probability to use of each stimuli (operational profile). • Generate test cases for each stimuli according to the usage probability distribution.

  29. Certification Test • Cleanroom approach DOES NOT emphasize on • Unit or integration testing. • Bug fixing as a result of test and regression. • Certification procedure involves the followings: • Create usage scenarios. • Specify a usage profile. • Generate test cases from the profile. • Execute test cases and record failure data. • Compute reliability and certify the component or system using reliability demo chart, etc.

  30. Reliability Demo Chart • An efficient way of checking whether the failure intensity objective (F) is met or not based on collecting failure data at time points. • Vertical axis: failure number • Horizontal axis: normalized failure data, i.e., failure time F

  31. Example • Automated Teller Machine (ATM) • The customer has a PIN number and access-card to use the ATM • The customer can deposit, withdraw money from the account • Transaction involves no bank employee

  32. Example: Usage Model

  33. Example: Black Boxes • Black boxes • Card Processor • In:ValidCard(cardNum) • Out: showMessage(message) Boolean • Cash Dispenser • In:enoughCashInMachine(amount) dispenseCash(amount) • Out: showMessage(message) dispense(amount) Boolean • Transaction Manager • In: ValidCustomer(cardNum, pin) AmountLimit(amount) EnoughCashInAccount(amount) • Out: showMessage(message) Boolean

  34. / insert card Menu [true] [false] / get pin [false] CheckMachine Cash Check [false] [true] [true] /get amount Check Account Example: State Box

  35. Example: Clear Box Spec /1 // Get customer PIN no ValidCustomer(cardNum, pin) / insert card Menu / get pin [true] [false] [false] CheckMachine Cash Check [false] [true] [true] /get amount CheckAccount

  36. Example: Clear Box Spec /2 // Bank returns false // Show message showMessage(mesg); / insert card Menu / get pin [true] [false] [false] CheckMachine Cash Check [false] [true] [true] /get amount CheckAccount

  37. Example: Clear Box Spec /3 // Bank returns true // get amount getAmount(amount); / insert card Menu / get pin [true] [false] [false] CheckMachine Cash Check [false] [true] [true] /get amount CheckAccount

  38. Example: Clear Box Spec /4 // Bank returns false for daily limit // and/or balance // Show message showMessage(mesg); / insert card Menu / get pin [true] [false] [false] CheckMachine Cash Check [false] [true] [true] /get amount CheckAccount

  39. Example: Clear Box Spec /5 // Bank returns true for daily limit // and balance Dispenser.enoughCashInAccount(amount) / insert card Menu / get pin [true] [false] [false] CheckMachine Cash Check [false] [true] [true] /get amount CheckAccount

  40. Example: Clear Box Spec /6 // Dispenser returns false for // cash level // Show message showMessage(mesg); / insert card Menu / get pin [true] [false] [false] CheckMachine Cash Check [false] [true] [true] /get amount CheckAccount

  41. Example: Clear Box Spec /7 // Dispenser returns true for // cash amount Dispenser.dispense(amount); / insert card Menu / get pin [true] [false] [false] CheckMachine Cash Check [false] [true] [true] /get amount CheckAccount

  42. CSE: Team • Specification team: • Responsible for developing and maintaining the system specification • Development team: • Responsible for developing and verifying the software • The software is not executed during this process • Certification team: • Responsible for developing a set of statistical tests to exercise the software after development • Use reliability growth models to assess reliability

  43. CSE: Evaluation • Basic features of Cleanroom development that distinguishes it from other SE methodologies are: • Formal specification (Box structure) • Correctness verification • Statistical certification test

  44. Evaluation: Formal Spec • Advantages: • Mathematical and logical foundation for defining requirements accurately with precise notation. • Proactive versus reactive approach with regards to requirements validation. • Ambiguous, inconsistent and conflicting requirements are caught before the system test. • Box structure uses black, state, and clear box and it is a stepwise approach to refine requirements. • Usage models define how the software is to be used by the customer.

  45. Evaluation: Formal Spec • Disadvantages: • Requires extra skills and knowledge (e.g. mathematics). • Requires substantial effort to fully express the system in formal specification. • On average Cleanroom projects require 60-80% of the time used in analysis and design • Ideal for safety or mission critical systems and not for ordinary commercial development. • Lacks good enough CASE tools supporting. • Project specific • If time-to-market & conditions are issues, then might not be used

  46. Evaluation: Incremental Devel • Advantages • Quick and clean development in Cleanroom Engineering • Continuous validation • Provides measurable progress • Manage higher risk requirements (i.e. prototype). • Tracking of requirements • Stepwise building functionalities that satisfies stakeholders’ requirements • Allows for fast delivery on important parts • Focus on planning and discipline at management level and technical level • Statistical testing make the project quality control in proper level • Verifiable specifications

  47. Evaluation: Incremental Devel • Disadvantages: • Incomplete or conflicting requirements cannot be resolved at the beginning to determine increments • Risk analysis has not been incorporated explicitly • Need more care about configuration management • Requires extra planning at both the management and technical levels • Stable requirements for each increment is needed, i.e., cannot adapt quickly to “rapidly changing” requirements

  48. Evaluation: Certification Test • Advantages • Determines a level of confidence that a software system conforms to a specification. • Able to statistically evaluate and infer the quality of the software system to meet all requirements. • Quantitative approach that is verifiable. • Quantitative data could be recorded and used later for benchmarking, etc.

  49. Evaluation: Certification Test • Disadvantages • Testing is derived from a usage model that must be exhaustive in order select a subset for testing • Statistical testing and verification will be more reliable if it is based on the some history data • It would be effective if it could be integrated with other testing methods • Testing is not suitable for bug-hunting • Human residual coding errors may not be addressed

  50. CSE vs. OO Development • Similarities: • Life Cycle: CSE incremental development and OO iterative development • Usage: OO use-case and CSE usage model • State machine representation: Both use a state machine representation for the behaviour of a design • Reuse: OO class and Cleanroom common service

More Related