920 likes | 935 Views
Explore software development models impact on testing throughout the life cycle. Discover verification, validation, V-Model testing, and more. Learn how to ensure software quality.
E N D
Politehnica University of TimisoaraMobile Computing, Sensors Network and Embedded Systems LaboratoryEmbedded systems testingTesting throughout the software life cycle instructor: Razvan BOGDAN
Outlines • Software Development Models • Test Levels • Testing Types • Maintenance Testing
SOFTWARE DEVELOPMENT MODELS • The development process adopted for a project will depend on the project aims and goals. • There are numerous development life cycles that have been developed in order to achieve different required objectives. • These life cycles range from lightweight and fast methodologies, where timeto market is of the essence, through to fully controlled and documented methodologies where quality and reliability are key drivers.
SOFTWARE DEVELOPMENT MODELS • The life cycle model that is adopted for a project will have a big impact on the testing that is carried out • Test activities are highly related to software development activities • The life cycle model will define the what, where, and when of our planned testing, influence regression testing, and largely determine which test techniques to use.
SOFTWARE DEVELOPMENT MODELS • In every development life cycle, a part of testing is focused on verification testing and a part is focused on validation testing. • Verification focuses on the question 'Is the deliverable built according to the specification?'. • Validation focuses on the question 'Is the deliverable fit for purpose, e.g. does it provide a solution to the problem?'.
SOFTWARE DEVELOPMENT MODELS • Waterfall • It has a natural timeline where tasks are executed in a sequential fashion. • We start at the top of the waterfall with a feasibility study and flow down through the various project tasks finishing with implementation into the live environment • Testing tends to happen towards the end of the project life cycle so defects are detected close to the live implementation date.
SOFTWARE DEVELOPMENT MODELS • Waterfall • With this model it has been difficult to get feedback passed backwards up the waterfall • There are difficulties if we need to carry out numerous iterations for a particular phase
SOFTWARE DEVELOPMENT MODELS • V-model • The V-model was developed to address some of the problems experienced using the traditional waterfall approach • The V-model provides guidance that testing needs to begin as early as possible in the life cycle • There are a variety of activities that need to be performed before the end of the coding phase. These activities should be carried out in parallelwith development activities
SOFTWARE DEVELOPMENT MODELS • Testing along the V-model • Development and test are two equal branches • Each development level has a corresponding test level • Tests (right hand side) are designed in parallel with software development (left hand side) • Testing activities take place throughout the complete software life cycle
SOFTWARE DEVELOPMENT MODELS • Testing along the V-model • Software development branch • Requirements Definition • specification documents • Functional System Design • design functional program flow • Technical System Design • define architecture/interfaces, their interactions • Component specification • structure of component • Programming • create executable code
SOFTWARE DEVELOPMENT MODELS • Testing along the V-model • Software test branch • 4. Acceptance Testing • formal test of customer requirements • 3. System Testing • integrated system, specifications • 2. Integration Testing • component interfaces • software does what should be doing from a functional point of view • 1. Component Testing • component’s functionality
SOFTWARE DEVELOPMENT MODELS • Verification vs. Validation • Verification • Confirmation by examination and through provision of objective evidence that specified requirements have been fulfilled. [ISO 9000] • Main issue: Did we proceed correctly when building the system? Did we use correctly this component? • Validation • Confirmation by examination and through provision of objective evidence that the requirements for a specific intended use or application have been fulfilled. [ISO 9000] • Main issue: Did we build the right software system? Was it the matter to use that particular component or maybe we should have used another one?
SOFTWARE DEVELOPMENT MODELS • Verification within the general V-Model • Each development level is verified against the contents of the level above it • to verify: to give proof of evidence, to substantiate • to verify means to check whether the requirements and definitions of the previous level were implemented correctly Verification
SOFTWARE DEVELOPMENT MODELS • Validation within the general V-Model • Validation refers to the correctness of each development level • to validate: to give proof of having value • to validate means to check the appropriateness of the results of one development level • e.g., is the component offering the expected behavior? Is the system offering that particular behavior which is mentioned/agreed in the System Design Document? Verification Validation
V-model • Although variants of the V-model exist, a common type of V-model uses four test levels. • component testing: searches for defects in and verifies the functioning of software components (e.g. modules, programs, objects, classes etc.) that are separately testable; • integration testing: tests interfaces between components, interactions to different parts of a system such as an operating system, file system and hard ware or interfaces between systems;
V-model • Although variants of the V-model exist, a common type of V-model uses four test levels. • system testing: concerned with the behavior of the whole system/product as defined by the scope of a development project or product. The main focus of system testing is verification against specified requirements; • acceptance testing: validation testing with respect to user needs, require ments, and business processes conducted to determine whether or not to accept the system.
SOFTWARE DEVELOPMENT MODELS • Iterative life cycles • Not all life cycles are sequential. V-Cycle takes a large amount of time, being applied in those cases where the requirements are not changing (ok, that often ) • There are also iterative or incremental life cycles where, instead of one large development time line from beginning to end, we cycle through a number of smaller self-contained life cycle phases for the same project.
SOFTWARE DEVELOPMENT MODELS • Iterative life cycles • A common feature of iterative approaches is that the delivery is divided into increments or builds with each increment adding new functionality • The increment produced by an iteration may be tested at several levels as part of its development. • Subsequent increments will need testing for the new functionality, regression testing of the existing functionality, and integration testing of both new and existing parts
SOFTWARE DEVELOPMENT MODELS • Iterative life cycles • Regression testing is increasingly important on all iterations after the first one. This means that more testing will be required at each subsequent delivery phase which must be allowed for in the project plans • This life cycle can give early market presence with critical functionality, can be simpler to manage because the workload is divided into smaller pieces, and can reduce initial investment although it may cost more in the long run.
SOFTWARE DEVELOPMENT MODELS • Examples of iterativeor incremental development models are: • Prototyping • Building quickly a usable representation of the system, followed by successive modification until the system is ready • Rapid Application Development (RAD) • The user interface is implemented using out-of-the box functionality taking the functionality which will be later developed • Rational Unified Process (RUP) • Object oriented model and a product of the company Rational/IBM. It mainly provides the modelling language UML and support for the Unified Process • Agile development • Development and testing take place without formalized requirements specification
SOFTWARE DEVELOPMENT MODELS • Rapid Application Development (RAD) is formally a parallel development of functions and subsequent integration. • Components/functions are developed in parallel as if they were mini projects, the developments are time-boxed, delivered, and then assembled into a working prototype • This can very quickly give the customer something to see and use and to provide feedback regarding the delivery and their requirements • This methodology allows early validation of technology risks and a rapid response to changing customer requirements. • Dynamic System Development Methodology [DSDM] is a refined RAD process that allows controls to be put in place in order to stop the process from getting out of control
SOFTWARE DEVELOPMENT MODELS • The RAD development process encourages active customer feedback. • An early business-focused solution in the market place gives an early return on investment (ROI) and can provide valuable marketing information for the business
SOFTWARE DEVELOPMENT MODELS • Extreme Programming (XP) is currently one of the most well-known Agile development life cycle models. • Some characteristics of XP are: • It promotes the generation of business stories to define the functionality. • It demands an on-site customer for continual feedback and to define and carry out functional acceptance testing. • It promotes pair programming and shared code ownership amongst the developers.
SOFTWARE DEVELOPMENT MODELS • Agile development • Some characteristics of XP are: • It states that component test scripts shall be written before the code is written and that those tests should be automated. • It states that integration and testing of the code shall happen several times a day. • It states that we always implement the simplest solution to meet today's problems.
SOFTWARE DEVELOPMENT MODELS • XP developers write every test case they can think of and automate them. • Every time a change is made in the code it is component tested and then integrated with the existing code, which is then fully integration-tested using the full set of test cases. • It gives continuous integration, which mean that changes are incorporated continuously into the software build. • At the same time, all test cases must be running at 100% meaning that all the test cases that have been identified and automated are executed and pass.
SOFTWARE DEVELOPMENT MODELS • Iteration models: Test Driven Development (TDD) • Based on: test case suites • Prepare test cycles • Automated testing using test tools • Development according to test cases • Prepare early versions of the component for testing • Automatic execution of tests • Correct defects on further versions • Repeat test suites until no errors are found • First the tests are designed, then the software is programmed; • test first, code after
SOFTWARE DEVELOPMENT MODELS • Iteration models: Test Driven Development (TDD) • Based on: test case suites • Prepare test cycles • Automated testing using test tools • Development according to test cases • Prepare early versions of the component for testing • Automatic execution of tests • Correct defects on further versions • Repeat test suites until no errors are found • First the tests are designed, then the software is programmed; • test first, code after
SOFTWARE DEVELOPMENT MODELS • Principles of all models • each development activity must be tested • no piece of software may be left untested, whether it was developed “in one procedure” or iteratively • each test level should be tested specifically • each test level has its own test objectives • the test performed at each level must reflect these objectives • testing begins long before test execution • as soon as development begins, the preparation of the corresponding tests can start • this is also the case for document reviews starting with concepts specification and overall design
SOFTWARE DEVELOPMENT MODELS - Summary • whichever life cycle model is being used, there are several characteristics of good testing: • for every development activity there is a corresponding testing activity; • each test level has test objectives specific to that level; • the analysis and design of tests for a given test level should begin during the correspondingdevelopmentactivity; • testers should be involved in reviewing documents as soon as drafts are available in the development cycle.
Outlines • Software Development Models • Test Levels • Testing Types • Maintenance Testing
TEST LEVELS • Testing levels: • Component testing • Integration testing • System testing • Acceptance testing
TEST LEVELS • 1. Component (unit) testing • also known as unit, module or program • test of each software component after its realization • searches for defects in, and verifies the functioning of software (e.g. modules, programs, objects, classes, etc.) that are separately testable. • Due to the naming of components in different programming languages, the component test may be referred to as: • module test (e.g. in C) • class test (e.g. in Java or C++) • unit test (e.g. in Pascal) • The components are referred to as modules, classes or units. Because of the possible involvement of developers in the test execution, they are also called developer’s test
TEST LEVELS • 1. Component testing • Test cases may be derived from (= Test basis ): • Component requirements • (Detailed) software design • Code • Data models • Typical test objects: • Components/classes/units/modules • Programs • Data conversion/migration programs • Database modules
TEST LEVELS • 1. Component testing: Scope • Only single components are tested • components may consist of several smaller units • test object often cannot be tested stand alone • Every component is tested on its own • finding failures caused by internal defects • cross effects between components are not within the scope this test
TEST LEVELS Positive Test cases • 1. Component testing: Functional/non functional testing • Testing functionality • Every function must be tested with at least one test case • are the functions working correctly, are all specifications met? • Defects found commonly are: • defects in processing data, often near boundary values • missing functions • Testing robustness (resistance to invalid input data) • How reliable is the software? • Test cases representing invalid inputs are called negative tests • A robust system provides an appropriate handling of wrong inputs • Wrong inputs accepted in the system may produce failure in further processing (wrong output, system crash) • Other non-functional attributes may be tested • e.g., performance and stress testing, reliability Negative Test cases
TEST LEVELS • 1. Component testing: Test harness /1 • Test execution of components often requires drivers and stubs • Drivers handle the interface to the component • calls a component to be tested • Simulate inputs, records outputs and provide a test harness • they use programming tools • Stubs replace or simulate components not yet available or not part of the test object • called from the software component to be tested • To program drivers and/or stubs one • must have programming skills • need to have the source code available • may need some special tools
TEST LEVELS • 1. Component testing: Test harness /2 • Example 1: Stubs.. • Q: How to test module L? Module L Module K Module F (not developed) (not developed)
TEST LEVELS • 1. Component testing: Test harness /3 • Example 2: Stubs.. • A: Stubs: called from the software component to be tested Module L Module K Stub Module F Stub Dummy code that SIMULATES the functionality of the undeveloped modules
TEST LEVELS • 1. Component testing: Test harness /4 • Example 1: • A: Stubs: called from the software component to be tested void functionToBeTested(params..) { ………... int p = price(param1); ……….. } void price(intparam) { //this is the stub return 10; // We don’t care what the price is. We just need a value so we can test the other function }
TEST LEVELS (not developed) • 1. Component testing: Test harness /5 • Example 2: What about drivers?.. • Q: How to test module L? Module L Module F Module K
TEST LEVELS Dummy code that returns values from Module L in… Module K • 1. Component testing: Test harness /6 • Example 2: What about drivers?.. • Q: How to test module L? • A: A driver calls a component to be tested Module L Driver Module F Module K
TEST LEVELS • 1. Component testing: Test harness /7 • Example 2: What about drivers?.. • Q: How to test module L? • A: A driver calls a component to be tested void functionThatCallsPrice (params..) { //this is the driver int p = price(param1); printf(“Price is: %d”, p); } void price(intparam) { //complex ecuations and DB interogations that determin the real price }
TEST LEVELS • 1. Component testing: Test harness /8 • Example 3: Driver - UI in which the tester can introduce some data, simulate some input for the Component, this application being connected to the Component Driver Unit Under Test (F)
TEST LEVELS • 1. Component testing: Methods • The program code is available to the tester • In case “tester = developer”: testing takes place with a strong development focus • Knowledge about functionality, component structure and variables may be applied to design test cases • Often functional testing will apply • Additionally, the use of debuggers and other development tools (e.g. unit test frameworks) will allow direct access to program variables • Source code knowledge allows to use white box methods for component test
TEST LEVELS • 1. Component testing - Summary • A component is the smallest system unit specified • Module, unit, class and developer’s test are used as synonyms • Drivers will execute the component functions and adjacent functions that are replaced by stubs • Component tests may check functional and non-functional system properties
TEST LEVELS • 2. Integration testing • Test basis • Software and system design • Architectural design • Workflows • Use cases • Interface specifications • Data models • Typical test objects • Subsystems database implementation • Infrastructure • Interfaces • System configuration • Configuration data
TEST LEVELS • 2. Integration (interface) testing • tests interfaces between components, interactions to different parts of a system such as an operating system, file system and hardware or interfaces between systems. • each component has already been tested for its internal functionality (component test ) => integration tests examine the external functions after component testing • examines the interaction of software elements (yes! components) between different systems or hardware and software • Integration is the activity of combining individual software components into a larger subsystem or in a series of systems • Further integration of a subsystems is also part of the system integration process • Integration testing is often carried out by the integrator, but preferably by a specific integration tester or test team