380 likes | 390 Views
Explore the fundamentals of software testing, including testing processes, the Test Maturity Model, fault models, defect origins, and verification vs. validation concepts.
E N D
Advanced Software Engineering: Software Testing, 2007COMP 3705 (Lecture1) Sada Narayanappa TA: Seif Azgandhi Anneliese Andrews (Chair DU) Thomas Thelin Carina Andersson
Facts about testing System development: • 1/3 planning • 1/6 coding • 1/4 component test • 1/4 system test [Brooks75]
TMM Test process Evaluation • Guided by Capability Maturity model (CMM) • Stages or levels to evolve from and to • Each level, except Level 1, has structure • Level 2 and above have: • Structure • Goals • Organization structure
Internal structure of TMM Level indicate Maturity Goals Contains Testing Capability Supported By Maturity Sub goals Achieved By Activities/tasks/responsibilities Addresses Implementation and Organizational adaptation Organized By 3 Critical views Manager Developer User/Client
TMM Levels • Level1- No process defined; debug/testing are same • Level2- Test & Debug tools/Test plan/Basic test process • Level3- Test Org/Technical Training/Lifecycle/ Control & Monitor, support V Model • Level4- Review process/test measurement/Software quality Evaluation – Test Logging with severity • Level5-Defect prevention/Quality/Process optimization
Good enough quality To claim that any given thing is good enough is to agree with all of the following propositions: • It has sufficient benefits • It has no critical problems • The benefits sufficiently outweigh the problems • In the present situation, and all things considered, further improvement would be more harmful than helpful James Bach, IEEE Computer, 30(8):96-98, 1997.
Why use testing? • Risk mitigation • Faults are found early • Faults can be prevented • Reduce lead-time • Deliverables can be reused • …
Why do faults occur in software? • Software is written by humans • Who know something, but not everything • Who have skills, but aren’t perfect • Who don’t usually use rigorous methods • Who do make mistakes (errors) • Under increasing pressure to deliver to strict deadlines • No time to check, assumptions may be wrong • Systems may be incomplete • Software is complex, abstract and invisible • Hard to understand • Hard to see if it is complete or working correctly • No one person can fully understand large systems • Numerous external interfaces and dependencies
Fault model Defect sources Impact of software artifacts Impact from user’s view Lack of educationPoor communicationOversightTranscriptionImmature process Errors Faults / Defects Failures Poor quality software User dissatisfaction Origins of defects
Definition 1 Verification is the product right? Validation is it the right product? Definition 2 Verification satisfies the conditions at the start of the phase Validation satisfies the requirements Testing, Verification & Validation Testing The process of evaluating a program or a system
Definitions • Failure is an event, fault is a state of the software caused by an error • Error – human mistake • Fault / Defect – anomaly in the software • Failure – inability to perform its required functions • Debugging / Fault localization – localizing, repairing, retesting.
Definitions • A TEST CASE consists of: • A set of inputs • Execution conditions • Expected outputs • A Test is: • Group of related test cases • Procedures needed to carry out the test case IEEE Definition Organization may define additional attributes
Scripted and non-scripted testing • In scripted testing test cases are pre-documented in detailed, step-by-step descriptions • Different levels of scripting possible • Scripts can be manual or automated • Non-scripted testing is usually manual testing without detailed test case descriptions • Can be disciplined, planned, and well documented exploratory testing • or ad-hoc testing
Test oracle • An oracle is the principle or mechanism by which you recognize a problem • Test oracle provides the expected result for a test, for example • Specification document • Formula • Computer program • Person • In many cases it is very hard to find an oracle • Even the customer and end user might not be able to tell which is the correct behaviour
Test Bed • Environment contains • all the hardware and software to test software component/system • Examples: • Simulators • Emulators • Memory checkers
Other Definitions • Important to understand the following definitions • Quality– degree of meeting specified requirement • Metric – quantitative measure • Quality metric • Correctness – perform the function • Reliability –perform under stated condition • Usability – effort to use the system • Integrity – withstand attacks • Portability/maintainability/interoperability …
Principle 1 – purpose of testing Testing is the process of: • exercising a software component using a selected set of test cases, with the intent of • Revealing defects • Evaluating quality
Principles 2: A good test case – When the test objective is to detect defects, then a good test case is one that has high probability of revealing a yet undetected defect(s) 3: Test result – The results should be inspected meticulously 4: Expected output – A test case must contain the expected output
Principles 5: Input – Test cases should be developed for both valid and invalid input conditions 6: Fault content estimation – The probability of the existence of additional defects in a software component is proportional to the number of defects already detected in that component 7: Test organization – Testing should be carried out by a group that is independent of the development group
Principles 8: Repeatable – Tests must be repeatable and reusable 9: Planned – Testing should be planned 10: Life cycle – Testing activities should be integrated into the software life cycle 11: Creative – Testing is a creative and challenging task
Goals of the course • Knowledge • Skills • Attitudes A test specialist - trained engineer- have knowledge of test-related • Principles/processes/measurements, standards, plans, tools, and methods, and • learn how to apply - testing tasks to be performed.
Defect reports/analysis • Requirement/SpecificationDefect Classes • Design Defect Classes • CodingDefect Classes • TestingDefect Classes • Defect Repository Functional DescriptionFeatureFeature interactionInterface description Algorithmic and processing Control, Logic, and sequence Data Module interface descriptionExternal interface description Algorithmic and processing Control, Logic, and sequence typographical data flow Data Module interface Code documentationExternal hardware, software Test HarnessTest design Test procedure Defect Classes Severity Occurrences • Defect reports/analysis Defect classes and Defect repository
Lectures • Theory + discussions • Cover the basic parts of software testing • Introduction • Black-box, Reliability, Usability • Inspections, white-box testing • Lifecycle, documentation • Organization, tools • Metrics, TMM • Research presentation Economic Overview Technical Technical / Manager Managerial
Lab sessions Preparation, Execution, Report • Black-box testing • Usage-based testing and reliability • White-box testing • Inspection and estimation • Software process simulation
Project: Option 1 • Learn a specific area of software testing • Collect and summarize research information • Critical thinking beyond the written information • Present information in a structured way • Peer review
Examination • Written exam based on the book and lab sessions • Lab sessions (approved) • Project/presentations are grades • See class web site for Assignment policy
Schedule • Read • Course program • Projects in Software Testing • Check homepage • Not decided • Extra Lab dates
This week • Read course program • Project • Read Projects in Software Testing • Exercise on Thursday • Decide Research/subject • Discuss papers with me – describe why is it interesting • Lab • Prepare lab 1 • Read Burnstein 1-3 • Prepare Burnstein 4,12
Project: Option 1 • Research: solve a research problem; survey the state-of-the-art and identify the research problems in some area; develop and justify an extension to an existing technique; etc. • Evaluation: apply and evaluate a technique or evaluate a commercial testing or, analysis tool. • Practical: Use an existing technique to test a system or design and implement a prototype for a system.
Project: Option 1 • Read Projects in Software Testing • Divide in groups (2-3 persons) • Discuss with me • http://www.cs.du.edu/~snarayan/sada/teaching/COMP3705/FilesFromCD/Project/Project_SwTest.pdf
Project: Option 2 • Research paper presentation • Find an interesting paper • Many research papers we come across during class – pick one for presentation • Have four paper presentation – Choose your team and prepare • Reading paper takes time – start early