340 likes | 442 Views
Foundation Level Overview Alexandar Despotovici. About ISTQB. founded in November 2002 a not-for-profit association legally registered in Belgium in Romania, since 2007 ISTQB trainings and certifications are done by ANIS ( Employers’ Association of the Software and Services Industry )
E N D
About ISTQB • founded in November 2002 • a not-for-profit association • legally registered in Belgium • in Romania, since 2007 ISTQB trainings and certifications are done by ANIS (Employers’ Association of the Software and Services Industry) • Trainings by ANIS are done in partnership with Quality House Bulgaria • ANIS is handling exams by an agreement with SEETB (South East European Testing Board)
Learning materials • Foundatin Level Syllabus • Standard glossary of terms used in Software Testing • Learning materials from Quality House: • Over 400 slides of training material • Basics of Software-Testing • Books: • Software Testing Foundations: A Study Guide for the Certified Tester Exam; A. Spillner, T. Linz, H. Schaefer • Software Testing - An ISTQB-ISEB Foundation Guide – Second Edition; B. Hambling, P. Morgan, A. Samaroo, G. Thomson, P. Wiliams • Foundations of Software Testing - ISTQB Certification; D. Graham, E. van Veenendaal, I. Evans, R. Black • Exercises and sample exams
Training • Takes 3 days • Mainly is theoretical but there are also some exercises • All syllabus is covered with more focus on some important parts • In day three is an mock exam
Exam • The exam is structured on multiple-choice questions • Each question has only one correct answer • Number of questions and length of exams are: • 40 questions • length = 1h (75 minutes for candidates taking exams not in their native language) • Questions are defined in a very strict way • Exams can be held without attending the course • To pass the exam 26 correct answers are needed • Promovability is 75%
Exam Questions • Questions are selected according to defined set of rules: • Proportional distribution of questions on the basis of Syllabus chapters topics. • Section1 – 7 questions • Section2 – 6 questions • Section3 – 3questions • Section4 – 12 questions • Section5 – 8 questions • Section6 – 4 questions • Distribution of questions on the basis of different types: • K1 = Remember (mnemonic contents), • K2 = Understand (conceptual contents), • K3 = Apply (exercise that requires the use of testing notions or techniques) • K4 = Analyze (exercise that requires also a contextual analysis)
SW Testing Fundamentals • Error – it’s a mistake made by a person (thecause) • Bug=Defect=Fault=Problem – A flaw in a component or system • Failure – Actual deviation of the component or systemfromitsexpecteddelivery, service or result. • Defect Masking - It is possible that a fault is hidden by one or more other faults in different parts of the application
SW Testing Fundamentals • Testingcan show failuresthat are causedbydefects • Debugingis a developmentactivitythatidentifiesthecause of defect, repairsthe code andchecksthatthe defect hasbeenfixedcorrectly • The responsability for eachactivityisdifferent • Testers test • Developersdebug Testingisnotdebugging!
SW Testing Fundamentals • Principle 1: Testing shows the presence of defects, not their absence • Principle 2: Exhaustive testing is not possible • Principle 3: Testing activities should start as early as possible • Principle 4: Defects tend to cluster together • Principle 5: The pesticide paradox • Principle 6: Testingis context dependent • Principle 7: Absence-of-errorsfallacy
SW TestingFundamentals • what andhowis going to be tested • fine detail of what to test (testconditions) • the most visible part of testing • at the end of test execution • makingsure that everything is tided away
Testing Throughout the SW Life Cycle • Test types • Functionaltesting – WHAT thesystemdoes • Black-box testing • IncludesSecurityandInteroperabilitytesting • Non-functionaltesting – HOW thesystemworks • Usabilitytesting • Stresstesting • Storagetesting • Performance testing • Recoverytesting • Volume testing • Installabilitytesting • Documentationtesting • Loadtesting • Structural testing – white-boxtesting • Testingrelatedtochanges • Re-testing • Regressiontesting
Static Techniques *Always: canbedone at allpointswhen a document isreleased
Static Techniques • Static Analisysbytools • Isperformedwithoutactuallyexecutingthe software beingexaminedbytool; dynamictestingdoes execute the software code • Can locate defectsthat are hard tofind in testing • Findsdefectsratherthanfailures • AtaticAnalisystools: • Analyze program code, as wellasgenerated output as HTML and XML • Typicallyusedby: • Developers – beforeandduring component testing • Designers – during software modelling
Test Design Techniques • The purpose of test design techniquesistoidentify test conditionsandtestcases • Sometechniquesfallclearlyinto a single category, othershaveelements of more thanonecategory • The syllabusrefersto: • Specification-basedapproaches as black-boxtechniques • Structure-basedapproaches as white-boxtechniques • Experience-basedapproaches
Test Design Techniques • Black box testingfocuses on functionality • Black box test techniques • EquivalencePartitioning • BoundryValueAnalysis • Decision Table (CauseEffectGraphing) • State Transition • Use Case Testing • SyntaxTesting • RandomTesting
Test Design Techniques • White box testingfocuses on code • White box test techniques • StatementTesting • DecisionTesting • BranchConditionTesting • BranchConditionCombinationTesting • ModifiedConditionDecisionTesting • Linear Code Sequence & Jump (LCSAJ) • Data FlowTesting
Test Design Techniques • In theExperience-basedtestingthetests are derivedfromthe tester’s skills, intuitionandexperience • Errorguessing –a techniquewherethe tester’s experienceisused • Exploratorytesting – an informal test design techniquewherethe tester activelycontrolsthe design of thetests
Test Management • Test planningandestimation • Test planning • Entrycriteria • Exitcriteria • Test estimation • Test strategyandapproach • Test progressmonitoring andcontrol • Test progress monitoring • Test reporting • Test control
Test Management • Risk– a factor that could result in future negative consequences, usuallyexpressed as impact and likelihood • Level of risk = probability of the risk occurring × impact if it did happen • Project risks • Supplierissues: • Failure of a third party to deliver on time or at all • Contractual issues, such as meeting acceptance criteria • Organisationalfactors • Skills, training and staff shortages • Personal issues • Technicalissues • Test environment not ready on time • Problems in defining the right requirements • Product risks • Failure-prone software delivered • The potential that a defect in the software/hardware could cause harm to an individual or company • Software that does not perform its intended functions.
Test Management • Incident management • „The process of recognising, investigating, taking actionanddisposing of incidents.” • It involves recording incidents, classifying themand identifying the impact. • The process of incident management ensures thatincidents are tracked from recognition to correction, and finally through retestandclosure. • Configuration management • For testers, configuration management helpstouniquelyidentify (andto reproduce) thetested item, test documents, thetestsandthetestharness(es)
Toolsupport of testing • Types of Test Tools: • Toolsupport for management of testingandtests • Test management tools • Requirements management tools • Incident management tools (Defect TrackingTools) • Configuration management tools • Toolsupport for static testing • Review tools • Static analysistools • Modeling tools • Toolsupport for test specification • Test design tools • Test data preparationtools • Toolsupport for test executionandlogging • Toolsupport for performance andmonitoring
Toolsupport of testing • Stagesin theprocess of introducing a toolinto an organisation • Objectives of thePilot Project • Successfactors for thetooldeployment